Quantcast
Channel: Planet Python
Viewing all 23425 articles
Browse latest View live

PyCharm: PyCharm 2018.3 Out Now

$
0
0

PyCharm 2018.3 is now available: Windows Subsystem for Linux (WSL) support, multiline TODOs, improved search everywhere, and more.

Download PyCharm 2018.3

New in PyCharm

  • For those of you on Windows who are developing an application that runs on Linux: Windows Subsystem for Linux is a quick and easy way to always have a Linux environment available. PyCharm 2018.3 can now be configured to use a Python interpreter inside WSL.
  • One of the most-requested features in our issue tracker were multiline TODOs, and we’re happy to announce that these are now available in PyCharm 2018.3
  • Even though search everywhere (double-shift) is not at all a new feature, we’ve made some great usability improvements, and now it’s easy to see how you can narrow down the results.

Read more about new features on our website. You can also find the full release notes here.

Upgrade Now

To get the new version of PyCharm, upgrade in one of the following ways:

Do you have questions, complaints, or suggestions? Please reach out to us! Ask questions to our support team, report bugs and suggestions on our issue tracker, or just connect with us on Twitter.


Trey Hunner: Black Friday Sale: 50% Off 52 weeks of Python Morsels

$
0
0

I launched a weekly Python skill-building service earlier this year called Python Morsels. This week I’m running my first sale, which will also likely be the biggest sale that I run on Python Morsels for the foreseeable future (I don’t want to say forever, but probably forever).

If you’re an experienced programmer and you feel like your Python code could be more Pythonic, Python Morsels is for you.

Before I jump into details, let me explain what Python Morsels is.

The inspiration

I do on-site Python training for teams, which means I work with a lot of developers at a lot of companies. One question I hear all the time is “how can I make my code more Pythonic”?

Most of the folks I teach are not new to programming and they’re usually not new to Python either, but they also aren’t experienced at leveraging the features and idioms that make Python unique. Being a skilled programmer isn’t the same as being a skilled Python programmer

Late last year this conundrum inspired me to create Python Morsels.

Learning by doing

My training courses and workshops are exercise-driven and I find an exercise-heavy style of teaching very effective.

You don’t learn by putting information into your head, you learn by trying to retrieve information from your head. You can watch talks and read books and read code, but you’ll absorb very little unless you apply what you’ve learned. You learn by doing, which means writing Python code.

That’s why Python Morsels is entirely about writing code and reflecting on the code you’ve written.

Python Morsels: exercise-driven learning

After you sign up for Python Morsels I’ll send you one exercise every week. Not an interview questions: a realistic Python exercise inspired by the interesting problems I’ve had to solve in the past. The purpose of these exercises is to inspire you to learn something new about Python each week.

Each exercise includes a number of bonuses so you can choose your own difficulty level. All exercises also include automated tests so you can test your code quickly. After you’ve solved the exercise I’ll send you a number of different solutions to the problem with a discussion about why we might choose one solution over another. These solutions are meant to help you reconsider the way you write your code.

While solving the bonuses is important, the more important thing is that you get into the habit of time-boxed weekly learning. You want to spend your time effectively and the best way to do that is to form a learning habit and time box that habit. I suggest that you dedicate 30 minutes every week to solving the exercise, regardless of the difficulty level you choose, as well as 30 minutes to reflecting on the solutions email I send you.

So how much is this sale for?

Python Morsels normally costs $16/month (or $160/year on the annual plan).

From now until Monday I’m offering a 40% discount off the annual plan, which means you’ll get 52 weeks of Python skill-building for $96. That’s effectively $8/month or a 50% discount when compared to the monthly subscription.

I say “52 weeks” instead of 1 year because Python Morsels subscriptions can be “paused” at any time, which allows for breaks during vacations and busy periods and ensures you’ll get all of the 52 weeks you signed up for.

To take advantage of this discount you’ll need to sign up for Python Morsels, verify your email address, go to the Account page, and click the Subscribe button for the 52 Week Plan. The BLACKFRIDAY discount code should be automatically applied from now until the end of the sale on Monday.

Money back guarantee

This is the first sale I’ve ever held so I’m not sure whether it’s common to offer a guarantee on sales, but I’m going to do it for this one because I’m pretty confident in what I’m offering.

If you contact me with concerns but I can’t find something that works for your needs, I’ll send you a full refund. I want you to improve your Python skills, but I don’t want you signing up for something that isn’t for you. If you end up signing up for Python Morsels and you don’t improve your Python skills because of it, you deserve a refund because I’ve wasted your time.

What do the first 52 weeks of Python Morsels exercises cover?

Python Morsels starts small, but the exercises increase in difficulty over time. The first 52 weeks of Python exercises will wander into a lot of interesting topics.

Within one year we’ll:

  • work with and create our own iterators (both generators and iterator classes)
  • make text-parsing programs and command-line programs
  • talk a lot about readability and code style
  • dive into a number of the built-ins and standard library modules
  • use operator overloading to make classes that support arithmetic
  • create our context managers
  • create our own decorators
  • use properties and descriptors and even make our own descriptor
  • create custom collections (mappings, sequences, strings, sets, etc.)

Haven’t made a descriptor before? By this time next year you will have!

The sale ends on Monday

This sale will ends on Monday November 26, end of day.

To get an effective 50% discount on Python Morsels over the next 52 weeks, sign up to Python Morsels, verify your email, go to the Account page, and subscribe to the 52 Week Plan.

Share this sale with friends and family

If you have a friend or colleague who might benefit from weekly Python practice, let them know about this sale! The BLACKFRIDAY coupon expires on Monday, but there’s no limit on the number of signups, so there’s no reason to keep this sale a secret.

So please share this email or the discount code with anyone you know who might find value in 52 weeks of Python skill-building.

Frequently Asked Questions

These are questions that I’ve been asked at least once (that’s apparently what “frequently” means now).

Is this for someone who is brand new to programming?

No, it isn’t. Python Morsels is for someone who has been using Python for a while and wants to improve their Python coding practices. Many of the folks currently signed up write primarily Python code, but have a background in at least one other programming language. However, there are a handful of folks who are signed up who would call Python their first and only programming language and I do try to accommodate folks in that camp as much as I can.

In general, I recommend Python Morsels for folks who are currently writing Python code regularly.

How is Python Morsels different from a Python course?

During my on-site trainings I’m present as a live instructor and coach. That’s something you won’t get from Python Morsels. During online courses there are videos explaining each topic before it’s practiced. Python Morsels also doesn’t have that.

The focus of Python Morsels is a bit different than a course or a training. If you think of Python course as like taking a tennis class, Python Morsels is more like weekly tennis practice. A Python Morsels subscriber described it to me as like Hannon’s finger exercises for piano or Kreutzer’s études for violin. Python Morsels is guided deliberate practice in the domain of writing readable and maintainable Python code.

How much time does this require each week?

I expect that you’ll spend about an hour each week on Python Morsels in total.

You’re a busy person who has production code to write and I don’t want to waste your time. The exercise includes bonuses, but I don’t expect you to solve them all each week: instead I want you to time box yourself. I recommend that you set aside 30 minutes to solve the problem each week, including running the tests and solving as many bonuses as you can. I’d also like you to set aside 30 minutes to reflect on your code while reading the solution email I send each week. I often link to related resources to read/watch, but I’d like you to bookmark those for later.

If you have more than one hour to devote each week, you could sit on the solutions for a couple days and then resolve the exercise without looking at the solutions email. I don’t expect this though.

What if the exercises are too easy for me and I don’t learn anything new?

If you find the exercises are too easy, email me and I’ll see what I can do. I’ve developed quite a few exercises over the last year and I may be able to work with you to ensure the exercises you get are a good fit for your experience level. If it turns out that Python Morsels simply isn’t for you, I’ll refund you.

What if the exercises suddenly get too hard for me?

If the exercises turn out to be too challenging for you, either immediately or eventually, email me. I plan to create some easier tracks for Python Morsels eventually (there’s certainly demand for this) and I may have some suitable exercises to send to you. If Python Morsels doesn’t suit your needs and I can’t easily fix the problem, I’ll send you a refund.

Ready to start a weekly skill-building habit?

Are you ready to start 52 weeks of Python skill-building for $96 (normally $192)? That’s less than $2/week and about one hour of your time each week (which really is the bigger cost here).

If you have questions that I didn’t address above, please email me and say what you’re thinking/feeling.

If you’re interested in seeing the opinion of someone who has worked through Python Morsels exercises, see the testimonials on the homepage or take a look at what some of my Python Morsels friends have said about it on Twitter (Andrew Pinkham, Pavel Anni, Jason Wattier, Ben Jones).

Ready to sign up? Click here to get 52 weeks of Python Morsels at 50% off.

Continuum Analytics Blog: Deriving Business Value from Data Science Deployments

$
0
0

By Gus Cavanaugh One of the biggest challenges facing organizations trying to derive value from data science and machine learning is deployment. In this post, we’ll take a look at three common approaches to deploying data science projects, and how Anaconda Enterprise simplifies deployment and allows data scientists to focus on building better models that …
Read more →

The post Deriving Business Value from Data Science Deployments appeared first on Anaconda.

Django Weblog: DSF Board Election for 2019

$
0
0

It's that time of year again where we elect the Django Software Foundation Board of Directors.

If you're interested in helping contribute back to Django and the Django Community we encourage you to stand for this years election.

To run this year please fill out this this election form by November 29th, 2018 AoE.

Not sure if you want to be a Board Member?

Being a DSF Board Member is a great way to contribute time rather than code or money. If there is something in our community you would like to change or improve being a Board Member puts you in a position to effect change.

While some of the officer positions do require more of a time commitment the average Member typically spends just a few hours a month helping to direct the DSF. We have one roughly hour-long meeting each month to conduct the main business and correspond via email/Trello/etc for smaller matters.

Typical meetings involve topics such as:

  • Approval/discussion of conferences
  • Awarding grants for events such as the many DjangoGirls events around the world
  • Policy and Process changes to membership, voting, structure, etc.
  • Fundraising
  • Awarding the Malcolm Tredinnick Memorial Prize
  • Board Member lead initiatives

This year, in particular, we are in need of someone interested in taking on the role of Treasurer. One of the more time-consuming officer positions.

If you have any questions about the Board or being a Board Member please do not hesitate to reach out to me directly at frank@djangoproject.com, any of our current Board Members, or all of us at once at foundation@djangoproject.com.

Kushal Das: Source of colors in Qubes devices menu items

$
0
0

Have you ever wondered how the device lock icon colors come in the device applet of Qubes OS? I saw those everyday, but, never bothered to think much. The guess was the VM where the devices are attached (because those names are there in the list). Yesterday, I was asked for a proper answer by Nina, I decided to confirm the idea I had in my mind.

The way I am learning about internals of Qubes OS, is by reading the source code of the tools/services (I do ask questions to developers on IRC too). As most of the tools are wriiten in Python, it is super easy to read and follow. The code base is also very to read+understand.

In this case, the USB devices get the color from the label of sys-usb VM. This is the special VM to which all of the USB devices get attached by default and label is the color attached to the VM (window decoration and other places). The PCI devices get the color of dom0, thus black by default.

Just for fun, I changed the color to Purple, a few more colors in life always help :)

gamingdirectional: Create the pool object for Enemy Missile class

$
0
0

In this article we will continue to create a new missile pool object which will be used to recycle the enemy missile object just like the pool object which will be used to recycle the enemy ship object in the previous article. Basically we will use back the previous object pool class which we have created earlier and then add in a new obtain missile method which takes in the x and y coordinate of...

Source

Reuven Lerner: Black Friday sale — improve your Python + Git for 40% off!

$
0
0

Yup — just like everyone else, I’m having a Black Friday sale on all of my online books and courses. Just this weekend, you can use the coupon code BF2018 to improve your Python skills:

Just use the coupon code BF2018 at checkout to get your 40% discount!  But hurry, the sale only lasts through “Cyber” Monday.  Any questions?  Just e-mail me at reuven@lerner.co.il.

The post Black Friday sale — improve your Python + Git for 40% off! appeared first on Lerner Consulting Blog.

Codementor: How to handle switch case in Python.

$
0
0
A short snippet to deal with switch case requirement in python more efficiently.

Codementor: High-performance mathematical paradigms in Python

$
0
0
Learn the best practices to evaluate any mathematical expression over a huge data set.

Simple is Better Than Complex: How to Implement Token Authentication using Django REST Framework

$
0
0

In this tutorial you are going to learn how to implement Token-based authentication using Django REST Framework (DRF). The token authentication works by exchanging username and password for a token that will be used in all subsequent requests so to identify the user on the server side.

The specifics of how the authentication is handled on the client side vary a lot depending on the technology/language/framework you are working with. The client could be a mobile application using iOS or Android. It could be a desktop application using Python or C++. It could be a Web application using PHP or Ruby.

But once you understand the overall process, it’s easier to find the necessary resources and documentation for your specific use case.

Token authentication is suitable for client-server applications, where the token is safely stored. You should never expose your token, as it would be (sort of) equivalent of a handing out your username and password.

Table of Contents


Setting Up The REST API Project

So let’s start from the very beginning. Install Django and DRF:

pip install django
pip install djangorestframework

Create a new Django project:

django-admin.py startproject myapi .

Navigate to the myapi folder:

cd myapi

Start a new app. I will call my app core:

django-admin.py startapp core

Here is what your project structure should look like:

myapi/
 |-- core/
 |    |-- migrations/
 |    |-- __init__.py
 |    |-- admin.py
 |    |-- apps.py
 |    |-- models.py
 |    |-- tests.py
 |    +-- views.py
 |-- __init__.py
 |-- settings.py
 |-- urls.py
 +-- wsgi.py
manage.py

Add the core app (you created) and the rest_framework app (you installed) to the INSTALLED_APPS, inside the settings.py module:

myapi/settings.py

INSTALLED_APPS=[# Django Apps'django.contrib.admin','django.contrib.auth','django.contrib.contenttypes','django.contrib.sessions','django.contrib.messages','django.contrib.staticfiles',# Third-Party Apps'rest_framework',# Local Apps (Your project's apps)'myapi.core',]

Return to the project root (the folder where the manage.py script is), and migrate the database:

python manage.py migrate

Let’s create our first API view just to test things out:

myapi/core/views.py

fromrest_framework.viewsimportAPIViewfromrest_framework.responseimportResponseclassHelloView(APIView):defget(self,request):content={'message':'Hello, World!'}returnResponse(content)

Now register a path in the urls.py module:

myapi/urls.py

fromdjango.urlsimportpathfrommyapi.coreimportviewsurlpatterns=[path('hello/',views.HelloView.as_view(),name='hello'),]

So now we have an API with just one endpoint /hello/ that we can perform GET requests. We can use the browser to consume this endpoint, just by accessing the URL http://127.0.0.1:8000/hello/:

Hello Endpoint HTML

We can also ask to receive the response as plain JSON data by passing the format parameter in the querystring like http://127.0.0.1:8000/hello/?format=json:

Hello Endpoint JSON

Both methods are fine to try out a DRF API, but sometimes a command line tool is more handy as we can play more easily with the requests headers. You can use cURL, which is widely available on all major Linux/macOS distributions:

curl http://127.0.0.1:8000/hello/

Hello Endpoint cURL

But usually I prefer to use HTTPie, which is a pretty awesome Python command line tool:

http http://127.0.0.1:8000/hello/

Hello Endpoint HTTPie

Now let’s protect this API endpoint so we can implement the token authentication:

myapi/core/views.py

fromrest_framework.viewsimportAPIViewfromrest_framework.responseimportResponsefromrest_framework.permissionsimportIsAuthenticated# <-- HereclassHelloView(APIView):permission_classes=(IsAuthenticated,)# <-- And heredefget(self,request):content={'message':'Hello, World!'}returnResponse(content)

Try again to access the API endpoint:

http http://127.0.0.1:8000/hello/

Hello Endpoint HTTPie Forbidden

And now we get an HTTP 403 Forbidden error. Now let’s implement the token authentication so we can access this endpoint.


Implementing the Token Authentication

We need to add two pieces of information in our settings.py module. First include rest_framework.authtoken to your INSTALLED_APPS and include the TokenAuthentication to REST_FRAMEWORK:

myapi/settings.py

INSTALLED_APPS=[# Django Apps'django.contrib.admin','django.contrib.auth','django.contrib.contenttypes','django.contrib.sessions','django.contrib.messages','django.contrib.staticfiles',# Third-Party Apps'rest_framework','rest_framework.authtoken',# <-- Here# Local Apps (Your project's apps)'myapi.core',]REST_FRAMEWORK={'DEFAULT_AUTHENTICATION_CLASSES':['rest_framework.authentication.TokenAuthentication',# <-- And here],}

Migrate the database to create the table that will store the authentication tokens:

python manage.py migrate

Migrate Auth Token

Now we need a user account. Let’s just create one using the manage.py command line utility:

python manage.py createsuperuser --username vitor --email vitor@example.com

The easiest way to generate a token, just for testing purpose, is using the command line utility again:

python manage.py drf_create_token vitor

drf_create_token

This piece of information, the random string 9054f7aa9305e012b3c2300408c3dfdf390fcddf is what we are going to use next to authenticate.

But now that we have the TokenAuthentication in place, let’s try to make another request to our /hello/ endpoint:

http http://127.0.0.1:8000/hello/

WWW-Authenticate Token

Notice how our API is now providing some extra information to the client on the required authentication method.

So finally, let’s use our token!

http http://127.0.0.1:8000/hello/ 'Authorization: Token 9054f7aa9305e012b3c2300408c3dfdf390fcddf'

REST Token Authentication

And that’s pretty much it. For now on, on all subsequent request you should include the header Authorization: Token 9054f7aa9305e012b3c2300408c3dfdf390fcddf.

The formatting looks weird and usually it is a point of confusion on how to set this header. It will depend on the client and how to set the HTTP request header.

For example, if we were using cURL, the command would be something like this:

curl http://127.0.0.1:8000/hello/ -H 'Authorization: Token 9054f7aa9305e012b3c2300408c3dfdf390fcddf'

Or if it was a Python requests call:

importrequestsurl='http://127.0.0.1:8000/hello/'headers={'Authorization':'Token 9054f7aa9305e012b3c2300408c3dfdf390fcddf'}r=requests.get(url,headers=headers)

Or if we were using Angular, you could implement an HttpInterceptor and set a header:

import{Injectable}from'@angular/core';import{HttpRequest,HttpHandler,HttpEvent,HttpInterceptor}from'@angular/common/http';import{Observable}from'rxjs';@Injectable()exportclassAuthInterceptorimplementsHttpInterceptor{intercept(request:HttpRequest<any>,next:HttpHandler):Observable<HttpEvent<any>>{constuser=JSON.parse(localStorage.getItem('user'));if(user&&user.token){request=request.clone({setHeaders:{Authorization:`Token${user.accessToken}`}});}returnnext.handle(request);}}

User Requesting a Token

The DRF provide an endpoint for the users to request an authentication token using their username and password.

Include the following route to the urls.py module:

myapi/urls.py

fromdjango.urlsimportpathfromrest_framework.authtoken.viewsimportobtain_auth_token# <-- Herefrommyapi.coreimportviewsurlpatterns=[path('hello/',views.HelloView.as_view(),name='hello'),path('api-token-auth/',obtain_auth_token,name='api_token_auth'),# <-- And here]

So now we have a brand new API endpoint, which is /api-token-auth/. Let’s first inspect it:

http http://127.0.0.1:8000/api-token-auth/

API Token Auth

It doesn’t handle GET requests. Basically it’s just a view to receive a POST request with username and password.

Let’s try again:

http post http://127.0.0.1:8000/api-token-auth/ username=vitor password=123

API Token Auth POST

The response body is the token associated with this particular user. After this point you store this token and apply it to the future requests.

Then, again, the way you are going to make the POST request to the API depends on the language/framework you are using.

If this was an Angular client, you could store the token in the localStorage, if this was a Desktop CLI application you could store in a text file in the user’s home directory in a dot file.


Conclusions

Hopefully this tutorial provided some insights on how the token authentication works. I will try to follow up this tutorial providing some concrete examples of Angular applications, command line applications and Web clients as well.

It is important to note that the default Token implementation has some limitations such as only one token per user, no built-in way to set an expiry date to the token.

You can grab the code used in this tutorial at github.com/sibtc/drf-token-auth-example.

Zero-with-Dot (Oleg Żero): Interacting with custom libraries in Google Colaboratory

$
0
0

Introduction

Colaboratory, also known as Colab, is a great tool created by Google for individuals interested in hands-on experience with the recent AI development. It offers a free CPU/GPU quota and a preconfigured virtual machine instance set up for to run Tensorflow and Keras libraries using a Jupyter notebook instance. In one sentence, it is a perfect “getting started” point for experimentation with neural networks for any part-time hobbist or computer nerd.

However, unlike standalone Jupyter, its preconfigured settings are targetted to focus on experimentation, and to lesser extent software development. This intention somehow breaks the development routine(s) for many software developers including myself. Although it allows certiain unix commands to be executed (using ! mark), essentially all interactions with the virtual machine happens though the notebook itself. As, at least for the time being, there seems to be no possibility to access the backend using e.g. SSH, working across multiple python files and libraries become a bit cumbersome. In the end, this setup leaves the user with having to force all code into the notebook’s cells.

This post is my best attempt to work around at least some to the difficulties, and so here I would like to share of what I managed to come up with so far.

(My) usual pattern

First of all, I would not like to blind side this post into a discussion around things such as “my favourite text editor or IDE”, as people tend to have their opinion on this topic. My perference goes to the linux command line interface and vim/tmux, but yours can be different. The point is that irrespectively of the IDE the general pattern when working with python projects tends to boil down to the following things:

  1. Create a project directory (let’s call it project’s root).
  2. Initialize or cloning of a git repository within that directory.
  3. Set up a virtual environment, activating it and installing relevant packages.
  4. Work across mutliple files to solve the problem in a logical and structured way.
  5. While working, revision changes through git and finally open a pull request once certain feature is ready.

With Google Colab, the primary focus goes on playing around with the models. Therefore, the first and the third point are resolved automatically when you first generate or open the notebook. (In case you need additional installations, !pip install <package> or !apt-get install <package> would do the job.)

Point number 2. is somewhat solved. You can use GUI to import or save the notebook itself using GitHub. You may also use !git clone <your-repo> command to load the stuff from public repositores.

Unfortunately, points 4. and 5. are not quite supported. If you want to modify or revision other files… well… it becomes a challenge.

Workaround

Here, I have to make it clear that my workaround is only partial. I have not figured it out how I can push the code to Gihub directly from the Colab notebook (apart from the notebook itelf, of course). However, with these simple steps, I managed to make my workflow somewhat more comfortable. Here it goes:

Creating space in Google Drive

First, I keep my work organized in a sub-directory within Google Drive. It is not the “root” directory of the project, but there I can store all python files I can edit. Aslo mounting of a Google Drive is very easy in Colaboratory. All you need to do is to execute the following lines:

1
2
3
4
5
fromos.pathimportjoinfromgoogle.colabimportdriveROOT="/content/drive"drive.mount(ROOT)

Once you are prompted for token and authentication, you obtain access to all your files within the Drive.

Fetiching git repository

If you choose to start a new project, you can initialize an empty git repository using:

1
2
3
4
PROJ="My Drive/Colab Experimental/Workspace"# This is a custom path.PROJECT_PATH=join(ROOT,PROJ)!mkdir"{PROJECT_PATH}"!gitinit"{PROJECT_PATH}"

or fetch the already existing repository using:

1
2
3
4
5
6
GIT_USERNAME="OlegZero13"GIT_TOKEN="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"GIT_REPOSITORY="SomeRepo"!mkdir"{PROJECT_PATH}"!gitclonehttps://{GIT_TOKEN}@github.com/{GIT_USERNAME}/{GIT_REPOSITORY}.git"{PROJECT_PATH}"

The lines above assume that you have generated token in Github, which will allow you to bypass being asked for the password. That is an assumption that your repository is not public. Otherwise, you will not be asked for the password. However, in case you would prefer to work with a private repository, you will need to authenticate for GitHub and then, defining an access token for a particular repository is definitely a safer option than writing the password explicitly.

All in all, this way you will dump the entire project into the Drive.

/assets/colab/drive-files.png Figure 1. ...

Editing

The final part is to include the project packages into your Colab workspace. This is the fun part. In case you are happy with simply importing content from other python files without editing, you could just as well have cloned the git repository directly to the workspace in Colab (same command without "{PROJECT_PATH}" at the very end).

In many cases, however, you would like to retain freedom to edit or modify your custom libraries. Now, that the files are kept in Google Drive, you can use any text editor such as Anyfile or Drive Notepad. My recommendation? Python Compiler Editor, since it can also execute code.

/assets/colab/python-online.png Figure 2. Example of Python Compiler Editor. Here, we edit the ./utils/somelib.py example library.

Importing

Since our libraries exist under Google Drive, simple imports will fail as they will not be able to locate the files correctly. Fortunately, the relative paths to the modules can be tweaked using python native importlib library with very simple commands.

Again, execute in Colab:

1
2
fromimportlib.machineryimportSourceFileLoadersomemodule=SourceFileLoader('somelib',join(PROJECT_PATH,'utils/somelib.py')).load_module()

This will give you the access to all the definitions created under ./utils/somelib.py file - your custom library. Invoking the functions works like this:

/assets/colab/colab-import.png Figure 3.

Finally, whenever you edit any of the imported files, you must remember to re-execute the cells where you do the importing. This way, however, you will be able to focus on your main “experimentation thread”, meaning the notebook, but define your own support functions and classes in separate files. Pretty much just how you would do when working on a “normal” project.

Closing Remarks

In this post, we have shown how we can take advantage of Google Drive and relative imports to make the Colaboratory workflow more convenient. The post showed how to mount the drive and perfom the imports in a way, where we can distibute the code among several files. Revisioning of the entire project or workspace through git is, unfortunately, still and open question, hence I would greatly appreciate your input in comments!

PyBites: How to Test Your Django App with Selenium and pytest

$
0
0

In this article I will show you how to test a Django app with pytest and Selenium. We will test our CodeChalleng.es platform comparing the logged out homepage vs the logged in dashboard. We will navigate the DOM matching elements and more. Overall you should learn enough Selenium and pytest to start testing a web page including a login. Sounds exciting? Let's dive straight in!

This article focuses on getting Selenium + pytest working with Django, but as the pytest + Selenium part is applicable to any web app, I ditched the Django / DB part from the final script which I will link to at the end of this article.

Project setup

First we want to make sure we have proper support for pytest in Django, hence after setting up my virtual environment, I installed pytest, pytest-django, selenium and python-dateutil:

$ more requirements.txt
python-dateutil
pytest
pytest-django
selenium

I am going to use a test user account and need to access the DB (which we see in a bit) so I set the following environment variables in my venv/bin/activate:

export DB_HOST=0.0.0.0
export DB_PORT=5432
export DB_NAME=
export DB_USER=
export DB_PASSWORD=
export USER_NAME=Github user
export USER_PASSWORD=

I also unset these in the deactivate function of venv/bin/activate so they don't linger around when I leave the virtual env, a trick I learned when writing Building a Simple Web App With Bottle, SQLAlchemy, and the Twitter API:

...
...
deactivate () {
    ...
    ...
    unset DB_HOST
    unset DB_PORT
    unset DB_NAME
    unset DB_USER
    unset DB_PASSWORD
    unset USER_NAME
    unset USER_PASSWORD
}

pytest setup

Next let's create a pytest.ini file to set the DJANGO_SETTINGS_MODULE environment variable to point to Django's configuration file, by default settings.py inside the main app:

$ cat pytest.ini
[pytest]DJANGO_SETTINGS_MODULE= mysite.core.settings
# -- recommended but optional:python_files= tests.py test_*.py *_tests.py

Testing using a DB / conftest.py

Overall I don't need the DB for Selenium testing but for some tests it would be nice to match up the page elements with what's in the DB, for example the amount of Bite exercises shown on the page vs records in the DB.

Another use case I found while writing more Selenium code for our platform was the activation link when users add their email. No real email gets send from my localhost and/or when testing so I needed to query the user's object to retrieve the newly generated link.

It took me a bit of trial and error how to use a real database because pytest-djangotakes a conservative approach.

I ended up using a conftest.py file (in the main app folder) as specified in the documentation:

$catmysite/core/conftest.pyimportosfromdjango.confimportsettingsimportpytestDEFAULT_ENGINE='django.db.backends.postgresql_psycopg2'@pytest.fixture(scope='session')defdjango_db_setup():settings.DATABASES['default']={'ENGINE':os.environ.get("DB_ENGINE",DEFAULT_ENGINE),'HOST':os.environ["DB_HOST"],'NAME':os.environ["DB_NAME"],# my dedicated test database (!)'PORT':os.environ["DB_PORT"],'USER':os.environ["DB_USER"],'PASSWORD':os.environ["DB_PASSWORD"],}

This is a predefined fixturepytest-django provides which will be triggered if you decorate your test function with @pytest.mark.django_db. As we want to set this up once for the whole test session, I set scope='session' in the fixture's argument.

Test our homepage

Now let's use both pytest and selenium to test the homepage of our platform logged in vs. logged out. I added the following code to a tests.py file in my main app folder. pytest.ini makes that the pytest command line interface will find it.

Setup work

As per PEP8 first we have some standard library modules, then external ones, then own modules:

fromdatetimeimportdateimportosimportrefromdateutil.relativedeltaimportrelativedeltaimportpytestfromseleniumimportwebdriverfromselenium.webdriver.common.keysimportKeys# here I use the DB/ORM Models to match page elementsfrommysite.core.modelsimportChallengefrombites.modelsimportBiteHOMEPAGE='http://localhost:8000'TODAY=date.today()

I load in my user from the corresponding env variables:

USER_NAME = os.environ['USER_NAME']
USER_PASSWORD = os.environ['USER_PASSWORD']

And I define a helper function to convert a datetime to an uppercase 3-char month string (SEP/OCT/NOV), we see why in a bit ...

def _make_3char_monthname(dt):
    return dt.strftime('%b').upper()

Selenium driver and pytest's tearDown

First I need to instantiate a Selenium webdriver. As required I have the geckodriver (I am using Chrome) sitting in my ~/bin folder which is in my $PATH, see the Selenium with Python documentation.

I wrote a second fixture to return a Selenium driver object which will span all tests in my module, so I set scope="module" (for now ... if I'd need to re-run this setup for each function, then I would leave scope off, defaulting to per function scope).

One really elegant thing I learned is to simply replace return with yield in a fixture, to have some tearDown code which suited me perfectly here to close out the Chrome browser that Selenium opens while testing:

pytest supports execution of fixture specific finalization code when the fixture goes out of scope. By using a yield statement instead of return, all the code after the yield statement serves as the teardown code docs

@pytest.fixture(scope="module")
def driver():
    driver = webdriver.Chrome()
    yield driver
    driver.quit()

Generators are awesome!

Test the logged out homepage

Next a hello world Selenium test: driver.get(HOMEPAGE) reaches out to the platform's homepage and it just checks if the title is as expected. Here is the logged out homepage:

homepage logged out

def test_loggedout_homepage(driver):
    driver.get(HOMEPAGE)
    expected = "PyBites Code Challenges | Hone Your Python Skills"
    assert driver.title == expected

And that is how easy it is to write a Selenium test in pytest!

Note I am using localhost for HOMEPAGE here so prior to this I started my Django app server in a second terminal tab: python manage.py runserver!

Test the logged in dashboard

Let's do something more interesting. Here is the CodeChalleng.es dashboard of my test user:

homepage logged in

Let's see if we can test the following:

  1. The h2 headers are as expected.
  2. The new coding streak calendar at the right bottom shows the last 3 months.
  3. In that widget only one day has the css class today (orange border).
  4. Match the number of Bite of Py links with the number of published Bites in the DB.
  5. Match the number of Blog Challenges links (2nd tab alongside "Bites of Py") with the number of published Challenges in the DB.

DB fixture and login

As I am going to access my DB for steps 4. and 5. I need to decorate my new test function with pytest-django's predefined @pytest.mark.django_db fixture. This will then (magically) reference my django_db_setup in conftest.py (this took me some trial and error).

@pytest.mark.django_db
def test_loggedin_dashboard(driver):
    ...

First I go to HOMEPAGE again and login using the Sign In With Github button. First I located the image and clicked it:

    driver.get(HOMEPAGE)
    login_btn = '//a[img/@src="/static/img/ghlogin.png"]'
    driver.find_element_by_xpath(login_btn).click()

But we fixed that on the platform setting a class attribute on the login button: class="ghLoginBtn" (not an id because sometimes there are two buttons and id attributes should be unique).

So now I can just do:

    driver.find_element_by_class_name('ghLoginBtn').click()

This takes me to the Github login page and I can login using Selenium's send_keys method. Note the extra return key:

    driver.find_element_by_name('login').send_keys(USER_NAME)
    driver.find_element_by_name('password').send_keys(USER_PASSWORD + Keys.RETURN)

Finding elements

I use Selenium's find_elements_by_tag_name to find all h2 elements (note the s in elements which gets you a list of all), then I check if the expected headers are in there:

    h2s = [h2.text for h2 in driver.find_elements_by_tag_name('h2')]
    expected = [f'Happy Coding, {USER_NAME}!', 'PyBites Platform Updates [all]',
                'PyBites Ninjas (score ≥ 50)', 'Become a better Pythonista!',
                'Keep Calm and Code in Python!    SHARE ON TWITTER']
    for header in expected:
        assert header in h2s, f'{header} not in h2 headers'

Assert calendar headers and a CSS class

You want to learn about dateutil's relativedelta. I use it here because datetime's timedelta does not have a delta of months. Here I calculate the last 3 months, at the time of this writing NOV-, OCT-, and SEP 2018. I then check if these are in the h2 headers:

    # calendar / coding streak feature
    this_month = _make_3char_monthname(TODAY)
    last_month = _make_3char_monthname(TODAY-relativedelta(months=+1))
    two_months_ago = _make_3char_monthname(TODAY-relativedelta(months=+2))
    for month in (this_month, last_month, two_months_ago):
        month_year = f'{month} {TODAY.year}'
        assert month_year in h2s, f'{month_year} not in h2 headers'

Only one day should be marked with the today css class, we can use Selenium's find_elements_by_class_name:

    # only current date is marked active
    assert len(driver.find_elements_by_class_name('today')) == 1

Inspect links

Selenium has a powerful find_elements_by_xpath method that lets me grab all links from the page like so:

    # next test if all bite and challenge links are present
    all_links = driver.find_elements_by_xpath("//a[@href]")

Then I check how many Bites we have in the database and match the link using a regex in a list comprehension:

    expected_num_bites = Bite.objects.filter(published=True).count()
    actual_num_bites = len([link for link in all_links
                            if re.match(r'^Bite \d+\..*',  # no class in html anchors :(
                                        link.text)])
    assert actual_num_bites == expected_num_bites

Ditto for Challenges but I don't need the regex because they conveniently have a class name of challengeTitle so I can again use Selenium's find_elements_by_class_name:

    expected_num_challenges = Challenge.objects.filter(published=True).count()
    challenge_titles = driver.find_elements_by_class_name('challengeTitle')
    assert len(challenge_titles) == expected_num_challenges

That's an additional advantage of writing tests: you will find refactoring candidates. Like the Github button earlier we could add a class name to the Bite links to make it easier to target them.


Of course this is only one page and even so it only hits the surface. Other tests we could write for this page:

  1. Resolve a Bite, does your score go up? Cheat a Bite, is only 1 point added to score?
  2. Go from 8 to 10 points, do I earn my first badge?
  3. Go from 48 to 50 points, is my user starting to show up on the leader board (right top)?
  4. Are bitecoins changing from grey to colored when I complete a Bite?
  5. Are coding actions over multiple days get the corresponding green cells in the coding streak / calendar widget?
  6. The NEW background image label for new (< 1 week old) Bites and Challenges.
  7. The Bite Token Counter for the Cherry-Pick Tier this user is on.
  8. Etc. etc.

It is good to start thinking about all these scenarios because as your app grows the permutations of all possible outcomes grow exponentially, so automated testing is paramount.

And with that I hope this gave you a feel how you can test your Django app with pytest and selenium.

The final (stripped down) code

And the result:

pytest selenium result

Check out a simplified version here. I took out the Django requirement omitting the last two (DB) checks. No more Django runserver made me change the HOMEPAGE constant to use the live site instead of localhost.

Your turn!

Up for a challenge? We have a dedicated Django + Selenium Code Challenge available on our platform: PCC32 - Test a Simple Django App With Selenium.

Final tip when writing Selenium code

Set a breakpoint in the test you are writing. You can use breakpoint() if on >= 3.7, else import pdb; pdb.set_trace().

In the debugger it's easier to Selenium's methods on the website in frozen state. Then you can just copy+paste from debugger to script and vice versa. This will save you a lot of time and makes it more fun :)

It takes time to write extended Selenium tests but the exciting part is that you build up your regression suite that will catch future bugs for you, saving you time and assuring you write more reliable code!


Keep Calm and Code|Test in Python!

-- Bob

Python Bytes: #105 Colorizing and Restoring Old Images with Deep Learning

gamingdirectional: Create a pool object for player missile

$
0
0

In this article we will create a pool object inside the player missile manager class which will then be used to recycle those player missile objects. Creating pool object for the player missile is a lot more complicated than creating pool object for the enemy missile because in order to use the player missile pool object we will need to amend a few classes such as the game manager class in the...

Source

Not Invented Here: Python Argument Surprise

$
0
0

Python function signatures are flexible, complex beasts, allowing for positional, keyword, variable, and variable keyword arguments (and parameters). This can be extremely useful, but sometimes the intersection between these features can be confusing or even surprising, especially on Python 2. What do you expect this to return?

>>> deftest(arg1,**kwargs):... returnarg1>>> test(**{'arg1':42})...

Terminology

Before we move on, let's take a minute to define some terms.

parameter
The name of a variable listed by a function definition. These are sometimes called "formal parameters" or "formal arguments." In def foo(a, b), a and b are parameters.
argument
The expression given to a function application (function call). In foo(1, "str"), 1 and "str" are arguments.
function signature
The set of parameters in a function definition. This is also known as a "parameter list."
binding
The process of associating function call arguments to the parameter names given in the function's signature. In foo(1, "str"), the parameter a will be assigned the value 1, and the parameter b will be assigned the value "str". This is also called "argument filling."
default parameter
In a function signature, a parameter that is assigned a value. An argument for this parameter does not have to be given in a function application; when it is not, the default value is bound to the parameter. In def foo(a, b=42), b=42 creates a default parameter. It can also be said that b has a default parameter value. The function can be called as foo(1).
positional argument
An argument in a function call that's given in order of the parameters in the function signature, from left to right. In foo(1, 2), 1 and 2 are positional arguments that will be bound to the parameters a and b.
keyword argument
An argument in a function call that's given by name, matching the name of a parameter. In foo(a=1), a=1 is a keyword argument, and the parameter a will have the value 1.
variable parameters
A function signature that contains *args (where args is an arbitrary identifier) accepts an arbitrary number of unnamed arguments in addition to any explicit parameters. Extra parameters are bound to args as a list. def foo(a, b, *args) creates a function that has variable parameters, and foo(1, 2), foo(1, 2, 3), foo(1, 2, 3, 4) are all valid ways to call it. This is commonly called "varargs," for "variable arguments" (even though it is a parameter definition).
variable positional arguments
Passing an arbitrary (usually unknown from the function call itself) number of arguments to a function by unpacking a sequence. Variable arguments can be given to a function whether or not it accepts variable parameters (if it doesn't, the number of variable arguments must match the number of parameters). This is done using the * syntax: foo(*(1, 2)) is the same as writing foo(1, 2), but more often the arguments are created dynamically. For example, args = (1, 2) if full_moon else (3, 4); foo(*args).
variable keyword parameters
A function signature that contains **kwargs (where kwargs is an arbitrary identifier) accepts an arbitrary number of keyword arguments in addition to any explicit parameters (with or without default values). The definition def foo(a, b, **kwargs) creates a function with a variable keyword parameter. It can be called like foo(1, 2) or foo(1, 2, c=42, d=420).
variable positional arguments
Similar to variable positional arguments, but using keyword arguments. The syntax is **, and the object to be unpacked must be a mapping; extra arguments are placed in a mapping bound to the parameter identifier. A simple example is foo(**{'b':"str",'a':1}).

Some language communities are fairly strict about the usage of these terms, but the Python community is often fairly informal. This is especially true when it comes to the distinction between parameters and arguments (despite it being a FAQ) which helps lead to some of the surprises we discuss below.

Surprises

On to the surprises. These will all come from the intersection of the various terms defined above. Not all of these will surprise everyone, but I would be surprised if most people didn't discover at least one mildly surprising thing.

Non-Default Parameters Accept Keyword Arguments

Any parameter can be called using a keyword argument, whether or not it has a default parameter value:

>>> deftest(a,b,c=42):... return(a,b,c)>>> test(1,2)(1, 2, 42)>>> test(1,b='b')(1, 'b', 42)>>> test(c=1,b=2,a=3)(3, 2, 1)

This is surprising because sometimes parameters with a default value are referred to as "keyword parameters" or "keyword arguments," suggesting that only they can be called using a keyword argument. In reality, the parameter just has a default value. It's the function call site that determines whether to use a keyword argument or not.

One consequence of this: the parameter names of public functions, even if they don't have a default, are part of the public signature of the function. If you distribute a library, people can and will call functions using keyword arguments for parameters you didn't expect them to. Changing parameter names can thus break backwards compatibility. (Below we'll see how Python 3 can help with this.)

Corollary: Variable Keyword Arguments Can Bind Non-Default Parameters

If we introduce variable keyword arguments, we see that this behaviour is consistent:

>>> kwargs={'a':'akw','b':'bkw'}>>> test(**kwargs)('akw', 'bkw', 42)

Corollary: Positional Parameters Consume Keyword Arguments

Knowing what we know now, we can answer the teaser from the beginning of the article:

>>> deftest(arg1,**kwargs):... returnarg1,kwargs>>> test(**{'arg1':42})(42, {})

The named parameter arg1, even when passed in a variable keyword argument, is still bound by name. There are no extra arguments to place in kwargs.

Default Parameters Accept Positional Arguments

Any parameter can be called using a positional argument, whether or not it has a default parameter value:

>>> deftest(a=1,b=2,c=3):... return(a,b,c)>>> test('a','b','c')('a', 'b', 'c')

This is the inverse of the previous surprise. It may be surprising for the same reason, the conflation of keyword arguments and default parameter values.

Of course, convention often dictates that default parameters are passed using keyword arguments, but as you can see, that's not a requirement of the language.

Corollary: Variable Positional Arguments Can Bind Default Parameters

Introducing variable positional arguments shows consistent behaviour:

>>> pos_args=('apos','bpos')>>> test(*pos_args)('apos', 'bpos', 3)

Mixing Variable Parameters and Keyword Arguments Will Break

Suppose we'd like to define some parameters with default values (expecting them to be passed as keyword arguments by convention), and then also allow for some extra arguments to be passed:

>>> deftest(a,b=1,*args):... return(a,b,args)

The definition works. Now lets call it in some common patterns:

>>> test('a','b')('a', 'b', ())>>> test('a','b','c')('a', 'b', ('c',))>>> test('a',b=1)('a', 1, ())>>> test('a',b='b',*(1,2))Traceback (most recent call last):...TypeError: test() got multiple values for argument 'b'

As long as we don't mix keyword and variable (extra) arguments, everything works out. But as soon as we mix the two, the variable positional arguments are bound first, and then we have a duplicate keyword argument left over for b.

This is a common enough source of errors that, as we'll see below, Python 3 added some extra help for it, and linters warn about it.

Functions Implemented In C Can Break The Rules

We generally expect to be able to call functions with keyword arguments, especially when the corresponding parameters have default values, and we expect that the order of keyword arguments doesn't matter. But if the function is not implemented in Python, and instead is a built-in function implemented in C, that may not be the case. Let's look at the built-in function open. On Python 3, if we ask for the function signature, we get something like this:

>>> importmath>>> help(math.tan)Help on built-in function tan in module math:<BLANKLINE>...    tan(x)<BLANKLINE>    Return the tangent of x (measured in radians).<BLANKLINE>

That sure looks like a parameter with a name, so we expect to be able to call it with a keyword argument:

>>> math.tan(x=1)Traceback (most recent call last):...TypeError: tan() takes no keyword arguments

This is due to how C functions bind Python arguments into C variables, using functions like PyArg_ParseTuple.

In newer versions of Python, this is indicated with a trailing / in the function signature, showing that the preceding arguments are positional only paramaters. (Note that Python has no syntax for this.)

>>> help(abs)Help on built-in function abs in module builtins:<BLANKLINE>abs(x, /)    Return the absolute value of the argument.

Python 3 Improvements

Python 3 offers ways to reduce some of these surprising characteristics. (For backwards compatibility, it doesn't actually eliminate any of them.)

We've already seen that functions implemented in C can use new syntax in their function signatures to signify positional-only arguments. Plus, more C functions can accept keyword arguments for any arbitrary parameter thanks to new functions and the use of tools like Argument Clinic.

The most important improvements, though, are available to Python functions and are outlined in the confusingly named PEP 3102: Keyword-Only Arguments.

With this PEP, functions are allowed to define parameters that can only be filled by keyword arguments. In addition, this allows functions to accept both variable arguments and keyword arguments without raising TypeError.

This in done by simply moving the variable positional parameter before any parameters that should only be allowed by keyword:

>>> deftest(a,*args,b=42):... return(a,b,args)>>> test(1,2,3)(1, 42, (2, 3))>>> test(1,2,3,b='b')(1, 'b', (2, 3))>>> test(1,2,3,b='b',c='c')Traceback (most recent call last):...TypeError: test() got an unexpected keyword argument 'c'>>> test()Traceback (most recent call last):...TypeError: test() missing 1 required positional argument: 'a'

What if you don't want to allow arbitrary unnamed arguments? In that case, simply omit the variable argument parameter name:

>>> deftest(a,*,b=42):... return(a,b)>>> test(1,b='b')(1, 'b')

Trying to pass extra arguments will fail:

>>> test(1,2,b='b')Traceback (most recent call last):...TypeError: test() takes 1 positional argument but 2 positional arguments (and 1 keyword-only argument) were given

Finally, what if you want to require certain parameters to be passed by name? In that case, you can simply leave off the default value for the keyword-only parameters:

>>> deftest(a,*,b):... return(a,b)>>> test(1)Traceback (most recent call last):...TypeError: test() missing 1 required keyword-only argument: 'b'>>> test(1,b='b')(1, 'b')

The above examples all produce SyntaxError on Python 2. Much of the functionality can be achieved on Python 2 using variable arguments and variable keyword arguments and manual argument binding, but that's slower and uglier than what's available on Python 3. Lets look at an example of implementing the first function from this section in Python 2:

>>> deftest(*args,**kwargs):... "test(a, *args, b=42) -> tuple"# docstring signature for Sphinx... # This raises an IndexError instead of a TypeError if 'a'... # is missing; that's easy to fix, but it's a programmer error... a=args[0]... args=args[1:]... b=kwargs.pop('b',42)... ifkwargs:... raiseTypeError("Got extra keyword args %s"%(list(kwargs)))... return(a,b,args)>>> test(1,2,3)(1, 42, (2, 3))>>> test(1,2,3,b='b')(1, 'b', (2, 3))>>> test(1,2,3,b='b',c='c')Traceback (most recent call last):...TypeError: Got extra keyword args ['c']>>> test()Traceback (most recent call last):...IndexError: tuple index out of range

Weekly Python StackOverflow Report: (cliii) stackoverflow python report

$
0
0

Bhishan Bhandari: Top 10 Most Subscribed Youtube Channel Visualization[2013-2018]

$
0
0

Motion Visualization is intriguing and explains a lot in itself. I have put together an example visualization of the Top 10 Youtubers(by subscribers) from November 2013 to November 2018. I am not very informed of whether or not flash based visualizations are still relevant, but I found Google Motion Chart to have explained very well […]

The post Top 10 Most Subscribed Youtube Channel Visualization[2013-2018] appeared first on The Tara Nights.

Davy Wybiral: Concurrency on the Internet of Things (Arduino, MicroPython, Espruino)

$
0
0
In this presentation I talk about what concurrency actually is, why it matters for Internet of Things applications, and which platforms are best at handling it.

Codementor: Sentiment analysis on Trump's tweets using Python 🐍

$
0
0
A sentiment analysis on Trump's tweets using Python tutorial.

John Cook: Ellipsoid distance on Earth

$
0
0

globe

To first approximation, Earth is a sphere. But it bulges at the equator, and to second approximation, Earth is an oblate spheroid. Earth is not exactly an oblate spheroid either, but the error in the oblate spheroid model is about 100x smaller than the error in the spherical model.

Finding the distance between two points on a sphere is fairly simple. Here’s a calculator to compute the distance, and here’s a derivation of the formula used in the calculator.

Finding the distance between two points on an ellipsoid is much more complicated. (A spheroid is a kind of ellipsoid.) Wikipedia gives a description of Vincenty’s algorithm for finding the distance between two points on Earth using an oblate spheroid model (specifically WGS-84). I’ll include a Python implementation below.

Comparison with spherical distance

How much difference does it make when you calculate difference on an oblate spheroid rather than a sphere? To address that question I looked at the coordinates of several cities around the world using the CityData function in Mathematica. Latitude is in degrees north of the equator and longitude is in degrees east of the prime meridian.

    |--------------+--------+---------|
    | City         |    Lat |    Long |
    |--------------+--------+---------|
    | Houston      |  29.78 |  -95.39 |
    | Caracas      |  10.54 |  -66.93 |
    | London       |  51.50 |   -0.12 |
    | Tokyo        |  35.67 |  139.77 |
    | Delhi        |  28.67 |   77.21 |
    | Honolulu     |  21.31 | -157.83 |
    | Sao Paulo    | -23.53 |  -46.63 |
    | New York     |  40.66 |  -73.94 |
    | Los Angeles  |  34.02 | -118.41 |
    | Cape Town    | -33.93 |   18.46 |
    | Sydney       | -33.87 |  151.21 |
    | Tromsø       |  69.66 |   18.94 |
    | Singapore    |   1.30 |  103.85 |
    |--------------+--------+---------|

Here are the error extremes.

The spherical model underestimates the distance from London to Tokyo by 12.88 km, and it overestimates the distance from London to Cape Town by 45.40 km.

The relative error is most negative for London to New York (-0.157%) and most positive for Tokyo to Sidney (0.545%).

Python implementation

The code below is a direct implementation of the equations in the Wikipedia article.

Note that longitude and latitude below are assumed to be in radians. You can convert from degrees to radians with SciPy’s deg2rad function.

from scipy import sin, cos, tan, arctan, arctan2, arccos, pi

a = 6378137.0 # equatorial radius in meters
f = 1/298.257223563 # ellipsoid flattening
b = (1 - f)*a
tolerance = 1e-11

def spherical_distance(lat1, long1, lat2, long2):
    phi1 = 0.5*pi - lat1
    phi2 = 0.5*pi - lat2
    t = sin(phi1)*sin(phi2)*cos(long1-long2) + cos(phi1)*cos(phi2)
    return a * arccos(t)

def ellipsoidal_distance(lat1, long1, lat2, long2):
    phi1, phi2 = lat1, lat2
    U1 = arctan((1-f)*tan(phi1))
    U2 = arctan((1-f)*tan(phi2))
    L1, L2 = long1, long2
    L = L2 - L1

    lambda_old = L + 0

    while True:
    
        t = (cos(U2)*sin(lambda_old))**2
        t += (cos(U1)*sin(U2) - sin(U1)*cos(U2)*cos(lambda_old))**2
        sin_sigma = t**0.5
        cos_sigma = sin(U1)*sin(U2) + cos(U1)*cos(U2)*cos(lambda_old)
        sigma = arctan2(sin_sigma, cos_sigma) 
    
        sin_alpha = cos(U1)*cos(U2)*sin(lambda_old) / sin_sigma
        cos_sq_alpha = 1 - sin_alpha**2
        cos_2sigma_m = cos_sigma - 2*sin(U1)*sin(U2)/cos_sq_alpha
        C = f*cos_sq_alpha*(4 + f*(4-3*cos_sq_alpha))/16
    
        t = sigma + C*sin_sigma*(cos_2sigma_m + C*cos_sigma*(-1 + 2*cos_2sigma_m**2))
        lambda_new = L + (1 - C)*f*sin_alpha*t
        if abs(lambda_new - lambda_old) <= tolerance:
            break
        else:
            lambda_old = lambda_new

    u2 = cos_sq_alpha*((a**2 - b**2)/b**2)
    A = 1 + (u2/16384)*(4096 + u2*(-768+u2*(320 - 175*u2)))
    B = (u2/1024)*(256 + u2*(-128 + u2*(74 - 47*u2)))
    t = cos_2sigma_m + 0.25*B*(cos_sigma*(-1 + 2*cos_2sigma_m**2))
    t -= (B/6)*cos_2sigma_m*(-3 + 4*sin_sigma**2)*(-3 + 4*cos_2sigma_m**2)
    delta_sigma = B * sin_sigma * t
    s = b*A*(sigma - delta_sigma)

    return s
Viewing all 23425 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>