Quantcast
Channel: Planet Python
Viewing all 23144 articles
Browse latest View live

Dataquest: Understanding SettingwithCopyWarning in pandas

$
0
0

SettingWithCopyWarning is one of the most common hurdles people run into when learning pandas. A quick web search will reveal scores of Stack Overflow questions, GitHub issues and forum posts from programmers trying to wrap their heads around what this warning means in their particular situation. It’s no surprise that many struggle with this; there are so many ways to index pandas data structures, each with its own particular nuance, and even pandas itself does not guarantee one single outcome for two lines of code that may look identical.

This guide explains why the warning is generated and shows you how to solve it. It also includes under-the-hood details to give you a better understanding of what’s happening and provides some history on the topic, giving you perspective on why it all works this way.

In order to explore SettingWithCopyWarning, we’re going to use a data set of the prices of Xboxes sold in 3-day auctions on eBay from the book Modelling Online Auctions. Let’s take a look:


Reuven Lerner: Raw strings to the rescue!

$
0
0

Whenever I teach Python courses, most of my students are using Windows. And thus, when it comes time to do an exercise, I inevitably end up with someone who does the following:

for one_line in open('c:\abc\def\ghi'):

    print(one_line)

The above code looks like it should work. But it almost certainly doesn’t. Why? Because backslashes (\) in Python strings are used to insert special characters. For example, \n inserts a newline, and \t inserts a tab. So when we create the above string, we think that we’re entering a simple path — but we’re actually entering a string containing ASCII 7, the alarm bell.

Experienced programmers are used to looking for \n and \t in their code.  But \a (alarm bell) and \v (vertical tab), for example, tend to surprise many of them. And if you aren’t an experienced programmer? Then you’re totally baffled why the pathname you’ve entered, and copied so precisely from Windows, results in a “file not found” error.

One way to solve this problem is by escaping the backslashes before the problematic characters.  If you want a literal “\n” in your text, then put “\\n” in your string.  By the same token, you can say “\\a” or “\\v”.  But let’s be honest; remembering which characters require a doubled backslash is a matter of time, experience, and discipline.

(And yes, you can use regular, Unix-style forward slashes on Windows.  But that is generally met by even more baffled looks than the notion of a “vertical tab.”)

You might as well double all of the backslashes — but doing that is really annoying.  Which is where “raw strings” come into play in Python.

A “raw string” is basically a “don’t do anything special with the contents” string — a what-you-see-is-what-you-get string. It’s actually not a different type of data, but rather a way to enter strings in which you want to escape all backslashes. Just preface the opening quote (or double quotes) with the “r” character, and your string will be defined with all backslashes escaped. For example, if you say:

print("abc\ndef\nghi")

then you’ll see

abc

def

ghi

But if you say:

print(r"abc\ndef\nghi")

then you’ll see

abc\ndef\nghi

I suggest using raw strings whenever working with pathnames on Windows; it allows you to avoid guessing which characters require escaping. I also use them whenever I’m writing regular expressions in Python, especially if I’m using \b (for word boundaries) or using backreferences to groups.

Raw strings are one of those super-simple ideas that can have big implications on the readability of your code, as well as your ability to avoid problems.  Avoid such problems; use raw strings for all of your Windows pathnames, and you’ll be able to devote your attention to fixing actual bugs in your code.

The post Raw strings to the rescue! appeared first on Lerner Consulting Blog.

NumFOCUS: Belinda Weaver joins Carpentries as Community Development Lead

$
0
0
Belinda Weaver was recently hired by the Carpentries as their new Community Development Lead. We are delighted to welcome her to the NumFOCUS family! Here, Belinda introduces herself to the community and invites your participation and feedback on her work. I am very pleased to take up the role of Community Development Lead for Software […]

Mike Driscoll: PyDev of the Week on Hiatus

$
0
0

I don’t know if anyone noticed something amiss this week, but the PyDev of the Week series is currently on hiatus. I have been having trouble getting interviewees to get the interviews done in a timely manner the last month or so and actually ended up running out.

While I have a bunch of new interviewees lined up, none of them have actually finished the interview. So I am suspending the series for the month of July 2017. Hopefully I can get several lined up for August and get the series kicked back into gear. If not, then it will be suspended until I have a decent number of interviews done.

If you happen to have any suggestions for Pythonistas that you would like to see featured here in the PyDev of the Week series, feel free to leave a comment or contact me.

Catalin George Festila: Python Qt4 - part 003.

$
0
0
Today I've taken a simple example with PyQt4 compared to the other tutorials we have done so far.
The main reason was to understand and use PyQt4 to display an important message.
To make this example I have set the following steps for my python program.
  • importing python modules
  • creating the application in PyQt4 as a tray icon class
  • establishing an exit from the application
  • setting up a message to display
  • display the message over a period of time
  • closing the application
  • running the python application
Let's see my source code of my python application:
#! /usr/bin/env python
import sys
from PyQt4 import QtGui, QtCore

class SystemTrayIcon(QtGui.QSystemTrayIcon):
def __init__(self, parent=None):
QtGui.QSystemTrayIcon.__init__(self, parent)
self.setIcon(QtGui.QIcon("mess.svg"))
menu = QtGui.QMenu(parent)
exitAction = menu.addAction("Exit")
self.setContextMenu(menu)
QtCore.QObject.connect(exitAction,QtCore.SIGNAL('triggered()'), self.exit)

def click_trap(self, value):
if value == self.Trigger: #left click!
self.left_menu.exec_(QtGui.QCursor.pos())

def welcome(self):
self.showMessage("Hello user!", "This is a message from PyQT4")

def show(self):
QtGui.QSystemTrayIcon.show(self)
QtCore.QTimer.singleShot(600, self.welcome)

def exit(self):
QtCore.QCoreApplication.exit()

def main():
app = QtGui.QApplication([])
tray = SystemTrayIcon()
tray.show()
app.exec_()

if __name__ == '__main__':
main()
I used PyQt4 python module to make the application and sys python module for exit from application.
About running application: the main function will run application.
The python class SystemTrayIcon will work only if we used QApplication to make like any application.
This is the reason I used variable app.
The tray variable is used to run it like tray icon application.
Into the SystemTrayIcon class I put some functions to help me with my issue.
Under __init__ I used all settings for my tray icon application: the icon, the exit menu, signal for exit.
The next functions come with:
  • click_trap - take the click of user ;
  • welcome - make message to display;
  • show - display the welcome message;
  • exit - exit from application
The result of my python script is this message:

About the showMessage then this help you:

QSystemTrayIcon.showMessage (self, QString title, QString msg, MessageIcon icon = QSystemTrayIcon.Information, int msecs = 10000)

Shows a balloon message for the entry with the given title, message and icon for the time specified in millisecondsTimeoutHint. title and message must be plain text strings.
Message can be clicked by the user; the messageClicked() signal will emitted when this occurs.
Note that display of messages are dependent on the system configuration and user preferences, and that messages may not appear at all. Hence, it should not be relied upon as the sole means for providing critical information.
On Windows, the millisecondsTimeoutHint is usually ignored by the system when the application has focus.
On Mac OS X, the Growl notification system must be installed for this function to display messages.
This function was introduced in Qt 4.3.

Kushal Das: Installing Python3.6.1 in your Fedora24/25

$
0
0

Yesterday, one of the participant in the dgplug summer training was trying to use Python3.6.1 in the Fedora 24 box. There was some issues in the installation, I actually don’t know how the installation was done. So, I just suggested to build Python 3.6.1 from source. Then have as many virtual environments as required to learn Python.

The following commands can help anyone to build from source on Fedora 24 or on Fedora 25.

$ sudo dnf install dnf-plugins-core wget make gcc
$ sudo dnf builddep python3
$ wget https://www.python.org/ftp/python/3.6.1/Python-3.6.1.tgz
$ ,tar -xvf Python-3.6.1.tgz
$ cd Python-3.6.1
$ ./configure --with-pydebug
$ make -j2

After the above command, you will have a python binary. Now, next step is to create a virtual environment.

$ ./python -m venv myenv
$ source myenv/bin/activate
(myenv)$ python

Now you have your own Python 3.6.1. Happy hacking :)

Kushal Das: Touch Typing

$
0
0

Learning new things, is an essential part of life. While we try to spend a lot of time learning various tricks about our tools or a particular programming language, many newcomers miss another important common skill.

The art of Touch Typing.

Ever since, I started going to conferences, I met many people who do touch type, and they generally type really fast. I found that to be very common, in our circles. But, when we meet beginners and discuss the things they should learn, we completely miss talking about this point. Most of the beginners I’ve met can’t type well. And most of the errors people ask about, are caused by, guess what?

Typing mistakes.

Also, because they type very slow, beginners lag behind in workshops.

GNU Typist

In the dgplugsummer training, just about after we’re done with the communication sessions, we ask people to start spending some time learning to type.

My favorite tool is GNU Typist. It’s a small command line tool which can help anyone learn touch typing in a few days. Remember that the package name is gtypist.

In the main menu, you choose one of the many courses shown. The “Quick QWERTY” course is powerful enough to give you a start. After a few screens of description, about how to use the tool, you get into a screen like the one shown below.

As you see, any error will be marked by the tool. In the beginning, it is okay if you keep checking where your fingers are. If you spend a week with this tool, you should be able to start typing faster, and with fewer errors. Your muscle memory will kick in, and you’ll will be amazed by your new super power :) When I first used the Das Ultimate (no relation :P) or the Kinesis Advantage keyboard, I spent the first few minutes in gtypist to become familiar with them.

KDE Touch

KDE Touch is a GUI application to learn how to type. It will give you similar levels of detail, using various pretty looking graphs & charts. If you do not like the command line tools, you can always start learning using this tool.

There is also the Tux Typing tool, which is aimed more at kids. The tool shows various words and if you can type them properly, you will be able to provide to food to the nice little penguin.

People who are reading this, most probably will spend the rest of our lives in front of computers (for various reasons). Learn to type well; having that muscle memory is a powerful tool and will be a be source of great strength for you.

Talk Python to Me: #119 Python in Engineering

$
0
0
Think about how you learn most technical or detail-oriented subjects? <br/> <br/> You start at the bottom, lowest level and you create building blocks and work your way into the actual thing you care about. This happens in engineering, in math, and even in programming. <br/> <br/> Our guest this week, Dr. Allen Downey, believes that computation and programming can help us turn this inside-out way of teaching right-side out again. <br/> <br/> Join Allen and me as we discuss programming as a way of thinking and physical modeling and engineering in Python.<br/> <br/> Links from the show:<br/> <br/> <div style="font-size: .85em;"><b>Dr. Allen Downey</b>: <a href="http://www.olin.edu/faculty/profile/allen-downey/" target="_blank">olin.edu/faculty/profile/allen-downey</a><br/> <b>Allen’s web page</b>: <a href="http://www.allendowney.com/wp/" target="_blank">allendowney.com</a><br/> <b>Allen’s blog</b>: <a href="http://allendowney.blogspot.com/" target="_blank">allendowney.blogspot.com</a><br/> <b>Allen on Twitter</b>: <a href="https://twitter.com/allendowney" target="_blank">@allendowney</a><br/> <b>Programming as a Way of Thinking</b>: <a href="https://blogs.scientificamerican.com/guest-blog/programming-as-a-way-of-thinking/" target="_blank">blogs.scientificamerican.com</a><br/> <b>Think Python book</b>: <a href="http://greenteapress.com/wp/think-python/" target="_blank">greenteapress.com/wp/think-python</a><br/> <b>Think OS book</b>: <a href="http://greenteapress.com/thinkos/" target="_blank">greenteapress.com/thinkos</a><br/> <b>Pint package</b>: <a href="https://pint.readthedocs.io" target="_blank">pint.readthedocs.io</a><br/> <b>ModSim14 course (slight older version)</b>: <a href="https://sites.google.com/site/modsim14/" target="_blank">sites.google.com/site/modsim14</a><br/> <b>Modeling and Simulation video</b>: <a href="http://www.olin.edu/academics/experience/modeling-simulation/" target="_blank">olin.edu/academics/experience/modeling-simulation</a><br/> <b>Early work on the book and code</b>: <a href="https://github.com/AllenDowney/ModSimPython" target="_blank">github.com/AllenDowney/ModSimPython</a><br/> <br/> <b>Sponsored links</b><br/> <b>Linode</b>: <a href="https://talkpython.fm/linode" target="_blank">talkpython.fm/linode</a><br/> <b>Rollbar</b>: <a href="https://talkpython.fm/rollbar" target="_blank">talkpython.fm/rollbar</a><br/> <b>Talk Python Courses</b>: <a href="https://training.talkpython.fm/" target="_blank">training.talkpython.fm</a><br/> <b>MongoDB Course</b>: <a href="https://training.talkpython.fm/courses/explore_mongodb_for_python_developers_course/mongodb-for-python-for-developers-featuring-orm-odm-mongoengine" target="_blank">training.talkpython.fm/courses</a><br/></div>

Python Bytes: #33 You should build an Alexa skill

$
0
0
<p>Sponsored by Rollbar! <a href="https://pythonbytes.fm/rollbar">pythonbytes.fm/rollbar</a> </p> <p><strong>Brian #1:</strong> <a href="https://dev.to/sethmichaellarson/linting-as-lightweight-defect-detection-for-python"><strong>Linting as Lightweight Defect Detection for Python</strong></a></p> <ul> <li>flake8, </li> <li>pycodestyle, formerly pep8 tool <a href="https://pycodestyle.readthedocs.io/en/latest/">https://pycodestyle.readthedocs.io/en/latest/</a></li> <li>pep257 can be checked with flake8-docstrings</li> <li>pydocstyle, <a href="http://www.pydocstyle.org/">http://www.pydocstyle.org/</a></li> </ul> <p><strong>Michael #2:</strong> <a href="https://medium.com/@jacquelinewilson/amazon-alexa-skill-recipe-1444e6ee45a6"><strong>You should build an Alexa skill</strong></a></p> <ul> <li>Jacqueline Wilson wrote <em>Amazon Alexa Skill Recipe with Python 3.6</em></li> <li>Ingredients: <ul> <li>A developer account on <a href="https://developer.amazon.com">https://developer.amazon.com</a> (“Amazon Developer Console”)</li> <li>An AWS account on <a href="https://aws.amazon.com">https://aws.amazon.com</a> (“AWS Console”)</li> <li>Beginner knowledge of Python 3.x syntax</li> </ul></li> <li>Create a “What’s for dinner” bot</li> <li>Amazon calls these utterances: <ul> <li>“What should I have for dinner?”</li> <li>“Do you have a dinner idea?”</li> <li>“What’s for dinner?”</li> </ul></li> <li>Tie the commands to an AWS Lambda function (returns a JSON response)</li> <li>Test via <a href="https://echosim.io">Alexa Skill Testing Tool</a> </li> </ul> <p><strong>Brian #3:</strong> <a href="https://github.com/damianavila/RISE"><strong>RISE</strong></a></p> <ul> <li>Reveal IPython Slide Extension</li> <li>Making slides with Jupyter notebooks</li> </ul> <p><strong>Michael #4:</strong> <a href="https://haarcuba.github.io/closer/"><strong>Closer</strong></a></p> <ul> <li>Run, monitor and close remote SSH processes automatically</li> <li>Closer was born because I had trouble with killing up processes I set up remotely via SSH. That is, you want to run some SSH process in the background, and then you want to kill it, just like you would a local subprocess.</li> <li>Main features: <ul> <li>kill the remote process (either by choice, or automatically at the end of the calling process)</li> <li>capture the remote process’s output</li> <li>live monitoring of remote process output</li> <li>get a callback upon remote process’ death</li> </ul></li> </ul> <p><strong>Brian #5:</strong> <a href="http://python.apichecklist.com/"><strong>Checklist for</strong></a> <a href="http://python.apichecklist.com/"><strong><em>*</a><a href="http://python.apichecklist.com/"></strong>Python libraries APIs</em>*</a></p> <p><strong>Michael #6:</strong> <a href="https://fades.readthedocs.io/en/release_6_0/readme.html"><strong>Fades</strong></a></p> <ul> <li>Fades is a system that automatically handles the virtualenvs in the cases normally found when writing scripts and simple programs, and even helps to administer big projects.</li> <li>fades will automagically create a new virtualenv (or reuse a previous created one), installing the necessary dependencies, and execute your script inside that virtualenv, with the only requirement of executing the script with fades and also marking the required dependencies.</li> <li>At the moment you execute the script, fades will search a virtualenv with the marked dependencies, if it doesn’t exists fades will create it, and execute the script in that environment.</li> <li>Indicating dependencies (in code or via CLI)</li> </ul> <pre><code> import somemodule # fades == 3 import somemodule # fades &gt;= 2.1 import somemodule # fades &gt;=2.1,&lt;2.8,!=2.6.5 </code></pre> <ul> <li>Can control the Python version the env is based upon</li> <li>Can ask for a “refresh” on the virtual env</li> <li>You can also configure fades using .ini config files.</li> <li>How to clean up old virtualenvs?</li> </ul> <p><strong>Listener comment,</strong> <a href="https://pythonbytes.fm/episodes/show/32/8-ways-to-contribute-to-open-source-when-you-have-no-time#comment-3400891427"><strong>RE: Episode 32</strong></a><strong>:</strong></p> <p>Jan Oglop: </p> <p>Hello Michael and Brian, I wanted to thank you for amazing work you do. And let you know that you have helped me to find the working place from my dreams! My colleagues has similar hobbies and loves python as much as I do!</p> <p>Thank you again!</p>

Tryton News: Call for board director canditates 2017

$
0
0

It is already 5 years that our current board is running the Foundation. It is time to renew it! The current board has to co-opt new directors based on candidatures. The candidates must apply here before the 30th September 2017.

Board DirectorsCC BY 2.0 Toronto Public Library

The Foundation needs to be founded to pursue its missions, so do not forget to checkout our budget for 2017.

Continuum Analytics News: Package Better with Conda Build 3

$
0
0
Friday, July 7, 2017
Michael Sarahan
Continuum Analytics

Handling version compatibility is one of the hardest challenges in building software. Until now, conda-build provided helpful tools in terms of the ability to constrain or pin versions in recipes. The limiting thing about this capability was that it entailed editing a lot of recipes.

Conda-build 3 introduces a new scheme for controlling version constraints, which enhances behavior in two ways. First, you can now set versions in an external file and provide lists of versions for conda-build to loop over. Matrix builds are now much simpler and no longer require an external tool, such as conda-build-all. Second, there have been several new jinja2 functions added, which allow recipe authors to express their constraints relative to the versions of packages installed at build time. This dynamic expression greatly cuts down on the need for editing recipes.

Each of these developments have enabled interesting new capabilities for cross-compiling, as well as improving package compatibility by adding more intelligent constraints.

This document is intended as a quick overview of new features in conda-build 3. For more information, see the docs.

These demos use conda-build's python API to render and build recipes. That API currently does not have a docs page, but is pretty self explanatory. See the source at: https://github.com/conda/conda-build/blob/master/conda_build/api.py

This jupyter notebook itself is included in conda-build's tests folder. If you're interested in running this notebook yourself, see the tests/test-recipes/variants folder in a git checkout of the conda-build source. Tests are not included with conda packages of conda-build.

from conda_build import api 
import os 
from pprint import pprint

First, set up some helper functions that will output recipe contents in a nice-to-read way:

def print_yamls(recipe, **kwargs): 
          yamls = [api.output_yaml(m[0]) 
                   for m in api.render(recipe, verbose=False, permit_unsatisfiable_variants=True, **kwargs)] 
          for yaml in yamls: 
              print(yaml) 
              print('-' * 50)
def print_outputs(recipe, **kwargs): 
    pprint(api.get_output_file_paths(recipe, verbose=False, **kwargs))

Most of the new functionality revolves around much more powerful use of jinja2 templates. The core idea is that there is now a separate configuration file that can be used to insert many different entries into your meta.yaml files.

!cat 01_basic_templating/meta.yaml

    package:
      name: abc
      version: 1.0

    requirements:
      build:
       - something {{ something }}
      run:
       - something {{ something }}

# The configuration is hierarchical - it can draw from many config files. One place they can live is alongside meta.yaml:
!cat 01_basic_templating/conda_build_config.yaml
 

    something:
      - 1.0
      - 2.0

 

Since we have one slot in meta.yaml, and two values for that one slot, we should end up with two output packages:

print_outputs('01_basic_templating/')
 
    Returning non-final recipe for abc-1.0-0; one or more dependencies was unsatisfiable:
    Build: something
    Host: None
 
    ['/Users/msarahan/miniconda3/conda-bld/osx-64/abc-1.0-h1332e90_0.tar.bz2',
    '/Users/msarahan/miniconda3/conda-bld/osx-64/abc-1.0-h9f70ef6_0.tar.bz2']

print_yamls('01_basic_templating/')
    package:
        name: abc
        version: '1.0'
    build:
        string: '0'
    requirements:
        build:
            - something 1.0
        run:
            - something 1.0
    extra:
        final: true
 
--------------------------------------------------
    package:
        name: abc
        version: '1.0'
    build:
        string: '0'
    requirements:
        build:
            - something 2.0
        run:
            - something 2.0
    extra:
        final: true
 
--------------------------------------------------

 

OK, that's fun already. But wait, there's more!

We saw a warning about "finalization." That's conda-build trying to figure out exactly what packages are going to be installed for the build process. This is all determined before the build. Doing so allows us to tell you the actual output filenames before you build anything. Conda-build will still render recipes if some dependencies are unavailable, but you obviously won't be able to actually build that recipe.

!cat 02_python_version/meta.yaml
 
    package:
      name: abc
      version: 1.0
 
    requirements:
        build:
            - python
        run:
            - python
 
!cat 02_python_version/conda_build_config.yaml
 
    python:
          - 2.7
          - 3.5
 
print_yamls('02_python_version/')
 
    package:
      name: abc
      version: '1.0'
 
    build:
        string: py27hef4ac7c_0
    requirements:
      build:
          - readline 6.2 2
          - tk 8.5.18 0
          - pip 9.0.1 py27_1
          - setuptools 27.2.0 py27_0
          - openssl 1.0.2l 0
          - sqlite 3.13.0 0
          - python 2.7.13 0
          - wheel 0.29.0 py27_0
          - zlib 1.2.8 3
      run:
          - python >=2.7,<2.8
 
    extra:
      final: true
 
--------------------------------------------------
    package:
        name: abc
        version: '1.0'
    build:
        string: py35h6785551_0
    requirements:
        build:
          - readline 6.2 2
          - setuptools 27.2.0 py35_0
          - tk 8.5.18 0
          - openssl 1.0.2l 0
          - sqlite 3.13.0 0
          - python 3.5.3 1
          - pip 9.0.1 py35_1
          - xz 5.2.2 1
          - wheel 0.29.0 py35_0
          - zlib 1.2.8 3
        run:
          - python >=3.5,<3.6
    extra:
        final: true
 
--------------------------------------------------

 

Here you see that we have many more dependencies than we specified, and we have much more detailed pinning. This is a finalized recipe. It represents exactly the state that would be present for building (at least on the current platform).

So, this new way to pass versions is very fun, but there's a lot of code out there that uses the older way of doing things—environment variables and CLI arguments. Those still work. They override any conda_build_config.yaml settings.

# Setting environment variables overrides the conda_build_config.yaml. This preserves older, well-established behavior.
os.environ["CONDA_PY"] = "3.4"
print_yamls('02_python_version/')
del os.environ['CONDA_PY']
 
    package:
        name: abc
        version: '1.0'
    build:
        string: py34h31af026_0
        requirements: build:
          - readline 6.2 2
          - python 3.4.5 0
          - setuptools 27.2.0 py34_0
          - tk 8.5.18 0
          - openssl 1.0.2l 0
          - sqlite 3.13.0 0
          - pip 9.0.1 py34_1
          - xz 5.2.2 1
          - wheel 0.29.0 py34_0
          - zlib 1.2.8 3
     run:
          - python >=3.4,<3.5
    extra:
        final: true
 
--------------------------------------------------

# Passing python as an argument (CLI or to the API) also overrides conda_build_config.yaml
print_yamls('02_python_version/', python="3.6")
 
 
    package:
        name: abc
        version: '1.0'
    build:
        string: py36hd0a5620_0
    requirements:
        build:
          - readline 6.2 2
          - wheel 0.29.0 py36_0
          - tk 8.5.18 0
          - python 3.6.1 2
          - openssl 1.0.2l 0
          - sqlite 3.13.0 0
          - pip 9.0.1 py36_1
          - xz 5.2.2 1
          - setuptools 27.2.0 py36_0
          - zlib 1.2.8 3
        run:
          - python >=3.6,<3.7
 
    extra:
        final: true
 
--------------------------------------------------

 

Wait a minute—what is that h7d013e7 gobbledygook in the build/string field?

Conda-build 3 aims to generalize pinning/constraints. Such constraints differentiate a package. For example, in the past, we have had things like py27np111 in filenames. This is the same idea, just generalized. Since we can't readily put every possible constraint into the filename, we have kept the old ones, but added the hash as a general solution.

There's more information about what goes into a hash at: https://conda.io/docs/building/variants.html#differentiating-packages-built-with-different-variants

Let's take a look at how to inspect the hash contents of a built package.

outputs = api.build('02_python_version/', python="3.6",
                    anaconda_upload=False)
pkg_file = outputs[0]
print(pkg_file)
 
    The following NEW packages will be INSTALLED:
        openssl: 1.0.2l-0
        pip: 9.0.1-py36_1
        python: 3.6.1-2
        readline: 6.2-2
        setuptools: 27.2.0-py36_0
        sqlite: 3.13.0-0 tk: 8.5.18-0
        wheel: 0.29.0-py36_0
        xz: 5.2.2-1
        zlib: 1.2.8-3
 
    source tree in: /Users/msarahan/miniconda3/conda-bld/abc_1498787283909/work
    Attempting to finalize metadata for abc
 
    INFO:conda_build.metadata:Attempting to finalize metadata for abc
 
    BUILD START: ['abc-1.0-py36hd0a5620_0.tar.bz2']
    Packaging abc
 
    INFO:conda_build.build:Packaging abc
 
    The following NEW packages will be INSTALLED:
        openssl: 1.0.2l-0
        pip: 9.0.1-py36_1
        python: 3.6.1-2
        readline: 6.2-2
        setuptools: 27.2.0-py36_0
        sqlite: 3.13.0-0 tk: 8.5.18-0
        wheel: 0.29.0-py36_0
        xz: 5.2.2-1
        zlib: 1.2.8-3
 
    Packaging abc-1.0-py36hd0a5620_0
 
    INFO:conda_build.build:Packaging abc-1.0-py36hd0a5620_0
 
    number of files: 0
    Fixing permissions
    Fixing permissions
    updating: abc-1.0-py36hd0a5620_0.tar.bz2
    Nothing to test for: /Users/msarahan/miniconda3/conda-bld/osx-64/abc-1.0-py36hd0a5620_0.tar.bz2
    # Automatic uploading is disabled
    # If you want to upload package(s) to anaconda.org later, type:
 
    anaconda upload /Users/msarahan/miniconda3/conda-bld/osx-64/abc-1.0-py36hd0a5620_0.tar.bz2
 
    # To have conda build upload to anaconda.org automatically, use
    # $ conda config --set anaconda_upload yes
 
    anaconda_upload is not set. Not uploading wheels: []
    /Users/msarahan/miniconda3/conda-bld/osx-64/abc-1.0-py36hd0a5620_0.tar.bz2

# Using command line here just to show you that this command exists.
!conda inspect hash-inputs ~/miniconda3/conda-bld/osx-64/abc-1.0-py36hd0a5620_0.tar.bz2
 
    Package abc-1.0-py36hd0a5620_0 does not include recipe. Full hash information is not reproducible.
    WARNING:conda_build.inspect:Package abc-1.0-py36hd0a5620_0 does not include recipe. Full hash information is not reproducible.
    {'abc-1.0-py36hd0a5620_0': {'files': [],
                                'recipe': {'requirements': {'build': ['openssl '
                                                                      '1.0.2l 0',
                                                                      'pip 9.0.1 '
                                                                      'py36_1',
                                                                      'python '
                                                                      '3.6.1 2',
                                                                      'readline '
                                                                      '6.2 2',
                                                                      'setuptools '
                                                                      '27.2.0 '
                                                                      'py36_0',
                                                                      'sqlite '
                                                                      '3.13.0 0',
                                                                      'tk 8.5.18 0',
                                                                      'wheel '
                                                                      '0.29.0 '
                                                                      'py36_0',
                                                                      'xz 5.2.2 1',
                                                                      'zlib 1.2.8 '
                                                                      '3'],
                                                            'run': ['python '
                                                                    '>=3.6,<3.7']}}}}

 

pin_run_as_build is a special extra key in the config file. It is a generalization of the x.x concept that existed for numpy since 2015. There's more information at: https://conda.io/docs/building/variants.html#customizing-compatibility

Each x indicates another level of pinning in the output recipe. Let's take a look at how we can control the relationship of these constraints. Before now you could certainly accomplish pinning, it just took more work. Now you can define your pinning expressions, and then change your target versions in just one config file.

!cat 05_compatible/meta.yaml
 
    package:
      name: compatible
      version: 1.0
 
    requirements:
      build:
        - libpng
    run:
        - {{ pin_compatible('libpng') }}

 

This is effectively saying "add a runtime libpng constraint that follows conda-build's default behavior, relative to the version of libpng that was used at build time."

pin_compatible is a new helper function available to you in meta.yaml. The default behavior is: exact version match lower bound ("x.x.x.x.x.x.x"), next major version upper bound ("x").

print_yamls('05_compatible/')
 
    package:
      name: compatible
      version: '1.0'
    build:
      string: h3d53989_0
    requirements:
      build:
        - libpng 1.6.27 0
        - zlib 1.2.8 3
      run:
        - libpng >=1.6.27,<2
     extra:
      final: true
 
--------------------------------------------------

 

These constraints are completely customizable with pinning expressions:

!cat 06_compatible_custom/meta.yaml
 
    package:
      name: compatible
      version: 1.0
 
    requirements:
      build:
        - libpng
      run:
        - {{ pin_compatible('libpng', max_pin='x.x') }}
 
print_yamls('06_compatible_custom/')
 
    package:
      name: compatible
      version: '1.0'
    build:
      string: ha6c6d66_0
    requirements:
      build:
        - libpng 1.6.27 0
        - zlib 1.2.8 3
      run:
        - libpng >=1.6.27,<1.7
    extra:
      final: true
 
--------------------------------------------------

 

Finally, you can also manually specify version bounds. These supersede any relative constraints.

!cat 07_compatible_custom_lower_upper/meta.yaml
 
    package:
      name: compatible
      version: 1.0
 
    requirements:
      build:
        - libpng
      run:
        - {{ pin_compatible('libpng', min_pin=None, upper_bound='5.0') }}
 
print_yamls('07_compatible_custom_lower_upper/')
 
    package:
      name: compatible
      version: '1.0'
    build:
      string: heb31dda_0
    requirements:
      build:
        - libpng 1.6.27 0
        - zlib 1.2.8 3
      run:
        - libpng <5.0
    extra:
      final: true
 
--------------------------------------------------

 

Much of the development of conda-build 3 has been inspired by improving the compiler toolchain situation. Conda-build 3 adds special support for more dynamic specification of compilers.

!cat 08_compiler/meta.yaml
 
    package:
      name: cross
      version: 1.0
 
    requirements:
      build:
        - {{ compiler('c') }}

 

By replacing any actual compiler with this jinja2 function, we're free to swap in different compilers based on the contents of the conda_build_config.yaml file (or other variant configuration). Rather than saying "I need gcc," we are saying "I need a C compiler."

By doing so, recipes are much more dynamic, and conda-build also helps to keep your recipes in line with respect to runtimes. We're also free to keep compilation and linking flags associated with specific "compiler" packages—allowing us to build against potentially multiple configurations (Release, Debug?). With cross compilers, we could also build for other platforms.

!cat 09_cross/meta.yaml
 
    package:
      name: cross
      version: 1.0
 
    requirements:
      build:
        - {{ compiler('c') }}

# But, by adding in a base compiler name, and target platforms, we can make a build matrix
# This is not magic, the compiler packages must already exist. Conda-build is only following a naming scheme.
!cat 09_cross/conda_build_config.yaml
 
 
    c_compiler:
      - gcc
    target_platform:
      - linux-64
      - linux-cos5-64
      - linux-aarch64
 
print_yamls('09_cross/')
 
    Returning non-final recipe for cross-1.0-0; one or more dependencies was unsatisfiable:
    Build: gcc_linux-64
    Host: None
    WARNING:conda_build.render:
    Returning non-final recipe for cross-1.0-0; one or more dependencies was unsatisfiable:
    Build: gcc_linux-64
    Host: None
    Returning non-final recipe for cross-1.0-0; one or more dependencies was unsatisfiable:
    Build: gcc_linux-cos5-64
    Host: None
    WARNING:conda_build.render:Returning non-final recipe for cross-1.0-0; one or more dependencies was unsatisfiable:
    Build: gcc_linux-cos5-64
    Host: None
    Returning non-final recipe for cross-1.0-0; one or more dependencies was unsatisfiable:
    Build: gcc_linux-aarch64
    Host: None
    WARNING:conda_build.render:Returning non-final recipe for cross-1.0-0; one or more dependencies was unsatisfiable:
    Build: gcc_linux-aarch64
    Host: None
 
 
    package:
      name: cross
      version: '1.0'
    build:
      string: '0'
    requirements:
      build:
        - gcc_linux-64
    extra:
      final: true
 
--------------------------------------------------
    package:
      name: cross
      version: '1.0'
    build:
      string: '0'
    requirements:
      build:
        - gcc_linux-cos5-64
    extra:
      final: true
 
--------------------------------------------------
    package:
      name: cross
      version: '1.0'
    build:
      string: '0'
    requirements:
      build:
        - gcc_linux-aarch64
    extra:
      final: true
 
--------------------------------------------------

 

Finally, it is frequently a problem to remember to add runtime dependencies. Sometimes the recipe author is not entirely familiar with the lower level code and has no idea about runtime dependencies. Other times, it's just a pain to keep versions of runtime dependencies in line. Conda-build 3 introduces a way of storing the required runtime dependencies on the package providing the dependency at build time.

For example, using g++ in a non-static configuration will require that the end-user have a sufficiently new libstdc++ runtime library available at runtime. Many people don't currently include this in their recipes. Sometimes the system libstdc++ is adequate, but often not. By imposing the downstream dependency, we can make sure that people don't forget the runtime dependency.

# First, a package that provides some library.
# When anyone uses this library, they need to include the appropriate runtime.
 
!cat 10_runtimes/uses_run_exports/meta.yaml
 
 
    package:
      name: package_has_run_exports
      version: 1.0
    build:
      run_exports:
        - {{ pin_compatible('bzip2') }}
    requirements:
      build:
        - bzip2

# This is the simple downstream package that uses the library provided in the previous recipe.
!cat 10_runtimes/consumes_exports/meta.yaml
 
    package:
      name: package_consuming_run_exports
      version: 1.0
    requirements:
      build:
        - package_has_run_exports

# Let's build the former package first.
api.build('10_runtimes/uses_run_exports', anaconda_upload=False)
 
 
    The following NEW packages will be INSTALLED:
        bzip2: 1.0.6-3
    source tree in: /Users/msarahan/miniconda3/conda-bld/package_has_run_exports_1498787302719/work
    Attempting to finalize metadata for package_has_run_exports
 
    INFO:conda_build.metadata:Attempting to finalize metadata for package_has_run_exports
 
    BUILD START: ['package_has_run_exports-1.0-hcc78ab3_0.tar.bz2']
    Packaging package_has_run_exports
 
    INFO:conda_build.build:Packaging package_has_run_exports
 
    The following NEW packages will be INSTALLED:
        bzip2: 1.0.6-3
    number of files: 0
    Fixing permissions
    Fixing permissions
    updating: package_has_run_exports-1.0-hcc78ab3_0.tar.bz2
    Nothing to test for: /Users/msarahan/miniconda3/conda-bld/osx-64/package_has_run_exports-1.0-hcc78ab3_0.tar.bz2
    # Automatic uploading is disabled
    # If you want to upload package(s) to anaconda.org later, type:
 
    anaconda upload /Users/msarahan/miniconda3/conda-bld/osx-64/package_has_run_exports-1.0-hcc78ab3_0.tar.bz2
     
    # To have conda build upload to anaconda.org automatically, use
    # $ conda config --set anaconda_upload yes
 
    anaconda_upload is not set. Not uploading wheels: []
 
 
    ['/Users/msarahan/miniconda3/conda-bld/osx-64/package_has_run_exports-1.0-hcc78ab3_0.tar.bz2']
 
 
print_yamls('10_runtimes/consumes_exports')
 
    package:
      name: package_consuming_run_exports
      version: '1.0'
    build:
      string: h8346d2f_0
    requirements:
      build:
        - package_has_run_exports 1.0 hcc78ab3_0
      run:
        - bzip2 >=1.0.6,<2
    extra:
      final: true
 
--------------------------------------------------

 

In the above recipe, note that bzip2 has been added as a runtime dependency, and is pinned according to conda-build's default pin_compatible scheme. This behavior can be overridden in recipes if necessary, but we hope it will prove useful.

Catalin George Festila: Python Qt4 - part 004.

$
0
0
Another tutorial about PyQt4 with QLCDNumber widget displays a number with LCD-like digits.
This tutorial will show you how to deal with this widget.
First, you need to know more about QLCDNumber, so take a look here.
The first example is very simple and will show just one digit, see:
import sys
from PyQt4.QtCore import *
from PyQt4.QtGui import *

class Digit(QWidget):

def __init__(self, parent=None):
QWidget.__init__(self, parent)
self.setWindowTitle("One digit")
lcd = QLCDNumber(self)

app = QApplication(sys.argv)
ls = Digit()
ls.show()
sys.exit(app.exec_())
Now, the next step is to send data to this digit.
One good example is with one slider.
The position of the slider will be send to the QLCDNumber.
How can do that? Will need a vbox to put the QLCDNumber and the slider and then using signal and slot.
Let's see the example:
import sys
from PyQt4.QtCore import *
from PyQt4.QtGui import *
class Digit(QWidget):

def __init__(self, parent=None):
QWidget.__init__(self, parent)
#make widgets
self.setWindowTitle("One digit with slider")
lcd = QLCDNumber(self)
slider = QSlider(Qt.Horizontal, self)
#set layout variable vbox
vbox = QVBoxLayout()
#add widgests
vbox.addWidget(lcd)
vbox.addWidget(slider)
#set the vbox to layout
self.setLayout(vbox)
#create signal to slot
self.connect(slider, SIGNAL("valueChanged(int)"),lcd, SLOT("display(int)"))
self.resize(200, 170)
if __name__ == '__main__':
app = QApplication(sys.argv)
ls = Digit()
ls.show()
sys.exit(app.exec_())
In the source code in the example, you can see the comments that mark the steps of creating and executing the python script.
Let's try another example with a digital clock:
import sys
from PyQt4 import QtCore, QtGui

class digital_clock(QtGui.QLCDNumber):
def __init__(self, parent=None):
super(digital_clock, self).__init__(parent)
self.setSegmentStyle(QtGui.QLCDNumber.Filled)
#the defaul is 5 , change to 8 for seconds
self.setDigitCount(5)
self.setWindowTitle("Digital Clock")
self.resize(200, 70)
timer = QtCore.QTimer(self)
timer.timeout.connect(self.showTime)
timer.start(1000)
self.showTime()

def showTime(self):
time = QtCore.QTime.currentTime()
text = time.toString('hh:mm')
#if you setDigitsCount to 8
#uncomment the next line of code
#text = time.toString('hh:mm:ss')
if (time.second() % 2) == 0:
text = text[:2] + ' ' + text[3:]
self.display(text)

if __name__ == '__main__':
app = QtGui.QApplication(sys.argv)
clock = digital_clock()
clock.show()
sys.exit(app.exec_())
If you want to see seconds, then you need to set the digit count of the LCD to 8 ( it's 5 by default) of setDigitCount.
Also you need to uncomment this line of code: text = time.toString('hh:mm:ss') and comment the old one.
You can solve multiple issues with this widget, like: stopwatch, timer, clock down timer ...

Catalin George Festila: Python Qt4 - part 005.

$
0
0
Another example with PyQt4 that allow to see images over internet.
Here's another simple example with PyQt4 that allows you to view images on the internet.
You can use any image on the internet to display with this python script.
This example is done in two steps:
  • take a single image from the internet - httplib python module;
  • displaying - PyQt4 python module
from PyQt4 import QtGui
import sys
import httplib

def getTempPNG():
conn = httplib.HTTPConnection("www.meteoromania.ro")
conn.request("GET", "/sateliti/img/id814/id814_2017070718.png")
return conn.getresponse().read()

def main():
app = QtGui.QApplication(sys.argv)
png = getTempPNG()
pixmap = QtGui.QPixmap()
pixmap.loadFromData(png)
label = QtGui.QLabel()
label.setPixmap(pixmap)
label.setWindowTitle('METEOSAT-10 Thermal Infrared Channel 10.8 micrometers Glowing temperature')
label.show()
app.exec_()

if __name__ == '__main__':
main()
The result can be see in this screenshot:

Simple is Better Than Complex: Ask Vitor #3: Mocking Emails

$
0
0

Phillip Ahereza asks:

I’m writing unit tests for my django app and I was wondering if there are any packages for mocking email or if there is any way I could mock sending and receiving of emails.


Answer

Basically what Django does when you run your test suite is switch your EMAIL_BACKEND to django.core.mail.backends.locmem.EmailBackend, so to prevent your application from sending emails during the tests execution.

While using this backend, all emails sent are stored in the outbox attribute of the django.core.mail module.

Let’s see one example on how you can use it to test the email outputs and so on.

urls.py

fromdjango.conf.urlsimporturlfrommysite.coreimportviewsurlpatterns=[url(r'^send/$',views.send,name='send'),]

views.py

fromdjango.httpimportHttpResponsefromdjango.core.mailimportsend_maildefsend(request):email=request.GET.get('email')ifemailand'@'inemail:body='This is a test message sent to {}.'.format(email)send_mail('Hello',body,'noreply@mysite.com',[email,])returnHttpResponse('<h1>Sent.</h1>')else:returnHttpResponse('<h1>No email was sent.</h1>')

This is a simple view that expects a querystring parameter named email with a valid email address. If the email value fulfill our view requirements, an email is sent to this address. If the email is invalid or no email is provided at all, the view just return a message for the user.

Now, let’s write some unit tests for it. First, a test case in case no email is provided:

tests.py

fromdjango.coreimportmailfromdjango.core.urlresolversimportreversefromdjango.testimportTestCaseclassEmailTest(TestCase):deftest_no_email_sent(self):self.response=self.client.get(reverse('send'))self.assertEqual(len(mail.outbox),0)

We can also write a test case and inspect the email contents:

tests.py

fromdjango.coreimportmailfromdjango.core.urlresolversimportreversefromdjango.testimportTestCaseclassEmailTest(TestCase):deftest_no_email_sent(self):self.response=self.client.get(reverse('send'))self.assertEqual(len(mail.outbox),0)deftest_email_sent(self):self.response=self.client.get(reverse('send'),{'email':'test@example.com'})self.assertEqual(len(mail.outbox),1)self.assertEqual(mail.outbox[0].body,'This is a test message sent to test@example.com.')

Final Remarks

Simple as that! You can find more information about the testing tools and email services on the official documentation: Django Testing Tools - Email Services.

You can also find the source code used in this post on Github: github.com/sibtc/askvitor.

NumFOCUS: FEniCS Conference 2017 in Review

$
0
0
Jack Hale of FEniCS Project was kind enough to share his summary of the recent 2017 FEniCS conference, for which NumFOCUS provided some travel funds. Read on below! The FEniCS Conference 2017 brought together 82 participants from around the world for a conference on the FEniCS Project, a NumFOCUS fiscally sponsored project. FEniCS is a open-source computing […]

Weekly Python StackOverflow Report: (lxxxi) stackoverflow python report

$
0
0

A. Jesse Jiryu Davis: Join Me and PyLadies NYC For a PyGotham Proposal Workshop

Python Insider: Python 3.6.2rc2 is now available for testing

$
0
0
Python 3.6.2rc2 is now available.   Python 3.6.2rc2 is the second release candidate for the next maintenance release of Python 3.6.  See the change log for Python 3.6.2rc2 for the changes included in this release and see the What’s New In Python 3.6 document for more information about features included in the 3.6 series.

You can download Python 3.6.2rc2 here.  3.6.2 is now planned for final release on 2017-07-17 with the next maintenance release expected to follow in about 3 months.

Wesley Chun: Creating events in Google Calendar from Python

$
0
0
NOTE: The code covered in this blogpost is also available in a video walkthrough here.

UPDATE (Jan 2016): Tweaked the code to support oauth2client.tools.run_flow() which deprecates oauth2client.tools.run(). You can read more about that change and migration steps here.

Introduction

So far in this series of blogposts covering authorized Google APIs, we've used Python code to access Google Drive and Gmail. Today, we're going to demonstrate the Google Calendar API. While Google Calendar, and calendaring in general, have been around for a long time and are fairly stable, it's somewhat of a mystery as to why so few developers create good calendar integrations, whether it be with Google Calendar, or other systems. We'll try to show it isn't necessarily difficult and hopefully motivate some of you out there to add a calendaring feature in your next mobile or web app.

Earlier posts (link 1, link 2) demonstrated the structure and "how-to" use Google APIs in general, so more recent posts, including this one, focus on solutions and apps, and use of specific APIs. Once you review the earlier material, you're ready to start with authorization scopes then see how to use the API itself.

    Google Calendar API Scopes

    Below are the Google Calendar API scopes of authorization. There are only a pair (at the time of this writing): read-only and read/write. As usual, use the most restrictive scope you possibly can yet still allowing your app to do its work. This makes your app more secure and may prevent inadvertently going over any quotas, or accessing, destroying, or corrupting data. Also, users are less hesitant to install your app if it asks only for more restricted access to their calendars. However, it's likely that in order to really use the API to its fullest, you will probably have to ask for read-write so that you can add, update, or delete events in their calendars.
    • 'https://www.googleapis.com/auth/calendar.readonly'— Read-only access to calendar
    • 'https://www.googleapis.com/auth/calendar' — Read/write access to calendar

    Using the Google Calendar API

    We're going to create a sample Python script that inserts a new event into your Google Calendar. Since this requires modifying your calendar, you need the read/write scope above. The API name is 'calendar' which is currently on version 3, so here's the call to apiclient.discovery.build() you'll use:
    GCAL = discovery.build('calendar', 'v3',
    http=creds.authorize(Http()))
    Note that all lines of code above that is predominantly boilerplate (that was explained in earlier posts and videos). Anyway, we've got an established service endpoint with build(), we need to come up with the data to create a calendar event with, at the very least, an event name plus start and end times.

    Timezone or offset required

    The API requires either a timezone or a GMT offset, the number of hours your timezone is away from Coordinated Universal Time (UTC, more commonly known as GMT). The format is +/-HH:MM away from UTC. For example, Pacific Daylight Time (PDT, also known as Mountain Standard Time, or MST), is "-07:00," or seven hours behind UTC while Nepal Standard Time (NST [or NPT to avoid confusion with Newfoundland Standard Time]), is "+05:45," or five hours and forty-five minutes ahead of UTC. Also, the offset must be in RFC 3339 format, which implements the specifications of ISO 8601 for the Internet. Timestamps look like the following in the required format: "YYYY-MM-DDTHH:MM:SS±HH:MM". For example, September 15, 2015 at 7 PM PDT is represented by this string: "2015-09-15T19:00:00-07:00".

    If you wish to avoid offsets and would rather use timezone names instead, see the next post in this series (link at bottom).

    The script in this post uses the PDT timezone, so we set the GMT_OFF variable to "-07:00". The EVENT body will hold the event name, and start and end times suffixed with the GMT offset:
    GMT_OFF = '-07:00'    # PDT/MST/GMT-7
    EVENT = {
    'summary': 'Dinner with friends',
    'start': {'dateTime': '2015-09-15T19:00:00%s' % GMT_OFF},
    'end': {'dateTime': '2015-09-15T22:00:00%s' % GMT_OFF},
    }
    Use the insert() method of the events() service to add the event. As expected, one required parameter is the ID of the calendar to insert the event into. A special value of 'primary' has been set aside for the currently authenticated user. The other required parameter is the event body. In our request, we also ask the Calendar API to send email notifications to the guests, and that's done by passing in the sendNotifications flag with a True value. Our call to the API looks like this:
    e = GCAL.events().insert(calendarId='primary',
    sendNotifications=True, body=EVENT).execute()
    The one remaining thing is to confirm that the calendar event was created successfully. We do that by checking the return value — it should be an Event object with all the details we passed in a moment ago:
    print('''*** %r event added:
    Start: %s
    End: %s''' % (e['summary'].encode('utf-8'),
    e['start']['dateTime'], e['end']['dateTime']))
    Now, if you really want some proof the event was created, one of the fields that's created is a link to the calendar event. We don't use it in the code, but you can... just use e['htmlLink'].

    Regardless, that's pretty much the entire script save for the OAuth2 code that we're so familiar with from previous posts. The script is posted below in its entirety, and if you run it, depending on the date/times you use, you'll see something like this:
    $ python gcal_insert.py
    *** 'Dinner with friends' event added:
    Start: 2015-09-15T19:00:00-07:00
    End: 2015-09-15T22:00:00-07:00
    It also works with Python 3 with one slight nit/difference being the "b" prefix on from the event name due to converting from Unicode to bytes:
    *** b'Dinner with friends' event added:

    Conclusion

    There can be much more to adding a calendar event, such as events that repeat with a recurrence rule, the ability to add attachments for an event, such as a party invitation or a PDF of the show tickets. For more on what you can do when creating events, take a look at the docs for events().insert() as well as the corresponding developer guide. All of the docs for the Google Calendar API can be found here. Also be sure to check out the companion video for this code sample. That's it!

    Below is the entire script for your convenience which runs on both Python 2 and Python 3 (unmodified!):
    from __future__ import print_function
    from apiclient import discovery
    from httplib2 import Http
    from oauth2client import file, client, tools

    SCOPES = 'https://www.googleapis.com/auth/calendar'
    store = file.Storage('storage.json')
    creds = store.get()
    if not creds or creds.invalid:
    flow = client.flow_from_clientsecrets('client_secret.json', SCOPES)
    creds = tools.run_flow(flow, store)
    GCAL = discovery.build('calendar', 'v3', http=creds.authorize(Http()))

    GMT_OFF = '-07:00' # PDT/MST/GMT-7
    EVENT = {
    'summary': 'Dinner with friends',
    'start': {'dateTime': '2015-09-15T19:00:00%s' % GMT_OFF},
    'end': {'dateTime': '2015-09-15T22:00:00%s' % GMT_OFF},
    'attendees': [
    {'email': 'friend1@example.com'},
    {'email': 'friend2@example.com'},
    ],
    }

    e = GCAL.events().insert(calendarId='primary',
    sendNotifications=True, body=EVENT).execute()

    print('''*** %r event added:
    Start: %s
    End: %s''' % (e['summary'].encode('utf-8'),
    e['start']['dateTime'], e['end']['dateTime']))
    You can now customize this code for your own needs, for a mobile frontend, a server-side backend, or to access other Google APIs. If you want to see another example of using the Calendar API (listing the next 10 events in your calendar), check out the Python Quickstart example or its equivalent in Java (server-side, Android), iOS (Objective-C, Swift), C#/.NET, PHP, Ruby, JavaScript (client-side, Node.js), or Go. That's it... hope you find these code samples useful in helping you get started with the Calendar API!

    Code challenge

    To test your skills and challenge yourself, try creating recurring events (such as when you expect to receive your paycheck), events with attachments, or perhaps editing existing events. UPDATE (Jul 2017): If you're ready for the next step, we cover the first and last of those choices in our follow-up post.

    Import Python: Import Python 132 - Python and Assembly, Python Books, PyPy and more

    $
    0
    0
    Worthy Read

    We can’t just copy/paste ASM directly into a Python script. Instead, python reads the machine code in as a bytearray of shellcode where the binary data is represented by a hex value where the \x represents the offset.
    asm

    Discover the best books in every Python book category.
    books

    Get the free report
    sponsor

    Hydrogen is a package for Atom editor that allows interactive programming in different languages. I would call it a bridge, or even a sweet spot, between Jupyter Notebooks and a full blown IDE (like IntelliJ IDEA).
    IDE

    closures

    With the concurrent.futures library, Python gives you a simple way to tweak your scripts to use all the CPU cores in your computer at once. Don’t be afraid to try it out. Once you get the hang of it, it’s as simple as using a for loop, but often a whole lot faster.
    futures

    sublime

    Imagine one day wherein we had a neural network which could watch movies and generate it’s own movies, or listen to songs and compose new ones. This network would learn from what it sees and hears without you explicitly telling it. This way of letting a neural network learn is known as unsupervised learning.
    machine learning

    pypy

    In the spirit of increasing the Python community’s inclusivity and diversity, PyBay is pleased to announce this year’s conference scholarships. Our scholarships are designed to support members of our community for whom attending PyBay would present a financial challenge.
    conference

    We’ve discussed a few reasons to use Jupyter Notebooks as a GIS user. From visualization of your data to the recent integration with the ArcGIS platform, Jupyter Notebooks are quickly becoming a crucial component of GIS and data science workflows. In spite of these benefits, coming up to speed and getting comfortable with Jupyter Notebooks can be a daunting task for a new user. There is nuance to the way Jupyter Notebooks operate that can take some time to comprehend.
    jupyter

    CheatSheets for Pandas, numpy etc.
    machine learning

    You want to count the number of times each thing occurs in your list. How do you do it? We'll talk about the many ways to solve this problem, concluding with the most Pythonic way: Counter.
    counter

    code snippets

    core-python


    Jobs

    India



    Projects

    Susanoo - 89 Stars, 12 Fork
    A REST API security testing framework.

    ssl_logger - 88 Stars, 13 Fork
    Decrypts and logs a process's SSL traffic.

    csvtotable - 19 Stars, 0 Fork
    Simple command-line utility to convert CSV files to searchable and sortable HTML table.

    CORStest - 13 Stars, 4 Fork
    A simple CORS misconfigurations checker

    unarcrypto - 5 Stars, 0 Fork
    unarcrypto is an educational tool to depict cryptography usage in zip, rar and 7zip archives

    sukhoi - 4 Stars, 0 Fork
    Minimalist and powerful Web Crawler.

    vault - 0 Stars, 0 Fork
    Python password manager

    Viewing all 23144 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>