Quantcast
Channel: Planet Python
Viewing all 22875 articles
Browse latest View live

tryexceptpass: Designing Continuous Build Systems: Handling Webhooks with Sanic

$
0
0

After covering how to design a build pipeline and define build directives in the continuous builds series, it’s time to look at handling events from a code repository.

As internet standards evolved over the years, the HTTP protocol has become more prevalent. It’s easier to route, simpler to implement and even more reliable. This ubiquity makes it easier for applications that traverse or live on the public internet to communicate with each other. As a result of this, the idea of webhooks came to be as an “event-over-http” mechanism.

With GitHub as the repository management platform, we have the advantage of using their webhook system to communicate user actions over the internet and into our build pipeline.


Roberto Alsina: Episodio 5: Muchos Pythons

$
0
0

Una pseudo secuela de "Puede Fallar" mostrando varias cosas:

  • Anvil: una manera de hacer aplicaciones web full-stack con Python!
  • Skulpt

Y mucho más!

La aplicación que muestro en el video: En Anvil

El código: lo podés clonar

Detalle: "lo de twitter" quedó reducido a un botón adentro de la aplicación, pero sirvió como disparador :-)

Python Engineering at Microsoft: What’s New for Python in Visual Studio (16.3 Preview 2)

$
0
0

Today, we are releasing Visual Studio 2019 (16.3 Preview 2), which contains an updated testing experience for Python developersWe are happy to announce that the popular Python testing framework pytest is now supported. Additionally, we have re-worked the unittest experience for Python users in this release. 

Continue reading to learn more about how you can enable and configure pytest and/or unittest for your development environment. What’s even better is that each testing framework is supported in both project mode and in Open Folder scenarios.   

 

Enabling and Configuring Testing for Projects

Configuring and working with Python tests in Visual Studio is easier than ever before.

 For users who are new to the testing experience within Visual Studio 2019 for Python projects, right-click on the project name and select the ‘Properties’ option. This option opens the project designer, which allows you to configure tests by going to the ‘Test’ tab.

From this tab, simply click the ’Test Framework’ dropdown box to select the testing framework you wish to use:

Walkthrough of New Testing Features in VS2019 16.3

  • For unittest, we use the project’s root directory for test discovery. This is a default setting that can be modified to include the path to the folder that contains your tests (if your tests are included in a sub-directory). We also use the unittest framework’s default pattern for test filenames (this also can be modified if you use a different file naming system for your test files). Prior to this release, unittest discovery was automatically initiated for the user. Now, the user is required to manually configure testing.
  • For pytest, you can specify a .ini configuration file which contains test filename patterns in addition to many other testing options.

Once you select and save your changes in the window, test discovery is initiated in the Test Explorer. If the Test Explorer is not open, navigate to the Toolbar and select Test > Windows > Test Explorer. Test Discovery can take up to 60 seconds, after which the test discovery process will end.

Once in the Test Explorer window, you have the ability to re-run your tests (by clicking the ‘Run All’ button or pressing CTRL + R,A) as well as view the status of your test runs.  Additionally, you can see the total number of tests your project contains and the duration of test runs:

to show the status of test runs in VS

If you wish to keep working while tests are running in the background but want to monitor the progress of your test run, you can go to the Output window and choose ‘Show output from: Tests’:

show the user how to select the tests output

We have also made it simple for users with pre-existing projects that contain test files to quickly continue working with their code in Visual Studio 2019. When you open a project that contains testing configuration files (e.g. a .ini file for pytest), but you have not installed or enabled pytest, you will be prompted to install the necessary packages and configure them for the Python environment in which you are working:

install and enable pytest infobar

For open folder scenarios (described below), these informational bars will also be triggered if you have not configured your workspace for pytest or unittest.

 

Configuring Tests for Open Folder Scenarios

In this release of Visual Studio 2019, users can configure tests to work in our popular open folder scenario.

To configure and enable tests, navigate to the Solution explorer, click the “Show All Files” icon to show all files in the current folder and select the PythonSettings.json file within the ‘Local Settings’ folder. (If this file doesn’t exist, create one in ‘Local Settings’ folder). Next, add the field TestFramework: “pytest” to your settings file or TestFramework: “unittest” depending on the testing framework you wish to use.

json settings file for tests

 

For the unittest framework, If UnitTestRootDirectory and/or UnitTestPattern are not specified in PythonSettings.json, they are added and assigned default values of “.” and “test*.py”, respectively.

As in project mode, editing and saving any file triggers test discovery for the test framework that you specified. If you already have the Test Explorer window open, clicking CTRL + R,A also triggers discovery.

Note: If your folder contains a ‘src’ directory which is separate from the folder that contains your tests, you’ll need to specify the path to the src folder in your PythonSettings.json with the setting SearchPaths:

add searchpaths settings to json file

 

 

Debugging Tests

In this latest release, we’ve updated test debugging to use our new ptvsd 4 debugger, which is faster and more reliable than ptvsd 3. We’ve added an option so that you can use the legacy debugger if you run into issues. To enable it, go to Tools > Options > Python > Debugging > Use Legacy Debugger and check the box to enable it.

As in previous releases, if you wish to debug a test, set an initial breakpoint in your code, then right-click the test (or a selection) in Test Explorer and select Debug Selected Tests. Visual Studio starts the Python debugger as it would for application code.

debug python tests in VS2019 16.3 P2

Note: everyone that tries to debug a test will find that the debugging does not automatically end when the debugging session completes. This is a known issue and the current workaround is to click ‘Stop Debugging’ (Shift + F5).

There’s also the ‘Test Detail Summary’ view that allows you to see the Stack Trace of a failed test which makes troubleshooting failed tests even easier. To access this view, simply click on the test within the Test Explorer that you wish to inspect and the ‘Test Detail Summary’ window will appear.

Test Detail Summary View for VS2019 Preview 2

 

Try it Out!

Be sure to download Visual Studio 2019 (16.3 Preview 2), install the Python Workload, and give feedback or view a list of existing issues on our GitHub repo.

The post What’s New for Python in Visual Studio (16.3 Preview 2) appeared first on Python.

Python Bytes: #143 Spike the robot, powered by Python!

Real Python: An Effective Python Environment: Making Yourself at Home

$
0
0

When you’re first learning a new programming language, a lot of your time and effort go into understanding the syntax, code style, and built-in tooling. This is just as true for Python as it is for any other language. Once you gain enough familiarity to be comfortable with the ins and outs of Python, you can start to invest time into building a Python environment that will foster your productivity.

Your shell is more than a prebuilt program provided to you as-is. It’s a framework on which you can build an ecosystem. This ecosystem will come to fit your needs so that you can spend less time fiddling and more time thinking about the next big project you’re working on.

Although no two developers have the same setup, there are a number of choices everyone faces when cultivating their Python environment. It’s important to understand each of these decisions and the options available to you!

By the end of this article, you’ll be able to answer questions like:

  • What shell should I use? What terminal should I use?
  • What version(s) of Python can I use?
  • How do I manage dependencies for different projects?
  • How can I make my tools do some of the work for me?

Once you’ve answered these questions for yourself, you can embark on the journey of creating a Python environment to call your very own. Let’s get started!

Free Bonus:Click here to get access to a free 5-day class that shows you how to avoid common dependency management issues with tools like Pip, PyPI, Virtualenv, and requirements files.

Shells

When you use a command-line interface (CLI), you execute commands and see their output. A shell is a program that provides this (usually text-based) interface to you. Shells often provide their own programming language that you can use to manipulate files, install software, and so on.

There are more unique shells than could be reasonably listed here, so you’ll see a few prominent ones. Others differ in syntax or enhanced features, but they generally provide the same core functionality.

Unix Shells

Unix is a family of operating systems first developed in the early days of computing. Unix’s popularity has lasted through today, heavily inspiring Linux and macOS. The first shells were developed for use with Unix and Unix-like operating systems.

Bourne Shell (sh)

The Bourne shell—developed by Stephen Bourne for Bell Labs in 1979—was one of the first to incorporate the idea of environment variables, conditionals, and loops. It has provided a strong basis for many other shells in use today and is still available on most systems at /bin/sh.

Bourne-Again Shell (bash)

Built on the success of the original Bourne shell, bash introduced improved user-interaction features. With bash, you get Tab completion, history, and wildcard searching for commands and paths. The bash programming language provides more data types, like arrays.

Z Shell (zsh)

zsh combines many of the best features from other shells along with a few of its own tricks into one experience. zsh offers autocorrection of misspelled commands, shorthand for manipulating multiple files, and advanced options for customizing your command prompt.

zsh also provides a framework for deep customization. The Oh My Zsh project supplies a rich set of themes and plugins, and is often used hand in hand with zsh.

macOS will ship with zsh as its default shell starting with Catalina, speaking to the shell’s popularity. Consider acquainting yourself with zsh now so that you’ll be comfortable with it going forward.

Xonsh

If you’re feeling particularly adventurous, you can give Xonsh a try. Xonsh is a shell that combines some features of other Unix-like shells with the power of Python syntax. You can use the language you already know to accomplish tasks on your filesystem and so on.

Although Xonsh is powerful, it lacks the compatibility other shells tend to share. You might not be able to run many existing shell scripts in Xonsh as a result. If you find that you like Xonsh, but compatibility is a concern, then you can use Xonsh as a supplement to your activities in a more widely used shell.

Windows Shells

Similarly to Unix-like operating systems, Windows also offers a number of options when it comes to shells. The shells offered in Windows vary in features and syntax, so you may need to try several to find one you like best.

CMD (cmd.exe)

CMD (short for “command”) is the default CLI shell for Windows. It’s the predecessor to COMMAND.COM, the shell built for DOS (disk operating system).

Because DOS and Unix evolved independently, the commands and syntax in CMD are markedly different from shells built for Unix-like systems. However, CMD still provides the same core functionality for browsing and manipulating files, running commands, and viewing output.

PowerShell

PowerShell was released in 2006 and also ships with Windows. It provides Unix-like aliases for most commands, so if you’re coming to Windows from macOS or Linux or have to use both, then PowerShell might be great for you.

PowerShell is vastly more powerful than CMD. With PowerShell you can:

  • Pipe the output of one command to the input of another
  • Automate tasks through the exposed Windows management features
  • Use a scripting language to accomplish complex tasks

Windows Subsystem for Linux

Microsoft has released a Windows subsystem for Linux (WSL) for running Linux directly on Windows. If you install WSL, then you can use zsh, bash, or any other Unix-like shell. If you want strong compatibility across your Windows and macOS or Linux environments, then be sure to give WSL a try. You may also consider dual-booting Linux and Windows as an alternative.

See this comparison of command shells for exhaustive coverage.

Terminal Emulators

Early developers used terminals to interact with a central mainframe computer. These were devices with a keyboard and a screen or printer that would display computed output.

Today, computers are portable and don’t require separate devices to interact with them, but the terminology still remains. Whereas a shell provides the prompt and interpreter you use to interface with text-based CLI tools, a terminal emulator (often shortened to terminal) is the graphical application you run to access the shell.

Almost any terminal you encounter should support the same basic features:

  • Text colors for syntax highlighting in your code or distinguishing meaningful text in command output
  • Scrolling for viewing an earlier command or its output
  • Copy/paste for transferring text in or out of the shell from other programs
  • Tabs for running multiple programs at once or separating your work into different sessions

macOS Terminals

The terminal options available for macOS are all full-featured, differing mostly in aesthetics and specific integrations with other tools.

Terminal

If you’re using a Mac, then you may have used the built-in Terminal app before. Terminal supports all the usual functionality, and you can also customize the color scheme and a few hotkeys. It’s a nice enough tool if you don’t need many bells and whistles. You can find the Terminal app in Applications → Utilities → Terminal on macOS.

iTerm2

I’ve been a long-time user of iTerm2. It takes the developer experience on Mac a step further, offering a much wider palette of customization and productivity options that enable you to:

  • Integrate with the shell to jump quickly to previously entered commands
  • Create custom search term highlighting in the output from commands
  • Open URLs and files displayed in the terminal with Cmd+click

A Python API ships with the latest versions of iTerm2, so you can even improve your Python chops by developing more intricate customizations!

iTerm2 is popular enough to enjoy first-class integration with several other tools, and has a healthy community building plugins and so on. It’s a good choice because of its more frequent release cycle compared to Terminal, which only updates as often as macOS does.

Hyper

A relative newcomer, Hyper is a terminal built on Electron, a framework for building desktop applications using web technologies. Electron apps are heavily customizable because they’re “just JavaScript” under the hood. You can create any functionality that you can write the JavaScript for.

On the other hand, JavaScript is a high-level programming language and won’t always perform as well as low-level languages like Objective-C or Swift. Be mindful of the plugins you install or create!

Windows Terminals

As with the shell options, Windows terminal options vary widely in utility. Some are tightly bound to a particular shell as well.

Command Prompt

Command Prompt is the graphical application you can use to work with CMD in Windows. Like CMD, it’s a bare-bones tool for getting a few small things done. Although Command Prompt and CMD provide fewer features than other alternatives, you can be confident that they’ll be available on nearly every Windows installation and in a consistent place.

Cygwin

Cygwin is a third-party suite of tools for Windows that provides a Unix-like wrapper. This was my preferred setup when I was in Windows, but you may consider adopting the Windows Subsystem for Linux as it receives more traction and polish.

Windows Terminal

Microsoft recently released an open source terminal for Windows 10 called Windows Terminal. It lets you work in CMD, PowerShell, and even the Windows Subsystem for Linux. If you need to do a fair amount of shell work in Windows, then Windows Terminal is probably your best bet! Windows Terminal is still in late beta, so it doesn’t ship with Windows yet. Check the documentation for instructions on getting access.

Python Version Management

With your choice of terminal and shell made, you can focus your attention on your Python environment specifically.

Something you’ll eventually run into is the need to run multiple versions of Python. Projects you use may only run on certain versions, or you may be interested in creating a project that supports multiple Python versions. You can configure your Python environment to accommodate these needs.

macOS and most Unix operating systems come with a version of Python installed by default. This is often called the system Python. The system Python works just fine, but it’s usually out of date. As of this writing, macOS High Sierra still ships with Python 2.7.10 as the system Python.

Note: You’ll almost certainly want to install the latest version of Python at a minimum, so you’ll have at least two versions of Python already.

It’s important that you leave the system Python as the default, because many parts of the system rely on the default Python being a specific version. This is one of many great reasons to customize your Python environment!

How do you navigate this? Tooling is here to help.

pyenv

pyenv is a mature tool for installing and managing multiple Python versions. I recommend installing it with Homebrew. After you’ve got pyenv installed, you can install multiple versions of Python into your Python environment with a few short commands:

$ pyenv versions
* system$ python --version
Python 2.7.10$ pyenv install 3.7.3  # This may take some time$ pyenv versions
* system  3.7.3

You can manage which Python you’d like to use in your current session, globally, or on a per-project basis as well. pyenv will make the python command point to whichever Python you specify. Note that none of these overrides the default system Python for other applications, so you’re safe to use them however they work best for you within your Python environment:

$ pyenv global 3.7.3
$ pyenv versions
  system* 3.7.3 (set by /Users/dhillard/.pyenv/version)$ pyenv local3.7.3
$ pyenv versions
  system* 3.7.3 (set by /Users/dhillard/myproj/.python-version)$ pyenv shell 3.7.3
$ pyenv versions
  system* 3.7.3 (set by PYENV_VERSION environment variable)$ python --version
Python 3.7.3

Because I use a specific version of Python for work, the latest version of Python for personal projects, and multiple versions for testing open source projects, pyenv has proven to be a fairly smooth way for me to manage all these different versions within my own Python environment. See Managing Multiple Python Versions with pyenv for a detailed overview of the tool.

conda

If you’re in the data science community, you might already be using Anaconda (or Miniconda). Anaconda is a sort of one-stop shop for data science software that supports more than just Python.

If you don’t need the data science packages or all the things that come pre-packaged with Anaconda, pyenv might be a better lightweight solution for you. Managing Python versions is pretty similar in each, though. You can install Python versions similarly to pyenv, using the conda command:

$ conda install python=3.7.3

You’ll see a verbose list of all the dependent software conda will install, and it will ask you to confirm.

conda doesn’t have a way to set the “default” Python version or even a good way to see which versions of Python you’ve installed. Rather, it hinges on the concept of “environments,” which you can read more about in the following sections.

Virtual Environments

Now you know how to manage multiple Python versions. Often, you’ll be working on multiple projects that need the same Python version.

Because each project has its own set of dependencies, it’s a good practice to avoid mixing them. If all the dependencies are installed together in a single Python environment, then it will be difficult to discern where each one came from. In the worst cases, two different projects may depend on two different versions of a package, but with Python you can only have one version of a package installed at one time. What a mess!

Enter virtual environments. You can think of a virtual environment as a carbon copy of a base version of Python. If you’ve installed Python 3.7.3, for example, then you can create many virtual environments based off of it. When you install a package in a virtual environment, you do it in isolation from other Python environments you may have. Each virtual environment has its own copy of the python executable.

Tip: Most virtual environment tooling provides a way to update your shell’s command prompt to show the current active virtual environment. Make sure to do this if you frequently switch between projects so you’re sure you’re working inside the correct virtual environment.

venv

venv ships with Python versions 3.3+. You can create virtual environments just by passing it a path at which to store the environment’s python, installed packages, and so on:

$ python -m venv ~/.virtualenvs/my-env

You activate a virtual environment by sourcing its activate script:

$source ~/.virtualenvs/my-env/bin/activate

You exit the virtual environment using the deactivate command, which is made available when you activate the virtual environment:

(my-env)$ deactivate

venv is built on the wonderful work and successes of the independent virtualenv project. virtualenv still provides a few interesting features of its own, but venv is nice because it provides the utility of virtual environments without requiring you to install additional software. You can probably get pretty far with it if you’re working mostly in a single Python version in your Python environment.

If you’re already managing multiple Python versions (or plan to), then it could make sense to integrate with that tooling to simplify the process of making new virtual environments with specific versions of Python. The pyenv and conda ecosystems both provide ways to specify the Python version to use when you create new virtual environments, covered in the following sections.

pyenv-virtualenv

If you’re using pyenv, then pyenv-virtualenv enhances pyenv with a subcommand for managing virtual environments:

// Create virtual environment$ pyenv virtualenv 3.7.3 my-env

// Activate virtual environment$ pyenv activate my-env

// Exit virtual environment(my-env)$ pyenv deactivate

I switch contexts between a large handful of projects on a day-to-day basis. As a result, I have at least a dozen distinct virtual environments to manage in my Python environment. What’s really nice about pyenv-virtualenv is that you can configure a virtual environment using the pyenv local command and have pyenv-virtualenv auto-activate the right environments as you switch to different directories:

$ pyenv virtualenv 3.7.3 proj1
$ pyenv virtualenv 3.7.3 proj2
$cd /Users/dhillard/proj1
$ pyenv local proj1
(proj1)$cd ../proj2
$ pyenv local proj2
(proj2)$ pyenv versions
  system  3.7.3  3.7.3/envs/proj1  3.7.3/envs/proj2  proj1* proj2 (set by /Users/dhillard/proj2/.python-version)

pyenv and pyenv-virtualenv have provided a particularly fluid workflow in my Python environment.

conda

You saw earlier that conda treats environments, rather than Python versions, as the main method of working. conda has built-in support for managing virtual environments:

// Create virtual environment$ conda create --name my-env python=3.7.3

// Activate virtual environment$ conda activate my-env

// Exit virtual environment(my-env)$ conda deactivate

conda will install the specified version of Python if it isn’t already installed, so you don’t have to run conda install python=3.7.3 first.

pipenv

pipenv is a relatively new tool that seeks to combine package management (more on this in a moment) with virtual environment management. It mostly abstracts the virtual environment management from you, which can be great as long as things go smoothly:

$cd /Users/dhillard/myproj

// Create virtual environment$ pipenv install
Creating a virtualenv for this project…Pipfile: /Users/dhillard/myproj/PipfileUsing /path/to/pipenv/python3.7 (3.7.3) to create virtualenv…✔ Successfully created virtual environment!Virtualenv location: /Users/dhillard/.local/share/virtualenvs/myproj-nAbMEAt0Creating a Pipfile for this project…Pipfile.lock not found, creating…Locking [dev-packages] dependencies…Locking [packages] dependencies…Updated Pipfile.lock (a65489)!Installing dependencies from Pipfile.lock (a65489)…🐍   ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 0/0 — 00:00:00To activate this project's virtualenv, run pipenv shell.Alternatively, run a command inside the virtualenv with pipenv run.// Activate virtual environment (uses a subshell)$ pipenv shell
Launching subshell in virtual environment… . /Users/dhillard/.local/share/virtualenvs/test-nAbMEAt0/bin/activate// Exit virtual environment (by exiting subshell)(myproj-nAbMEAt0)$exit

pipenv does all the heavy lifting of creating a virtual environment and activating it for you. If you look carefully, you can see that it also creates a file called Pipfile. After you first run pipenv install, this file contains just a few things:

[[source]]name="pypi"url="https://pypi.org/simple"verify_ssl=true[dev-packages][packages][requires]python_version="3.7"

In particular, note that it shows python_version = "3.7". By default, pipenv creates a virtual Python environment using the same Python version it was installed under. If you want to use a different Python version, then you can create the Pipfile yourself before running pipenv install and specify the version you want. If you have pyenv installed, then pipenv will use it to install the specified Python version if necessary.

Abstracting virtual environment management is a noble goal of pipenv, but it does get hung up with hard-to-read errors occasionally. Give it a try, but don’t worry if you feel confused or overwhelmed by it. The tool, documentation, and community will grow and improve around it as it matures.

To get an in-depth introduction to virtual environments, be sure to read Python Virtual Environments: A Primer.

Package Management

For many of the projects you work on, you’ll probably need some number of third-party packages. Those packages may have their own dependencies in turn. In the early days of Python, using packages involved manually downloading files and pointing Python at them. Today, we’re fortunate to have a variety of package management tools available to us.

Most package managers work in tandem with virtual environments, isolating the packages you install in one Python environment from another. Using the two together is where you really start to see the power of the tools available to you.

pip

pip (pip installs packages) has been the de facto standard for package management in Python for several years. It was heavily inspired by an earlier tool called easy_install. Python incorporated pip into the standard distribution starting in version 3.4. pip automates the process of downloading packages and making Python aware of them.

If you have multiple virtual environments, then you can see that they’re isolated by installing a few packages in one:

$ pyenv virtualenv 3.7.3 proj1
$ pyenv activate proj1
(proj1)$ pip list
Package    Version---------- ---------pip        19.1.1setuptools 40.8.0(proj1)$ python -m pip install requests
Collecting requests  Downloading .../requests-2.22.0-py2.py3-none-any.whl (57kB)    100% |████████████████████████████████| 61kB 2.2MB/sCollecting chardet<3.1.0,>=3.0.2 (from requests)  Downloading .../chardet-3.0.4-py2.py3-none-any.whl (133kB)    100% |████████████████████████████████| 143kB 1.7MB/sCollecting certifi>=2017.4.17 (from requests)  Downloading .../certifi-2019.6.16-py2.py3-none-any.whl (157kB)    100% |████████████████████████████████| 163kB 6.0MB/sCollecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests)  Downloading .../urllib3-1.25.3-py2.py3-none-any.whl (150kB)    100% |████████████████████████████████| 153kB 1.7MB/sCollecting idna<2.9,>=2.5 (from requests)  Downloading .../idna-2.8-py2.py3-none-any.whl (58kB)    100% |████████████████████████████████| 61kB 26.6MB/sInstalling collected packages: chardet, certifi, urllib3, idna, requestsSuccessfully installed packages$ pip list
Package    Version---------- ---------certifi    2019.6.16chardet    3.0.4idna       2.8pip        19.1.1requests   2.22.0setuptools 40.8.0urllib3    1.25.3

pip installed requests, along with several packages it depends on. pip list shows you all the currently installed packages and their versions.

Warning: You can uninstall packages using pip uninstall requests, for example, but this will only uninstall requests—not any of its dependencies.

A common way to specify project dependencies for pip is with a requirements.txt file. Each line in the file specifies a package name and, optionally, the version to install:

scipy==1.3.0requests==2.22.0

You can then run python -m pip install -r requirements.txt to install all of the specified dependencies at once. For more on pip, see What is Pip? A Guide for New Pythonistas.

pipenv

pipenv has most of the same basic operations as pip but thinks about packages a bit differently. Remember the Pipfile that pipenv creates? When you install a package, pipenv adds that package to Pipfile and also adds more detailed information to a new lock file called Pipfile.lock. Lock files act as a snapshot of the precise set of packages installed, including direct dependencies as well as their sub-dependencies.

You can see pipenv sorting out the package management when you install a package:

$ pipenv install requests
Installing requests…Adding requests to Pipfile's [packages]…✔ Installation SucceededPipfile.lock (444a6d) out of date, updating to (a65489)…Locking [dev-packages] dependencies…Locking [packages] dependencies…✔ Success!Updated Pipfile.lock (444a6d)!Installing dependencies from Pipfile.lock (444a6d)…🐍   ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 5/5 — 00:00:00

pipenv will use this lock file, if present, to install the same set of packages. You can ensure that you always have the same set of working dependencies in any Python environment you create using this approach.

pipenv also distinguishes between development dependencies and production (regular) dependencies. You may need some tools during development, such as black or flake8, that you don’t need when you run your application in production. You can specify that a package is for development when you install it:

$ pipenv install --dev flake8
Installing flake8…Adding flake8 to Pipfile's [dev-packages]…✔ Installation Succeeded...

pipenv install (without any arguments) will only install your production packages by default, but you can tell it to install development dependencies as well with pipenv install --dev.

poetry

poetry addresses additional facets of package management, including creating and publishing your own packages. After installing poetry, you can use it to create a new project:

$ poetry new myproj
Created package myproj in myproj$ ls myproj/
README.rst    myproj    pyproject.toml    tests

Similarly to how pipenv creates the Pipfile, poetry creates a pyproject.toml file. This recent standard contains metadata about the project as well as dependency versions:

[tool.poetry]name="myproj"version="0.1.0"description=""authors=["Dane Hillard <github@danehillard.com>"][tool.poetry.dependencies]python="^3.7"[tool.poetry.dev-dependencies]pytest="^3.0"[build-system]requires=["poetry>=0.12"]build-backend="poetry.masonry.api"

You can install packages with poetry add (or as development dependencies with poetry add --dev):

$ poetry add requests
Using version ^2.22 for requestsUpdating dependenciesResolving dependencies... (0.2s)Writing lock filePackage operations: 5 installs, 0 updates, 0 removals  - Installing certifi (2019.6.16)  - Installing chardet (3.0.4)  - Installing idna (2.8)  - Installing urllib3 (1.25.3)  - Installing requests (2.22.0)

poetry also maintains a lock file, and it has a benefit over pipenv because it keeps track of which packages are subdependencies. As a result, you can uninstall requestsand its dependencies with poetry remove requests.

conda

With conda, you can use pip to install packages as usual, but you can also use conda install to install packages from different channels , which are collections of packages provided by Anaconda or other providers. To install requests from the conda-forge channel, you can run conda install -c conda-forge requests.

Learn more about package management in conda in Setting Up Python for Machine Learning on Windows.

Python Interpreters

If you’re interested in further customization of your Python environment, you can choose the command line experience you have when interacting with Python. The Python interpreter provides a read-eval-print loop (REPL), which is what comes up when you type python with no arguments in your shell:

>>>
Python 3.7.3 (default, Jun 17 2019, 14:09:05)[Clang 10.0.1 (clang-1001.0.46.4)] on darwinType "help", "copyright", "credits" or "license" for more information.>>> 2+24>>> exit()

The REPL reads what you type, evaluates it as Python code, and prints the result. Then it waits to do it all over again. This is about as much as the default Python REPL provides, which is sufficient for a good portion of typical work.

IPython

Like Anaconda, IPython is a suite of tools supporting more than just Python, but one of its main features is an alternative Python REPL. IPython’s REPL numbers each command and explicitly labels each command’s input and output. After installing IPython (python -m pip install ipython), you can run the ipython command in place of the python command to use the IPython REPL:

>>>
Python 3.7.3Type 'copyright', 'credits' or 'license' for more informationIPython 6.0.0.dev -- An enhanced Interactive Python. Type '?' for help.In [1]: 2+2Out[1]: 4In [2]: print("Hello!")Out[2]: Hello!

IPython also supports Tab completion, more powerful help features, and strong integration with other tooling such as matplotlib for graphing. IPython provided the foundation for Jupyter, and both have been used extensively in the data science community because of their integration with other tools.

The IPython REPL is highly configurable too, so while it falls just shy of being a full development environment, it can still be a boon to your productivity. Its built-in and customizable magic commands are worth checking out.

bpython

bpython is another alternative REPL that provides inline syntax highlighting, tab completion, and even auto-suggestions as you type. It provides quite a few of the quick benefits of IPython without altering the interface much. Without the weight of the integrations and so on, bpython might be good to add to your repertoire for a while to see how it improves your use of the REPL.

Text Editors

You spend a third of your life sleeping, so it makes sense to invest in a great bed. As a developer, you spend a great deal of your time reading and writing code, so it follows that you should invest time in setting up your Python environment’s text editor just the way you like it.

Each editor offers a different set of key bindings and model for manipulating text. Some require a mouse to interact with them effectively, whereas others can be controlled with only the keyboard. Some people consider their choice of text editor and customizations some of the most personal decisions they make!

There are so many options to choose from in this arena, so I won’t attempt to cover it in detail here. Check out Python IDEs and Code Editors (Guide) for a broad overview. A good strategy is to find a simple, small text editor for quick changes and a full-featured IDE for more involved work. Vim and PyCharm, respectively, are my editors of choice.

Python Environment Tips and Tricks

Once you’ve made the big decisions about your Python environment, the rest of the road is paved with little tweaks to make your life a little easier. These tweaks each save minutes or seconds alone, but they collectively save you hours of time.

Making a certain activity easier reduces your cognitive load so you can focus on the task at hand instead of the logistics surrounding it. If you notice yourself performing an action over and over, then consider automating it. Use this wonderful chart from XKCD to determine if it’s worth automating a particular task.

Here are a few final tips.

Know your current virtual environment

As mentioned earlier, it’s a great idea to display the active Python version or virtual environment in your command prompt. Most tools will do this for you, but if not (or if you want to customize the prompt), the value is usually contained in the VIRTUAL_ENV environment variable.

Disable unnecessary, temporary files

Have you ever noticed *.pyc files all over your project directories? These files are pre-compiled Python bytecode—they help Python start your application faster. In production, these are a great idea because they’ll give you some performance gain. During local development, however, they’re rarely useful. Set PYTHONDONTWRITEBYTECODE=1 to disable this behavior. If you find use cases for them later, then you can easily remove this from your Python environment.

Customize your Python interpreter

You can affect how the REPL behaves using a startup file. Python will read this startup file and execute the code it contains before entering the REPL. Set the PYTHONSTARTUP environment variable to the path of your startup file. (Mine’s at ~/.pystartup.) If you’d like to hit Up for command history and Tab for completion like your shell provides, then give this startup file a try.

Conclusion

You learned about many facets of the typical Python environment. Armed with this knowledge, you can:

  • Choose a terminal with the aesthetics and enhanced features you like
  • Choose a shell with as many (or as few) customization options as you need
  • Manage multiple versions of Python on your system
  • Manage multiple projects that use a single version of Python, using virtual Python environments
  • Install packages in your virtual environments
  • Choose a REPL that suits your interactive coding needs

When you’ve got your Python environment just so, I hope you’ll share screenshots, screencasts, or blog posts about your perfect setup ✨


[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Catalin George Festila: Python 3.7.3 : Using the flask - part 014.

$
0
0
Today I worked on YouTube search with flask and Google A.P.I. project. The source code is simple to understand and you can test any A.P.I. from google using this way. I created a new Google project with YouTube A.P.I. version 3 and with the A.P.I. key. I use this key to connect with flask python module. I used the isodate python module. You can see the source code on my GitHub repo named flask_yt

Kushal Das: DMARC, mailing list, yahoo and gmail

$
0
0

Last Friday late night, I suddenly started getting a lot of bounced emails from the dgplug mailing list. Within a few minutes, I received more than a thousand emails. A quick look inside of the bounce emails showed the following error:

Unauthenticated email from yahoo.in is not accepted due to     domain's 550-5.7.1 DMARC policy.

Gmail was blocking one person’s email via our list (he sent that using Yahoo and from his iPhone client), and caused more than 1700 gmail users in our list in the nomail block unless they check for the mailman’s email and click to reenable their membership.

I panicked for a couple of minutes and then started manually clicking on the mailman2 UI for each user to unblock them. However, that was too many clicks. Suddenly I remembered the suggestion from Saptak about using JavaScript to do this kind of work. Even though I tried to learn JavaScript 4 times and failed happily, I thought a bit searching on Duckduckgo and search/replace within example code can help me out.

$checkboxes = document.querySelectorAll("[name$=nomail]");
for (var i=0; i<$checkboxes.length; i++)  {
      $checkboxes[i].checked = false;
}

The above small script helped me to uncheck 50 email addresses at a time, and I managed to unblock the email addresses without spending too many hours clicking.

I have also modified the mailing list DMARC settings as suggested. Now, have to wait and see if this happens again.

Kushal Das: FreeBSD on a Thinkpad 230

$
0
0

From the first-ever conference I attended, I started picking up many tools and habits from other participants, speakers, and friends. It is still the same with many new conferences I go to, by meeting new people and learning about new technologies, or sometimes about technologies which are not so new.

I use Linux as my primary operating system at home over 15 years now, getting a good Internet connection helped to make it happen. It was the same for my servers too. I do run different distributions, depending on the kind of work that needs to be done. When I go to many language-specific or general technical conferences, I do always find some discussions related to which distribution is good for what. However, whenever I met Trouble aka Philip Paeps, his lines are always amusing, but, also making questions about how FreeBSD differs from Linux in every possible way. I had FreeBSD running in few VMs at home, which is okay to have an understanding of the basics. To know more in details, I decided to move my primary site https://kushaldas.in over FreeBSD around a year ago. Till now it is running fine, and as a simple static website, there is not much to do anyway.

Last week during rootconf I again met trouble and a bunch of old friends (who all are regular in the FreeBSD world). They helped me to understand how to upgrade to the latest release, and showed a few more tricks. I wanted to use it more to become familiar with command line tools.

I got a X230 laptop with CoreBoot and installed FreeBSD 12 on it. The necessary installation went very smooth. Then, I decided to have KDE as a desktop environment on it. I followed the guide. However, I failed to get sddm working. Even though friends at #freebsd and #bsdin tried to help/debug, only in the evening, we figured out that I was missing some critical Xorg related packages.

# pkg install xf86-input-keyboard xf86-input-mouse xf86-input-synaptics xf86-input-libinput xauth

Also, remember to upgrade the system to the latest.

# freeebsd-update fetch
# freebsd-update install

I have installed the regular applications I use in my standard Linux boxes, including FocusWriter. Remember to install the hunspell package and corresponding dictionary for your language, if you want to have spell checking in FocusWriter.

I am writing this blog post in the same tool in the FreeBSD system. I completely forgot how good the old X series ThinkPad keyboards were feeling nice to type on this. I will keep using this system for learning purpose and hoping to write more in the coming days.


Kushal Das: A few bits on tmux

$
0
0

I don’t remember when I started using tmux, but, the move from screen to tmux was quick. I have it installed on all of my systems and VMs. Though I never bothered to have a proper configuration file, it also means that I never used any plugin or other particular configuration. I don’t prefer to use plugins for command line applications much (for example in Vim), as not all systems will have those plugins installed.

tmux screenshot

While working on OSCP labs, I wished for a way to keep my tmux sessions logged, as that helps to create the report or remember the process in the future. Later, I found IPPSec has a video on their usage of tmux, which includes a plugin to log tmux sessions. I decided to give it a go and created my tmux.conf based on the same.

$ cat ~/.tmux.conf

# Remap prefix to screens
set -g prefix C-a
bind C-a send-prefix
unbind C-b

# Other values
set -g history-limit 100000
set -g allow-rename off

# Join windows
bind-key j command-prompt -p "Join pane from:"  "join-pane -s '%%'"
bind-key s command-prompt -p "Send pane to:"  "join-pane -t '%%'"

# Search mode VI
set-window-option -g mode-keys vi
bind -T copy-mode-vi y send-keys -X copy-pipe-and-cancel 'xclip -in -selection clipboard'

# git clone https://github.com/tmux-plugins/tmux-logging
run-shell /opt/tmux-logging/logging.tmux

Following IPPSec, I have also converted the prefix key to Ctrl+a. This change helps to use another tmux in a remote system, where the default Ctrl+b works as the prefix key. I have also moved the default search to vi mode. You can start selecting text by pressing the spacebar, and then press y to copy text to the primary system clipboard, and helps to copy text easily to any other GUI application. This feature requires xclip tool from the system packages.

I have also cloned the tmux-logging repository under /opt.

On Twitter, Justin Garrison pointed me to his super amazing awesome-tmux repository, which contains many many useful resources on tmux. I spent a good part of reading The Tao of tmux.

Now, my tmux is working the way I want on my Linux systems and also on the FreeBSD laptop (where I am writing this blog post). Btw, if you search tmux cheatsheet on https://duckduckgo.com it provides a lovely view of the cheat sheet in the result page.

Kushal Das: Two new federated services for dgplug

$
0
0

Last week we started providing two new services for the dgplug members.

Mastodon service at toots

Having our own instance was in the plan for time in my head. I had personal Mastodon account before, but, that instance went down and never tried to find a new home. This time, I think if a few of us (the sys-admins from the group) use this as a regular thing for ourselves, it will be much easier to maintain than depending on someone else.

Any regular dgplug member can get an invite link for the instance by joining the IRC channel and asking for the same.

Blogging platform

In our summer training, we spend much time talking about communication, a significant part is focused on blogging. We suggest https://wordpress.com as a starting place to the newcomers. At the same time, we found that some people had trouble as they were more focused on the themes or other options than writing regularly.

I looked at https://write.as before, but as I saw https://people.kernel.org is now running on WriteFreely, I thought of giving it a try. The UI is much more straightforward, and as it uses Markdown by default, that is a plus point for our use case. Though most of this year’s participants already have their own blogs, we don’t have many people at the beginning, which helps as not too many support requests to us.

Just like the Mastodon instance, if you need a home for your blogs, come over to our IRC channel #dgplug on Freenode server, and ask for an account.

backup of the systems

This is the biggest question in providing the services in my mind. We set up the very initial backup systems, and we will see in the coming weeks how it stands. Maybe, we will take down the services, and try to restore everything from backup, and see how it goes.

Btw, if you want to follow me over Mastodon, then I am available at https://toots.dgplug.org/@kushal

Kushal Das: Highest used Python code in the Pentesting/Security world

$
0
0
python -c 'import pty;pty.spawn("/bin/bash")'

I think this is the highest used Python program in the land of Pentesting/Security, Almost every blog post or tutorial I read, they talk about the above-mentioned line to get a proper terminal after getting access to a minimal shell on a remote Linux server.

What does this code do?

We are calling the Python executable with -c and python statements inside of the double quote. -c executes the Python statements, and as we are running it as non-interactive mode, it parses the entire input before executing it.

The code we pass as the argument of the -c has two statements.

import pty
pty.spawn("/bin/bash")

pty is a Python module which defines operations related to the pseudo-terminal concept, it can create another process, and from the controlling terminal, it can read/write to the new process.

The pty.spawn function spawns a new process (/bin/bash in this case) and then connects IO of the new process to the parent/controlling process.

demo of getting bash

In most cases, even though you get access to bash using the way mentioned above, TAB completion is still not working. To enable it, press Ctrl+z to move the process to sleep, and then use the following command on your terminal.

stty raw -echo

stty changes terminal line settings and part of the GNU coreutils package. To read about all the options we set by using raw -echo, read the man page of stty.

Many years ago, I watched a documentary about Security firms showcasing offensive attacks, that was the first I saw them using Python scripts to send in the payload and exploit the remote systems. Now, I am using similar scripts in the lab to learn and having fun with Python. It is a new world for me, but, it also shows the diverse world we serve via Python.

Kushal Das: Setting up WKD

$
0
0

We fetch any GPG public key from the keyservers using the GPG fingerprint (or parts of it). This step is still a problematic one for most of us. As the servers may not be responding, or the key is missing (not pushed) to the server. Also, if we only have the email address, there is no easy way to download the corresponding GPG key.

Web Key Directory to rescue

The Web Key Directory comes to the picture. We use WKD to enable others to get our GPG keys for email addresses very easily. In simple terms:

The Web Key Directory is the HTTPS directory from which keys can be fetched.

Let us first see this in action:

gpg --auto-key-locate clear,wkd --locate-key mail@kushaldas.in

The above will fetch you the key for the email address, and you can also assume the person who owns the key also has access to the https://kushaldas.in server.

There are many available email clients, which will do this for you. For example Thunderbird/Enigmail 2.0 or Kmail version 5.6 onwards.

Setting up WKD for your domain

I was going through the steps mentioned in the GNUPG wiki, while weasel pointed to me to a Makefile to keep things even more straightforward.

all: update install

update:
        rm -rfv openpgpkey
        mkdir -v openpgpkey
        echo 'A85FF376759C994A8A1168D8D8219C8C43F6C5E1 mail@kushaldas.in' | /usr/lib/gnupg/gpg-wks-client -v --install-key
        chmod -v 0711 openpgpkey/kushaldas.in
        chmod -v 0711 openpgpkey/kushaldas.in/hu
        chmod -v 0644 openpgpkey/kushaldas.in/hu/*
        touch openpgpkey/kushaldas.in/policy

        ln -s kushaldas.in/hu openpgpkey/
        ln -s kushaldas.in/policy openpgpkey/

install: update
        rsync -Pravz --delete ./openpgpkey root@kushaldas.in:/usr/local/www/kushaldas.in/.well-known/

.PHONY: all update install

The above Makefile is using gpg-wks-client executable and also pushing the changes to the right directory on the server.

Email providers like protonmail already allow users to publish similar information. I hope this small Makefile will help you to set up your domain.

Kushal Das: Using signify tool for sign and verification

$
0
0

We generally use GNUPG for sign and verify files on our systems. There are other tools available to do so; some tools are particularly written only for this purpose. signify is one such tool from the OpenBSD land.

How to install signify?

pkg install signify

I used the above command to install the tool on my FreeBSD system, and you can install it in your Debian system too, the tool is called signify-openbsd as Debian already has another tool with the same name. signify is yet to be packaged for Fedora, if you are Fedora packager, you may want to package this one for all of us.

Creating a public/private key pair

signify -G -s atest.sec -p atest.pub -c "Test key for blog post"

The command will also ask for a password for the secret key. -c allows us to add a comment in our key files. The following is the content of the public keyfile.

untrusted comment: Test key for blog post public key 
RWRjWJ28QRKKQCXxYPqwbnOqgsLYQSwvqfa2WDpp0dRDQX2Ht6Xl4Vz4

As it is very small in size, you can even create a QR code for the same.

Signing a file

In our demo directory, we have a hello.txt file, and we can use the newly generated key to create a signature.

signify -S -s atest.sec -m hello.txt

This will create a hello.txt.sig file as the signature.

Verifying the signature

$ signify -V -p atest.pub -m hello.txt
Signature Verified

This assumes the signature file in the same directory. You can find the OpenBSD signature files under /usr/local/etc/signify (or in /etc/signify/ if you are on Debian).

To know more about the tool, read this paper.

Kushal Das: Setting up authorized v3 Onion services

$
0
0

Just like v2 Onion services, we can also set up client authorization for Onion services v3. In simple terms, when you have a client authorization setup on an Onion service, only the Tor clients with the private token can access the service. Using this, you can run services (without opening up any port in your system) and only selected people can access that service, that is also being inside of totally encrypted Tor network. Last month, I did a workshop in Rootconf about the same topic, but, I demoed v2 Onion services. In this blog post, I am going to show you how you can do the same with the latest v3 services.

Setting up the Onion service

We assume that we are already running nginx or apache on port 80 of the server. Add the following two lines at the end of the /etc/tor/torrc file of your server.

HiddenServiceDir /var/lib/tor/hidden_service/
HiddenServicePort 80 127.0.0.1:80

Then, restart the tor service.

systemctl restart tor

The above command will create the onion service at /var/lib/tor/hidden_service/ directory, and we can see the address from the hostname file.

cat /var/lib/tor/hidden_service/hostname 
cz2eqjwrned6s7zy3nrmkk3fjoudzhvu53ynq6gdny5efdj26zxf4bid.onion

It should also create a authorized_clients directory at the service directory.

Next, we will create keys of type x25519, and you can either use any of the following options to create the keys.

I used the Rust implementation, and I got the secret and the public key.

secret: "TIICFSKY2PECECM2LOA7XLKQKJWHYTN4WLRSIIJKQFCCL3K2II2Q"
public: "RO7N45JLVI5UXOLALOK4V22JLMMF5ZDC2W6DXVKIAU3C7FNIVROQ"

Now, we will use the public key to create a clientname.auth file in /var/lib/tor/hidden_service/authorized_clients/ directory, I chose the name kushal.auth.

descriptor:x25519:RO7N45JLVI5UXOLALOK4V22JLMMF5ZDC2W6DXVKIAU3C7FNIVROQ > /var/lib/tor/hidden_service/authorized_clients/kushal.auth

If you look closely, the file format is like below:

descriptor:x25519:public_key

Now, restart the tor service once again in the server.

systemctl restart tor

Setting up client authorization

The first step is to close down my Tor Browser as I will be manually editing the torrc file of the same. Then, I added the following line to the same file tor-browser_en-US/Browser/TorBrowser/Data/Tor/torrc.

ClientOnionAuthDir TorBrowser/Data/Tor/onion_auth

Next, we will create the directory.

mkdir tor-browser_en-US/Browser/TorBrowser/Data/Tor/onion_auth
chmod 0700 tor-browser_en-US/Browser/TorBrowser/Data/Tor/onion_auth

Then, add the following in kushal.auth_private file inside of the onion_auth directory.

cz2eqjwrned6s7zy3nrmkk3fjoudzhvu53ynq6gdny5efdj26zxf4bid:descriptor:x25519:TIICFSKY2PECECM2LOA7XLKQKJWHYTN4WLRSIIJKQFCCL3K2II2Q

The format of the file:

onion_address_56_chars:descriptor:x25519:private_key

Now, start the Tor Browser, and you should be able to visit the authorized Onion service at cz2eqjwrned6s7zy3nrmkk3fjoudzhvu53ynq6gdny5efdj26zxf4bid.onion.

Use case for students

If you want to demo your web project to a selected group of people, but, don't want to spend money to get a web server or VPS, Onion services is a great way to showcase your work to the world. With the authenticated services, you can choose whom all can view the site/service you are running.

Kushal Das: Using sops with Ansible for vars encryption

$
0
0

Sops is a secret management tool from Mozilla. According to the official Github page, it is defined as:

sops is an editor of encrypted files that supports YAML, JSON, ENV, INI and BINARY formats and encrypts with AWS KMS, GCP KMS, Azure Key Vault and PGP.

In this blog post, I am going to show you how you can use it along with GPG to encrypt secrets in YAML files for your Ansible roles/playbooks.

Installation

Download the official binary from the release page, or you can build and install from the source code.

Basic usage (manual)

First let us create a .sops.yaml in the git repository root directory. In this example, I am saying that I can encrypt any .yml using the two given GPG fingerprints.

creation_rules:
  # If vars file matchs "*.yml", then encrypt it more permissively.
    - path_regex: \.yml$
      pgp: "A85FF376759C994A8A1168D8D8219C8C43F6C5E1,2871635BE3B4E5C04F02B848C353BFE051D06C33"

Now to encrypt a file in place, I can use the following command:

sops -e -i host_vars/stg.dgplug.org.yml

If we open the file afterward, we will see something like below.

mysql_root_password: ENC[AES256_GCM,data:732TA7ps+qE=,iv:3azuZg4tqsLfe5IHDLJDKSUHmVk2c0g1Nl+oIcKOXRw=,tag:yD8iwmxmENww+waTs5Kzxw==,type:str]
mysql_user_password: ENC[AES256_GCM,data:YXBpO54=,iv:fQYATEWw4pv4lW5Ht8xiaBCliat8xdje5qdmb0Sff4Y=,tag:cncwg2Ops35l0lWegCSEJQ==,type:str]
mysql_user: ENC[AES256_GCM,data:cq/VgDlpRBxuHKM+cw==,iv:K+v6fkCIucMrMJ7pDRxFS/aHh0OCxqUcLJhZIgCsfA0=,tag:BD7l662OVOWRaHi2Rtw37g==,type:str]
mysql_db_name: ENC[AES256_GCM,data:hCgrKmU=,iv:/jnypeWdqUbIRy75q7OIODgZnaDpR3oTa0G/L8MRiZA=,tag:0k6cGNoDajUuKpKzrwQBaw==,type:str]
sops:
    kms: []
    gcp_kms: []
    azure_kv: []
    lastmodified: '2019-07-29T04:05:09Z'
    mac: ENC[AES256_GCM,data:qp9yV3qj0tYf/EaO0Q3JdlpPvm5WY4ev1zGCVNoo+Anm/esj0WHlR7M7SNg54xRTUwMhRRnirx7IsEC8EZW1lE+8DObnskemcXm93CJOBfVzQOX/RvCSR4rMp2FgBEPZADCDiba1X2K/9myU96lADME0nkdmX9YjhOLMFJ6ll4o=,iv:2WNKhl81FI/qw6mRKpK5wRYjqK16q1ASeCJYpEeuhj4=,tag:v4PlGT4ZmPUxj7aYIendVg==,type:str]
    pgp:
    -   created_at: '2019-07-29T04:05:06Z'
        enc: |-
            -----BEGIN PGP MESSAGE-----


            wcFMA/uCql0ybaddARAADkHqV/mEgVoTxVHkRKW3mjajReKaQ5Mz/VwMal3GsjKr
            8aGnghCjtgJE01wCBjrTfNKKmlf5JjMSFufy9pG0xE+OFOXt+pnJFDSro26QnPFG
            VlvkvnjTxw4gV/mIvxUXTT6rmijvQSHMXdyEGmyHu3kNprKuuN37xZ4SYSWg7pdO
            vee4DSOaw+XfdgYUF+YSEjKMZ+qevRtzJL4hg9SZvEsHMQObueMBywAc5pyR6LvS
            ZuAS5SS+cPXnyMJLemqfU7L+XMw2NMrmNYFXOWnMMny1Hez5fcmebJp+wjDqWJTX
            j9vJvXLuNAglFvL1yz2rNJYTb0mC9chLT7YxEa9Z9JHWezUB8ZYuOC6vRf18/Hz8
            e6Gd2sncl4rleCxvKZF9qECsmFzs4p7H5M+O31jnjWdnPBD8a84Du3wdeWiI5cRF
            d7/aUXEdGJQy7OVbzGE1alDOSIyDD2S73ou2du7s/79Wb11RwQV898OyGgmjWm0Y
            7hTsBiBXnayQdjtg6qKlvoWIn79PU0YmDYLujMiXDQPJLV3ta82dcK2B1yTCLuSO
            nGfFzNOSNemmniyEuMI16SrfYDsf9l/K3gec+TRNvEdc1GqO4gFblQWptPE7KIIC
            oBVMDLUJSpOf5yF7Izedy5wBb8ZmzJAvpchvTMUls2+En4ifYh90cauXxiP6edPS
            4AHkZMebg44aCefn4zMdKdUhbOFxbuA/4I3hQWLgmuJMFlcI4Orl5tsRjXwCfQ9S
            jOdGAUJ8usV9gT4IXj73WfN1JJHj7DTgoORXFtDMs2Yy/rPPR4H9msSL4reJGZLh
            dEAA
            =LoMY
            -----END PGP MESSAGE-----
        fp: A85FF376759C994A8A1168D8D8219C8C43F6C5E1
    -   created_at: '2019-07-29T04:05:06Z'
        enc: |-
            -----BEGIN PGP MESSAGE-----


            wcFMA0HZfc7pf7hDARAAX6xF6qP9KQJ5wLn0kv5WVf8HgOkk+2ziIuBH411hVEir
            oN4bwwnL+DEYZm2aFvZl5zQElua7nGkQK041DecPbOCCBqThv3QKVMy2jScG5tTj
            jxGgh8W/KxdwIY7teRaCRNDkT18IhtBc4SO2smJEtPeGIptvDHLjETBFbDnZeo6/
            PG4QIA1Rfvm14n1NR56oibWduwvb1wrm4tGYPDx8mVgfugxeIhoqlZ87VrivrN+2
            40S/rwWQ/aEVM1A8q19DGkXYVBcxQA1dGBrKLDPtiCq/LCSDNY4iuzMjzQuHPjgM
            bS0lWv8fvSp6iIZlB3eSRPW9Ia8tRJpEfTLS8jiHlcCZ4Vy3fW6EijBf00iSy5hP
            Y54TCERscXt1/UOW2ACYTNhMfuiO+WuG6Vjdns1NsSUVazqxmf+kBMwl06/HyUwL
            KAYTOB2hipYUm6mlpSBfDgBKjQq8dgvxlWP0Ay0of0p4ZzFv7MepYkJA+gwb0hUt
            rui9GVE/uys8W8buYqfM2ABzIG4GrH3rELh8eW8oPwlu7rgS7YGhyog2xabJyQnj
            BZ65wwu5TzMq+n5v1+878teOzqqD/F+6X5dw0jF95bDHKdA64JR/Uxlj75sp59GH
            e/+3UDm0UwSILDrYJkcVaTnrt3wPjQAw4ynKZuN+k6KvmDCkGquHNaM+2hqvWq/S
            4AHkgpZHzmU14QmfJy7RN2HPduHhSeCh4LrhMzngJuJyH72G4IPlP4WwPJUhrKMt
            UI+Azl61rIZEm6n82oWbY1gIrrIygWLg1OSD/0Ly6KbO4/W1NipWU53w4nAo7abh
            3gkA
            =YXu1
            -----END PGP MESSAGE-----
        fp: 2871635BE3B4E5C04F02B848C353BFE051D06C33
    unencrypted_suffix: _unencrypted
    version: 3.1.1


If you look closely, we can see that sops only encrypts the values, not the keys in the YAML file.

Decrypting as required on runtime

We can use a small Python script to enable runtime decryption of the file as required in Ansible. Create a vars_plugin directory in the git repository root, and then put the following code in there as sops.py.

from __future__ import (absolute_import, division, print_function)
__metaclass__ = type


DOCUMENTATION = '''
    vars: sops
    version_added: "N/A"
    short_description: In charge of loading SOPS-encrypted vars
    description:
        - Loads SOPS-encrytped YAML vars into corresponding groups/hosts in group_vars/ and host_vars/ directories.
        - Only SOPS-encrypted vars files, with a top-level "sops" key, will be loaded.
        - Extends host/group vars logic from Ansible core.
    notes:
        - SOPS binary must be on path (missing will raise exception).
        - Only supports YAML vars files (JSON files will raise exception).
        - Only host/group vars are supported, other files will not be parsed.
    options: []
'''


import os
import subprocess
import yaml
from ansible.errors import AnsibleError, AnsibleParserError
from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.inventory.host import Host
from ansible.inventory.group import Group
from ansible.utils.vars import combine_vars


FOUND = {}


# Import host_group_vars logic for file-walking functions.
# We'll still need to copy/paste and modify the `get_vars` function
# and edit below to insert a call to sops cli.
from ansible.plugins.vars.host_group_vars import VarsModule as HostGroupVarsModule


# All SOPS-encrypted vars files will have a top-level key called "sops".
# In order to determine whether a file is SOPS-encrypted, let's inspect
# such a key if it is found, and expect the following subkeys.
SOPS_EXPECTED_SUBKEYS = [
    "lastmodified",
    "mac",
    "version",
]



class AnsibleSopsError(AnsibleError):
    pass



class VarsModule(HostGroupVarsModule):


    def get_vars(self, loader, path, entities, cache=True):
        """
        Parses the inventory file and assembles host/group vars.


        Lifted verbatim from ansible.plugins.vars.host_group_vars, with a single
        in-line edit to support calling out to the SOPS CLI for decryption.
        Only SOPS-encrypted files will be handled.
        """


        if not isinstance(entities, list):
            entities = [entities]


        super(VarsModule, self).get_vars(loader, path, entities)


        data = {}
        for entity in entities:
            if isinstance(entity, Host):
                subdir = 'host_vars'
            elif isinstance(entity, Group):
                subdir = 'group_vars'
            else:
                raise AnsibleParserError("Supplied entity must be Host or Group, got %s instead" % (type(entity)))


            # avoid 'chroot' type inventory hostnames /path/to/chroot
            if not entity.name.startswith(os.path.sep):
                try:
                    found_files = []
                    # load vars
                    b_opath = os.path.realpath(to_bytes(os.path.join(self._basedir, subdir)))
                    opath = to_text(b_opath)
                    key = '%s.%s' % (entity.name, opath)
                    if cache and key in FOUND:
                        found_files = FOUND[key]
                    else:
                        # no need to do much if path does not exist for basedir
                        if os.path.exists(b_opath):
                            if os.path.isdir(b_opath):
                                self._display.debug("\tprocessing dir %s" % opath)
                                found_files = loader.find_vars_files(opath, entity.name)
                                FOUND[key] = found_files
                            else:
                                self._display.warning("Found %s that is not a directory, skipping: %s" % (subdir, opath))


                    for found in found_files:
                        # BEGIN SOPS-specific logic
                        if self._is_encrypted_sops_file(found):
                            new_data = self._decrypt_sops_file(found)


                            if new_data:  # ignore empty files
                                data = combine_vars(data, new_data)
                        # END SOPS-specific logic


                except Exception as e:
                    raise AnsibleParserError(to_native(e))
        return data


    def _is_encrypted_sops_file(self, path):
        """
        Check whether given filename is likely a SOPS-encrypted vars file.
        Determined by presence of top-level 'sops' key in vars file.


        Assumes file is YAML. Does not support JSON files.
        """
        is_sops_file_result = False
        with open(path, 'r') as f:
            y = yaml.safe_load(f)
            if type(y) == dict:
                # All SOPS-encrypted vars files will have top-level "sops" key.
                if 'sops' in y.keys() and type(y['sops'] == dict):
                    if all(k in y['sops'].keys() for k in SOPS_EXPECTED_SUBKEYS):
                        is_sops_file_result = True
            return is_sops_file_result


    def _decrypt_sops_file(self, path):
        """
        Shells out to `sops` binary and reads decrypted vars from stdout.
        Passes back dict to vars loader.


        Assumes that a file is a valid SOPS-encrypted file. Use function
        `is_encrypted_sops_file` to check.


        Assumes file is YAML. Does not support JSON files.
        """
        cmd = ["sops", "--input-type", "yaml", "--decrypt", path]
        real_yaml = None
        try:
            decrypted_yaml = subprocess.check_output(cmd)
        except OSError:
            msg = "Failed to call SOPS to decrypt file at {}".format(path)
            msg += ", ensure sops is installed in PATH."
            raise AnsibleSopsError(msg)
        except subprocess.CalledProcessError:
            msg = "Failed to decrypt SOPS file at {}".format(path)
            raise AnsibleSopsError(msg)
        try:
            real_yaml = yaml.safe_load(decrypted_yaml)
        except yaml.parser.ParserError:
            msg = "Failed to parse YAML from decrypted SOPS file at {},".format(path)
            msg += " confirm file is YAML format."
            raise AnsibleSopsError(msg)
        return real_yaml


From the next you will try to use any of the encrypted vars files in an Ansible run, it will ask for the GPG passphrase to decrypt the file as required.

Thank you Conor Schaefer for the original version of the Python script.


Kushal Das: Adding directory to path in csh on FreeBSD

$
0
0

While I was trying to install rust on a FreeBSD box, I figured that I will have to update the path on the system with directory path of the ~/.cargo/bin. I added the following line in the ~/.cshrc file for the same.

set path = ( $path /home/kdas/.cargo/bin)

I am yet to learn much about csh, but, I can count this as a start.

TechBeamers Python: Append Vs. Extend in Python List

$
0
0

In this tutorial, you’ll explore the difference between append and extend methods of Python List. Both these methods are used to manipulate the lists in their specific way. The append method adds a single or a group of items (sequence) as one element at the tail of a list. On the other hand, the extend method appends the input elements to the end as part of the original list. After reading the above description about append() and extend(), it may seem a bit confusing to you. So, we’ll explain each of these methods with examples and show the difference between

The post Append Vs. Extend in Python List appeared first on Learn Programming and Software Testing.

Continuum Analytics Blog: Accessing Remote Data with a Generalized File System

ListenData: Python : Learn Object Oriented Programming in 3 Hours

$
0
0
This tutorial outlines object oriented programming (OOP) in Python with examples. It is a step by step guide which was designed for people who have no programming experience. Object Oriented Programming is popular and available in other programming languages besides Python which are Java, C++, PHP.
Table of Contents

What is Object Oriented Programming?

In object-oriented programming (OOP), you have the flexibility to represent real-world objects like car, animal, person, ATM etc. in your code. In simple words, an object is something that possess some characteristics and can perform certain functions. For example, car is an object and can perform functions like start, stop, drive and brake. These are the function of a car. And the characteristics are color of car, mileage, maximum speed, model year etc.

In the above example, car is an object. Functions are called methods in OOP world. Characteristics are attributes (properties). Technically attributes are variables or values related to the state of the object whereas methods are functions which have an effect on the attributes of the object.

In Python, everything is an object. Strings, Integers, Float, lists, dictionaries, functions, modules etc are all objects.
OOP Python

Do Data Scientists Use Object Oriented Programming?

It's one of the most common question data scientists have before learning OOP. When it comes to data manipulation and machine learning using Python, it is generally advised to study pandas, numpy, matplotlib, scikit-learn libraries. These libraries were written by experienced python developers to automate or simplify most of tasks related to data science. All these libraries depend on OOP and its concepts. For example, you are building a regression model using scikit-learn library. You first have to declare your model as an object and then you use a fit method. Without knowing fundamentals of OOP, you would not be able to understand why you write the code in this manner.

In python, there are mainly 3 programming styles which are Object-Oriented Programming, Functional Programming and Procedural Programming. In simple words, there are 3 different ways to solve the problem in Python. Functional programming is most popular among data scientists as it has performance advantage. OOP is useful when you work with large codebases and code maintainability is very important.

Conclusion : It's good to learn fundamentals of OOP so that you understand what's going behind the libraries you use. If you aim to be a great python developer and want to build Python library, you need to learn OOP (Must!). At the same time there are many data scientists who are unaware of OOP concepts and still excel in their job.

Basics : OOP in Python

In this section, we will see concepts related to OOP in Python in detail.

Object and Class

Class is a architecture of the object. It is a proper description of the attributes and methods of a class. For example, design of a car of same type is a class. You can create many objects from a class. Like you can make many cars of the same type from a design of car.

Class Methods Attributes

There are many real-world examples of classes as explained below -

  • Recipe of Omelette is a class. Omelette is an object.
  • Bank Account Holder is a class. Attributes are First Name, Last Name, Date of Birth, Profession, Address etc. Methods can be "Change of address", "Change of Profession", " Change of last name" etc. "Change of last name" is generally applicable to women when they change their last name after marriage
  • Dog is a class. Attributes are Breed, Number of legs, Size, Age, Color etc. Methods can be Eat, Sleep, Sit, Bark, Run etc.

In python, we can create a class using the keyword class. Method of class can be defined by keyword def. It is similar to a normal function but it is defined within a class and is a function of class. The first parameter in the definition of a method is always self and method is called without the parameter self.

READ MORE »

Mike Driscoll: PyDev of the Week: Raphael Pierzina

$
0
0

This week we welcome Raphael Pierzina (@hackebrot) as our PyDev of the Week! Raphael is a core developer of pytest, a popular testing framework for Python. You can learn more about Raphael by visiting his blog or checking out his Github profile. Let’s take a few moments to get to know Raphael!

Can you tell us a little about yourself (hobbies, education, etc)

My background is in 3D visualization and animation. After graduating from university with a Bachelor of Arts in Design, I worked as a software developer for a visual effects company for a few years and built applications for digital artists.

Fast forward to today, after having worked at a few other software companies, I’m now at Mozilla where I work on Firefox Telemetry. I manage projects to reduce Telemetry related blind-spots in our Firefox browser products and support our Software Engineers and Data Engineers in increasing the automated test coverage for the Firefox Telemetry component and our Firefox Data Platform. I wrote about my first year at Mozilla on my blog earlier this year in February, if you’d like to find out more about my work.

For fun, I like to run fast, read books, and enjoy the outdoors. 🏔

 Raphael PierzinaRaphael Pierzina

Why did you start using Python?

Back when I worked in VFX, my team developed plugins for several 3D computer graphic applications in whatever scripting language these programs supported:

  • MaxScript in 3ds Max
  • MEL in Maya
  • TCL in Nuke
  • ZScript in ZBrush
  • C# in Unity

We often had to develop similar features for the different programs in the respective languages, which was not only tedious but also felt really unnecessary.

When I first learned about PyPI and the many awesome frameworks, libraries, and CLI apps that the Python community created and published under open-source licenses, I immediately fell in love with Python and started to look for ways to get involved and contribute back to Python projects that seemed welcoming to newcomers, like for example cookiecutter. 🍪

What other programming languages do you know and which is your favorite?

Aside from the scripting languages that I’ve mentioned earlier, I learned C++ and Java at university, but I wouldn’t say I know those as I haven’t used them in years. I’ve done a fair bit in Go for a previous job and for personal projects, but Python is definitely what I feel most proficient in. I recently started learning Rust and really like it so far.

While I don’t always enjoy coding in Python (I’ve worked on adding Python 3 support to way too many projects at this point and still support Python 2 in the majority of the projects that I maintain), Python is still my favorite programming language!

Through my involvement in several open-source Python projects, from attending and speaking at Python conferences and meetups, and interactions on Twitter, I have made a lot of friends in the Python community. If you see me at EuroPython or PyCon DE this year, please say hi!

What projects are you working on now?

I currently work on open-source projects at Mozilla as well as cookiecutter, pytest, and a number of smaller projects like cookiecutter-pytest-plugin, pytest-cookies, pytest-md, and labels.

My priorities have changed over the past year or two and I now focus on mentoring and teaching through speaking at conferences and meetups, writing on my blog and posting on my twitter.

Which Python libraries are your favorite (core or 3rd party)?

I’m a big fan of attrs, click, and pytest. 🚀

What lessons have you learned as a maintainer?

Setting expectations and being able to say “No” is really important. Allow yourself to take breaks or even walk away from your projects. Take care of yourself!

I gave a talk about the challenges of maintaining a popular open-source project at EuroPython in 2018. While this talk might not have quite as many views on YouTube as some of my other talks, it was very important to me to share what I’ve learned from maintaining cookiecutter for several years and I hope it helps folks who find themselves in a similar position.

Thanks for doing the interview, Raphael!

The post PyDev of the Week: Raphael Pierzina appeared first on The Mouse Vs. The Python.

Viewing all 22875 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>