Image for post
Image for post

The difference between an aggravating CLI tool and a great one can often be made by a few simple changes.

I have built many CLI tools over the years, including pipx which has nearly a half a million downloads. Here I try to capture the important things in a CLI tool that make it pleasant to use. These are conventions and expectations I’ve come to adopt. Without further ado, here is the checklist.

Use an argument parsing library

Don’t try to manually parse command line arguments. Use a library. Libraries will also autogenerate help text.

Set the Exit Code Correctly

On POSIX systems the standard exit code is 0 for success and 1 to 255 for anything else. People calling your tool will want to script around its success or failure. The exit code can be obtained with $?. I generally just set 1, but depending on the requirements, you may give special meaning to other error code values. …


Image for post
Image for post
So clean!

Unless you enjoy sudo changing permissions on files in /usr and searching for instructions on how to uninstall things cleanly (check out these answers on Stack Overflow), you might want to change up how you install homebrew on your mac.

This post outlines a clean way to install brew and programs with brew.

Install

Instead of running curl or piping a shell script, we’re going to clone brew itself and run it directly. Don’t worry, you don’t have to compile anything, just clone it.

>> git clone https://github.com/Homebrew/brew ~/git/brew

You can now run brew directly at ~/git/brew/bin/brew, but that’s not so convenient, so let’s see what else we can do to improve from here. …


If you’re a Python developer you’ve likely heard of Virtual Environments. A Virtual Environment is “a self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.”

Why are they so popular? Well, they solve a problem: no longer are packages installed into a mishmash of global site-packages. Instead, each project can install precise dependency versions into their own “virtual” environments.

However, they introduce some problems as well:

  • Learning curve: explaining “virtual environments” to people who just want to jump in and code is not always easy
  • Terminal isolation: Virtual Environments are activated and deactivated on a per-terminal…


Myth 1: “If I don’t use Pipenv or feel like it improves my workflow, I’m doing it wrong!”

Pipenv is designed for a very specific use-case: application dependency management and associated workflows. In other words it’s a fancy replacement for requirements.txt that automates virtual environment management. Even if your use-case matches what it’s designed for, nobody is saying you must use it, or looking down on you if you don’t. In fact core Python developers and the Pipenv team have workflows that don’t involve Pipenv. [1][2]

Myth 2: “Pipenv is the officially recommended Python packaging tool from Python.org”

This myth comes from older versions of Pipenv’s documentation. The claim is misleading because it sounds like a recommendation from the core Python team but is actually referring to a recommendation from the Python Packaging Authority (PyPA), a volunteer organization separate from core Python with no qualifications for membership [1]. Pipenv now more accurately markets itself as a tool for application dependency management, not for all Python packaging. Pipenv is still recommended in the PyPA’s documentation (which is slightly controversial) among other tools, but it’s not the packaging tool for all of Python. …


PEP 517 and 518 are related and their combined changes are as follows.

A new standardized file named pyproject.toml will be read by pip for a section called [build-system] . This file specifies two things

  1. How to build a package (“wheel”) from source code (“sdist”) (PEP 517)
  2. Which packages need to be installed before trying to build. These are the build requirements. (PEP 518)

Here is an example of whatpyproject.toml looks like

[build-system]
requires = ["requests"] # PEP 518 - what is required to build
build-backend = “tool.module:myfunction” # PEP 517 - what function to call to build

Implications

This is a big deal because previously pip is literally hardcoded to run python setup.py to build packages when installing them from source. PEP 517 and 518 break that convention and let pyproject.toml tell pip how to build wheels and install them.


Image for post
Image for post

SSH is awesome. It lets you securely connect to remote computers and act like you’re on a local computer. It’s so common it’s even a verb:

“ssh into the server and restart it”

But it has some downsides:

  • you need to reconnect if your computer goes to sleep or you switch internet connections
  • each terminal needs to maintain its own connection
  • it’s slow to respond to high bandwidth commands on slow connections

It turns out something could be better than SSH if it solved these problems, and in fact, some tools do exist that augment SSH. …

About

Chad Smith

Software engineer and open source developer https://github.com/cs01, https://twitter.com/grassfedcode

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store