39

The DevTools that Changed my Life

 5 years ago
source link: https://www.tuicool.com/articles/hit/buamMfy
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

The DevTools that changed my Life

The most impactful developer tools I’ve come across in my career

VRRB3eQ.jpg!web

Every once in a while you encounter a tool that upgrades your abilities with superpowers, and makes you forget life before it. When this happens your mind is blown as something hard is made easy. This is how I felt when I first encountered dynamic typing and async operations in Javascript.

In this post I’ll share some of the most life-changing tools I’ve come across in the past decade, in which I was fortunate enough to work in some amazing development teams ranging from small startups to world-class enterprizes.

Most of these tools are terminal based, stemming from the notion that it’s one of the most effective ways to communicate with the machine. This will not be a full blown guide on each tool, but rather a thoughtful review of how and why these tools changed the way I work right when I first discovered them.

Recently I’ve chosen to join Bit , a brilliant tool which I believe holds the power to change the life of many developers. Bit changes the way we build software by making it easier to share and build with smaller components. These are the tools I use to super-charge Bit, and where I draw my inspiration from.

The king composition | the pipe or pipeline

In the core of a Unix-like operation system there’s the concept that the tools we use are very simple. They are mono-task tools composed with pipes.

For example, the find command is piped to wc -l and you get: find . | wc -l counting how many files there are. Pipe means that the output of one command becomes the input of another. In short, it is piped.

Some other useful operators in that context are:

1. |& will pipe standard error

2. > will output to file.

3. < will read from file.

4. xargs will transform a standard output to command arguments of the next command.

2yAVzmf.jpg!web

So when I do this lsof -i :3307 | xargs kill -9 you can understand (assuming you know lsof and kill ) that I’m trying to kill the process which occupies port 3037. You also understand that kill takes an argument and not a stream by just looking at the glue parts (| and xargs).

The best thing to note is that these operators compose or chain. They were built with this intent in mind, so you can go on from there to more complex commands and scripts. For me this is unix in a nutshell, your interface becomes a programming language.

This flow allows you to create new functions in your terminal which were not thought-of by the original creators. The concept of pipelines runs deeper, the ability to compose very simple (read pure) tools (read functions) together might remind you of something else you heard of pure functions.

In the same way it (should be) easy to reason about pure functions , and you understand how they are composed, hopefully it is easy to understand the new meaning you are creating via the command line ( Linux Philosophy ).

jq - process JSONs

when I call the xkcd API with curl I do this

curl -X GET <a href="https://xkcd.com/614/info.0.json`" data-href="https://xkcd.com/614/info.0.json`" rel="noopener" target="_blank">https://xkcd.com/614/info.0.json</a>

and the output is bad. Really bad.

when I pipe the same command to jq it looks like this

curl -X GET <a href="https://xkcd.com/614/info.0.json" data-href="https://xkcd.com/614/info.0.json" rel="noopener" target="_blank">https://xkcd.com/614/info.0.json</a> | jq .

These results differences are shown in gists, but this also mimics the experience in the terminal. I can test my API and enjoy the reading part. I can reorder, change and format json outputs (file or network) which are hard to read. As json is such a common medium of information, this tool makes it readable.

The Docker revolution

Being the building block of almost any modern cloud system today, Docker is nothing short of amazing. This is a spaceship compared to JQ and | which are extremely light weight.

Docker is a different beast altogether. It introduced immutable applications and virtualization. For those who never dived in to it, you get the same environment every time the application runs.

When you don’t need the application instance (should read container) you dump it and create a new one. Other maintainers can rather quickly decipher what your server configuration looks like from the Dockerfile.

You can do more beyond your application docker file; you can start incorporating the docker way of thinking in your day to day work. This is when things click. Here are a few examples:

iIRBfiV.jpg!web

1. Run docker for new products you want to test: DB machine, pub-sub, this crazy piece of code.

2. Solve python environment problems with docker by mounting your dev folder as a volume and the dependencies are already installed with pip.

3. Introduce external dependencies for your micro-services via docker-compose before running tests.

When you explore more server products it seems to make sense to combine them with Docker. For the most part, you will discover new ways to employ docker to your benefit. When I witnessed this for the first time, it gave me a whole new look on the Docker way.

Yea, compared to the tools here which are mostly hammers and screwdrivers, docker is very different and might be more comparable to a robotic crane. While usage includes the same interface (terminal), the results are amazing.

Static one line http server

python -m SimpleHTTP Server 8000 Small and useful. This will run a static web server which serves files from your working directory. It’s one line away. Python comes pre installed on most linux/mac systems. For python version 3 use python -m http.server 8000 .

Memorize one and stick with it.

Text manipulation and search

Our commands usually output texts which teach us their structure. Here are a few ways to get a text and search for it.

grep cat [file-name]| grep [search-term] cat will output the file to screen and grep will filter out lines which contain the expression.

tail : tail -f [log-file] I have an application which spits out logs. I want to see in runtime what it prints and I don’t care much about its history. tail will print the end of the file to give some context, the -f param will allow you to keep listening to the log stream as an application runs.

head : head [file-name] When a server loads many times it will spit out some information about its load. When I cat the log file I get a huge dump of everything which happened in that time frame. Head allows you to take a peek at the top of the file.

ack | ag : are both text search commands which I use a lot in the context of searching source code. They are both very fast (ag should be faster). On mac the results can open the code file in the proper location while using the mouse, and that’s just awesome.

Some worthwhile mentions I don’t use:

VIM: While my dotfiles repo on github might say otherwise I’ve still haven’t gone down the ViM path. I’m not sold but I see the power in it. This is a good point to start from.

bat : A recent discovery, cat but with syntax highlight, line numbers, git diff.

Remote debugging

You’ve deployed a service to production. It works fine most of the time, but sometimes it crashes when a specific requests come in or when the server flaps for an unknown reason. All your tests pass on your machine and they pass in CI. You have no idea where the bug comes from.

If only you could debug production instances…wait a minute, it’s on a virtual host in your cloud provider and on a docker machine! What do you do ? (I really hope it’s not installing your dev environment there)

The answer is remote debugging .

In my day-to-day work I use node --inspect-brk to debug node in my browser. Yea, I get laughed-at sometimes by people debugging in their editor/IDE’s, but the ability to setup a remote debugging is a super power.

Surely some other measures need to be in place as well, like loading an instance which gets duplicate requests to not disturb production usage, but this use case and others are solved when you use remote debugging.

You can go on and open a debug port on a rest request and have docker expose it by default to your a secure jump server. You can even chain subnets with ssh to overcome network boundaries. If you know the IP and have access, you should be able to debug it.

E7r6ZfU.png!web

There is a tool made by chrome labs called ndb which I use and will assist node developers.

I encourage you to learn how to remote debug in every echo system, and do so with caution.

Thats all I got for now

There are many things I didn’t manage to cover here like wget/curl, htop and more. I don’t want to make you drunk with power just yet + I tried to stick to things which you can run and use from your own machine right away.

Now it’s your turn. Name some of the tools that changed your life and gave you dev superpowers in the comments below to blow everyone away (should read: I want to know them all). I want to know what you think.

About the author

Hi I’m Doron. I’ve been writing code since I was 14 and been doing it professionally in the past 10 years. I’m a father to Avigail which is enjoying her first look at the ocean. That’s an awesome first experience. In my spare time I crawl the random web and play roleplaying games.

ZjeYRbV.jpg!web

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK