Embracing AI Tools
AI Coding Assistants
I think we have entered the golden age of AI developer tools. Just in AI coding assistants alone I’m quickly losing track of the options that are out there, from Wind Surf to Cline to Claude Code to Cursor. The list goes on. Recently, Amazon released a new AI IDE called Kiro, which I learned is another VSCode fork. This made me quite curious as to how more money and capital is being funneled into a seemingly concentrated space already full of viable options. But each AI coding assistant is quite unique in what they offer. Kiro’s selling point, for instance, is that it offers a feature called “Spec” which gives you the chance to build out your prompt through listing requirements, design, and task to ensure higher quality code generation. Other AI IDEs simply take the prompt and run with it no questions asked - for better or worse. This kind of rigorous approach focusing on documentation and maintainability is more suited to the software development life cycle for building production-ready code and makes the AI coding experience much more frictionless (less likely that the LLM misunderstands your prompt’s intent), at the cost of some added tool complexity.

I think with all of these AI tools, people should really try their hand at a couple and see what they like. I was quite stubbornly using Cursor for a while (I loved the Tab), so it took me a long time to give a different AI coding assistant a shot. During my research internship at Argonne, my mentor Jeremy Feinstein was a big proponent of a CLI tool called Aider. It’s an open source AI coding assistant that lives completely in your terminal. It is also very easy to configure by setting up your API keys to your favorite LLMs whether that is Deepseek or Gemini or Claude. I realized that AI coding without an IDE could also be just as smooth and effective. Aider is also easy to pickup, as it just involves using a couple in chat commands. I was impressed when I used Aider to quickly run an experiment involving Conditional Variational Autoencoders (I was curious as to the role of the encoder conditioning in training on decoder generation), and it was all done in one shot.
Enjoy Reading This Article?
Here are some more articles you might like to read next: