Vibe Coding Is So In šŸŽ¶ But It's Not All Hunky-Dory

In this Newsletter:

-The solidifying trend of vibe-coding

-The dos and donts for when giving AI control

-Building personal resume/portfolio website

-Latest happenings in AI

Roughly two months ago, Andrej Karpathy floated out the idea of a new kind of coding that has cropped up in existence ā€” ā€œvibe coding.ā€

Vibe coders embrace giving up control to the AI; share their product vision, give out guidelines in English, watch the results come in, and ā€œaccept allā€ the code written by the AI for them.

Now, there are competitions being organized with millions of dollars in prizes up for grabs for best vibe coders out there.

Setting the Stage

Most expert developers have been of the opinion that AI just canā€™t write expert-level code (although more and more are changing this view with time).

So, this approach coming from Karpathy may have been a surprise to some ā€” him being one of the most respected engineers in the world.

And if you ask Karpathy, who co-founded OpenAI and for a long time led Teslaā€™s AI and self-driving initiatives, LLMsā€™ ability to code will only get better from here.

The thing with vibe-coding is it opens doors for people who have neither any experience with coding nor any interest in learning it.

The thing with vibe-coding is it opens doors for people who have neither any experience with coding nor any interest in learning it.

The second bit is important because it highlights the difference between vibe-coding and generally AI-assisted coding.

Coders generally use AI assistance to quickly write repetitive or mundane stuff.

Say when coding in Django, I need to frequently write view functions to render a template with some data from module ā€” and I just use AI to write these now, and save myself 10-15 minutes each time.

My Snakes game and Artificially Boosted landing page are all vibe-coded, while I would never ever leave the driverā€™s seat with my main projects Dzambhala or TushitaAI, even when using a lot of assistance from AI.

The Hottest Programming Language To Learn In 2025 Is English

I think it is understandable why top engineers donā€™t think AI is particularly helpful for coding in practice. Same would apply for a top journalist thinking AI canā€™t write a good column or a top tax lawyer think it isnā€™t helpful for legal research.

Yet, I think all of these critics put emphasis on the wrong point.

Take me, for example.

I never let AI write my words. It is a craft I have honed over a lifetime and it just canā€™t compete with me or even better me when it comes to conveying my thoughts in way that I can, with emotions and a sense of emotional quotient (although I love GPT-4.5ā€™s attempt). But it does serve as a good aide in helping me find the right direction, or help tackle writerā€™s block.

As a writer, I donā€™t need chatgpt to ā€œvibe writeā€ for me ā€” it would be darn offensive.

Yet, ChatGPT is why my family members donā€™t ask me to write their emails any more, and I am very thankful to it for that.

At the same time, I donā€™t have any issues with AI generating images in entirety for me, or vibe-coding an entire landing page. Those are not areas I am expert in or have any emotional attachment to ā€” so it saves me a lot of time, effort and money to have it do those tasks.

So, in a similar vein, I think while those top developers might not think much of vibe-coding, but it would have very real repercussions for entry-level and lower-skilled devs, who are simply not needed anymore. AI can code the mundane, repetitive and at times, even very creative stuff just fine.

Why Some People Are Struggling With Vibe-coding And Giving Up

The biggest challenges non-coders face when starting with vibe-coding is:

1) Codebase getting too large that AI becomes ineffective at keep going.

2) security of their deployed app

There are a couple of example floating around on Twitter, with people who took entire vibe-coded projects live, getting hammered by cyberattacks.

The first one happens due to limited context windows in models. There is only so much information an LLM can remember at a time.

While most LLMs are seeing this context window rise, that doesnā€™t necessarily bring solution.

There are a lot of internal mechanics involved and the more ā€œcontextā€ an LLM has when responding to a query, the more ineffective or imprecise it can be, as a result.

The solution here is to keep code fragmented, and only vibe-coding small projects to begin with.

So, it is completely fine to vibe-code a static website or game, but donā€™t keep hopes of being able to code an entire social media app or a finance terminal through this process just yet.

And learning git control can be an important step here, so you have a better ability to ā€œreverseā€ the code, if something gets ducked up.

Security concerns are even more tricky to deal with ā€” and can happen to anyone, vibe-coding or not.

I am for example going through a subscription form spam attack for Dzambhala right now, and working through adding more security layers to bring about a fix.

Typically, security issues happens on front of you telling the LLM the API key, accidentally exposing it directly in the code, not having enough spam detection mechanisms or database security.

If you are coding static websites, this is something you have to worry about very little. Entry for cyberattacks typically come from javascript, dynamically-rendered html or API keys.

New Builderā€™s Corner

Early on in my writing career, I paid a dev INR 10,000 (~ $116) to code a personal website for me to showcase my work to potential clients.

Now the same work can be done in less than 10 minutes using AI and I wanted to share with you how.

This is the initial prompt and follow-up prompts I used with Claude 3.7 sonnet:

Create a personal website for me. I want each section to be vh-100 height, with primary color around teal or aquamarine, secondary color as sth that contrasts it! The hero section should be about me and it features my photo and a basic intro, keep placeholders that I can later edit! For the second section onward, you can use my resume to build each section. At last page, place my social media profiles and a contact me button!

My name is Popeye Dixon

Let's cook up the info instead of using placeholders, happy to have you imagine me as a harvard goer, cool Nvidia employee or sth

Need more color, maybe some gradients in there, also we could use a menu at the top that is collapsed on mobile

Can we use a more icons based approach in skills section? You know just highlight them better?

image - 2025-03-21T192240.052.jpg

Let's use this image for header

The results of the game are available here: https://vibe-code-resume.pages.dev/

You could also keep going with this in Cursor, including creating a portfolio section and more! Tried this and want more help with it? Respond to this email. I will also soon share a way to deploy these websites for free.

From Around The Web

Levelsio is organizing a competition to vibe-code the best game with AI. Judges include Karpathy, the man himself, and a couple of other leading game developers

Give it a shot if you ended up vibe-coding a game!

Bolt.new is also organizing some hackathons for vibe-coding, with cash prices!

Latest Happenings

Web Search Comes To Claude: Anthropic has added web-search functionalities similar to ChatGPT and Perplexity to Claude.AI for Pro users, with rollout also coming next for free users.

Being able to search the web with AI instead of Googleā€™s legacy ā€œ10 blue linksā€ is becoming more and more of a common experience.

Perplexity started with the idea of being able to search the web with LLMs and ChatGPT, Google all followed suit.

I havenā€™t tried Claudeā€™s yet but I think the search functionality on these platforms would remain a secondary feature based on my experience so far ā€” with either Google or disruptor Perplexity getting to eat the cake.

Search is a major user experience, and dedicated products seem to have edge.

Other Updates:

  • Nvidia introduced the Blackwell Ultra GPU and the Vera Rubin chip at its GTC event underway to further developments in AI.

  • OpenAI has added the audio chat to the API with the different voices, alongside transcription endpoints.

  • Sam Altmanā€™s company has also added the o1-Pro in the API at a whopping $150 per million input and $600 output (stahp killinā€™ me Sam).

  • SoftBank and Jeff Bezos-backed Perplexity may be seeking an $18 billion valuation in a fresh funding round.

  • Apple leadership may be disgruntled over progress in Siri, especially in light of generative AI advancements, causing it to shake-up the team, according to highly-reliable analyst Mark Gurman.

I am continually working on ways to improving the value this newsletter brings to audiences of different calibre when it comes to using AI! If you have any feedback, please share anytime!

Reply

or to participate.