- Artificially Boosted
- Posts
- I Just Canceled My ChatGPT Plus Subscription
I Just Canceled My ChatGPT Plus Subscription
The age of GPT wrappers is here
Up until say six months ago, I was very jittery whenever people targeted me for building “GPT wrappers.”
“Anyone can do that,” they said.
Times have changed. I am now convinced in my belief that the real products would be built at segmented application layer.
After over 2 years of paying $20 for ChatGPT Plus subscription, I just canceled it this month.
And I didn’t do this because it’s any less useful — but because “ChatGPT wrappers” fulfill my needs better.
I am now paying the same $20 for a Cursor subscription.
Imagine having a professional coder that live right in your IDE and helps you as you type, as you think and you don’t have to constantly engage in a back and forth on a separate platform.
What Cursor has built with using Claude, DeepSeek and OpenAI APIs is incredible. It is the application layer for coding segment that it has that makes the product astounding.
Think similarly of other products, do you want to prompt ChatGPT to write your email or do you want that in the gmail compose email?
Do you want to talk to ChatGPT to sketch out a travel plan or would you rather just go to a dedicated app that is trained to book your entire travel in one go!
Integrated AI Interfaces Are Going To Win Out
Another reason it kinda sucks to pay for a foundational model-based interface like ChatGPT is that there ins’t a one-size fit all.
For example, while OpenAI has made better models in general, xAI’s Grok is astounding at generating images.

Elon Musk painting on a canvas. Image generated with Grok. Couldn’t have been possible with Dall-E
There is no way Sam Altman would ever bring Grok’s image functionality to ChatGPT.
Similarly, what if you wanted to transcribe an audio quickly? Google’s Speech-to-Text model excels at that! And is available at throwaway prices in its AI studio!
So, I would much rather pay for a product that is able to seamlessly integrate a plethora of these models into one interface via API calls.
Total win-win. Foundational models make money, interface makes money, user is happy.
Yes, I want an AI interface that calls Grok to generate images or for sass, OpenAI for conversational messages, Claude for coding, and Google API for transcription.
AI Is Going More Human
OpenAI just released its latest model GPT-4.5.
This time, Sam Altman doesn’t want you to think about benchmarks or asking it tricky logical or scientific questions. It would fail at all that.
Instead, it is designed to be more human — and that may mean sucking at math, in exchange for giving more human-like, more thoughtful response, displaying a high emotional quotient and less hallucination.
At $75 per million input tokens and $150 output — it is now by far the most expensive AI out there in the public domain.
I am not writing a lot on it, as I haven’t tried the model out a lot beside as an anonymized guinea pig in ChatGPT.
Experts whose opinion I value who have tried it do say it performs better at engaging writing, so I look forward to giving it a shot.
My first impression is the pricing doesn’t seem to be pragmatic at a time the wider narrative is around bringing down the costs of AI for customers.
My Favorite AI Tweet of the Day
For the confused, it's actually super easy:
- GPT 4.5 is the new Claude 3.6 (aka 3.5)
- Claude 3.7 is the new o3-mini-high
- Claude Code is the new Cursor
- Grok is the new Perplexity
- o1 pro is the 'smartest', except for o3, which backs Deep ResearchObviously. Keep up.
— Nabeel S. Qureshi (@nabeelqu)
8:13 PM • Feb 27, 2025
Parting Thoughts
This newsletter is written for you, my friend. I am taking out time everyday to make sure I jot down my thoughts on what I personally find to be the most exciting technical field of our lifetime.
Please give feedback on what you like, don’t like. What can I do to make you better informed about AI?
Respond to this email and let me know and don’t forget to check out our core product — the Nodex.
Reply