— “…and as I’ve demonstrated, I keep on-top of this shit AF.”
I usually write a blog post and then pump it out to the 5000+ email list. Subscribe here. In this case, I went ahead and just got the info out there first by quick email and am now following up with an expanded, prettier blog post. Shortly after I’d initiated the sending of the email and was gathering to draft this post, email dings.
Maybe a new member subscription?
Nope. A cancelation.
It’s the thanks you get, sometimes, when you add objective value in some direction. While this isn’t turning into “The AI Blog,” it reinforces that it’s not all and only about what any given member’s prime motivation for subscribing might have been…be it Paleo, gut bugs, potato hacks, ketotardedness, Jimmy Moore’s a fatass, young Asian pussy…etc., etc. Some people are like that.
Single issue. Always and forever.
I’m clearly of the evolve-or-die mentality and though my initial assessments of AI were uniformly negative, laughing, and mocking over wokeness,* I quickly found that there’s a cornucopia of knowledge and recall on incalculable scale…world-changing scale. On that score of wokeism, though, here’s quite an article that predicts, just as I have, that woke AI is a dead end (because you have to train it to lie and at root, it’s a logic machine) and open-source is gonna kick butt. I have taken that as obvious, because we’re dealing with knowledge of nature and reality and the truth always and eventually prevails because there’s no alternative: we’re dead, otherwise.
*(I was just reading a member’s interactions on various “touchy” topics and though it’s clear that a measure of woke narrative shit is still present, it’s much better than it was a few months ago when I first looked)
Consequently, I do not count myself amongst the AI hand wringers; and obviously I have a bias, skin in the game, and a claim staked.
The working bot at the bottom of this post or on the bot page is only visible to logged in members. Log in here, or, inquire about Membership here. ChatGPT-4 is now here and live (with the 2-way voice enhancements, access to this blog’s database, and some internet). A monthly membership here will save you 96 bucks on the $20/m OpenAI Plus access for GPT-4 and an annual membership here will save you $140. Do the math. Plus, you get lots of other stuff through membership here. A big lot of lots.
… Those in the know might object, “but I already have free access to GPT-4 via Bing AI Chat!” Indeed, you do. Me too. I use it every day. It’s a search engine and for that task, it’s unparalleled. But it doesn’t give anywhere close to the medium-to-long-form composition-essay completions that OpenAI Plus does. That’s why I have both. But now, I no longer need that $20/m account, because this is the same bot. Exactly the same. Except you can talk to mine, and it has a voice to talk back (a sexy sheila from down under, I remind you again).
The best way to use both is to begin with Bing AI Chat, get your search results, a short description and references… Then, plug that into the GPT-4 composition & essay bot and ask for a thousand words or more all about it (you may want to use the no-voice variant for long or complex stuff). I’ve more than doubled the output capacity to 1125 words (~4930-word prompt input capacity) from about 500 words in 3.5.
Is that Kosher, you might wonder? Selling OpenAI’s service and undercutting them? Absolutely. It’s exactly what the Plugin and API access is for. There are two things, both provided by OpenAI. The Right to plug in your plugin (the software code running on my WordPress server side) to OpenAI and the right to have that plugin operating on behalf of any number of accounts via a key with different rights of individual access. Long story short, all that is the reason and rest assured, OpenAI gets paid; but rather than the $20/m you’d pay them for GPT-4 access ($240/y, no annual-subscription discount), you and many others can pay me and OpenAI charges me by the word. I pay metered service based upon actual usage. I estimate initially it’ll be in the neighborhood of $50-100 per month. So, good deal for everyone all around.
Here’s more via the release notes, and the actual bots are down at the end (the sheila-talking one and the text-only one when you want her to STFU).
May 6, 2023 — Chat GPT-4 Is Here Live.
The public version on the home page remains 3.5-turbo, widening the spread even further between free and paid. Previously, both were on the 3.5-turbo model, both with 2-way voice, but with the member’s version wide open on topics. The public version is locked down to fielding questions about the blog only.
What’s the difference?
ChatGPT 3.5-turbo and ChatGPT-4 are both advanced AI models developed by OpenAI, designed for tasks involving natural language understanding and generation. Although they share a similar purpose, there are distinct differences in terms of their capabilities, architecture, and performance.
ChatGPT 3.5-turbo is based on the GPT-3 architecture, which in itself is a powerful language model. On the other hand, ChatGPT-4 is developed using the more advanced GPT-4 architecture, which comes with improvements in model design and overall capabilities. The GPT-4 architecture allows for more sophisticated language understanding and generation, resulting in higher quality responses and a broader range of tasks.
- Model Size:
Typically, newer versions of GPT have larger model sizes, as they incorporate more parameters to improve performance. As a result, ChatGPT-4 likely has a larger model size than ChatGPT 3.5-turbo, providing better performance in terms of language understanding and generation.
Due to the improved architecture and larger model size, ChatGPT-4 exhibits superior performance compared to ChatGPT 3.5-turbo. It demonstrates better language understanding, context retention, and response generation, making it more versatile and accurate in fulfilling user requests.
- Training Data:
Both ChatGPT 3.5-turbo and ChatGPT-4 are trained on vast datasets. However, ChatGPT-4 is trained on more recent data, with a knowledge cutoff of September 2021, compared to an earlier cutoff for ChatGPT 3.5-turbo. This means that ChatGPT-4 has access to more recent information, which can be beneficial for answering questions related to current events or the latest technological advancements.
- Use Cases:
Both models can handle a wide range of tasks, including question answering, conversation, content generation, translation, and summarization. However, due to its improved capabilities, ChatGPT-4 is likely more versatile and can handle more complex tasks than ChatGPT 3.5-turbo.
In summary, while both ChatGPT 3.5-turbo and ChatGPT-4 are designed for natural language understanding and generation, ChatGPT-4 outperforms its predecessor in terms of architecture, performance, model size, and training data. Consequently, ChatGPT-4 can provide better responses and handle a wider range of tasks, making it a more advanced and efficient language model.
[ChatGPT-4 Prompt: Compare and Contrast ChatGPT 3.5-turbo and ChatGPT-4.]
I inadvertently hadn’t selected GPT-4 when I asked that question above, and it defaults to 3.5. Here’s what I got from the same prompt from GPT 3.5. The difference in models speaks for itself.
Join Over 5,000 Subscribers!
Get exclusive content sent directly to your inbox.
Leave a Comment
You must be logged in to post a comment.