The Future Of Artificial Intelligence Web Apps…
Very few artificial intelligence (AI) web apps have developed their own Large Language Model (LLM) due to cost, time and expertise, although that may change in the very near future (https://www.semianalysis.com/p/google-we-have-no-moat-and-neither).
99% of these AI web apps are using an existing LLM like Watson, GPT etc and layering their functionality (or simply, prompts) on top of their own user interface — I’ve done this for dozens of prototypes.
The problem is: these aren’t AI web apps creating anything new: sure they are remixing words to avoid copyright infringement, but the underlying data sets are the same, especially if they are using the same API such as OpenAIs.
At best, they are creating convenience (which is valuable).
When you really distil it, all one is paying for with these AI web apps is:
1) An alternative ChatGPT user interface;
2) Someone’s time to do the prompt engineering behind the software; and
3) *MAYBE* some upload/download/copy-paste functionality not present in ChatGPT currently.
Further, if these AI web apps are offering the same service as other AI web apps (plenty are), they are likely still using the same AI engine such as Watson, GPT etc, so the ease of user interface, level of prompt engineering or functionality isn’t going to justify higher prices than their competitors — again they are commodity products by the nature of using the same “supplier” which also uses existing free data.
What this means is 99% of these AI web apps will eventually compete on price, unless of course they have a better offer outside of the core web app or have a brand.
LLMs and generative AI really epitomise democratising information and ideas.
And for jobs, mostly white collar, university-qualified ones that are based on knowledge or creativity that need industry sanctions (i.e. the “professions”: Lawyers, Doctors, Engineers etc) are going to get disrupted too.
It’s a wild time.