Google Stock: Search Is On The Precipice Of Multi-Decade Disruption
February 23, 2023
Beth Kindig
Lead Tech Analyst
This article was originally published on Forbes on Feb 17, 2023,01:18am EST
Earlier this month, Google’s stock (Alphabet) tumbled 7% when chatbot Bard was unable to complete a search with 100% accuracy. During a demonstration, Bard returned incorrect information about which telescope was the first to take pictures of a planet outside the Earth’s solar system. This was a minor mistake given how far large language models and generative AI has come, rather it was the timing that was a bit flawed as OpenAI’s ChatGPT, the chatbot powering competitor Microsoft Bing, had been dominating headlines since its November 30th launch.
Microsoft, being an opportunist, took it a step further and announced Bing would now be powered by a faster and more accurate version of GPT-3.5 one day after Bard’s failed demonstration: “We’re excited to announce the new Bing is running on a new, next-generation OpenAI large language model that is more powerful than ChatGPT and customized specifically for search. It takes key learnings and advancements from ChatGPT and GPT-3.5 – and it is even faster, more accurate and more capable.”
Both companies have been preparing for this moment for many years. Microsoft invested $1 billion into OpenAI a few years ago with a new $10 billion round announced last month. Meanwhile, Google acquired DeepMind in 2014. Google also previously developed conversational neural language models such as LaMDA, which is used by Google’s Bard for its conversational AI technology.
As much fun as the media has had lately poking fun of Bard, there have been similar, compelling reports of ChatGPT-powered Bing also having accuracy issues.
Point being, both are in the early stages and mistakes are being blown out of proportion. Which brings up more important questions for investors – given that technology can require many iterations, what’s the right timing for generative AI and chatbots to drive real advertising revenue?
Investors can get burned by being too early. For example, autonomous vehicles (AVs) were promised in 2019, and the Metaverse has not driven any real gains despite a large media push in early 2021. How does AI compare in terms of time to market?
Secondly, Alphabet has a lot of turf to defend. It won’t only be Bing, but also browsers like Opera that will incorporate ChatGPT into its sidebar. From there, it’s easy to imagine other competitors may crop up over time, some replacing search engines entirely with conversational AI applications powered by speech recognition, which are otherwise unimaginable today.
We look at these key points below for a 360-degree view on Google’s stock given search is on the precipice of its first major shift in over two decades.
Background on AI-Powered Search
“AI is the most profound technology we are working on today. Our talented researchers, infrastructure and technology make us extremely well positioned, as AI reaches an inflection point.” -Sundar Pichai, Alphabet’s Q4 earnings call.
Despite the mishap with Bard, it would be a human-generated mistake to think Alphabet does not command a place of leadership right now in generative AI. Alphabet was one of the first tech companies to focus and invest on AI and natural language processing (NLP). We pointed out to our premium research members in July of 2022 that ChatGPT is based on transformer architecture that Google initially introduced in 2017 when we said:
“Transformers are becoming one of the most popular neural-network models by applying self-attention to detect how data elements in a series influence and depend on one another.
Sequential text, images and video data are used for self-supervised learning and pattern recognition, which results in more data being used to create better models. Prior to transformer models, labeled datasets had to be used to train neural networks.
Transformer models eliminate this need by finding patterns between elements mathematically, which substantially opens up what datasets can be used and how quickly.
Google first introduced transformer models in 2017 and transformers are used in Google and Bing Search. Transformers also led to BERT models, which stands for Bidirectional Encoder Representations from Transformers, and is commonly used for text sequences. Transformers are also used in GPT-3 (it’s the T in GPT) which improved from 1.5 billion parameters to 175 billion parameters. GPT-3 has the ability to report on queries it has not been specifically trained on.”
Earlier this month, Google’s CEO, Sundar Pichai, gently reminded the AI community of how cutting edge Google’s research is when he stated, “Transformer research project and our field-defining paper in 2017, as well as our important advances in diffusion models, are now the basis of many of the generative AI applications you're starting to see today.”
BERT was designed to help Google better understand search intent, as despite billions of searches every day, about 15% of those searches are for brand new terms. This prompted Google engineers to develop a model that could self-learn.
The result is that searches results are more accurate by taking into consideration the nuances of language.
Sign up for I/O Fund's free newsletter with gains of up to 221% -Click here
Multitask Unified Models (MUMs) are 1,000X More Powerful than BERT
Multitask Unified Models (MUM) were introduced in 2021 to further address conversational nuances and is 1,000 times more powerful than BERT. MUMs will have a large impact for search users as it decreases the amount of effort put into seeking the desired information. It’s not only addressing the 15% of search based on new terms, rather it’s a powerful iteration that returns search results that more closely resemble how humans interact.
According to Google, it takes an average of eight queries to answer a complex question. With MUM, this is reduced to one query. You can theoretically ask “should I travel to Hawaii or California this Fall?” and MUM will be able to compare travel rates and weather patterns to answer this question with more depth. Similar to if you ask your friend this question, they might answer “Hawaii is more expensive to travel to but California is prone to wildfires in the Fall, so I would go to Hawaii.” To search for this answer would take many queries, but with MUM, succinct, human-like responses are provided in only one interaction.
Large language models have been helping to improve search results for many years. Therein, Google presents its moat; which is not only a deeply engrained behaviour pattern where search users automatically turn to the multi-decade leader out of pure habit, but that Google search truly presents the highest quality search results today.
Google’s commanding lead on search is not a legacy metric, by any means, rather it symbolizes the lead Google has on data for training large language models.
Bard’s demonstration may have been problematic compared to Chat-GPTs more favorable reviews, however, it’s nothing more than that for now —- which is a mix of bad reviews and good reviews by a limited number of beta users.
TPUs:
This brings us to Google’s TPUs, which are essentially ASICs (application specific integrated circuit) on the efficiency/flexibility spectrum. I first covered the differences between TPUs and GPUs nearly four years ago in 2019 for our premium members when I said:
“TensorFlow is rising in popularity as a machine learning framework and TPUs primarily run TensorFlow models. This is one of Google’s more successful experiments. They are cheaper and use less power than GPUs and are specifically focused on machine learning.
TPUs train and run machine learning models and power Google Translate, Photos, Search, Assistant and Gmail – i.e., image recognition, language translation, speech recognition and image generation.”
Although there are ongoing debates between TPUs and GPUs, the primary difference is that TPUs are application specific and have been optimized for Google’s AI tools. Meanwhile, Nvidia was the first to break ground in deep learning due to the ease of programming GPUs and the relative speed in which parallel computing can train networks. Nvidia also offers its customers an aggressive product road map.
An example of this is the H100 DGX SuperPods, which we covered for our premium members in July when we said:
“Nvidia and Microsoft recently worked on a Mega transformer model with 530 billion parameters and the future for AI engineers is trillion-parameter transformers and applications. The H100 is already prepping for this. According to Nvidia, the training needs for transformer models will increase 275-fold every two years compared to 8-fold for other models. The H100 GPU with its Transformer Engine supports the FP8 format to speed up training to support trillion-parameter models. This leads to transformer models that go from taking 5 days to train to becoming 6X faster to only taking 19 hours to train.”
As of today, TPUs do not necessarily provide Google an advantage over Microsoft’s partnership with Nvidia. When TPUs were first launched, it was expected that it would provide Google an important lead in launches such as Bard. However, Nvidia has proven to be a more difficult competitor than originally expected, and I imagine Microsoft will not stray from this partnership as the company will instead focus on other areas, such as taking more market share with Bing.
Introduction of Bard powered by LaMDA
LaMDA is a conversational language model that powers Bard. Two years ago, Google launched LaMDA to better mimic open-ended conversations by training the language model on dialogue. The result was a more human-like chatbot that personally knows you well enough to recommend movies or books, is sensitive enough to change an uncomfortable conversation, can discuss its own “death” by being turned off, —- and also has machine vision to where it can look at a picture and discuss the picture intelligibly.
Bard was released this month for beta testers and will be available to the public “in the coming weeks.” As mentioned in the introduction, Bard answers questions with real-time data whereas ChatGPT is trained on data from 2021 or earlier (note: the new Bing version is rumoured to use real-time data, see below).
Bard is also free, and given Google’s search revenue, the company may have incentives to undercut competitors that charge paid plans for conversational AI.
Other ChatGPT alternatives
Anthropic is building a 52-billion-parameter pretrained model called Claude, which is a potential rival to ChatGPT. Google invested $300 million into Anthropic last year, with a similar arrangement as Microsoft and OpenAI, which includes a stake in the R&D of the startup. Anthropic was founded by former employees of OpenAI. Whether Claude can actually exceed Google’s own language systems is yet to be determined, or perhaps Google is simply spreading its bets and wanting access to its competitors’ former talent. Despite being in closed beta, there’s an excellent write-up here about the differences between ChatGPT and Claude.
DeepMind is also not to be underestimated. Google’s sister company is behind many of Google’s AI product integrations to-date. In September of 2022, DeepMind introduced Sparrow which is trained with human feedback, similar to ChatGPT, but will use up-to-date information from a Google-powered internet. DeepMind’s previous release, Chinchilla, was competitive with ChatGPT 3.0 before the more advanced ChatGPT 3.5 was released.
There are many other large language models, such as Google’s PaLM and Microsoft’s Megatron, in the 530 to 540 billion parameters size and also based on Transformer architecture.
Notably, this is not meant to be a comprehensive list rather a sample of the level of innovation occuring in this space.
The I/O Fund has launched a new $99/year Premium Newsletter called "Essentials" -- this newsletter delivers premium samples for our readers who want more actionable analysis for their tech portfolios. This month, we released a stock pick that we believe will be a leader in 2023 plus a video with the buy plan.
AI to Drive Advertising Revenues
The strength in Search highlights the advantage that having first-party data provides. The company had global desktop market share of 84% in online Search at the end of December 2022, according to data from Statista. Desktop share dropped slightly from 88% at the end of December 2015 to 84% at the end of December 2022. Meanwhile, Microsoft’s Bing share increased from 5% to 9% during this period.
Source: Statista
According to data from Statista, YouTube has 2.5 billion monthly active users. It ranks second behind Facebook which has 2.96 billion monthly active users.
Source: Statista
According to the data from Nielsen, the company also has a leadership position in streaming in the U.S. For the month of December 2022, YouTube (including YouTube TV) accounted for 8.7% of TV usage, followed by Netflix with 7.5% and Hulu with 3.4%. With AI, the company is helping advertisers address pain points like frequency and measurement.
Source: Nielsen
Android has a dominant market share in Mobile Operating Systems. As per the datafrom Statcounter, Android accounted for 71.8% share compared to 27.6% for Apple iOS at the end of Q4 2022. The share for Android has come down marginally from 74.5% in Q1 2018.
Source: STATCOUNTER
This vast amount of first-party data from Chrome and Android can be efficiently used to train complex AI models. A few years ago, I accurately predicted Apple’s IDFA changes would cause problems for advertising companies in an editorial for Forbes:
This is a problem for the ad industry because it goes well beyond personal sentiments and niceties around privacy and slow-moving government regulations and pits tech giant against tech giant in the black box world of ad software, user tracking and engineered loop holes. There is little question who will win as Apple goes up against Google, Facebook and many others. After all, it’s Apple’s device, Apple’s operating system and Apple’s app store. The only question is why this hasn’t happened sooner.
Similarly, Google is a large real estate owner with arguably more data than any other tech company in the world. This advantage cannot be overstated when it comes to training large language models (LLMs). In addition to having a strategic advantage for future development of LLMs with data, Google can offer advertisers instant ROI.
Philipp Schindler, Senior Vice President and Chief Business Officer said in the earnings call, “Going forward, we are focused on growing revenues on top of this higher base through AI-driven innovation.”
This will be accomplished with AI campaigns, such as Performance Max and Smart Bidding. Smart Bidding uses machine learning tools to optimize the bid of the advertisers. ML tools can analyze millions of data signals and can better predict future ad conversions. The further advancement in AI helped to improve the bidding performance in 2022.
Performance Max will replace Smart Shopping Campaigns. Performance Max allows advertisers to access all Google ad channels from a single campaign and uses Smart Bidding to optimize performance by efficiently matching the conversion goals of the advertisers. Advertisers saw a 12% increase in conversion value with Performance Max when compared to Smart Shopping Campaigns. This is a drop in the bucket in terms of what’s likely to follow over the next few years in terms of better ad tools.
Financials
Alphabet’s current revenue growth is one of the lowest in its public history. Last quarter, revenue grew by 1% to $76.05 billion and on constant currency basis grew by 7%. Next quarter, analysts expect revenue to grow 1.1% to $68.78 billion in Q1 2023. From there, the revenue growth is expected to gradually increase.
Google Search revenue was negative (1.6%) YoY to $42.6 billion and YouTube ads revenue was negative (7.8%) YoY on the back of the tough macro environment. Per the earnings call, “In YouTube, we are prioritizing continued growth in Shorts engagement and monetization, while also working on other initiatives across our ad-supported products.”
The number of YouTube creators is at an all-time high. This can create a flywheel opportunity as content increases with more creators, which leads to an increase in viewership, which in turn is expected to drive more revenues. In order to reward creators, the company has started revenue sharing with YouTube Shorts, which now averages 50 billion daily views. This is up from 30 billion daily views in Q1 2022.
Google Cloud revenues was up 32% YoY to $7.3 billion. The company is seeing strong momentum from enterprises and governments for digital transformation. Management mentioned in the earnings call, “Google Cloud is making our technological leadership in AI available to customers via our Cloud AI platform, including infrastructure and tools for developers and data scientists like Vertex AI.”
Source: Company IR
In light of the soft revenue, net income declined to $13.6 billion compared to $20.6 billion in the same period last year. EPS was $1.05 and missed estimates by 11.9%. The company also recently announced a reduction of about 12,000 employees to improve long-term profitability
Risks to consider
Microsoft’s investment in OpenAI is an obvious risk with quite a bit of awareness. Google has not faced a similar threat for many decades. Microsoft also recently announced a new version of Bing which is yet to be available to the public. A student named Owen Yin previewed the new Bing before it was shut down. The new version is expected to replace the search bar with a chatbox.
However, you can also search the traditional way by toggling between chat and search in the toolbar. The new Bing is also expected to have access to the real-time data, unlike ChatGPT, which is trained on the data collected through 2021. The new version is expected to provide detailed answers rather than just links to websites. Similarly, the users will be able to chat with the bot regarding their queries and develop a conversation. It is also expected to perform more creative tasks, such as writing an email or a poem.
Opera also plans to integrate ChatGPT with a Shorten button feature, which will provide summaries in the side bar.
Conclusion:
I would not be surprised if we exit 2023 with a reimagined way to use Search Engines. The iteration cycle here is likely to move quickly compared to AVs or the Metaverse, as there are real-world applications where AI can be applied without safety issues (AVs) or friction in terms of user adoption (Metaverse/VR headsets). Instead, the scale has already been built with Search being a viral, daily activity used by nearly every human on earth. AI advancements will simply improve what is already in place.
Cutting-edge chatbots can be quickly deployed on the search engines that already exist, and this is a substantial difference from other overhyped, early-stage technologies. Their accuracy may still need time, but they're probably not too far off from being deemed “reliable enough.”
Investors should expect that AI will become a winner(s)-take-all market. In time, the difference in how search and other applications operate in terms of user experience plus ROI for advertisers will help carve a larger lead.
Secondly, investors should not forget the best innovation comes from the private markets, and even if stock-driven media focuses on Google Versus Microsoft, there will be a few David-versus-Goliaths where the smaller team comes seemingly out of nowhere to win the hearts and minds of consumers with a viral entry on the market. However, back to point number one, look for the Goliaths to court the smaller teams and bring them into the fold rather than compete head-on.
It may be clear that there are some puts and takes with Alphabet, such as search being on the precipice of a multi-decade shift, yet the reality is that ad revenue for the company is flat to declining. Our firm uses a blend of broad market analysis, technicals and fundamentals to time entries, such as when we bought Nvidia at its lowest trading point in October 13th for $108 with a real-time trade alert provided to our Members. Our process helps to reduce risk around stocks and find strong entries. Nvidia is up 100% from that recent entry. Our firm will do something similar with Alphabet, as we believe there is a further drawdown in its future. We hold weekly webinars on Thursdays at market close to go over the exact levels we plan to enter stocks. You can learn more here.
Royston Roche, Equity Analyst at the I/O Fund, contributed to this article.
Gains of up to 2,880% from our Free Newsletter.
Here are sample stock gains from the I/O Fund’s newsletter --- produced weekly and all for free!
2,880% on Nvidia
750% on Bitcoin
*as of Nov 20, 2024
Our newsletter provides an edge in the world’s most valuable industry – technology. Due to the enormous gains from this particular industry, we think it’s essential that every stock investor have a credible source who specializes in tech. Subscribe for Free Weekly Analysis on the Best Tech Stocks.
If you are a more serious investor, we have a premium service that offers lower entries and real-time trade alerts. Sample returns on the premium site include 4,490% on Nvidia, 900% on Chainlink, and 1,120% on Bitcoin. The I/O Fund is audited annually to prove it’s one of the best-performing Funds on the market, with returns that beat Wall Street funds.
Get a bonus for subscription!
Subscribe to our free weekly stock
analysis and receive the "AI Stock: 5
Things Nobody is Telling you" brochure
for free.
More To Explore
Newsletter
AI Spending To Exceed A Quarter Trillion Next Year
Big Tech’s AI spending continues to accelerate at a blistering pace, with the four giants well on track to spend upwards of a quarter trillion dollars predominantly towards AI infrastructure next year
Palantir Stock: How High Is Too High?
Palantir proved again in Q3 that it’s undeniably one of the stronger AI software stocks in the market outside of the cloud hyperscalers. The company reported visible AI-driven growth and persisting bu
Bitcoin Bull Market Intact as Risk Increases
In December 2022, we boldly stated that “Bitcoin is a buy” when it was trading around $17,000. We were positioning for a new bull cycle and projected a target between $75,000 - $132,000. Despite Bitco
Tesla Stock: Margins Bounce Back For AI-Leader
Tesla is arguably one of the most advanced AI companies in the world, yet its stock is dictated by margins. Over the past three years, Tesla’s average gross profit per vehicle has declined by 60%, fal
This Stock Is Crushing Salesforce, MongoDB And Snowflake In AI Revenue
In this article, I break down how Palantir’s AIP is putting it a step above peer Salesforce, MongoDB and Snowflake with visible AI growth, and its undeniable ‘secret sauce’.
Nvidia, Mag 7 Flash Warning Signs For Stocks
In this report, my team will address the risks brewing in the market. The strange behavior in the bond market could be signaling that the FOMC has made a policy error. This coupled with key tech stock
Why the I/O Fund is Not Buying Nvidia Right Now: Video Interview
In an interview with Darius Dale, Beth Kindig stated: “We ultimately think you can get Nvidia lower than where it is trading now. We are likely to take gains between $120 and $150 based on technical l
Cybersecurity Stocks Seeing Early AI Gains
Below, I look at the demand environment for leading cybersecurity stocks CrowdStrike, Zscaler, Palo Alto, and Fortinet, and which ones have key metrics hinting toward underlying strength.
4 Things Investors Must Know About AI
We’re still in the early innings of AI, but the pace of transformation that AI is driving is unlike any other technology seen before, and that was evident at Communacopia. Below, I dig in to the four
AI PCs Have Arrived: Shipments Rising, Competition Heating Up
Chipmakers Qualcomm, Intel and AMD are working to bring AI-capable PCs to the “mainstream”, delivering powerful neural processing units to PCs for on-computer AI operations. AI PCs are not only a cons