JP Morgan and Morgan Stanley are leveraging GPT-4 to create in-house knowledge assistants and generative AI-based financial derivatives for 16,000 analysts, attracting investors. This article highlights 24 cases of global financial generative AI applications and B2B AI technology trends. It also discusses the rise of B2B AI solutions and their impact on corporate productivity.
What are JP Morgan and Morgan Stanley doing with GPT-4? They are quickly creating in-house knowledge assistants for 16,000 analysts and generative AI-based derivatives to captivate investors. Here are 24 cases of the latest global financial generative AI applications and B2B AI technology trends.
What are JP Morgan and Morgan Stanley doing with GPT-4? They are attracting investors by quickly creating in-house knowledge assistants and generative AI-based derivatives for use by 16,000 analysts. We have covered 24 years of recent global financial sector generative AI application cases and B2B AI technology trends.
Generative AI has mainly expanded to B2C applications such as ChatGPT. However, even in the B2B field, efforts have been made to improve work productivity and derive business insights through generative AI, and cases that have become a reality are emerging in earnest by 2024.
McKinsey believes that among industries, finance will have the greatest opportunity from generative AI. We estimate that increased productivity from generative AI will add $200 billion to $340 billion in annual value (9 to 15 percent of operating profits). According to MIT Technology Review , the implementation of generative AI could result in cost savings of up to $340 billion per year across the financial services industry. Organizations leveraging AI have seen an average 18% increase in customer satisfaction, productivity, and market share , and have shown an average return of $3.50 per dollar spent on these investments.
We looked for 24 years of examples of global financial institutions utilizing generative AI.
Morgan Stanley was the first on Wall Street to lead collaboration with Open AI and even appointed an AI chief within the company.
They created a GPT-based asset management content knowledge discovery program for analysts. She used GPT-4 to analyze and search hundreds of thousands of pages of distributed PDF files, including capital markets, industry analysis, investment strategies, and market research reports. It is used by over 16,000 analysts around the world.
According to Bloomberg , JPMorgan Chase launched “IndexGPT” on May 3, 2024 . Index GPT is a themed investment basket created based on OpenAI’s GPT-4 model. Unlike traditional indices that rely on industry or company fundamentals, IndexGPT utilizes AI-generated keywords extracted from news articles to identify investments based on emerging trends.
JP Morgan uses IndexGPT as a way to acquire new topics very quickly. Generative AI can quickly summarize news, generate key keywords, and quickly create products. JP Morgan noted that theme product investors have a very short holding period and applied the fast speed of generative AI. In the long term, it is said that AI will be introduced to all index products.
In terms of cost-benefit, it is predicted that deriving insights through information synthesis will help innovate corporate productivity. In addition, it was believed that the LLM technological innovation in the second half of 2023 and 2024 lies in freely attaching external tools.
The information synthesis stage focuses on condensing information in a way that saves the user time . The key is to take control of the workflow by allowing users to do more with their applications and to easily create new ones to take advantage of new use cases to maximize ROI. It locks in as customers spend more time and is easily scalable to additional use cases over time.
Andreessen Horowitz says that a year later, we are seeing an increase in AI companies emphasizing these end-to-end workflows and actually transitioning to Wave 2.
To save time, we need to do the work for you. LLM usually uses a prompt input → output method. It can also be converted from text to voice and video. Input and output fit well into the workflow. We input context and information and proceed to the output to gain insight or make a decision.
However, since the chat UX/prompt involves the user going through the steps of chatting with the system, rather than going straight from input to output, I have to input my context and information in the form of chat, which interferes with the workflow. Andreessen Horowitz proposes that AI takes over the workflow by converting the workflow into a function or feature within the product that can be completed with a single click of a button. Another option is to deploy standardized, purpose-optimized prompts based on the business objective of each situation. In fact, one of the most used functionalities of Allganize is the ability to define workflows for specific use cases and automate the prompts to each case, without putting the burden on the end user.
There are examples of companies such as FigJam and Macro that have actually commercialized this. We even propose AI-based CRM for a new user experience. For more detailed information, it would be a good idea to read the full article.
If 23 was the year of testing generative AI, this year can be seen as the actualization phase of generative AI.
Allganize is a B2B AI specialist company that creates industry-specific models through fine tuning based on open source models, and also has excellent technology to help companies properly utilize generative AI through RAG.
B2B AI solutions are evolving to solve problems by combining LLM in the workplace and produce sophisticated and accurate results.
Allganize's Ali LLM app market, which allows you to select and use a variety of LLMs to suit your company's business, and allows you to use over 100 work automation tools at once, is also evolving toward a full-stack AI tool.
If you are curious about AI native workflow tools, contact Allganize!