Case Study: NTT DOCOMO Leveraging the Alli LLM App Market to Enhance Operational Efficiency with Generative AI
Blog
1/15/2025
Case Study: NTT DOCOMO Leveraging the Alli LLM App Market to Enhance Operational Efficiency with Generative AI
NTT DOCOMO enhances operational efficiency with generative AI, adopting Allganize's Alli LLM App Market to create no-code AI applications. Through workshops and training, employees rapidly develop tailored solutions like Q&A generation and internal inquiry management, fostering innovation and streamlining workflows while promoting company-wide adoption and future-ready AI integration.
NTT DOCOMO, a leader in the communication industry, offers a comprehensive range of services. These include consumer communication services like mobile networks (5G, LTE, etc.), optical broadband, and devices. Additionally, their Smart Life business provides financial payment services and lifestyle content (e.g., video, music, e-books, and utilities like DOCOMO Denki). NTT DOCOMO also delivers corporate communication solutions, including ubiquitous services, satellite phones, and office tools like Office Link.
Since 2023, NTT DOCOMO has been actively exploring generative AI and Large Language Models (LLMs) to improve operational efficiency across its departments. As part of these efforts, the company adopted Allganize's Alli LLM App Market, a platform that enables the creation and use of generative AI applications.
To understand the implementation and impact of this initiative, we spoke with Mr. Nakamura, Mr. Fueta, and Ms. Inako, senior managers from the Service Innovation Department, R&D Innovation Headquarters, who are leading NTT DOCOMO's internal generative AI efforts.
What Are Your Roles?
Mr. Nakamura: In the Service Innovation Department, we focus on developing new services by leveraging data from existing operations. My team specializes in business co-creation using LLMs, collaborating with various departments to enhance operational efficiency. This includes providing prompt engineering support, developing generative AI applications using the Alli LLM App Market, and creating other AI-driven tools.
Mr. Fueta: Our responsibilities are organized by teams. I work in the Developer Relations (DevRel) team, where we plan and execute training sessions and events related to generative AI. We also provide hands-on support to regional offices and branches to facilitate the adoption of AI technologies.
Ms. Inako: I am part of the same DevRel team as Mr. Fueta. My role involves organizing internal events to promote the use of generative AI, supporting regional offices, and creating generative AI applications to address specific departmental challenges.
Mr. Nakamura: Additionally, I lead the Use Case Team, which provides targeted support to departments facing complex challenges beyond standard prompt engineering. We develop custom generative AI applications using tools like the Alli LLM App Market to address unique business needs.
Cultivating a Company Culture to Promote Operational Efficiency with Generative AI
Could you share your past efforts related to generative AI and LLM?
Mr. Nakamura: In August 2023, we launched the "LLM Value-Added Platform" internally across the three companies in the NTT DOCOMO Group: NTT DOCOMO, NTT Communications, and NTT COMWARE. This platform integrates advanced features like Retrieval-Augmented Generation (RAG), developed with proprietary technology, and an ethics-check function to prevent unethical outputs from LLMs. It provides a secure framework for using generative AI in business operations. The platform supports multiple LLMs, including the GPT series from Azure OpenAI Service and "tsuzumi," an in-house model developed by NTT Laboratories. Additionally, we’re expanding its capabilities by integrating it with various services through APIs.
Mr. Fueta: Alongside developing the platform, we’ve implemented numerous training sessions and events to raise awareness, encourage adoption, and foster a culture of generative AI within the organization. These efforts have been instrumental in promoting its use across teams and departments.
Ms. Inako: To support this cultural shift, we’ve introduced training programs on crafting effective prompts, which are available to all employees. These courses have been highly popular and well-received, with many employees participating to enhance their understanding and skills in generative AI.
Mr. Fueta: Beyond training, we’ve organized activities to increase engagement with generative AI and encourage its application in daily tasks. One notable example is a series of "Prompt-a-thons," where participants tackle specific challenges, such as LLM-powered schedule management or LLM-based survey analysis. Participants submit prompts that are evaluated, and the best submissions are awarded. We’ve successfully completed four rounds of this program, which not only helps participants refine their skills but also enables the sharing of expert knowledge across the company. To further promote operational applications, we host large-scale internal contests where outstanding ideas or use cases involving generative AI are recognized and celebrated.
Trial Adoption of Alli LLM App Market: A Generative AI Platform Usable Without Prompts
Why did you decide to adopt the Alli LLM App Market despite already actively promoting generative AI with your in-house platforms, training, and events?
Mr. Nakamura: In addition to our in-house "LLM Value-Added Platform," we adopted the Alli LLM App Market to further expand generative AI adoption across the company. While our in-house platform features a chat-based UI where users input prompts to interact with LLMs, we found limitations in applying it to diverse business operations. Many users became skilled at using features like prompt libraries, creating and registering prompts, and sharing them. However, for broader adoption, we sought a tool that didn’t require prompt knowledge, simplifying the process for all employees. The Alli LLM App Market meets this need with its intuitive, no-prompt interface.
Mr. Fueta: To promote generative AI adoption, we’ve conducted various workshops, where the Alli LLM App Market has proven invaluable. Its pre-set prompts allow users of any skill level to engage effectively. The platform’s ease of customization and app creation makes it ideal for training sessions, enabling participants to experiment and innovate without needing advanced technical knowledge.
Mr. Nakamura: Another major advantage is the low barrier to entry, which enables non-engineers to create applications with minimal learning effort. This simplicity has allowed us to test a variety of applications. We’ve also integrated the Alli LLM App Market with our systems, including Azure OpenAI Service’s GPT series and tsuzumi from our in-house platform, through APIs. For secure access, we leverage Okta authentication, ensuring a seamless and safe user experience.
Using the Alli LLM App Market to Rapidly Validate Business Ideas with Generative AI
Can you describe the workshops you’ve hosted utilizing the Alli LLM App Market?
Mr. Fueta: Our workshops, titled "No-Code LLM App Creation," aim to empower employees to turn their ideas into tangible apps. These hands-on sessions last about two hours, providing a quick and practical introduction to building generative AI applications.
Ms. Inako: We organize sessions around specific themes, such as "QA" (Question Answering) and "VLM" (Visual Language Models). In the QA sessions, participants build chatbots that use Retrieval-Augmented Generation (RAG) to generate answers from documents. For VLM sessions, participants create apps that analyze visual data, like graphs, using LLMs. In one session, participants started by copying a default generative AI app and then developed their own automated response generation app using RAG.
Mr. Fueta: The platform’s ability to enable instant app creation allows us to prototype and test ideas quickly. As we continue hosting these workshops, we’re seeing an increase in participants familiar with the Alli LLM App Market, which will likely lead to faster and more practical implementations. We’re also planning an internal generative AI contest this fiscal year to further encourage innovation.
Mr. Nakamura: By leveraging the Alli LLM App Market, employees can validate their ideas and measure their effectiveness in real-time. This enables them to move beyond theoretical discussions and focus on assessing the real business impact of their applications. The ability to quickly iterate and refine ideas makes the platform a key enabler of innovation within the company.
Overflowing Inquiries About Generative AI Usage Even After the Workshops
What was the response to the workshops?
Mr. Fueta: We conducted four workshops, with about 60 participants attending the first session. To accommodate employees' schedules, we kept the workshops short—just two hours each. After the sessions, we set up a dedicated communication channel where participants could ask questions and share ideas. They were also encouraged to use their issued Alli LLM App Market accounts to develop apps for their respective workplaces. The volume of inquiries we received via the chat channel was overwhelming, showing the high level of interest and expectations surrounding the platform.
Mr. Nakamura: Anticipating frequent questions, we used the Alli LLM App Market itself to create a chatbot specifically for workshop inquiries. This allowed us to integrate the platform into the workshops’ operations, streamlining how we managed the influx of queries.
Mr. Fueta: In the community channel, participants asked proactive questions like, "How can I achieve this?" or "Is this type of application possible?" This eagerness to explore new possibilities demonstrated a strong enthusiasm for using generative AI to tackle practical challenges.
Supporting Departments with Quick Test App Creation and Rapid Problem-Solving
Could you share some examples of the apps you’ve developed using the Alli LLM App Market or ones you’re considering?
Mr. Fueta: In regional offices, we developed an app using Retrieval-Augmented Generation (RAG) to respond to general inquiries sent to the administrative department. We are currently testing its accuracy and operational effectiveness. Initially, the app will be used internally by staff managing these inquiries, allowing them to refine the content and register necessary information before a broader company-wide rollout. This app was also highlighted as a case study during one of our workshops.
Mr. Nakamura: We’ve built several applications, including one that generates Q&A from chat logs of inquiries between Docomo Shop staff and the Support Center (Agency Front Support Center), as well as from service manuals. Given the wide range of services we offer—such as communication services and "d-payment"—there is a significant demand for efficient Q&A creation. Automating this process using generative AI has proven highly effective for improving operational efficiency.
RAG Optimization in Alli LLM App Market Improved Answer Accuracy by 30%
What are your impressions after using the Alli LLM App Market?
Ms. Inako: I’m impressed by the simplicity and speed of app creation. For example, during a one-hour meeting with the Support Center staff, we built an app for generating Q&A from their logs. Using the app builder, I configured the prompts, developed a test app, and had it ready for immediate user testing. This ability to deliver results quickly is a significant advantage.
Mr. Fueta: While LLMs aren’t perfect, the Alli LLM App Market bridges gaps with practical features. For instance, it allows users to preview documents with highlighted sections or prioritize FAQ-based workflows. These features, like those in Tokyo Metro’s use case, improve usability and ensure the platform is genuinely practical for operational needs.
Mr. Nakamura: One standout example is the app we developed for generating Q&A from support logs. This app, which processes chat logs from Docomo Shop staff inquiries to the Support Center, has reduced Q&A creation time by approximately 10 hours per month. Additionally, the RAG functionality in the Alli LLM App Market has been exceptional. When we tested its Retriever optimization, we observed a 30% improvement in answer accuracy. Although our in-house platform also implements RAG, the Alli LLM App Market’s user-friendly approach to enhancing precision independently is a key advantage.
Key to Promoting Company-Wide Use of Generative AI: A Universal Platform and an Engaged Community
With the strong response to the workshops, many requests for generative AI utilization have come from various departments. How do you communicate with these teams?
Mr. Nakamura: The events organized by Mr. Fueta and his team have had a tremendous impact. In addition to training sessions and contests, we recently hosted "Generative AI Day," an event designed to provide comprehensive insights into generative AI. It covered updates on our LLM Value-Added Platform, internal use cases, and initiatives by LLM vendors. We also invited Mr. Ikegami from Allganize to present as a vendor. While we initially anticipated around 200 participants, the event exceeded expectations with over 1,000 attendees, highlighting the growing company-wide interest in generative AI.
Mr. Fueta: We were unprepared for such a large turnout, which speaks to the strong interest in generative AI across the organization. During Generative AI Day, we also showcased the Alli LLM App Market, which inspired participants. Many reached out afterward with requests such as, "I want to build an app like this," demonstrating their eagerness to explore the platform’s potential.
Mr. Nakamura: Beyond large-scale events, we also introduce the platform during individual departmental meetings, creating a more direct and approachable atmosphere. This strategy encourages teams to share ideas or experiment with the platform themselves. By fostering an open and supportive environment, we’ve successfully engaged more departments and generated widespread interest in leveraging generative AI.
Advice for Companies Starting to Use Generative AI: A Universal Platform and Active Community Engagement
What advice would you give to companies looking to promote generative AI usage company-wide?
Mr. Nakamura: It’s essential to create an environment where all employees can access and use generative AI without requiring special approvals. At our company, we’ve ensured the web app for our LLM Value-Added Platform is accessible to all departments. Providing a platform that’s readily available to everyone has been a key factor in driving widespread adoption.
Mr. Fueta: Beyond offering a platform, building a sense of community is equally important. We’ve established a community channel to share information, post quizzes, and encourage active participation. Lowering the barrier to entry and fostering consistent engagement are critical to sustaining interest and encouraging employees to integrate generative AI into their daily workflows.
How has Allganize supported you in promoting the use of the Alli LLM App Market?
Mr. Nakamura: Allganize has been instrumental in providing ongoing support through regular meetings and asynchronous chat, helping us quickly address and resolve any issues. Their responsiveness in implementing requested features, like app editing permissions, has been impressive. Additionally, the seamless integration between the Alli LLM App Market, our LLM Value-Added Platform, and our authentication services has enhanced our overall experience.
Mr. Fueta: I’m particularly looking forward to their individual app creation support services. I believe these services will be a valuable resource for helping our teams develop and implement more tailored solutions.
Expanding Effective Generative AI Applications Across the Company for Greater Operational Efficiency: Embracing New Technologies for Optimal Implementation
What are your future plans?
Mr. Fueta: Our goal is to develop a wide range of applications using the Alli LLM App Market through workshops and expand their adoption across the organization. For example, apps designed to handle internal inquiries can be applied to any department or branch, offering versatility and clear benefits in implementation. By actively sharing successful apps and use cases, we aim to roll them out more broadly, driving even greater operational efficiency throughout the company.
Mr. Nakamura: We plan to continue gathering insights on generative AI usage and providing departments with practical measures to improve efficiency. While inquiry handling remains a primary focus, generative AI is also being used for tasks such as translation and meeting minutes creation, both of which have significantly boosted productivity. Currently, our generative AI initiatives are primarily supported by our in-house LLM Value-Added Platform, but we’ve also started leveraging the Alli LLM App Market to complement our efforts. Moving forward, we will adopt new technologies and pursue initiatives that ensure the most effective solutions are implemented across the company.
Thank you for sharing your valuable insights today!