Skip to Content
DocumentationBriefsJPMorgan AI-First Culture

Building an AI-First Bank Culture

Key takeaways from JPMorgan Chase CAO Derek Waldron’s conversation with McKinsey on LLM Suite, the two-pillar AI strategy, workforce transformation, and emerging risks in agentic AI.

Source: McKinsey Financial Services Practice  — October 2025


The Numbers

  • JPMorgan Chase is set to spend $18 billion on technology in 2025
  • Nearly 250,000 employees have access to LLM Suite, the bank’s flagship gen AI platform
  • Close to half of all JPMorgan employees use gen AI tools every day
  • Since its AI program’s inception, gross benefits attributed to AI have grown at 30-40% per year

Most employees would say 2024 was the year they developed a personal relationship with AI. Waldron describes LLM Suite’s impact as nothing short of a cultural transformation.


LLM Suite: From Chatbot to Ecosystem

LLM Suite launched as a chatbot interface powered by leading third-party LLMs. It has since evolved into what Waldron describes as a full ecosystem — an AI-connected enterprise with intelligence at the center, linked to team knowledge systems, firm-wide data, applications, and tools for presentations, analysis, and reports. Every few weeks a new data set or application connection is added, expanding the problem space that can be addressed.

The rollout was opt-in, not mandated. Phased onboarding created healthy internal competition and a fear-of-missing-out dynamic that accelerated organic adoption. Data security was the paramount consideration before launch; the bank invested in change management and training alongside the technology.


Two-Pillar AI Strategy

PillarApproach
Top-downStrategic focus on the domains with the most transformative value: credit, fraud, marketing, technology development, operations, and frontline banker enablement. Cross-functional teams reimagine end-to-end journeys using the right mix of RPA, predictive AI, gen AI, and agentic AI.
Bottom-upFederated, self-service innovation where employees use LLM Suite for day-to-day productivity. The long tail of job families — too numerous for formal prioritization — is addressed through democratized tools where the cost of experimentation is near zero.

Waldron notes that bottom-up productivity gains create organizational capacity and operating leverage, but they don’t automatically translate to cost takeout. True cost reduction or end-to-end metric improvements (like 80% faster response times) require top-down journey reimagination.


Training Segments

SegmentProgram
All EmployeesAI Made Easy. A branded, scaled training program covering what LLMs can and cannot do, prompt construction, and practical daily use. Supported by marketing campaigns, town halls, and peer-to-peer sharing (prompt libraries, “prompt of the week” emails).
TechnologistsBuilding with agentic AI. New frameworks, capabilities, methodologies, and risk considerations for building sophisticated AI-native applications.
Data ScientistsSystem-level skills. Shifting from building models from scratch to designing, evaluating, and optimizing end-to-end probabilistic systems using powerful third-party models.
ExecutivesOperating model reimagination. Leading cross-functional teams through transformation — value from gen AI requires business leaders to redesign processes and functions, not just deploy tools.

Emerging Roles

Prompt Engineer → Context Engineer

Prompt engineers translate business logic into instructions LLMs can execute — the first new job category to emerge from gen AI.

Context engineers assemble all necessary context into AI systems for correct decision-making — a broader, more strategic evolution of prompt engineering.

Waldron also identifies knowledge management as an emerging job family: curating and structuring institutional knowledge so AI systems can navigate it clearly and avoid errors.


Industry Economics

Global banking generated roughly $1.2 trillion in profits in 2024 with a 10.3% return on tangible equity — just above cost of capital. The industry spends about $600 billion annually on technology. McKinsey’s central scenario now estimates **$700 billion in cost savings** available to banks that adopt AI thoughtfully, up from $200-340 billion estimated in 2023. However, much of that will be competed away as customers use AI to reduce switching friction.

AI pioneers in banking could see ROTE increase by up to four percentage points, while slow movers are likely to see declines. Entry-level employment (ages 22-25) in AI-exposed occupations saw a 6% decline from late 2022 to mid-2025 according to Stanford/ADP research.


Key Risks

RiskDescription
Shadow ITConsumer-grade AI tools lack enterprise guardrails. Without an internal platform like LLM Suite, employees may input sensitive data into uncontrolled tools.
DeepfakesCEO/CFO fraud and spear phishing are growing more sophisticated and frequent. Malicious use of AI is a rising concern for financial institutions.
Agent AccessIdentity and access management frameworks need to be uplifted for a world where agents access systems, applications, and other agents — credential passing is a real problem.
Trust at ScaleAutonomous multi-step analysis raises auditability questions. As systems run for minutes or hours, humans need new ways to verify outputs. Complacency risk increases as accuracy approaches 95%.
IP / TrainingOngoing litigation over how data was used to train LLMs. A licensing solution (like ASCAP/BMI for music) may be needed.

AI will make everyone a manager. The way we work with and manage AI systems will become more like how we manage people today.


The Bottom Line

JPMorgan’s playbook combines top-down strategic focus on high-value domains with bottom-up democratization through self-service tools. The next horizon of value isn’t more adoption — it’s deeper connectivity (linking more data, applications, and systems into the AI ecosystem) and end-to-end journey reimagination. Waldron’s framing: the competitive advantage isn’t the model, it’s the institutional fabric wired around it.

Last updated on