JFrog - Experts & Thought Leaders

Latest JFrog news & announcements

JFrog launches secure MCP server for AI integration

JFrog Ltd, the Liquid Software company and creators of the award-winning JFrog Software Supply Chain Platform, today unveiled a new Model Context Protocol (MCP) Server. This architecture enables Large Language Models (LLMs) and AI agents to securely interact with tools and data sources within the JFrog Platform directly from MCP clients, including popular agentic coding development environments and IDEs, boosting developer productivity and streamlining workflows. MCP-enabled AI tools and coding agents “The developer tool stack and product architecture has fundamentally changed in the AI era. With the launch of the JFrog MCP Server, we’re expanding the open integration capabilities of the JFrog Platform to seamlessly connect with LLMs and agentic tools,” said Yoav Landman, Co-Founder and CTO, JFrog. "This allows developers to natively integrate their MCP-enabled AI tools and coding agents with our Platform, enabling self-service AI across the entire development lifecycle, which helps increase productivity and build smarter, more secure applications, faster." JFrog software supply chain platform with agentic AI AI automation helps simplify complex queries that once required advanced developer knowledge The Model Context Protocol (MCP) is an open, industry-standard integration framework designed to connect AI systems with external tools, data, and services. With JFrog’s MCP Server, developers can now use natural language commands like “Create a new local repository” or “Do we have this package in our organisation?” to interact with the JFrog Platform directly from their IDE or AI assistant.  Teams gain immediate awareness of open-source vulnerabilities and software package usage without context switching, saving developers time. AI automation also helps simplify complex queries that previously required advanced developer knowledge, helping all teams work smarter and faster.  JFrog Security Research Team While remote MCP servers can help facilitate rapid code iteration and improve software reliability, they are not without risk. The JFrog Security Research Team recently discovered vulnerabilities, such as CVE-2025-6514 that could hijack MCP clients and execute remote code, potentially leading to severe consequences. This is another reason why JFrog’s MCP Server is designed with security in mind and relies exclusively on trusted connection methods, such as HTTPS.  JFrog’s MCP Server securely provides: Essential Tools for Gaining Software Package Insights: Users can leverage a base toolset to create and manage projects, repositories, view build status, and query detailed package and vulnerability information. Centralised, Cloud-Native MCP Server with Automatic Updates: Available to JFrog SaaS customers and multi-tenant environments, JFrog’s MCP server is implemented as a remote, secure server available in all JFrog cloud environments, providing automatic version updates and improvements with less maintenance. Secure OAuth 2.1 Authentication: Enforcing modern token-based authorisation with scoped access per tenant and tool, making sure all operations are authenticated and performed under the identity of the end user. Production-Grade Monitoring: Comprehensive logging and event tracking for actionable insights into tool usage. JFrog MCP client up  JFrog’s new MCP Server for the JFrog Platform is now available for developers to test and provide feedback during a preview period.  For more information and to get started, check out blog or visit the AWS marketplace. Interested parties can also check out this step-by-step guide on how to get the JFrog MCP client up and running quickly. 

JFrog and NVIDIA: Unified AI software innovation

JFrog Ltd, the Liquid Software company and creators of the award-winning JFrog Software Supply Chain Platform, now announced the integration of its foundational DevSecOps tools with the NVIDIA Enterprise AI Factory validated design. JFrog will serve as the cornerstone software artifact repository and secure model registry for the landmark agentic AI architecture. NVIDIA NIM integration New collaboration delivers a full-spectrum MLOps solution, designed to ensure scalable Following a successful NVIDIA NIM integration with the JFrog Platform, this new collaboration delivers a full-spectrum MLOps solution, designed to ensure scalable, secure and seamless deployment of AI-powered applications using the NVIDIA Blackwell platform. "The future of AI depends not only on innovation - but on trust, control, and seamless execution," said Shlomi Ben Haim, CEO and co-founder of JFrog. JFrog’s Software Supply Chain Platform "To deliver AI at scale, enterprises need to adopt the same concepts applied to software: developer-friendly workflows, strong security, robust governance, and full lifecycle management. ML models are binaries, and they must be managed as first-class software artifacts." "That’s why we’re excited to partner with NVIDIA to bring JFrog’s Software Supply Chain Platform as the single source of truth for all software and AI assets to the NVIDIA Enterprise AI Factory so organisations can build and scale trusted AI solutions with confidence." Delivering critical infrastructure to enable future AI innovation JFrog Platform provides customers with a “single source of truth” for software components within NVIDIA The JFrog Platform provides customers with a “single source of truth” for software components within NVIDIA Enterprise AI Factory, which contains an integrated and validated suite of software technology solutions enterprises can use to develop, deploy, and manage agentic AI, physical AI, and HPC workloads on-premises. This validated design aims to allow organisations to have full control of their data and operate advanced AI agents in a secure environment. Key capabilities include: Secure & Governed Software Component Visibility: Enables all ML models, engines, and software artifacts to be scanned for security issues, versioned, governed, and traceable across the entire software development lifecycle. End-to-End Software Artifact & ML Model Management: Enables the seamless pulling, uploading, and hosting of AI models and datasets, AI containers, Docker containers, and dependencies optimised for the NVIDIA Enterprise AI Factory validated design.  Rapid, Trusted AI/ML Application Provisioning in Runtime: Simplifies configuration of AI environments by eliminating the need for runtime environments to pull components from outside of the organisation, thanks to the universality, proven scalability and robustness of JFrog Artifactory.  Future-proofed for Evolving GenAI Applications: Quickly and easily manages ML model versioning and upgrades to new and approved model generations. Complexity of AI adoption JFrog Platform to run natively on NVIDIA Blackwell systems to help reduce latency and process tasks “Enterprises building AI factories need to manage the complexity of AI adoption while ensuring performance, governance and trust,” said Justin Boitano, Vice President, Enterprise AI Software Products, NVIDIA. “JFrog’s unified software supply chain platform, paired with the NVIDIA Enterprise AI Factory validated design, enables rapid, responsible AI innovation at scale.” NVIDIA Blackwell systems The integration is designed to enable the JFrog Platform to run natively on NVIDIA Blackwell systems to help reduce latency and process tasks with unparalleled performance, efficiency, and scale. It supports a wide range of AI-enabled enterprise applications, agentic and physical AI workflows, autonomous decision-making, and real-time data analysis across various industries, including financial services, healthcare, telecommunications, retail, media, and manufacturing. Additionally, the system leverages NVIDIA’s engineering know-how and partner ecosystem to help enterprises accelerate time-to-value and mitigate the risks of AI deployment.

JFrog & NVIDIA team up for AI supply chain platform

JFrog Ltd, the Liquid Software company and creators of the award-winning JFrog Software Supply Chain Platform, announces the integration of its foundational DevSecOps tools with the NVIDIA Enterprise AI Factory validated design. JFrog will serve as the cornerstone software artifact repository and secure model registry for the landmark agentic AI architecture. Following a successful NVIDIA NIM integration with the JFrog Platform, this new collaboration delivers a full-spectrum MLOps solution, designed to ensure scalable, secure, and seamless deployment of AI-powered applications using the NVIDIA Blackwell platform. Future of AI "The future of AI depends not only on innovation - but on trust, control, and seamless execution," said Shlomi Ben Haim, CEO and co-founder of JFrog. "To deliver AI at scale, enterprises need to adopt the same concepts applied to software: developer-friendly workflows, strong security, robust governance, and full lifecycle management. ML models are binaries, and they must be managed as first-class software artefacts.” “That’s why we’re excited to partner with NVIDIA to bring JFrog’s Software Supply Chain Platform as the single source of truth for all software and AI assets to the NVIDIA Enterprise AI Factory so organisations can build and scale trusted AI solutions with confidence." Providing critical infrastructure for future AI innovation The JFrog Platform provides customers with a “single source of truth” for software components The JFrog Platform provides customers with a “single source of truth” for software components within NVIDIA Enterprise AI Factory, which contains an integrated and validated suite of software technology solutions enterprises can use to develop, deploy, and manage agentic AI, physical AI, and HPC workloads on-premises. This validated design aims to allow organisations to have full control of their data and operate advanced AI agents in a secure environment. Key capabilities include: Secure & Governed Software Component Visibility: Enables all ML models, engines, and software artefacts to be scanned for security issues, versioned, governed, and traceable across the entire software development lifecycle. End-to-End Software Artifact & ML Model Management: Enables the seamless pulling, uploading, and hosting of AI models and datasets, AI containers, Docker containers, and dependencies optimised for the NVIDIA Enterprise AI Factory validated design. Rapid, Trusted AI/ML Application Provisioning in Runtime: Simplifies configuration of AI environments by eliminating the need for runtime environments to pull components from outside of the organisation, thanks to the universality, proven scalability and robustness of JFrog Artifactory. Future-proofed for Evolving GenAI Applications: Quickly and easily manages ML model versioning and upgrades to new and approved model generations. Words from Justin Boitano “Enterprises building AI factories need to manage the complexity of AI adoption while ensuring performance, governance and trust,” said Justin Boitano, Vice President, Enterprise AI Software Products, NVIDIA. “JFrog’s unified software supply chain platform, paired with the NVIDIA Enterprise AI Factory validated design, enables rapid, responsible AI innovation at scale.” Efficient integration of the JFrog platform The integration is designed to enable the JFrog Platform to run natively on NVIDIA Blackwell systems to help reduce latency and process tasks with unparalleled performance, efficiency, and scale. It supports a wide range of AI-enabled enterprise applications, agentic and physical AI workflows, autonomous decision-making, and real-time data analysis across various industries, including financial services, healthcare, telecommunications, retail, media, and manufacturing. Additionally, the system leverages NVIDIA’s engineering know-how and partner ecosystem to help enterprises accelerate time-to-value and mitigate the risks of AI deployment.