DMS Utility Is Dying. Have Law Firms Noticed Yet?
There is a quiet crisis unfolding inside law firms, and it is playing out in the most mundane of places: the document management system. For decades, platforms like iManage, NetDocuments, and OpenText eDOCS have served as the foundational nervous system of law firm information management — the canonical repository where work product lives, where matter files are organized, where knowledge is theoretically accumulated and preserved. Partners swear by them. IT departments defend them. Governance committees mandate them. And yet, increasingly, no one actually wants to use them.
The reason is not apathy or laziness. The reason is that the actual knowledge work of lawyering is rapidly migrating to a new generation of AI-native platforms — tools like Harvey, Legora, CoCounsel, Syllo and others — that are operating on a fundamentally different conceptual foundation than the document management paradigm law firms spent the last thirty years building. The DMS, once the center of gravity for law firm information, is becoming an obstacle. In some workflows, it is becoming irrelevant entirely. And it unclear whether the legal profession, characteristically slow to confront structural disruption, has begun to grapple with what this means.
The Document Management System as We Know It
To understand what is at stake, it helps to understand what traditional document management systems were actually designed to do. DMS platforms emerged in the 1990s to solve a specific, bounded problem: organizing and retrieving electronic documents in a matter-centric filing structure, with version control, access permissions, and audit trails. They were built around the metaphor of the physical file room — a place to store things, retrieve things, and ensure that things did not get lost or improperly accessed.
That metaphor served law firms reasonably well for a long time. iManage Work, which the company reports is relied on by more than one million professionals at 4,000 organizations worldwide, markets its platform as creating "a single source of truth for all documents, emails, and chat conversations" saved in "a clearly organized and flexible folder structure."¹ That is a good description of what a DMS does. It is also a good description of what an AI platform does not do, at least for now.
AI-native platforms entering law firm workflows operate on an entirely different conceptual foundation. They do not treat documents as files to be stored and retrieved. They treat documents as data to be analyzed, synthesized, and reasoned over. Harvey describes its platform as enabling teams to work across "document storage, legal research, deal management, due diligence, fund formation, contract analysis, and complex workflows,"² with documents serving as inputs into AI reasoning systems rather than as records in a filing structure. This is not a minor interface difference. It is an architectural incompatibility that goes to the heart of what a DMS is and what AI legal tools need.
The Integration Promise and Its Practical Limits
To their credit, the major AI legal platforms have invested substantially in attempting to bridge this gap through native integrations with major DMS providers. The published record of those integration efforts is illuminating — both for what it reveals about genuine progress and for what it reveals about the complexity of the problem.
Harvey announced a formal technology partnership and integration with iManage on June 5, 2025, committing to enable users "to seamlessly access documents from within iManage and return Harvey-generated work product back to the [iManage] platform."³ The open beta of that integration launched August 11, 2025, offering users the ability to "securely import documents directly from iManage into Assistant, Workflows, and Vault through an intuitive, embedded interface."⁴ The direct OAuth connection, Harvey's product blog notes, was designed to eliminate the need for "third-party middleware or manual credential management."⁴ Harvey has subsequently confirmed in its December 2025 year-in-review post that its current integration ecosystem spans iManage, NetDocuments, SharePoint, and Google Drive.⁵
Legora, the Sweden-based legal AI platform that raised $550 million in its most recent financing round, advertises on its homepage that it provides access to "your DMS content, all within Legora," citing "partnerships across jurisdictions and integrations with iManage and SharePoint."⁶ Thomson Reuters, through its CoCounsel product, announced the launch of CoCounsel Knowledge Search in July 2025, a capability designed to "access content from HighQ, iManage, NetDocuments, SharePoint, OneDrive, Thomson Reuters content from Westlaw and Practical Law, and third-party sources," addressing what Thomson Reuters characterized as the problem that "knowledge management systems and content repositories are fragmented and often lack basic integration capabilities."⁷
The marketing around these integrations is, as the press releases confirm, enthusiastic. The practical reality, which Harvey's own technical engineering team has documented in striking detail, is considerably more complicated.
In a technical blog post published October 29, 2025, Harvey's engineering team described building its iManage integration as requiring the resolution of significant challenges across "security, scaling, and networking."⁸ The post documented that iManage implements API rate limits that Harvey had to engineer around with a distributed Redis-based rate limiter to prevent large Vault file uploads from consuming API quota in ways that would degrade real-time user experience in other product areas.⁸ For on-premises iManage deployments — which remain common among the largest enterprise customers — Harvey documented the need to work with individual customers' infrastructure and security teams to design custom network paths to allow Harvey's API requests to reach iManage "without being hijacked by SSO redirects or blocked by firewalls."⁸ The most common pattern Harvey identified involved publishing iManage through Azure Application Proxy, a solution that requires meaningful configuration effort by the customer's IT team.⁸
Harvey's separate February 2026 technical post on its file ingestion system described the earlier state of its DMS integration more candidly: prior to building its current system, "customers experienced some issues as adoption grew," including the inability to process "extremely large uploads quickly," users having to "manually select large sets of individual files (not folders)," and the problem that files "eventually became stale" — when documents were updated in iManage, "they had to also update the same files in Harvey."⁹ Solving these problems required building an asynchronous workflow architecture capable of ingesting "hundreds of thousands of files" with automated folder sync, a non-trivial engineering undertaking.⁹
The bilateral sync problem is improving but still architecturally complex. Harvey's August 2025 integration announcement acknowledged that before the integration launched, "accessing and utilizing documents from iManage in Harvey involved manual steps, such as downloading and re-uploading files between the two platforms, slowing down workflows and introducing unnecessary complexity."⁴ The integration addressed this by enabling documents drafted in Harvey to be exported "back into specific iManage matters, complete with full versioning, metadata, and audit trails,"⁴ and CMS Netherlands' Head of Innovation described it as allowing work product to be exported "as a document back to the exact iManage workspace in a single click."⁴
That this is celebrated as a milestone — rather than treated as baseline functionality — is itself revealing. The very existence of that quote, and the engineering investment required to get there, tells the story of how hard genuine DMS integration actually is. Law Firms in the market for AI native legal specific software platforms should thus be extremely suspicious of developer claims promising seamless integration with existing DMS systems. The integrity of those claims should be vigorously tested during the procurement cycle because the reality of the integration’s functionality may fall well short of “seamless.”
Does the DMS Just Get in the Way?
This brings us to the harder, more uncomfortable question: in an AI-native workflow, does the document management system add value, or does it simply add steps?
The honest answer, at least for certain classes of work, is that it increasingly adds steps. Consider a typical AI-assisted document review workflow in complex litigation. A team is using an AI platform to analyze large document populations for relevance and privilege, to identify key custodians, and to map the factual narrative. The underlying document population almost certainly lives in a review platform — Syllo, Relativity, Reveal, DISCO — not in the DMS. The AI analysis occurs in the AI platform. The work product is generated in the AI platform. The DMS sits to the side, nominally the system of record, while the actual matter intelligence accumulates in a constellation of third-party systems that do not natively communicate with it. Similarly, to the extent the AI-assisted workflow benefits from previously accumulated knowledge that lives within the DMS, that knowledge must be transported from the DMS to the review platform through cumbersome data exfiltration and [re]import workflows. Absent a functional and reliable native integration between the two systems this process significantly detracts from the aspiration toward maximum efficiency that AI unwaveringly proselytizes .
Or consider transactional work. A team using an AI platform to analyze a contract data room — identifying representations and warranties, flagging deviations from market standards, generating a due diligence summary — is working largely within the AI platform's environment. The documents may have come from a virtual data room, not the DMS. The outputs are work product that needs to be filed somewhere. Harvey's own blog for mid-sized law firms frames the problem precisely: "Without seamless connections between systems, lawyers are left manually bridging the gaps — downloading documents from a DMS, uploading them into AI tools, and transferring work product back into their matter files."¹⁰
The DMS, in these workflows, is functioning less as a knowledge management system and more as a compliance and filing system — a place where things go after the real work is done, not a place where work gets done. This is a significant functional demotion from the role it was designed to play. Thomson Reuters characterized this problem in its July 2025 Knowledge Search press release as the need to eliminate "the necessity to search their various document management systems, find and download documents, and then move them to the required portal."⁷ That this problem still required a press release announcement to address just eight months ago illustrates how far the profession remains from a seamlessly integrated multi-platform workflow.
The Security Dimension: A Problem No One Has Fully Solved
Any honest discussion concerning the fraught information transfer negotiation between AI native platforms and existing DMS has to confront critical data security questions. They are substantial and not yet adequately addressed by the profession.
When documents move from a law firm DMS into a third-party AI platform, they traverse a security boundary. The DMS lives behind the firm's perimeter — on-premises or in a firm-controlled cloud environment, subject to the firm's information security policies, access controls, audit logging, and breach protocols. The AI platform lives somewhere else: in AWS, in Azure, in a vendor-controlled cloud environment governed by a term of service agreement and a data processing addendum.
The security posture of AI legal platforms varies considerably but all are pursuing enterprise certifications. Harvey's security page discloses that it is hosted on Microsoft Azure, offers SOC 2 Type II and ISO 27001 certifications, contractually prohibits model providers from training on customer data, and offers data processing in EU/Switzerland or Australia for customers with data localization requirements.¹¹ Legora publishes equivalent disclosures: ISO 27001 and SOC 2 Type 2 certifications, ISO 42001 AI governance certification, AES-256 encryption at rest, and a contractual commitment that "Legora will not use your data to train or fine tune any AI models."¹² Both platforms describe their approach in terms of enterprise-grade security architecture, zero-trust design principles, and robust access controls.
But even the most security-conscious AI platform represents a different risk profile than a firm-controlled DMS. Documents that leave the firm perimeter are subject to new threat surfaces: vendor breach risk, sub-processor exposure, data residency questions, and questions about how customer data flows through AI inference pipelines. This is not hypothetical risk.
The IBM Cost of a Data Breach 2024 Report — the latest annual study conducted by IBM and the Ponemon Institute, reported that the global average cost of a data breach had reached $4.88 million, the largest year-over-year increase since the pandemic.¹³ While the report is not specific to legal AI platforms, its documentation of third-party breach vectors is directly applicable to the integration pathways law firms are now creating between internal DMS infrastructure and external AI platforms.
The legal profession's obligation to protect client confidences under Model Rule 1.6 is well-established. The ABA's Formal Opinion 477R (2017) addressed the obligation to use reasonable efforts to prevent inadvertent disclosure when using electronic communications. More directly applicable is ABA Formal Opinion 512, issued in July 2024, which represents the ABA's first formal ethical guidance specifically addressing lawyers' use of generative AI tools. That opinion addresses the use of generative AI in the context of the duty of competence, confidentiality, supervision, and other ethical obligations under the Model Rules.¹⁴ Among the issues Opinion 512 addresses is the obligation attorneys have when sharing client information with third-party AI vendors — a question the opinion identifies as requiring careful analysis under Rule 1.6 and the supervisory obligations of Rules 5.1 and 5.3.
The compliance frameworks that law firms have built around DMS data governance — data classification, retention schedules, matter-level access controls — do not automatically extend to third-party AI platforms. As Harvey's own technical documentation of its iManage integration acknowledges, meaningful governance required the company to update its data model "such that it does not differentiate between file import sources," ensuring that "imported files and metadata follow the same segregation, retention, and compliance policies as any other customer data."⁸ That Harvey describes this as a design achievement — rather than a baseline assumption — underscores how much integration complexity remains the customer's governance problem to navigate.
The Knowledge Management Consequence
There is a dimension to this disruption that has received insufficient attention: the long-term knowledge management consequences of matter intelligence accumulating in platforms the firm does not control.
One of the theoretical value propositions of the DMS — a value proposition that, admittedly, few firms have ever fully realized — is that it allows a firm's institutional knowledge to accumulate over time. Precedent documents, negotiated terms, successful arguments, deposition outlines that worked — all of this, in theory, lives in the DMS and is searchable and accessible to attorneys working on future matters. Harvey acknowledged this institutional knowledge dimension explicitly in its February 2026 file ingestion post: "The best legal work doesn't happen in a vacuum. It builds on years of accumulated institutional knowledge: prior deal structures, successful motion templates, negotiation playbooks, and matter-specific expertise. Most often, this context lives in files stored in document management systems."⁹
But in an AI-native workflow, the most sophisticated analytical work product is being generated in third-party platforms, and the manner in which those AI sessions are structured — what was surfaced, what was synthesized, how a risk profile was characterized — is not itself a document that files back into the DMS. It is an artifact of the AI session. Harvey's response to this problem is its Vault product and the institutional knowledge sync infrastructure described in its February 2026 post — a continuous synchronization system designed to keep firm knowledge in Harvey current with DMS updates.⁹ But that infrastructure is still being built and extended, and it presupposes that the AI platform itself is where institutional knowledge will ultimately live — not the DMS. It also leaves open the question of who owns that knowledge and how is it transferred (if at all) if the business relationship between the third-party platform and the law firm terminates.
This is a profound knowledge management challenge that the profession has not yet fully confronted. The DMS, for all its limitations, represented an aspiration toward institutional memory. The AI-native workflow, as currently structured, runs the risk of producing extraordinary analytical capability at the individual matter level while simultaneously creating new dependencies on vendor platforms whose long-term continuity cannot be assumed.
What Comes Next
None of this means that AI legal platforms are wrong or that firms should resist them. They should not. The analytical capabilities these tools provide are genuinely transformative, and firms that fail to adopt them will find themselves at a competitive disadvantage that will compound over time. The efficiency gains, the analytical depth, the capacity to work across large document populations with speed and precision — these are real.
But the profession needs to confront, honestly and directly, the infrastructure consequences of this transition. The DMS, as currently architected, is not fit for purpose as the sole organizing structure for an AI-native workflow. The integrations — while meaningfully more capable than they were two years ago, as the Harvey, Legora, and CoCounsel integration announcements demonstrate — still require significant engineering effort, customer-side IT configuration, and governance design to function as genuine bidirectional systems of record rather than one-directional document pipelines. Harvey's own engineers published a detailed technical account of the distributed rate limiting, OAuth architecture, network engineering, and asynchronous workflow orchestration required to make a single DMS integration work properly at enterprise scale.⁸ ⁹ This is not a problem that initially resolves itself through a simple connector install.
The firms that navigate this transition most successfully will be those that stop pretending the DMS can be patched into relevance through vendor integrations alone and start asking harder questions: What is the actual system of record for AI-assisted work product? How do we ensure that matter intelligence generated in third-party platforms is captured and preserved consistently? What does a security governance framework for multi-platform information workflows actually look like in light of ABA Formal Opinion 512's guidance on confidentiality obligations? How do we satisfy our professional obligations to protect client information across a vendor ecosystem whose security posture requires ongoing due diligence?
These are not technology questions. They are governance questions, professional responsibility questions, and strategic questions. The answers will define which firms are still recognizable as knowledge institutions in a decade and which have simply become document processors with expensive AI subscriptions.
The document management system is not going to die tomorrow. But the workflow it was built to support is already dying around it. The firms that wait for the DMS vendors to solve this problem very well could be waiting a long time.
Joshua Upin is a litigation partner and head of the e-discovery practice at Philadelphia-based Royer Cooper Cohen Braunfeld LLC. He also co-chairs the firm's technology and innovation committee.
This article represents the personal views of the author and does not constitute legal advice.
Endnotes
- iManage, iManage Work: Drive Productivity With Work 10, imanage.com/imanage-products/document-email-management/work/ (last visited Mar. 26, 2026) (quoting published product web copy: "iManage Work is relied on by more than one million professionals at 4,000 organizations around the world").
- Harvey, Harvey | AI Platform for Legal and Professional Services, harvey.ai (homepage) (last visited Mar. 26, 2026).
- Harvey / iManage, Harvey Announces Technology Partnership with iManage (June 5, 2025), reported at imanage.com/technology-partners/harvey/ and harvey.ai blog.
- Harvey, Harvey's iManage Integration, harvey.ai/blog/harveys-imanage-integration (Aug. 11, 2025) (all quoted customer statements from published blog; quoting Winston Burt, Director of Legal Technology, Ropes & Gray; Ashton Batchelor, Chief Innovation & Value Officer, Blank Rome; Bert Vries, Head of Innovation & IT, CMS Netherlands).
- Harvey, Harvey's Top 5 Product Releases of 2025, harvey.ai/blog/top-5-product-releases-of-2025 (Dec. 30, 2025) ("Internal data from Vault, local file uploads, or our integrations with iManage, NetDocuments, SharePoint, and Google Drive").
- Legora, Homepage, legora.com (last visited Mar. 26, 2026) ("Access up-to-date information, legal databases, and your DMS content, all within Legora. With partnerships across jurisdictions and integrations with iManage and SharePoint, everything lawyers need is now in one place").
- Thomson Reuters, Thomson Reuters Launches CoCounsel Knowledge Search – An AI-Powered Experience to Manage Content and Institutional Knowledge (Press Release, July 9, 2025), available at thomsonreuters.com (quoting Rawia Ashraf, head of Product, CoCounsel Transactional and Corporates, Thomson Reuters).
- Harvey Engineering Team (Reggie Cai, George Tamer, Ken Chen, Elaine Lu, Sandeep Uppaluri, and Abhishek Verma), Building Harvey's iManage Integration, harvey.ai/blog/building-harveys-imanage-integration (Oct. 29, 2025) (quoting John Jovanovski, Head of AI & Cyber, Clayton Utz, on on-premises integration).
- Reggie Cai, Building a New File Ingestion System to Scale Firm Knowledge, harvey.ai/blog/building-new-file-ingestion-system-to-scale-firm-knowledge (Feb. 11, 2026).
- Harvey, Why an Integrated AI Platform Matters for Mid-Sized Law Firms, harvey.ai/blog/integrated-ai-platform-for-mid-sized-law-firms (Mar. 18, 2026).
- Harvey, Security: For the Most Sensitive Matters, harvey.ai/security (last visited Mar. 26, 2026).
- Legora, Security and Compliance, legora.com/security (last visited Mar. 26, 2026).
- IBM and Ponemon Institute, Cost of a Data Breach Report 2024, available at ibm.com/reports/data-breach (2024) (reporting global average breach cost of $4.88 million, the largest year-over-year increase since the pandemic).
- American Bar Association, Formal Opinion 512: Generative Artificial Intelligence Tools (July 29, 2024); see also ABA, Formal Opinion 477R: Securing Communication of Protected Client Information (May 22, 2017); ABA Model Rules of Professional Conduct, Rule 1.6 (Confidentiality of Information), Rules 5.1 and 5.3 (Supervisory Obligations).