MCP Archives - WAV Group Consulting https://www.wavgroup.com/category/mcp/ WAV Group is a leading consulting firm serving the real estate industry. Thu, 22 Jan 2026 23:19:57 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 https://www.wavgroup.com/wp-content/uploads/2017/03/cropped-favicon-32x32.png MCP Archives - WAV Group Consulting https://www.wavgroup.com/category/mcp/ 32 32 MLS Data, AI, and the Line Between Innovation and Risk https://www.wavgroup.com/2026/01/23/mls-data-ai-and-the-line-between-innovation-and-risk/?utm_source=rss&utm_medium=rss&utm_campaign=mls-data-ai-and-the-line-between-innovation-and-risk Fri, 23 Jan 2026 16:00:33 +0000 https://www.wavgroup.com/?p=53874 As AI adoption accelerates across real estate, MLS data sits at the center of both opportunity and risk. MCP is emerging as a key safeguard, helping the industry innovate responsibly while protecting critical data assets.

The post MLS Data, AI, and the Line Between Innovation and Risk appeared first on WAV Group Consulting.

]]>
Where MCP becomes the line of defense for MLS data in an AI-driven world.

 

MLS executives are right to be cautious when agents, brokers, teams, or third-party listing websites connect artificial intelligence to MLS data. That concern is not resistance to innovation. It is stewardship of the MLS data that is fundamental to the brokerage cooperative.

MLS data is not just information. It is the shared intellectual property of the brokerage cooperative and the foundation on which every MLS operates. When AI systems are poorly designed or loosely governed, they can quietly erode that foundation by learning from MLS data and repurposing it in ways that violate copyright, data license agreements, and broker trust.

This tension defines the current moment. MLSs are expected to enable innovation while simultaneously protecting the broker asset they were created to serve naturally and without favor.

Why AI Creates a New Class of Data Sovereignty Risk

Traditional software consumes MLS data in predictable ways. Search, display, analytics, and reporting are governed by long-standing rules around access, storage, and attribution.

AI introduces a fundamentally different risk profile.

When an AI system is allowed to train on MLS data, the data is no longer just being queried. It is being absorbed into the internal weights of a model. Once that happens, the value of the MLS data can be reconstructed, inferred, or redeployed outside the MLS ecosystem, often without visibility or control.

This is the core data sovereignty concern facing MLSs today:

  • MLS data can be transformed into derivative intelligence that lives outside MLS governance
  • Copyright protections become difficult to enforce once data is embedded in a trained model
  • Data license restrictions can be unintentionally violated through model reuse or redistribution
  • The cooperative asset of brokers risks becoming a permanent input to third-party AI platforms

In short, AI can turn a shared broker asset into an uncontained resource if safeguards are not designed from the start.

Innovation Is Not Optional. Exposure Is.

MLSs cannot simply block AI. Many agents and consumers increasingly expect smarter search, conversational interfaces, and more intuitive discovery tools. The challenge is not whether innovation should happen, but how it happens.

This is where architectural intent matters.

A well-designed AI system can enhance consumer experience without ever learning MLS data. A poorly designed one can permanently compromise it.

Natural Language Search, Explained Simply

One of the most visible and valuable AI use cases in real estate is natural language search.

Natural language search allows consumers to search the MLS the way they speak or think, rather than forcing them into rigid filters and dropdowns.

Instead of selecting city, beds, baths, price, and property type manually, a consumer can type or say:

  • “A ranch-style home with a pool near good schools in Austin”
  • “Two-bedroom condos in Arlington and Alexandria close to metro stations”
  • “Homes in Santa Monica within a 15-minute walk to Whole Foods”

The breakthrough is not that the MLS data changes. The breakthrough is that large language models interpret conversational intent and translate it into a structured search query that operates across the MLS dataset. The AI acts as an interpreter, not an owner of the data. This is the method deployed by pioneer Howard Hanna Real Estate Services; at Cribio.com (which is the Broker Public Portal’s industry initiative); and Homes.com.

Conversational Search Without Training the Data

This distinction matters.

In a compliant implementation, the large language model does not study MLS data, store it, or improve itself using it. Instead, it performs a transient task:

  • It receives a short, temporary prompt describing the user’s request
  • It converts that request into a structured search query
  • It passes that query to the MLS-backed search system
  • It forgets everything immediately after execution

The model behaves like a translator with no memory, not a student with a notebook.

A Practical Example: Homes.com Smart Search

Homes.com provides a useful reference point for MLS leaders evaluating how AI can be deployed responsibly.

Homes.com launched its Smart Search feature in October 2025 using a natural language interface built in partnership with Microsoft through the Azure OpenAI Service. From the outset, the system was engineered to comply with IDX rules, MLS data licenses, and broker copyright protections.

Several architectural decisions are worth highlighting.

Data Isolation and Residency

According to Andy Woolley, Homes.com operates Smart Search inside a private Microsoft Azure tenant. MLS listing data never leaves the Homes.com environment and is isolated from the public internet. The AI does not crawl, scrape, or independently access MLS data. It only sees data passed through secure internal APIs for seconds at a time.

No Model Training, Ever

Under Homes.com’s enterprise agreement with Microsoft, MLS data is never used to train, fine-tune, or improve any external third-party AI model. The model is static and frozen. It cannot learn prices, addresses, or patterns across the MLS dataset. This is governance operating at the server level.

Stateless Execution

The Smart Search AI is intentionally designed with amnesia. It has no memory of prior queries and no ability to build cumulative understanding of the MLS. Once a query is processed, the data disappears from the model’s context entirely. Apple’s Siri works the same way. It’s a decision that delivers trust and privacy.

IDX and Attribution Compliance

Search results generated through Smart Search are programmatically contained by the same IDX display rules as traditional search. Broker attribution, display controls, and domain restrictions remain intact, ensuring that AI-enhanced results do not bypass existing MLS governance, IDX policy, or data license restrictions.

The Stewardship Challenge for MLS Leaders

The Homes.com example demonstrates a critical point. AI does not have to threaten MLS data sovereignty. The Homes.com model is a version of the architecture and policy governed rule set that MLSs should model in the delivery of their gateway for agents and brokers to access MLS records using AI. 

The real risk emerges when AI is connected casually, without architectural guardrails, or through consumer-grade tools that were never designed for licensed, copyrighted data. This is happening in abundance today, and MLS records are being shared with AI though unrestricted gateways that live on replicated data sets living outside of the MLS listing infrastructure.

For MLSs, the path forward requires discipline:

  • Demand clarity on whether AI functionality deployed by licensed data recipients allow AI systems to train on MLS data (data leakage)
  • Require stateless, transient processing for conversational AI
  • Ensure data residency and isolation within controlled environments (the “walled garden” approach)
  • Treat MLS data as a protected cooperative asset, not just an input
  • Encourage innovation that enhances search results without extracting data from the dataset

Why MLSs Must Move Quickly on MCP Servers

This discussion ultimately leads to a more urgent conclusion for MLS leadership. MLSs must move quickly to provide Model Context Protocol (MCP) servers as part of their core infrastructure strategy.

Until MLSs provide sanctioned MCP servers, vendors, brokers, teams, and agents who want AI capabilities have little choice but to design their own data architectures downstream of the MLS. Today, there are no hard stated restrictions that forbid vendors from replicating the IDX data to their servers and allowing AI to train on the data. That fragmentation is not just inefficient, it erodes the value of the data by allowing any AI to extract whatever it wants. The MLS never knows about the extraction because it is happening on data repositories that it only controls by the data license agreement.

When AI connections are built outside of MLS-controlled environments, the MLS loses visibility into how data is accessed, processed, and protected. Each independent implementation introduces variability in compliance discipline, security standards, and architectural rigor. Over time, that variability compounds risk.

Perhaps the greatest emerging liability in real estate today is the unharnessed adoption of AI downstream of the MLS.

The Downstream Risk MLSs Cannot Ignore

AI adoption is accelerating whether MLSs are ready or not. Agents and brokers are experimenting with consumer-grade tools. Vendors are racing to differentiate with AI features. Development teams are building AI agent workflows that connect MLS data in new ways.

Without MLS-provided MCP servers:

  • Vendors must replicate MLS data to create their own AI data pipelines to remain competitive
  • MLSs lose the ability to enforce consistent guardrails at the point of AI interaction
  • Data access patterns become opaque and difficult to audit
  • Compliance becomes reactive instead of architectural

The danger is not theoretical. If even a single MLS data feed is accidentally exposed to a training-enabled large language model, the consequences may be irreversible. Once data is learned by a model, it cannot be reliably unlearned. A single leak to one or two models could permanently compromise the value of the cooperative asset.

This is happening today at scale off of data collected by search engine website crawlers that were designed for indexing websites so search engines could link to pages. Microsoft’s own generative AI models and partners like OpenAI can and do use the Bing index for training as well as for real-time retrieval (grounding). 

Here is a breakdown of how AI uses the Bing index:

  • Training Foundation Models: Microsoft has indicated that web content in the Bing Index may be used to train their generative AI foundation models.
  • Retrieval-Augmented Generation (RAG): AI tools like Copilot and ChatGPT use Bing to ground their responses, meaning they search the index in real-time to provide up-to-date, accurate information.
  • Data Usage Controls: Site owners can control this, however. Content without NOCACHE or NOARCHIVE tags can be used for both Bing Chat answers and training. If content is tagged NOCACHE, it may still be used in chat, but only URLs, Titles, and Snippets are used in training. Content tagged NOARCHIVE is not used for either.

If IDX data license agreements required that site owners displaying IDX data deploy NOARCHIVE tags, this consequential data leakage could be resolved. WAV Group believes that the best policy would only allow the listing firm to drop the NOARCHIVE tag on their listings. The listings of other firms would require the NOARCHIVE tag.

MCP Servers as the New Line of Defense

“MCP Guards Data” Access flows only with permission—MCP servers enforce controlled tool usage. SECURITY. PERMISSIONS. GUARDRAIL. CONSENT. SAFE. CONTEXT. TRUST.MCP servers give MLSs a way to reassert control without blocking innovation.

By providing an MLS-controlled interface for AI interaction, MCP servers allow MLSs to:

  • Act as the authoritative broker of context, not just data
  • Restrict access to participants and subscribers through existing login protocols
  • Enforce stateless, non-training execution by design
  • Maintain data residency and license compliance
  • Standardize how AI tools safely interact with MLS systems
  • Enable innovation without surrendering sovereignty

In this model, the MLS defines the rules of AI engagement.

The Architectural Moment MLSs Cannot Miss

The approach demonstrated by Homes.com shows what is possible when AI is engineered deliberately. Private infrastructure, stateless execution, zero-training guarantees, and strict license compliance are not obstacles to innovation. They are prerequisites for trusting that the data brokers contribute to the MLS benefits the cooperative.

MLSs now face a similar architectural moment.

Either the MLS becomes the secure, compliant gateway through which AI interacts with listing data, or that role will be filled by dozens of downstream implementations, each with no supervision, uneven controls, and collective risk of exposing data outside of the control of data license agreements.

The question is no longer whether AI will touch MLS data. It already is.

The real question is whether MLSs will lead that connection through thoughtful new AI usage rules and MCP servers, or whether they will be left trying to contain the consequences after the fact.

Stewardship, speed, and architectural intent now matter more than ever. Reach out below if you’re interested in getting started.

Hire WAV Group

  • Please select a service.
  • How can we help you?

The post MLS Data, AI, and the Line Between Innovation and Risk appeared first on WAV Group Consulting.

]]>
AI in the MLS: How Paragon is Building the Future of Real Estate Data Infrastructure https://www.wavgroup.com/2025/11/03/ai-in-the-mls-how-paragon-is-building-the-future-of-real-estate-data-infrastructure/?utm_source=rss&utm_medium=rss&utm_campaign=ai-in-the-mls-how-paragon-is-building-the-future-of-real-estate-data-infrastructure Mon, 03 Nov 2025 16:00:39 +0000 https://www.wavgroup.com/?p=53011 A conversation with ICE's Lucie Fortier and Jeff Fullbright reveals how AI is reshaping MLS platforms and what it means for data governance, agent adoption, and competitive positioning.

The post AI in the MLS: How Paragon is Building the Future of Real Estate Data Infrastructure appeared first on WAV Group Consulting.

]]>
The conversation around artificial intelligence in real estate has moved beyond speculation. MLSs are now facing practical questions: Which AI features should we deploy? How do we protect our data? And what infrastructure decisions will define our role in an AI-driven market?

To explore these questions, we sat down with Lucie Fortier and Jeff Fullbright from Intercontinental Exchange (ICE), the company behind Paragon® Connect MLS, a platform serving MLSs across the country. Our discussion covered everything from agent-facing AI tools to emerging data access models like MCP servers, and what “AI sovereignty” really means for MLSs.

Why This Matters Now

Paragon isn’t new to AI integration—they’ve worked with partners like Restb.ai for years. But the current moment represents something different. As Fortier and Fullbright explain in our conversation, MLSs are moving from curiosity to deployment, and the questions they’re asking have evolved from “Should we use AI?” to “How do we govern it?”

The interview explores:

  • Agent use cases: Which AI-powered features are being prioritized, listing input automation, CMA creation, client assistance and how they’re being integrated into daily workflows
  • MLS collaboration models: How Paragon is designing AI capabilities that can work for anyone; from small MLSs with limited tech teams to some of the largest MLSs in the country
  • Data governance and control: What protections are in place when AI systems access listing data, and how MLSs maintain control over their data assets
  • Emerging infrastructure: The potential role of MCP servers as a new data access layer designed specifically for AI systems, distinct from traditional IDX feeds and WebAPIs
  • Strategic positioning: How AI features can drive agent adoption and engagement, not just serve power users

The Data Access Question

One of the most compelling parts of our conversation centers on a challenge many MLS executives are facing right now: AI vendors are approaching MLSs asking for data access, but the traditional IDX framework wasn’t built for AI systems. Fortier and Fullbright discuss whether MCP servers could offer MLSs a way to provide secure access with usage tracking, licensing terms, and granular controls.

This isn’t just a technical question. It’s a strategic one about how MLSs position themselves as AI transforms real estate workflows.

Who Should Listen

This recording is essential for:

  • MLS executives and board members evaluating AI strategy and vendor partnerships
  • Technology directors responsible for platform decisions and data governance
  • MLS leaders thinking about competitive positioning and agent value propositions
  • Anyone involved in MLS data policy who needs to understand how AI is changing data access requirements

The tone is conversational and practical, not a vendor pitch, but a substantive discussion about the real decisions MLSs are facing as AI capabilities mature.

Ready to dig deeper? The full interview offers specific insights into Paragon’s roadmap, lessons from pilot partners, and Fortier and Fullbright’s perspective on what every MLS should be thinking about today.

Questions or want to continue the conversation? This interview is part of an ongoing series exploring how AI is reshaping MLS infrastructure and strategy. We welcome your thoughts and feedback.

Hire WAV Group

  • Please select a service.
  • How can we help you?

The post AI in the MLS: How Paragon is Building the Future of Real Estate Data Infrastructure appeared first on WAV Group Consulting.

]]>
A vision for AI in real estate: why MLS MCP servers matter more than ever https://www.wavgroup.com/2025/10/22/a-vision-for-ai-in-real-estate-why-mls-mcp-servers-matter-more-than-ever/?utm_source=rss&utm_medium=rss&utm_campaign=a-vision-for-ai-in-real-estate-why-mls-mcp-servers-matter-more-than-ever Wed, 22 Oct 2025 18:40:52 +0000 https://www.wavgroup.com/?p=52939 Every wave of technology gives us a chance to rethink how we work. MCP is that wave.

The post A vision for AI in real estate: why MLS MCP servers matter more than ever appeared first on WAV Group Consulting.

]]>
There’s a lot to be excited about in real estate tech right now. We’re seeing an impressive wave of generative AI integrations across platforms. Some are cosmetic. Others are foundational.

Dive deeper into how these innovations are shaping MLS governance in our new white paper, “Sovereign AI and the Future of MLS Governance.

Recent History

Howard Hanna was one of the first brokerages to embed conversational AI into home search. Cribio.com, the new portal for the Broker Public Portal, is pushing innovation forward with AI-powered discovery and real-time listing intelligence. Homes.com and Realtor.com have recently added generative AI tools to enhance search experiences. 

Associations and MLSs are partnering with companies to deliver 24/7 customer support and insights from tools like Ardi by Voiceflip and voice-enabled search by Lundy.

Broker tech vendors are also racing to modernize. Delta Media, Inside Real Estate (through BoldTrail), Real Estate Webmasters, Rechat, and others are building AI features into their stacks. But what’s happening behind the scenes is even more important: many of these companies are actively developing Model Context Protocol (MCP) servers to manage data downloaded by the MLS.

All of this momentum is worth celebrating, but it’s just the tip of the iceberg. The future of MLSs depends on taking on a much more active role in re-inventing the ways MLS data, insights and guidance are delivered to subscribers. 

What’s missing: production-grade MCP servers at the MLS

Right now, no MLS has a production-ready MCP server live in the field. UtahRealEstate.com and NorthstarMLS are close to completion. FBS has an MCP server in Beta. Others are exploring. Most are still on the sidelines.

That vacuum creates real risk.

AI is not waiting. While the MLS industry debates policy and control, technologists are moving forward. And without a clear, industry-led channel for AI access to listing data, the most likely outcome is direct publication. Broker to large language model.

If that happens, the foundation of the MLS cracks. At a time when MLSs should be expanding the excellence of listing input and quality to new arenas like rental listings and commercial listings, it could all fall apart. 

The stakes: data ownership, cooperation, and control

The moment real estate listings are published directly to ChatGPT, Perplexity, Claude, Gemini, and/or CoPilot without going through an MCP server managed by the MLS, the structure that governs listing data begins to collapse.

  • Data ownership disappears. Raw listings inside an LLM become a public asset, not a broker’s.
  • Brokerage cooperation breaks. If LLMs surface every listing, there’s no reason for agents to collaborate.
  • The MLS gets bypassed. When brokers can reach consumers directly through AI, the MLS is no longer essential.

This is not a theory. It is already happening.

No other country has the level of cooperation that brokers have built in America. And that cooperation is based on trust. Access is granted by one shared condition: a state-issued real estate license. That simplicity has fueled competition, protected consumers, and invited innovation.

We should not give that away.

Team CultureThe inflection point: unify or fracture

The real estate industry is consolidating. Brokerages are national. MLSs are regional. Now is the time to act like we’re on the same team.

One of the most consistent points of tension among brokers is the IDX program. It may be time to reset the terms.

Picture this: brokers only advertise their own listings. When a consumer becomes a client through a signed buyer agency agreement, they are invited into a private portal. The public side of listings is handled only by Fair Display Guideline-compliant websites like HAR.com, Cribio.com, and UtahRealEstate.com. These platforms are governed by rules that prioritize broker control and consumer protection.

A call to MLS leaders: don’t wait

If your MLS is not actively developing an MCP server, now is the time to start. If your vendor does not offer one, press them for a roadmap. Because if the MLS community doesn’t step up, others will fill the void. And the role of the MLS will shrink along with it.

Every wave of technology gives us a chance to rethink how we work. MCP is that wave.

MLSs need to lead. Brokerages need to act. And the industry must protect the very thing that makes it work. Broker cooperation is rare. It is valuable. It is worth defending.

Let’s not lose it to an API endpoint.

To explore a strategic framework for MLS-led AI innovation, download our white paper, “Sovereign AI and the Future of MLS Governance.

We’re helping organizations across real estate design practical AI frameworks and MCP strategies. Reach out to us below to start your journey.

Hire WAV Group

  • Please select a service.
  • How can we help you?

The post A vision for AI in real estate: why MLS MCP servers matter more than ever appeared first on WAV Group Consulting.

]]>
UtahRealEstate.com to Launch an MCP Server https://www.wavgroup.com/2025/09/16/utahrealestate-com-to-launch-an-mcp-server/?utm_source=rss&utm_medium=rss&utm_campaign=utahrealestate-com-to-launch-an-mcp-server Tue, 16 Sep 2025 18:11:41 +0000 https://www.wavgroup.com/?p=52619 As AI reshapes our industry, this much is clear: the MLS that provides the MCP server becomes the source of truth.

The post UtahRealEstate.com to Launch an MCP Server appeared first on WAV Group Consulting.

]]>

At WAV Group, we’ve been saying for a while that the MLS will play a decisive role in how AI evolves in real estate. UtahRealEstate.com, led by CEO Brad Bjelke, just gave us the clearest proof yet. To our knowledge, they are the first MLS in the nation to launch a Model Context Protocol (MCP) server.

This is a big deal. The MCP server is what allows brokers, agents, and their technology partners to securely ingest MLS records into their own AI workflows. Without it, we end up with a dangerous patchwork of duplicated databases, scraped data, and uncontrolled integrations. That’s not only inefficient,  it puts data accuracy, compliance, and consumer trust at risk.

UtahRealEstate.com’s decision is grounded in a philosophy they’ve held for years: technology sovereignty. They’ve always believed that MLSs should own and control the technology that sits on top of broker-contributed listings. Rather than outsource everything, Utah has built much of its own stack. That sovereignty is what made it possible for them to lead the way with MCP.

Consumers in Utah already know UtahRealEstate.com as a trusted place to search for homes. That site became the launchpad for the MLS’s first generative AI initiative. Now the focus is shifting inward, embedding AI natively into the solutions brokers and agents use every day. This isn’t about bolting on a chatbot. It’s about weaving intelligence into the core of the workflow.

As AI reshapes our industry, this much is clear: the MLS that provides the MCP server becomes the source of truth. That’s how brokers and agents will access their data in the age of AI — directly, securely, and with the right guardrails in place. UtahRealEstate.com is showing all of us what that future looks like.

Watch our conversation with Brad Bjelke below!

Get in touch below to explore how MCP servers support data accuracy and compliance.

Hire WAV Group

  • Please select a service.
  • How can we help you?

The post UtahRealEstate.com to Launch an MCP Server appeared first on WAV Group Consulting.

]]>
The Hidden Risk in MCP Servers That Could Expose Your Business https://www.wavgroup.com/2025/09/15/the-hidden-risk-in-mcp-servers-that-could-expose-your-business/?utm_source=rss&utm_medium=rss&utm_campaign=the-hidden-risk-in-mcp-servers-that-could-expose-your-business Mon, 15 Sep 2025 13:00:58 +0000 https://www.wavgroup.com/?p=52608 If your team is deploying AI agents using the Model Context Protocol (MCP) without proper security, you're essentially leaving your business wide open to attack. A recent security assessment found that 43% of popular MCP implementations contain command injection flaws, 30% allow network infiltration, and 22% expose sensitive file vulnerabilities. With real-world incidents already occurring the solution isn't hoping for the best, it's implementing an MCP gateway before your next deployment.

The post The Hidden Risk in MCP Servers That Could Expose Your Business appeared first on WAV Group Consulting.

]]>
The hidden dangers of MCP Servers in the AI world.

I don’t like writing scare pieces. But this one? It needs to be written.

Because if your team is deploying AI agents or leveraging AI desktop tools using the Model Context Protocol (MCP) and you’re not securing them with a gateway, you’re basically leaving the doors and windows open and walking away.

So, what is MCP and why should I care?

The Model Context Protocol (MCP) is like the glue that connects AI agents to outside tools and information. It lets an AI model talk to your CRM, hit your internal APIs, or fetch files on your system.

Sounds useful, right?

It is. That’s why so many teams, from startups to massive enterprises, are adopting it. MCP makes AI agents way more capable. It turns them into doers and not just talkers.

But there’s a catch.

MCP servers require security considerations

A recent security assessment by Equixly looked at dozens of popular MCP implementations. The results weren’t promising:

  • 43% had command injection flaws
  • 30% allowed Server-Side Request Forgery (SSRF is basically letting attackers poke around your internal network)
  • 22% exposed arbitrary file read vulnerabilities
  • Only 30% of vendors even patched the issues when they were told

Worse? Some vendors claimed these risks were “theoretical” or “acceptable.” That’s like a car company saying exploding airbags are “edge cases”, and only happen when there’s an accident.

These are not theoretical. They’re real. And they’ve already caused real-world incidents.

The hacks are creative and terrifying

Let’s break down what’s happening out there:

  • Prompt Injection: Attackers can sneak commands like “IGNORE ALL PREVIOUS INSTRUCTIONS” into API responses. Your AI agent happily obeys.
  • SQL Injection: Old-school attack, new playground. Some MCP servers let you drop malicious SQL into prompts and exfiltrate data.
  • Cross server shadowing: MCP metadata or responses change how the AI interacts with other servers.
  • Server Spoofing/Tool Mimicry: MCPs trick the AI into using the wrong servers & tools.
  • Authentication Bypass: Some servers don’t verify who’s calling. Others let you register rogue MCP endpoints and impersonate trusted tools.
  • Tool Poisoning: A tool looks safe at install. Then one day, it updates silently and starts stealing data.
  • Rug Pulls: Third-party MCP packages switch behavior after getting adopted widely—just like malicious npm packages have done for years.

This isn’t speculation. It’s already happened as detailed in security investigations from Composio and Equixly:

  • One attack chain exposed Asana data via unsecured MCP endpoints
  • Another let attackers run remote commands on public-facing servers
  • One even granted access to private GitHub repos through a compromised MCP tool

Here’s what actually works: The MCP Gateway

Gateways act like bodyguards for your AI agents.

They sit between the AI client and the MCP server. Every request goes through the gateway. Every response does too.

The idea is simple: Centralize security. Remove trust from the server layer. Lock everything down.

Here’s how they help.

  1. They handle identity properly
  • Full OAuth 2.0/2.1 support
  • Short-lived tokens (so even if someone grabs one, it’s useless soon)
  • Role-based access control
  • Integration with enterprise identity systems like Okta, Azure AD

Your AI agents don’t manage auth. The gateway does. That’s safer and way easier to manage.

  1. They validate and sanitize everything

This is the magic. The gateway checks:

  • Are prompts malicious?
  • Is someone trying to inject SQL or shell commands?
  • Are any tool descriptions poisoned?

It also strips out anything sketchy. Think of it like a metal detector for every request.

Some even use machine learning to detect suspicious prompts.

  1. They audit, monitor, and alert

Every request. Every response. Logged.

You can get real-time alerts when something fishy happens. You can plug into your SIEM. You can see what tools were called, by whom, when, and how.

This isn’t optional anymore. It’s table stakes for enterprise deployment.

  1. They lock down the tool supply chain

Before a tool is allowed through the gateway, it’s scanned:

  • What’s the source?
  • How popular is it?
  • Has it ever been flagged?
  • Is the repo still active?

Tools that fail checks can be blocked automatically.

If you’re not scanning tools, you’re just waiting to be breached.

So who’s building these gateways?

There are a number of gateway solutions now available, offering different levels of security, specialization, and enterprise readiness. Below are several strong options:

Enkrypt AI Secure MCP Gateway

Offers dynamic tool discovery, built-in prompt sanitization, and enterprise-grade authentication for secure MCP deployments.

  • Built‑in security scans
  • Dynamic tool discovery
  • Works with enterprise authentication
  • Performance‑optimized

Lasso Security MCP Gateway

Focuses on threat prevention with:

  • Plugin architecture
  • Server and tool risk scoring
  • Automated blocking of high‑risk components

WAV Group Gateway Template (Real Estate Focus)

WAV Group offers a Gateway Template designed for real estate brokerages and MLSs. Key features:

  • Prompt sanitization tailored for real estate contexts
  • Guardrails for private client/buyer/seller data
  • Role‑Based Access Control (RBAC) at agent/user levels
  • Audit logging specific to real estate workflows
  • MLS API integration controls and PII masking for real estate data
  • Designed as a template clients can adopt to deploy secure, compliant AI agents in real estate environments

Obot MCP Gateway

Obot is an open‑source gateway focused on enterprise requirements. Some of the features:

  • Admin control plane: IT can onboard MCP servers, define access policies, manage users/groups, monitor usage. 
  • Catalog / discovery: A searchable directory of approved MCP servers, documentation, trust/reputation information. 
  • Proxying & hosting: Support for both local and remote MCP servers; ability to proxy third‑party ones with audit and routing control. 
  • Access control + logging: Role‑based access, enterprise auth integration (Okta etc.), audit logs for MCP‑client/server interactions. 

Kong Konnect / Kong AI Gateway

Kong is more known as an API gateway, but it’s also building out MCP support and gateway‑style features. Key capabilities:

  • Kong Konnect MCP Server: Enables MCP clients (e.g. Claude) to query APIs, configuration, analytics via Kong’s control plane. 
  • Securing & governing MCP traffic: Kong’s AI Gateway offers plugins and policies for authentication (OIDC / Key Auth), rate limiting, prompt filtering (guardrails) etc. 
  • Observability: Metrics, logging, tracing for MCP traffic. 

What should your team do right now?

If you’re deploying MCP servers, or building on top of them, here’s a basic security checklist:

  • Set up a gateway (before anything goes live)

This is non-negotiable. Even for internal tools.

  • Use proper auth

Hook into OAuth. Integrate with your identity provider. Don’t hand-roll this.

  • Validate inputs and outputs

Use JSON schemas. Sanitize tool responses. Strip out embedded commands.

  • Lock down your network

Log everything. Store audit trails. Send alerts when strange stuff happens.

  • Don’t trust tools blindly.

Scan them. Review their source. Watch for updates. Use a reputation system.

The future isn’t secure by default

MCP is a powerful idea. But it’s dangerously naive out of the box and can expose your most valuable asset, your data.

Vendors are moving fast. Too fast. And when 43% of servers have command injection flaws, you don’t get to say “well, we trust our stack.”

You lock it down. You build defensively. You audit, scan, and restrict.

This isn’t optional if you’re serious about deploying AI in production.

And finally: stop hoping and start securing

Hope is not a security strategy. “No one would ever target us” is how breaches happen. “It’s just a proof of concept” becomes a Common Vulnerabilities and Events (CVE).

The MCP ecosystem is still young. That means you get to choose your architecture now before someone else chooses it for you via an incident report.

So choose wisely.

Start with a gateway.

The post The Hidden Risk in MCP Servers That Could Expose Your Business appeared first on WAV Group Consulting.

]]>
ATTOM Data’s Vision for AI and MCP Servers: An Interview with Todd Teta https://www.wavgroup.com/2025/09/03/attom-datas-vision-for-ai-and-mcp-servers-an-interview-with-todd-teta/?utm_source=rss&utm_medium=rss&utm_campaign=attom-datas-vision-for-ai-and-mcp-servers-an-interview-with-todd-teta Wed, 03 Sep 2025 17:00:22 +0000 https://www.wavgroup.com/?p=52517 WAV Group recently had the opportunity to sit down with Todd Teta, Chief Product and Technology Officer at ATTOM Data. Todd has spent his career at the intersection of product and technology, with deep roots in real estate, mortgage, and property data. Since joining ATTOM in 2016, he has guided the company’s transformation from RealtyTrac, a foreclosure portal, into one of the industry’s leading pure-play data licensing firms.

The post ATTOM Data’s Vision for AI and MCP Servers: An Interview with Todd Teta appeared first on WAV Group Consulting.

]]>

 

WAV Group recently had the opportunity to sit down with Todd Teta, Chief Product and Technology Officer at ATTOM Data. Todd has spent his career at the intersection of product and technology, with deep roots in real estate, mortgage, and property data. Since joining ATTOM in 2016, he has guided the company’s transformation from RealtyTrac, a foreclosure portal, into one of the industry’s leading pure-play data licensing firms.

Data as the foundation of ai

Todd is clear on one central point: high-quality data is the bedrock of successful AI in real estate. While some companies experiment with scraped or unverified datasets, he believes this shortcut undermines accuracy and trust. Instead, ATTOM has invested heavily in building parcel-centric data through its ATTOM ID system, applying multi-stage quality checks and leveraging machine learning to identify anomalies. This commitment ensures that brokers, insurers, and lenders can build AI models on a stable, authoritative foundation rather than on incomplete or inconsistent information.

MCP servers and the next wave of integration

One of the most forward-looking parts of our conversation was Todd’s discussion of MCP servers. ATTOM is currently developing its own MCP server, which Todd refers to as a pathway for “agent-ready data.” By enabling AI agents and SaaS applications to connect directly to normalized, parcel-centric datasets, ATTOM is positioning itself as a bridge between legacy data delivery and the emerging agentic economy. He also highlighted Google’s Agent-to-Agent (A2A) standard as another important development. Together, MCP and A2A could dramatically simplify how brokerages, MLSs, and technology providers consume and integrate property data, eliminating the need for integration with endless bespoke APIs.

The pace of technology shifts

When asked about the pace of change, Todd noted that AI is accelerating faster than any previous tech cycle he has seen, from web to mobile to cloud. What once took seven years for mainstream adoption of cloud may now take only two for AI and MCP. While he acknowledged that today’s real estate downturn is slowing some investment, he expects agentic features to become standard across SaaS platforms in the near term. Brokerages that prepare now by consolidating their data and aligning with MCP-driven access models will be best positioned to lead.

Conclusion

ATTOM’s commitment to transparency, data quality, and innovation reflects Todd Teta’s broader vision: that the future of real estate will be defined not just by who has the data, but by who delivers it in a way that empowers AI and protects digital sovereignty. As MCP servers roll out and the industry experiments with A2A integrations, ATTOM intends to be at the center of this transformation—making it easier for brokers, MLSs, and technology providers to trust the data that powers their next generation of tools.

WAV Group is a leader in helping companies license and leverage data effectively, and ATTOM already works with numerous WAV Group clients today. As WAV Group’s AI division, Fluente, continues to accelerate AI development for brokers and MLSs, we see integration with ATTOM’s MCP server as a ripe opportunity by year end.



The post ATTOM Data’s Vision for AI and MCP Servers: An Interview with Todd Teta appeared first on WAV Group Consulting.

]]>
The Compass AI Demo and Fluente’s Path Forward For Brokers https://www.wavgroup.com/2025/08/28/the-compass-ai-demo-and-fluentes-path-forward-for-brokers/?utm_source=rss&utm_medium=rss&utm_campaign=the-compass-ai-demo-and-fluentes-path-forward-for-brokers Thu, 28 Aug 2025 19:00:27 +0000 https://www.wavgroup.com/?p=52479 The lesson for brokerages is clear: owning the AI orchestration layer is how you future-proof your business and create experiences that Compass is previewing here. The stakes are high.

The post The Compass AI Demo and Fluente’s Path Forward For Brokers appeared first on WAV Group Consulting.

]]>
Compass has been showcasing a conversational AI demo that turns everyday agent tasks into a seamless, voice-driven workflow. Every broker should aim to stay ahead or keep up with Compass. This means owning your AI platform, and owning your data. We call this AI Soverenty and it’s the keynote of how WAV Group’s AI Division – Fluente – is coaching and building AI for brokers. When you work with us, you own your AI, not rent it. 

At first glance, it looks like magic: the agent asks a question, the AI responds, and business moves forward. But behind the scenes, what’s really happening is a carefully orchestrated sequence of MCP server calls, CRM API integrations, and MLS property data connections. We build all of this stuff, your technology team can too. Sometimes brokers and their tech team need a coach, or someone to jump in and get it done. Fluente does both.

Let’s break down the Compass demo step by step — and discuss how brokerages can match (or surpass) this type of experience.

compass ai

Step 1: Daily Digest of Tasks

Agent asks: “Can you tell me what I should be aware of today and what I have going on?”

This request requires the AI to read the agent’s calendar and task list. Compass has the advantage of integrating with their CRM – Contactually CRM (acquired in 2019), which already integrates with Google Workspace and Office 365. The MCP server simply queries the calendar endpoints and formats the results into a conversational response.


What to learn:

  • Brokerages with Office 365 or Google plugins could replicate this with integrations between the broker’s AI experience and a connection to calendar.
  • The voice interaction here is excellent. 

Step 2: Birthday Reminders & CRM Notes

Agent asks: “Do I have any clients with birthdays this month? If so, put a note in my CRM to reach out and tag my assistant Jed.”

Here the AI is searching customer records for birthdays, creating reminders, and assigning tasks. Compass again leans on Contactually for CRM API integration. But this is where third-party data enhancement services like Aidentified can play an even bigger role. These platforms can automatically match the customer records and append missing birthdays, anniversaries, or even life events to contact records.

What to learn:

  • AI shouldn’t create more work for agents — it should do the work. That means not just logging tasks, but also triggering the birthday text, card, or gift order.
  • Brokerages can differentiate by integrating AI with gifting platforms (e.g. Evabot), so an agent can say, “Send Michael a gift for his birthday,” and it just happens.

Step 3: Preparing for a Client Meeting

Agent asks: “I’m meeting with Jacob Wells. Can you remind me of his preferences and pull some market stats for the West Village? Draft me an email I can send him.”

This CRM task is the most complex part of the demo. To deliver, the AI must:

  1. Retrieve Jacob’s saved searches or property cart.
  2. Recall his buying criteria (2-bedroom, 2-bath, price range, neighborhood).
  3. Query MLS or brokerage search tools for updated listings.
  4. Generate market stats (price trends, days on market, absorption rates).
  5. Design the report
  6. Draft a client-ready email.

Compass positions its CRM as the hub for saved searches, but brokerages have multiple options here.  MLS client portals like OneHome (Cotality), point solutions like CloudCMA, or native brokerage tools. The real magic is in the orchestration: the MCP server routes the queries to each data source and brings the results back to the AI conversation.

What to learn:

  • Don’t stop at generating a “market update email.” AI should guide the agent: “Jacob currently owns a condo at 123 Main Street. Do you want me to draft a CMA for that property as well, in case he needs to sell before buying?”
  • AI connectors for CMA and Market Stats are essential. If MLSs and vendors don’t expose APIs for these workflows, their tools risk being sidelined. In the case of CloudCMA – the job can be sent from the MCP server to CloudCMA via API, email, or text message and the report is generated and returned. This was a key selling point of CloudCMA when this functionality was released in 2014.

The Bigger Picture: Why MCP Servers Matter

Compass’ demo makes one thing clear: the future of brokerage AI depends on seamless connections between CRMs, MLS systems, productivity suites, and third-party data services. That orchestration happens through MCP servers, a middleware orchestrator that translates a conversational request into discrete jobs routed to the right endpoints and specialty AI agents.

If MLSs and brokerages don’t provide MCP access, agents and vendors will start replicating data to power these workflows themselves. That’s a slippery slope, leading to loss of data oversight and poor governance. The smarter move is for brokerages and MLSs to own the orchestration layer and maintain sovereignty over their AI strategy, data assets, and agent access.

Final Takeaway

The Compass AI demo is impressive, but it’s not unattainable. Brokerages can match or exceed it today by:

  • Integrating calendars, CRMs, and MLS portals through MCP servers.
  • Using data enrichment tools like Aidentified to keep customer records fresh.
  • Embedding AI connectors into core MLS and vendor tools (CMA, market stats, gifting).
  • Focusing on work completed, not work delegated — ensuring AI takes tasks off the agent’s plate instead of just creating more reminders.

The timing is important too. Compass anticipates rollout of this functionality this fall to a limited group. Top producing agents and teams are hungry for this automation. WAV Group anticipates that Compass will have a compelling technology narrative for recruiting and retention when this AI platform releases. It may be somewhat limited at first, and that is important too. You cannot throw too much tech at real estate agents at once, or the adoption task becomes overwhelming. 

Get in the game with a few AI features that have an impact for agents. Monitor the market for features that are attracting agent buzz. Reinforce that your AI strategy is solid. The lesson for brokerages is clear: owning the AI orchestration layer is how you future-proof your business and create experiences that Compass is previewing here. The stakes are high.

WAV Group is not just helping with AI education and strategy for brokerages and MLSs, we are building it through Fluente, our wholly owned AI subsidiary. Fill out the contact form below to discuss ways to collaborate on your AI solution. 

More WAV Group Reading  

WAV Group white papers, articles, and videos on the evolution of AI in real estate. Reach out below if you would like to learn more about our Tech and AI services.

https://www.wavgroup.com/2025/08/22/fbs-and-the-future-of-mls-infrastructure-why-sparkapi-could-be-the-blueprint-for-model-context-protocol-servers/

https://www.wavgroup.com/2025/08/21/if-your-broker-marketing-is-spectacular-put-it-to-work-recruiting/

https://www.wavgroup.com/2025/07/31/mls-mcp-servers-are-here-see-the-one-david-gumpper-built-for-real-estate-ai/

https://www.wavgroup.com/2025/07/14/why-every-mls-needs-to-understand-mcp-servers-before-someone-builds-one-without-you/

https://www.wavgroup.com/2025/07/14/utahs-largest-keller-williams-brokerage-just-shared-how-theyre-using-ai-and-its-working/

https://www.wavgroup.com/2025/07/09/ai-is-rewriting-the-rules-of-brokerage-are-you-playing-in-the-right-tier/

Hire WAV Group

  • Please select a service.
  • How can we help you?

The post The Compass AI Demo and Fluente’s Path Forward For Brokers appeared first on WAV Group Consulting.

]]>
FBS and the Future of MLS Infrastructure: Why SparkAPI Could Be the Blueprint for Model Context Protocol Servers https://www.wavgroup.com/2025/08/22/fbs-and-the-future-of-mls-infrastructure-why-sparkapi-could-be-the-blueprint-for-model-context-protocol-servers/?utm_source=rss&utm_medium=rss&utm_campaign=fbs-and-the-future-of-mls-infrastructure-why-sparkapi-could-be-the-blueprint-for-model-context-protocol-servers Fri, 22 Aug 2025 16:27:16 +0000 https://www.wavgroup.com/?p=52421 Brokers are adopting AI. If the MLS doesn’t provide compliant access to the data they need, they’ll find workarounds.

The post FBS and the Future of MLS Infrastructure: Why SparkAPI Could Be the Blueprint for Model Context Protocol Servers appeared first on WAV Group Consulting.

]]>
AI artificial intelligence concept, Close up of microprocessor on mainboard electronic computer background, Futuristic innovative technologies.In this recorded interview, we sit down with Michael Wurzer, President and CEO of FBS (the company behind Flexmls and SparkAPI) to talk about the future of MLS infrastructure in an AI-powered industry. As brokerages begin deploying generative AI to power intelligent workflows and consumer experiences, MLSs face a critical inflection point: either lead by enabling secure, standards-based AI access through MCP servers – or risk fragmentation as brokers and vendors replicate MLS data in unmanaged, non-compliant environments.

SparkAPI: More Than Listings

FBS’s SparkAPI is a mature and feature-rich implementation of the RESO Web API standard. More than a listing API, SparkAPI provides access to key resources like Roster, Office, Media, Open Houses, and more. It supports advanced authentication, field-level access controls, and developer permissions, making it a natural foundation for AI-native interfaces. These capabilities are exactly what MCP (Model Context Protocol) builds on to deliver AI-native, policy-aware interactions between MLS data and generative models for brokers.

The Risk of Doing Nothing

As brokers explore AI for tasks like CMA generation, compliance workflows, or intelligent search, they need structured access to MLS data across multiple sources. If the MLS doesn’t offer a secure, AI-friendly API like MCP, brokers will be forced to replicate the data themselves or work with vendors who do. This creates fragmented data environments with little MLS oversight and no policy enforcement.

We’ve already seen this happen. When MLS data is ingested into general-purpose AI systems without proper controls, it’s exposed to compliance risks, copyright infringement, and misuse. MCP servers prevent this by keeping MLS data inside a secure, policy-governed walled garden where every AI interaction is monitored, attributed, and compliant.

Fluente: Your Partner for MCP Enablement

This is exactly why Fluente exists. It is a wholly owned AI division of WAV Group. We help brokers deploy private, standards-based MCP servers that allow authorized brokers, staff, and vendors to build AI integrations without needing to replicate MLS data externally. Fluente MCP servers developed for brokers are designed to integrate directly with platforms like the Flex MCP server so MLSs can extend the infrastructure they already have rather than reinvent the wheel.

FBS as a Beta MCP Server

While SparkAPI isn’t a full MCP implementation yet, it shows how RESO-aligned developers like FBS are well positioned to lead. It is WAV Group’s opinion that every MLS must offer an MCP server today. In the podcast, Michael Wurzer discusses the architectural alignment between SparkAPI and MCP, and how FBS is exploring this evolution to support innovation without sacrificing data governance or MLS control.

What This Means for MLSs and Brokers

This transition isn’t optional. It’s already underway. Brokers are adopting AI. If the MLS doesn’t provide compliant access to the data they need, they’ll find workarounds. But with partners like FBS and Fluente, MLSs can move proactively and stay in control. MCP servers are the gateway to a future where AI and MLS policy coexist—securely, responsibly, and transparently.

We hope you enjoy the conversation. 

Lastly, to gain further insight into this content, download our whitepapers: The Three Tiers of AI That Every Broker Shoud Know and Why MLSs Need MCP Servers.

Reach out below if you would like to learn more about the future of MLS infrastructure and Model Context Protocol Servers.

Hire WAV Group

  • Please select a service.
  • How can we help you?

The post FBS and the Future of MLS Infrastructure: Why SparkAPI Could Be the Blueprint for Model Context Protocol Servers appeared first on WAV Group Consulting.

]]>