August 13, 2025

AI
Development

Feeding AI directly with your content

MCP servers are a way for AI to produce results based on your data

What is an MCP server?

An MCP server is a new layer in server infrastructure which can allow Large Language Models (LLM’s) to access private or public data storages and sources. Meaning that we can allow AI LLM’s to pull  data from databases & file storage. This is a very open technology which will evolve, but the first application that came to my mind as a clear use case could be  a product search bar for an ecommerce website, which I will touch on later in this post.

In the technology’s current state, the way MCP servers are being used by developers to connect data sources to an LLM. An example of this is connecting a WooCommerce database to an LLM instance, then asking it how many previous orders have gone to any given target area for a particular SKU. 

An example of this would be connecting a WooCommerce database to an LLM instance, then asking it how many previous orders have gone to any given target area for a particular SKU

Without AI developers need to write custom reporting to query the whole orders database to find how many of these SKUs were delivered within our target area. 

By supplying AI with the data via an MCP server, then asking it to give the answer would dramatically speed that up and reduce the time taken to get that answer, by bypassing the developer’s need to take time to write the reporting. 

Many reporting questions can be asked and quickly delivered in this way, to deliver better insight into these ecommerce sales, without conforming to a rigid structure.

Although this would be an extremely powerful tool for processing data, you should always consider GDPR compliance and carefully control which data you allow LLMs to read and process, as client/user data could potentially be fed into the training of an LLM outside your jurisdiction which may break GDPR compliance.

Where could this technology go?

This technology could be integrated onto any API system to allow for much better access to the data the API is distributing. This does however raise concerns with how much energy this would use as ‘non AI’ traditional API’s are designed to be very efficient working systems, and  LLM prompts are the opposite, because of this I do not believe that this will wipe out the current methodologies but may be a premium add-on in the near future but could change as the technologies develop.

Imagine you are looking for a very particular costume for halloween and have a lot of specifics which need to be met – In this example I will be looking to buy an American spacesuit costume which fits an average adult male and includes the helmet, gloves & the boots. Some of these specifications may be filterable by the filters available on the big costume websites but other’s wont be, which is the perfect case for a MCP server’s search capability.

This advanced search will take much longer to process than using a standard search bar + filters as it will need to process the LLM prompt and then gather the data and present it back to the LLM. For this to work the technology will need to develop into a state where a large number of E-Commerce stores have a MCP advanced search engine integrated into their stock management API’s.

So this leads me to the question – Do users know what they want and write it down with enough specificity that an AI will be able to return a list of products that are suitable? At the moment we use user journeys and images to gain information as to what the user is looking for which may need to be introduced to a MCP driven search functionality by asking the user questions to gain more information.

Conclusion

MCP servers are going to be a massive part of how data is manipulated in the future allowing for extremely custom parameters to be set when searching the web to find the information or product that you are looking for. The technology has endless possibilities in the direction it could take to get into the mainstream and at Fanatic we will be keeping a close eye on how it progresses and thinking of the different ways that our clients could benefit from MCP integration.

References

https://modelcontextprotocol.io/docs/getting-started/intro