freedomben a day ago

Interesting thoughts regarding MCPs being the future App Store/Platform. I don't know that I agree but I don't necessarily disagree either. Time will certainly tell.

To me, MCP feels more like an implementation detail, not something that most people would ever use directly. I would expect that the future would be some app distributed through existing channels, which bundles the MCP client into it, then uses a server-side component (run by the vendor of course) to get the real work done. As much as I like the idea of people installing the servers locally, that future seems like a Linux nerd/self hosted type of activity. I just can't imagine a typical mac or windows non-power-user installing one directly. Just the idea that they would need to install "two apps" is enough to confuse them immensely. It's possible some might bundle the server too and run it locally as needed, but even in that case I think MCP is completely invisible to the user.

  • daxfohl a day ago

    I'd expect "local MCP servers" will be generally installed as part of something else. Photoshop, or Outlook, or whatever could come with a local MCP server to allow chat clients to automate them. Maybe printer drivers or other hardware would do similar. I don't think there's much reason to install a cloud service MCP server to run locally; you'd just use the one provided in the cloud.

    • Garlef 14 hours ago

      Interesting thought.

      But maybe the companies would actually like to at least pipe the communication throught the cloud to get all the usage data. Here's one possible architecture:

      local chat client

        - talks to cloud LLM
        - talks to local MCP servers
      
      local MCP server provided by company

        - connects to company cloud (this lets the company collect usage data)
        - forwards tasks to the cloud
      
      local tool (for example photoshop)

        - connects to company cloud to get a users tasks
        - executes the tasks (this lets the company use the users hardware, saving cloud costs)
      • daxfohl 11 hours ago

        Hmm, in that example the MCP server is just a thin api wrapper though, so it wouldn't change anything by running locally, right? Like I could see where maybe a TikTok MCP server would benefit from running locally since that would allow it to expose a camera api, but I can't think of anything you could do with a local Airbnb MCP server that you couldn't do with a cloud one.

        Nefariously, I guess since these things would be running in the background continuously, that would provide another avenue for companies to spy on you, so that may be a reason companies create local mcps even if there's no other reason to.

        • daxfohl 11 hours ago

          Well maybe a local Airbnb MCP could have access to your phone to call the host. Or to your wallet to pay.

          That may make more sense than having a separate "wallet MCP server" running locally and having the LLM coordinate the transaction. While the premise of MCP is to allow LLMs to do such things, idk if I want an LLM to be hallucinating with my credit card.

  • grahac a day ago

    Agree that for mainstream use it needs to be and will be hidden from the user entirely.

    Will be much more like an app store where you can see a catalog of the "LLM Apps" and click to enable the "Gmail" plugin or "Shopping.com" plugin. The MCP protocol makes this easier and lets the servers write it once to appear in multiple clients (with some caveats I'm sure).

    • kitd 9 hours ago

      They feel quite similar to Alexa skills, packaged in a standard form. The app store analogy allows them to be searched by the end user.

      TBH, it's quite surprising (and reassuring) that they have standardised as MCPs so soon. It normally takes a decade of walled gardens and proprietary formats before any open standards emerge.

  • masterj a day ago

    MCP has a remote protocol. You don't need to install anything to add an MCP server, or rather, you won't once client support catches up to the spec. It will be a single click in whatever chat interface you use.

  • mirekrusin a day ago

    More like npm, not app store.

  • dist-epoch a day ago

    MCP's will be run by the service providers, and you'll have the ability to "link" them, just like today you can link a Google account to give access to Calendar, GDrive, ... in the future you'll be able to give a model access to the Google MCP for your account.

    • lgiordano_notte a day ago

      i wonder how granular the permissions will get though. giving model-level access to something like Gmail sounds powerful, but also like a privacy minefield if not done carefully. curious to see how trust and isolation get handled.

      • vendiddy 16 hours ago

        Those oauth permission dialogs come to mind but they may need to improve significantly to become useful.

        I don't think most of those are granular enough.

guideamigo_com1 a day ago

MCP might be one of the few technology pieces where more articles have been written about it than the actual use-cases being built.

It is like the ERC20 era all over again.

  • klik99 a day ago

    This particular way of seeing MCP that the article describes came up a lot during the early voice assistant ways - and I guess amazon did kind of attempt an app store approach to giving alexa more capabilities. In theory I like it! But in practice most people won't be using any one integration enough to buy it - like why go through the hoops to buy a "plane ticket purchasing app" when you do it maybe 4 times a year. I just don't see it playing out the way the author describes

  • spudlyo a day ago

    Remember “push technology”?

    • empath75 9 hours ago

      Do you mean "notifications", ie a core feature of every computer and phone?

  • atonse a day ago

    I don't feel that way. Maybe the first examples have all been related to what software people do, but I think an MCP for a travel site would be a game changer.

    There are so many things I want to tell a travel site that just doesn't fit into filters, so then end up spending more time searching all kinds of permutations.

    These could be done with an MCP-augmented agent.

    • esafak a day ago

      There is no saying that they will expose more functionality through the MCP API than their web site. I imagine the API will be more limited.

      • atonse a day ago

        No, but let me be more specific.

        For example, when I search for flights, there might be situational things (like, "can you please find me a flight that has at least a 2 hour layover at <X> airport because last time i had a hard time finding the new terminal" etc.

        Or an agent that will actually even READ that information from the airport website to see notices like "expect long delays in the next 3 months as we renovate Terminal 3"

        Right?

        The agent could have this information, and then actually look at the flight arrival/departure times and actually filter them through.

        Other things like, "I can do a tuesday if cheaper, or, look through my calendar to see if i have any important meetings that day and then decide if i can pick that day to save $400"

        These are all things that synthesize multiple pieces of data to ultimately arrive at something as simple as a date filter.

        • leo-notte a day ago

          that kind of synthesis is where current search interfaces fall short. the pieces exist in isolation like flight data, personal calendars, and airport notices, but nothing ties them together in a way that's actually useful. an agent using MCP could help connect those dots if the APIs are deep enough and the UX avoids feeling like a black box. the real challenge might not be the tech but getting providers to share enough useful data and trust whatever sits between them and the user.

        • troupo a day ago

          So, Yahoo! Pipes, but with magic and wishful thinking

          • itomato 10 hours ago

            Also RAG. Pipes just consumed APIs, IIRC. SOAP at that.

            But definitely web mashups all over again.

    • dkersten a day ago

      People said similar things about smart contracts, yet here we are, with them being rather niche. I do agree that once the Alexa's and Siri's are LLM powered with MCP (or similar) support, these kinds of use cases will become more valuable and I do feel it will happen, and gain widespread use eventually. I just wonder how much other software it will actually replace in reality vs how much of it is hype.

  • __loam a day ago

    It's very funny to see people talking about an extremely thin protocol like this.

    • soulofmischief a day ago

      It's a matter of organizing developer effort around a set of standards. Good architecture makes it easy to contribute to the ecosystem, and currently agentic tooling is the wild west, with little in terms of standardization. Now we are seeing more developer momentum around making our everyday tools accessible to agents.

      • klik99 a day ago

        Yeah, it's a good thing to be talking about pie in the sky ideas, most of which won't really work. The few good ideas that survive internet critics picking apart the smallest details could be interesting

  • 3np a day ago

    ERC20 stood the test of time and is ubiquitous today.

    Who knows what MCP looks like in a decade?

3np a day ago

> Think of MCPs as standardized APIs—connectors between external data sources or applications and large language models (LLMs) like ChatGPT or Claude.

This is incorrect.

MCP is Model Context Protocol.

You didn't "build an MCP", you implemented an MCP server. Lighttpd is not "an HTTP", it's an HTTP server. wget is also not "an HTTP", it's an HTTP client. Lighttpd and wget are different enough that it's useful to make that distinction clear when labeling them.

dnsmasq is not "a DHCP", it's a DHCP server.

This distinction also matters because it is certain that we will see further protocol iterations so we will indeed have multiple different MCPs that may or may not be compatible.

  • happyopossum a day ago

    > You didn't "build an MCP"

    The author explicitly states he built 2 MCP servers, not 2 MCPs, so I don’t know where your beef is coming from

    • quantadev 19 hours ago

      I had the exact same reaction to the plural "MCPs". That's silly wording. There are no multiple MCPs. It's a single protocol. It's hilariously awkward wording to say you built "an MCP". It's like saying you built "an FTP", or "an HTTP". I guess every Web App is really just "an HTTP". We've been talking wrong all these years. lol.

      • szvsw 15 hours ago

        On the other hand, IP addresses have crossed into the popular lexicon in exactly this manner… it’s common enough to hear people say “what’s my “ip?” or “are there any free ips?” or what are the IPs for x/y/z”.

        I agree that it sounds stupid and incorrect, but that doesn’t necessarily mean using MCP as a metonym for MCP server.

        • falcor84 10 hours ago

          Good point. Other examples are Wi-Fi (e.g. "What's your Wi-Fi?"), DNS (e.g. "You should change your DNS") and USB (e.g. "I only have 2 USBs on my laptop"). So who knows, maybe "MCPs" will catch-on.

          • quantadev 8 hours ago

            I always say "The Google" though, so maybe I'm guilty as well of playing fast and loose with the Engrish Rangurage.

  • cle 9 hours ago

    Purists perpetually decry the zeitgeist's sloppy terminology.

    Words that climb the Zipf curve get squeezed for maximum compression, even at the cost of technical correctness. Entropy > pedantry. Resisting it only Streisands the shorthand.

  • helloooooooo a day ago

    I’d just like to interject for a moment. What you’re refering to as Linux, is in fact, GNU/Linux, or as I’ve recently taken to calling it, GNU plus Linux. Linux is not an operating system unto itself, but rather another free component of a fully functioning GNU system made useful by the GNU corelibs, shell utilities and vital system components comprising a full OS as defined by POSIX.

    Many computer users run a modified version of the GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU which is widely used today is often called Linux, and many of its users are not aware that it is basically the GNU system, developed by the GNU Project.

    There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine’s resources to the other programs that you run. The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called Linux distributions are really distributions of GNU/Linux!

    • esseph a day ago

      Hey look I found the individual willing to die on the "ATM Machine" / "NIC card" hill!

      • didgeoridoo 9 hours ago

        I prefer “AT Machine”, then nobody can tell whether you’re talking about Automated Teller, All-Terrain, or Anti-Tank.

        And that’s how you get them.

    • lizardking 20 hours ago

      Honestly can't tell if this is very dry sarcasm or not

      • 3np 20 hours ago

        It's tired copypasta. Typically interpreted as "parent is a silly nitpicking neckbeard keyboard warrior".

  • hinkley a day ago

    “The map is not the territory.”

    • falcor84 10 hours ago

      I've actually been thinking about this recently in the context of video games and virtual worlds in general, where when we speak about "the map", we are literally referring to the (virtual) territory. The more we digitize things, the more this distinction breaks down.

baalimago 17 hours ago

If LLMs are so smart, why do they need a custom "MCP" format to what's commonly known as a normal API? Why can't they just call normal APIs?

Extending this thought: why would there be any difference between offering data behind an API, and offering data behind a "MCP api"? At the end of the day, the underlying data will be the same (weather, stock market info, logs, whatever), it seems LLMs just needs this to be "standardized", otherwise it doesn't get it (?).

Furthermore..! LLMs can already crawl web pages just fine using (true) restful technologies. So why would there be need for other, new, special APIs when it's enough to expose the same data on a normal website?

I don't get it.

  • suninsight 9 hours ago

    I also did not get it, but now I get it a bit, I think.

    Look at it this way. You have to get some work done - maybe book a flight ticket. So you go to two sites - first you go to flight fare comparison, then you book the ticket on the airline website. And you have to do it in code.

    There are two ways you can do it.

    First Way 1. Understand the API of the flight comparison portal. 2. Understand the API for the airline website. 3. Write code which combines both these API and does the task.

    Second Way 1. Message a coder friend who knows the API of the flight comparison portal and ask him to write code to get the cheapest flight. 2. Message another coder friend who knows the API of the airline portal and ask him to book a flight.

    Both ways are possible, but which one do you think is Less Work ? Which one is 'cognitively' easier ? Which one can you do while driving a car with one hand ?

    It should be clear that the second way is easier. Not only is the second way easier, but if the task requires multiple providers and a lot of context, it might be the only way possible.

    The first way is analogous to LLM's doing API calls. The second way is analogous to LLM's doing MCP Servers. MCP servers reduce the cognitive cost to do a task to the LLM - which dramatically increases their power.

  • lysecret 15 hours ago

    Ye its also funny to me. On the one side people are saying: Look we have computer use, browser use etc. so we don't need an api! And on the other side saying, look apis are way too complicated we need our own protocol!

  • johntash 13 hours ago

    I don't understand why just using something like openapi specs didn't become the "normal" thing to do. We already have APIs for pretty much everything, why do we need a new protocol that wraps around an existing api?

  • empath75 9 hours ago

    > If LLMs are so smart, why do they need a custom "MCP" format to what's commonly known as a normal API? Why can't they just call normal APIs?

    LLM's _can't_ just call APIs, because all they can do is generate text. The LLM can _ask_ you to run some code, but it has no ability to run code directly. MCPs are basically a way for LLMs to signal intent to make an API call, along with a list of white listed APIs, and documentation for using them, and preloaded credentials with whatever permissions you want to give them.

  • mindwok 11 hours ago

    They actually can. I have found myself getting more use out of a terminal MCP and providing OpenAPI specs than bespoke MCP servers.

    Bespoke MCP's right now are a convenience.

  • manojlds 12 hours ago

    Well, LLMs are not so smart for starters.

brap a day ago

My prediction: there will be no standard protocol, clients will do whatever works for them, and devs will do whatever it takes to be installable on those clients. Just like mobile.

  • precompute 8 hours ago

    Agreed. MCP is merely the first player.

gadders a day ago

>>MCP Affiliate Shopping Engines

As someone else once said, I want a Grocery Shopping Engine. "Here's my shopping list, taking into consideration delivery times and costs, please buy this for the lowest cost from any combination of supermarkets and deliver by day after tomorrow at the latest."

If MCPs gave the LLMs a window into all the major supermarkets home shopping sites that looks like it's a step closer.

  • lou1306 13 hours ago

    This requires arithmetics and constraint solving skills that are wildly out of reach to any pure-LLM platform. At the very least you would need interfacing with a real SMT or LP solver to get something that fits the bill.

    • gadders 12 hours ago

      Looks like we're stuck with coding assistants and Studio Ghibli pics then :-)

  • troupo a day ago

    > If MCPs gave the LLMs a window into all the major supermarkets home shopping sites that looks like it's a step closer.

    And how exactly will they do that?

    • selcuka 20 hours ago

      There are existing comparison services that keep track of prices and locations of grocery items. MCP is just the glue code.

    • fullstackchris a day ago

      Not OP but perhaps the following mcp tools: google maps api, nearest supermarkets, puppeteer their product listing pages?

      Though to be honest not sure why you would need so much info - if I need lettuce or tomatoes for example, I know theyre gonna be at essentially every supermarket in my area....

      • gadders 15 hours ago

        In the UK, most supermarkets offer their own home delivery services.

        I was hoping that an LLM would do the heavy lifting of working out who can get your shopping cheapest from various supermarkets.

        Probably not worth it if you're buying lettuce and tomatoes, but if you have a large family with a £100+ grocery bill per week it might be worth it?

neuroelectron 20 hours ago

MCP is perhaps the biggest attack vector I've seen people willingly adopt simply for FOMO. Nothing about implementing it is defined or tractable. Even logging its use is extremely complicated.

bravetraveler a day ago

MCP in this context means "Model Context Protocol"

I thought it might be "managed cloud providers", but perhaps I'm too optimistic for a change

ilaksh 17 hours ago

Anyone else explored a plugin approach to bundle tools with the client? I started down that path long before MCP was a thing. I do plan to add MCP and A2A support at some point. But for my program it's generally not necessary. You just go to the admin and install plugins with the tools. https://github.com/runvnc/mindroot

How does discovery work with MCP? Is there a way to make it fairly seamless?

kaycebasques a day ago

If you're sold on MCP, what was your "wow" moment? I've read the docs and tinkered a bit but it was a decidedly "meh" experience personally. It seems very similar to ChatGPT Plugins, and that was a flop. I don't really like the fuzzy nature of the architecture, where I never know what server will be invoked. And I have to manually opt-in to each server I want to use? To be unexpectedly useful, it seems like I would have to opt-in to tens or hundreds of servers. Yet I've heard that clients start to struggle once you have more than 20 servers plugged in...? Please excuse any fundamental errors I've repeated here, if any...

  • jaapbadlands a day ago

    The first use case I found relevant and useful was the Supabase MCP server, allowing Cursor's agent to query my Supabase project. It meant no longer describing my database to Cursor, it could simply go and get the information it needed, as needed.

  • mwigdahl a day ago

    My "wow" moment was when I wrote an internal MCP server so that Claude Code could access our test databases. It was a tiny amount of code, simple to connect up, and immediately gave Claude Code a way to directly validate queries. It's been useful in numerous scenarios since then and got me thinking about additional MCP-based tools it might be nice to have.

  • bluedevil2k a day ago

    Writing an internal MCP server to link our API layer to Augment/VSCode so that our Frontend developers can ask in plaintalk about API details. With over 1000 endpoints, it lets the devs find the endpoint, and more importantly the GQL fields, quickly. After some dogfooding we plan to open it up to our clients as well.

  • cruffle_duffle a day ago

    Wrote an MCP to hook into my logging so I could get Claude + Cursor to quickly answer "hey, why did request 20394 from yesterday evening fuck up?". It goes into the logs and finds the exception, hunts down the line and then tells me whats up. Of course, left unchecked it tries to fix the problem too but I've spent countless lines of prompt engineering to have it never attempt to "just start writing code".

  • empath75 9 hours ago

    We have a software templating tool we use internally to start new projects. I wired an MCP server into it and now I can just ask cursor to start a new project for me, and it'll go through our list of templates, find the most useful one, create a new project, read the documentation for it, and then be ready to building it and adding new features right away.

  • fullstackchris a day ago

    For me it was implementing a simple `execute_terminal_command` tool along with hooking up to my company's jira and gitlab (dont worry security gurus, for the command line, I have a hardcoded list of allowed read-only commands that the LLM can execute, and both jira and gitlab servers likewise have readonly options.)

    What I will say is I agree there should be an option to get rid of the chat confirmations of every single new tool call in a chat - as well as building a set of "profiles" of different tools depending what I'm working on. Also strongly agree there needs to be an internal prompt possibility to explicitely tell the LLM what tool(s) to favor and how to use them (even in addition to the descriptions / schemas of the tools themselves) I opened an issue on the anthropic repo exactly about this: https://github.com/modelcontextprotocol/typescript-sdk/issue...

    • ChromaticPanic 21 hours ago

      Open webui let's you do all that. You set up single model agents and assign specific tools. You can also beef it up with system prompts.

      On that note, the various agents libraries will let you create that same setup.

  • grahac a day ago

    Yep. It is currently a Meh experience as said in the OP because the UX sucks. The idea is take a step back and imagine what could it become if those are fixed.

    Btw, one of my favorite MCPs is a Whois MCP so I can ask Claude Desktop to brainstorm domain names and then immediately check if they are available :).

    It’s clunky but I am still using it :)

helsinki a day ago

Are there any open-source MCP 'app stores' currently available? I am considering building one for my employer. A place to host internally built MCPs. It would be useful to know if there is something already out there.

  • tuananh a day ago

    I just use OCI registry to host all my MCP modules (the way I chose to extends my MCP server's capabilities) - WASM plugins.

    OCI registry is available every where and probably already presented in your infrastructure. You get to use all the OCI tools/security controls you already have with it.

    To add new tools, you just have to update the config file to include new plugins and restart.

    https://github.com/tuananh/hyper-mcp

  • sirius87 a day ago

    This is a registry I know of: https://smithery.ai but it's just a listing

    But any self-hosted npm registry backend (e.g. github npm registry) should serve as a private MCP Server registry?

    • fullstackchris a day ago

      seen a lot with uvx as well (apparently a package manager for python, but i try to stay as far away as possible from python)

wkat4242 a day ago

The Master Control Program has no future!

taytus 9 hours ago

The future? MCP is fighting even to have a present.

  • nilslice 4 hours ago

    ah the classic "guy who depends on people viewing webpages doesn't like tech that obviates need to view webpages" response!

relistan 14 hours ago

My experience so far with MCP is also that this is not ready yet. There are a few working use cases, but generally this stuff is pretty raw.

kristjank a day ago

MCPs tries too hard to signal XHR for AI, but nobody wants to earnestly deal with the consequences of AI interfacing in a wider context of mis-/disinformation, hallucination and generally letting it talk to stuff in a semi-unprompted manner.

nilslice a day ago

we’ve been building most of what OP has written about with https://mcp.run

We started doing this the day Anthropic released MCP in November last year. Our company has always been devoted to secure plug-in system technology having built Extism, a WebAssembly plugin framework.

We immediately saw MCP as the plugin system for AI and knew it would be significant, but were concerned about the security implications of running MCP servers from untrusted parties and using the STDIO transport which makes user systems vulnerable in ways we weren’t ok with.

So we built mcp.run which is a secure implementation of the protocol, running servers in fully isolated & portable wasm modules. They must be allow-listed to access files & network hosts, and cannot access any part of your system without your explicit permission.

They also run everywhere. Each server (we call them servlets) on mcp.run is automatically available via SSE (soon HTTP streaming) as well as STDIO, but can also be embedded directly into your AI apps, no transport needed, and can run natively on mobile!

We are excited about MCP and glad so many are too - but we really need more security-oriented implementations before it’s too late and someone suffers a seriously bad exploit - which could tarnish the ecosystem for everyone.

  • pbronez a day ago

    Cool platform. I got some errors while exploring your website. Searching for tools to use works on mobile but not on desktop for some reason.

    • nilslice a day ago

      we just had a minor outage — sorry about that. It should be fully back online.