This is an open‑source Model Context Protocol (MCP) server that gives any LLM a sense of the passage of time.
Most MCP demos wire LLMs to external data stores. That’s useful, but MCP is also a chance to give models perception — extra senses beyond the prompt text.
Six functions (`current_datetime`, `time_difference`, `timestamp_context`, etc.) give Claude/GPT real temporal awareness: It can spot pauses, reason about rhythms, and even label a chat’s “three‑act structure”. Runs locally in <60 s (Python) or via a hosted demo.
If time works, what else could we surface? - Location / movement (GPS, speed, “I’m on a train”) - Weather (rainy evening vs clear morning) - Device state (battery low, poor bandwidth) - Ambient modality (user is dictating on mobile vs typing at desk) - Calendar context (meeting starts in 5 min) - Biometric cues (heart‑rate spikes while coding)
Curious what other signals people think would unlock better collaboration.
Full back story: https://medium.com/@jeremie.lumbroso/teaching-ai-the-signifi...
Happy to discuss MCP patterns, tool discovery, or future “senses”. Feedback and PRs welcome!
The submitter made a basic MCP function that returns the current time, so... Claude knows the current time. There is nothing about sundials and Claude didn't somehow build a calendar in any shape or form.
I thought this was something original or otherwise novel but it's not... it's not complex code or even moderately challenging code, nor is it novel, nor did it result in anything surprising... it's just a clickbaity title.