Boston's Unruly Streets and What They Teach Us
Driving through Boston feels like trying to solve a maze written in a foreign language. The streets twist around corners, bend in odd ways, and converge at traffic circles that seem almost invented for the city's own amusement. Most of us have spent a few moments wondering why anyone would design such a confusing network. A popular explanation attributes the layout to old cow paths that once crossed what is now the Boston Common. The idea that a city could intentionally create a grid that defies navigation seems absurd, yet the pattern persists, and the story has a strange resonance for anyone who has worked on a complex project that ends up looking like an antiquated relic.
When I first heard the tale of Boston's winding roads, I laughed at the absurdity of a city built to trip up its drivers. But a second look - at the way the streets meet in unexpected angles, at how they loop back onto themselves - revealed something deeper. A city is not merely a collection of asphalt; it is a living organism shaped by history, geography, and the people who build it. Boston’s cobblestones and hidden roundabouts are relics of earlier times that were simply carried forward when the city grew. In a sense, the streets became a kind of time capsule, preserving the irregularities of the past even as the present demanded more order.
The same phenomenon shows up in technology projects. In the early days of PCs, a word processor called MultiMate was released that deliberately mimicked the look and feel of an older Wang word processor. The developers weren't trying to create something novel; they wanted to give typists who already knew Wang’s interface a smooth transition to a cheaper PC environment. The result was a tool that looked like the old one but worked on new hardware. The design choice was functional at first glance, but it also locked the software into a specific visual pattern that no longer matched the evolving user experience. In many ways, MultiMate became a modern incarnation of Boston’s confusing streets: familiar, but not intuitive.
Just as Boston’s roads force drivers to adjust, projects that borrow heavily from legacy systems force users to abandon their new workflow expectations. This is not a matter of nostalgia. It’s about how design choices create friction. A city that forces drivers to navigate a maze is not a city that encourages exploration; it discourages efficient movement. Similarly, a system that feels like a relic pushes users to fight against the software, rather than embrace it. The underlying principle is that the past should inform, not dictate, present design. A legacy mindset can become a bottleneck, just as a historic street layout can become an obstacle to modern traffic.
There are a few things we can learn from Boston’s streets. First, the design of a system - or a city - should prioritize ease of use for the current population. Second, historical influences must be carefully weighed against future growth. Finally, the presence of a legacy system does not automatically mean that the new system has to copy it. In fact, ignoring that history and adopting a clean slate can often save time and resources in the long run. The story of Boston’s winding roads serves as a reminder that what is inherited may be more hindrance than help.
When engineers look back at a project that looks like an old template, they often ask: Why did we design it that way? The answer usually circles back to the same reasons that made Boston’s streets confusing: a desire to keep the old familiar, a need to preserve backward compatibility, and a hope to minimize user retraining. Recognizing these motivations early can prevent the same pitfalls from emerging in modern software. If we learn from Boston’s past and let it inform our present, we can create cities - and systems - that feel fresh, intuitive, and truly fit the needs of the people who use them.
When New Systems Wear Old Skin: Lessons for IT Projects
In the world of software development, vestiges of older systems are a common sight. Developers sometimes build new applications that look and feel like their predecessors, especially when the new platform has to support legacy standards. This approach often seems practical, but it can create unnecessary complications for users and maintainers alike. The core of the issue lies in three intertwined motivations: backward compatibility, user education, and the way requirements are gathered.
Backward compatibility is a powerful driver. When a new system has to communicate with an older one, developers might decide to keep the same interfaces, naming conventions, and even visual cues. The advantage is immediate: existing users can plug the new tool into their workflows without a relearning curve. However, the cost is the entanglement of old and new paradigms. Maintaining support for outdated formats adds complexity to code, increases the risk of bugs, and often forces future upgrades to cling to legacy constraints. In many cases, a clean slate that embraces modern standards - such as RESTful APIs instead of SOAP - provides greater flexibility and scalability, even if it requires an initial learning effort from users.
Minimizing user education is another common justification. The logic is simple: if users are already comfortable with a familiar interface, they will transition smoothly to the new software. MultiMate is a perfect example; it borrowed the Wang look to attract typists who had spent years mastering the older system. The tradeoff, though, was that the new platform inherited an outdated UI that did not match modern expectations. Users had to accept a clunky experience, which could lead to frustration and decreased productivity. When designers weigh this factor, they should ask whether a short training period is truly cheaper than designing an intuitive interface from the start. Often, the answer is that a well-designed modern UI pays off more in the long run.
The third driver is the way requirements are gathered. Some teams spend months collecting statements from users, hoping to capture every need. Unfortunately, most users cannot articulate what they truly require - they know what they do, not what they want to achieve. The result is a flood of “requirements” that are inconsistent or contradictory. A more effective approach is to treat early project phases as a consensus-building exercise. By involving users in design decisions and prototyping, teams can surface real priorities and weed out unnecessary features that merely echo the past. This collaborative process turns requirements from a static list into a dynamic dialogue that evolves with the project.
Vestiges are not always harmful. When a new system genuinely builds upon proven functionality - such as reusing a secure authentication framework - it makes sense to retain that core. The key is intentionality: every legacy element must be evaluated against the present goals of the project. Ask whether it serves a critical purpose, or if it merely adds baggage. If the answer is the latter, consider refactoring or replacing it. This mindset prevents the accidental inheritance of old roadmaps, ensuring that the final product feels fresh and purposeful.
When planning a new system, it helps to remember that the most valuable feature is often the absence of unnecessary complexity. A well-thought-out design can eliminate the need for users to relearn or to navigate confusing structures. If the goal is to serve clients and end-users, the best strategy is to start with a blank canvas that reflects current needs, not a dusty copy of a past solution. In doing so, developers avoid laying out a digital “Boston” that will frustrate the very people they aim to help.
Paul Glen is an IT management consultant and the author of the award‑winning book “Leading Geeks: How to Manage and Lead People Who Deliver Technology” (Jossey‑Bass Pfeiffer, 2003). He frequently speaks at corporations and national associations across North America. For more information, visit Paul Glen's website or email him at
Tags





No comments yet. Be the first to comment!