Stirring the Pot with my First Principles

Stirring the Pot with my First Principles

Today I thought I’d cause a little controversy by comparing my First Principles (things I know to be true) related to building high performance, scalable systems (software, compute, storage and networking) with current conventional wisdom.

Among most architects today, current conventional wisdom states that your architecture must follow a microservice software pattern, use containers like Docker, capture data in an event streaming platform like Kafka, persist data in a NoSQL database, be managed by something like Kubernetes or Mesosphere and run on Linux if you want to have a high performance, scalable system.

So what is something I know to be true from the dot com era?

Back then I was fortunate enough to be part of a team that built an energy trading platform that allowed multiple counterparties to buy and sell financial instruments (think NASDAQ). This platform was built on 32-bit Windows Server 2000 with some kind of Intel Pentium CPU. Classic Active Server Pages would send and receive XML fragments over HTTPS to and from brokers from every major energy company + NYMEX and the Intercontinental Exchange (ICE) following a RESTful pattern. Incoming data was queued in MSMQ and free-threaded objects pooled inside Microsoft Transaction Server (MTS) moved that data in and out of 32-bit SQL Server 2000. The databases were clustered and Windows Server got 1 GB of RAM whereas SQL Server got most of the remaining 3 GB of RAM. The Internet Information Servers running ASP 3.0 used the built-in network load balancing service (NLB). When we needed to update the software, we just took the servers out of the cluster one by one to make the update. No biggie.

So what’s the takeaway from all this? As someone who has played a prominent role in the Internet of Things space over the years, I struggle to find modern systems that have to deal with the transactional and analytical load that I witnessed with the system I helped build 20 years ago. In spite of all the promise and talk, I’ve yet to see any IoT system that deals with a fraction of the load the world’s financial systems handle effortlessly with their antiquated architectures and relational databases.

Why were we able to do so much more with so much less back then?

Remember folks, we're just flowing current through a gate to establish a high or low voltage at a particular point in the circuit. The farther away you abstract yourself from this, the more resources your system will require and the slower the system gets. Don't be impressed by architectural diagrams with hundreds of lines, boxes, arrows and triangles going every which way. Complexity kills. New programming languages, frameworks, and architectural patterns come along all the time. Use you best judgement and fall back to your own first principles before deciding to jump on the next bandwagon.

Tom Mills

Architect at Microsoft Industry Solutions Delivery (Retired)

4 年

I thought you'd use a picture of a pot of queso to illustrate this post. :)

Douglas Bodden

Software Engineer - All opinions are my own and do not reflect those of my employer or colleagues.

4 年

It wouldn't kill the "enterprise" folks to work on constrained and embedded systems once in a while.

Rabi Satter

Research and development of technology in Web3, cryptocurrency, mobile, WASM, and software development.

4 年

I am amazed at the lack of understanding that when you add something into a system it adds overhead. The reward for that overhead better be multiples or even orders of magnitude of value. These days it seems people are more interested in adopting technology because it is new and hip. It is not based on data.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了