It’s the time of the month again: the T-SQL Tuesday! This month’s edition is hosted by Kennie Nybo Pontoppidan (blog|twitter) and the topic is “daily database related WTF”. In other words, we get to tell one of our horror stories we encountered in our career as data professionals.
Let me tell you the story of a small BI project I did a couple of years back. The project building a small data mart for the local branch of a very large multinational in consumer goods. The data was gathered by the retailers in the country, who send the details back to the company for analyses of the sales performance. The data was to be hosted internally at the company, not at the client. However, the server was commissioned but not yet delivered. In the meantime, I would simply start developing on my own machine – which was quite a powerful laptop with 8 cores and 32GB of RAM – and we would migrate the solution once the server arrived.
I started on the ETL for a couple of weeks and I built a Tabular model on top of the data mart. There was quite some data (a couple of Gigabytes on disk), but nothing that Tabular couldn’t handle due to the great compression of the in-memory columnar engine.
Finally the server arrived and a colleague – with more of a DBA profile – migrated everything over to the server. I logged in and … everything grinded to a halt. Right-click on a database in SSMS? Wait 10 seconds for the context menu to pop-up. Process the model? Wait a couple of hours, if it doesn’t crash. What happened? Since there was only one server, it hosted both the development and the production environment. But what was even worse? The “server” only had 8GB of RAM (I don’t remember the CPU, but it was not something to be amazed by). Apparently, the sales person who closed the deal also commissioned the server. Without consulting any developer or admin. *le sigh*
When your laptop is more powerful than the server, it is not a server.