Interview with Josef Mušek, CTO at BizMachine

31.5.2021

3 min

reading time

The aspiration of the Prospector platform is to solve a vast number of client needs through several principles, such as data separation, tags, or hierarchy. Josef Mušek, CTO at BizMachine, talked about this and more in our interview.

What technologies do you use for development and why?

Prospector consists of two main components: an integration API and a web interface that users interact with. The integration API is written in Microsoft .NET C# and the web interface is developed in React (a JavaScript library). Both are developed using different tools, and our developers work on both Windows and Mac. On top of that, production versions are deployed on Microsoft Azure Cloud. All of this requires a great need for universality so that both components can share as many common tools as possible. Microsoft Cloud makes many of our activities easier. As is typical with cloud development, it also brought the need for changes in our approach to development.

How has your approach to development changed?

At BizMachine, we believe that automation is inevitable in the twenty-first century if we want to keep pace with what other successful companies are doing.

How do you deploy and release new versions of the application?

Over the past two years, BizMachine has undergone a fairly significant shift in approach. For development itself, we continue to use standard tools like Microsoft Visual Studio, Visual Studio Code, and Git. But we’re trying to leverage automation possibilities much more than in the past.

Two years ago, our development was heavily dependent on the human factor: a developer made changes, pushed them to the repository himself, built them himself, and deployed them himself. Nothing happened without a developer. He had great freedom, but as they say, with great power comes great responsibility. The enormous flexibility was both an advantage and a disadvantage. Developers didn’t have many reasons to deeply consider the impact of a given change, and typically only local impact was tested, not the interaction with other components. Due to time constraints, it often wasn’t possible to test everything across different environments. The human factor proved to be a major disadvantage especially in process reproduction: people are fallible, and reproducing the same step in the same sequence with the same parameters is nearly impossible. Plus, developers don’t enjoy it much. Developers could also afford to go a bit “wild” — for example, changing a framework version and affecting the behavior of something unrelated to their change. Nothing could be done, planned, or prepared in advance without human intervention. Each environment had to be changed manually. When we needed to deploy changes, someone had to sit down to do it, sometimes even at ten in the evening.

Today, when a developer makes a change, they save it to a server where all changes are collected. The server then creates binaries and Docker takes over. An automated process takes the resulting container and deploys it to one environment. It waits for human interaction, and after testing, when everything is in order, the same container is automatically deployed to the next environment — second, third, as needed. Initially, the developer must describe the deployment process via a script, but repeated runs are already automated. We can schedule them anytime, even outside working hours. Deployment typically uses the same script across different environments. So when we test functionality on the pre-production staging environment and it’s fine, we have high confidence it will also work on the production environment and don’t need to monitor the process to the end. The script can also include automatic checks that roll back the deployment if an error occurs. Another advantage of automation is much lower error rates: unlike humans, a well-written script never makes a mistake. We can replicate processes much more easily, and combined with Docker, we know exactly where everything is and in what version, including dependencies. If there’s a problem with a deployed version, we can even run the same container locally to verify behavior for debugging purposes. At BizMachine, we believe that automation is inevitable in the twenty-first century if we want to keep pace with what other successful companies are doing.

You mentioned Docker container services. What’s your experience with them?

We’re currently in the implementation phase of this technology. We’re trying to prepare all the steps to get there as soon as possible. The first step is to create an output package from our build (a set of libraries and exe files) and create a Docker image from it. Until now, we’d release a package and run it directly on the target environment. Instead, we now want to release Docker images and run container instances based on the released image in the target environment.

Josef Mušek, CTO at BizMachine
We started with Azure Cloud-specific technologies and are now moving toward cloud-agnostic open source technologies like Docker and Kubernetes — technologies that can be used both in the cloud and locally.

We started with Azure Cloud-specific technologies and are now moving toward cloud-agnostic open source technologies like Docker and Kubernetes — technologies that can be used both in the cloud and locally.

What about cloud development? How does it help you?

Nowadays, almost everything is in the cloud. Interestingly, many developers still don’t have practical experience with cloud development — they perceive it as something magical or even difficult. But it’s a trend that all companies are gradually following, because when they don’t have to worry about infrastructure, they have more time to focus on what matters — their product. At BizMachine, we try to use cloud-agnostic technologies that we can use for development across different cloud platforms as well as on premise — for example, the Docker container platform. On top of it runs the Kubernetes orchestrator, originally developed by Google and, importantly, an open-source project. Applications are then deployed to different environments in the same way, whether in the cloud or locally, thanks to the unified interface. In general, this helps us gain a different perspective on specific problems. Memory usage is a good example: on your own hardware it’s a limitation you must account for during development, but in the cloud you can request enormous amounts, and the limitation instead becomes the duration for which you need it.

Cloud development and deployment, and automation — how exactly are they related?

For developers, cloud development knowledge itself is a major advantage: once they master development in Azure Cloud from Microsoft and change jobs to somewhere using Google Cloud, the internal processes will be different but the set of capabilities will be similar, only minimally limited by company policies. They don’t need to learn additional technologies. When they know a new application will be released as a Docker container, they’ll de facto know everything about the given environment — its inputs and outputs. They don’t need to think about whether it will be in Azure or Google.

What advantages does the new development approach bring to your clients?

Of course, we try to make any development change either imperceptible or positive for the client — for example, by speeding up the web interface response time. Currently, everything mentioned above allows us to deploy new versions with zero downtime, add new features, and serve more clients simultaneously. What we do in Prospector development, we also apply to varying degrees in other types of work, such as data preparation or client projects. In this regard, open source technologies help us improve the reusability of solutions as a whole or their parts, increase data quality, and be more efficient when preparing new projects.

Anna Evans, article author

Anna Evans

Anna Evans is the Head of Marketing at BizMachine, leading marketing strategy and execution across the Czech and Central European markets. She specializes in B2B positioning, sales enablement, and data-driven marketing. At BizMachine, she bridges the gap between data, technology, and go-to-market strategy to help sales and marketing teams find and win the right customers.