RAIDIX production technology


By Alexander Lysenko

In 9 cases out of 10, the value of stored data exceeds the cost of a data storage system by a factor of several times. Development of such products implies high responsibility and commitment to reliable, stable and safe data processing. What does RAIDIX do to keep up with the high standards? We adhere to a specified production workflow. More details in this blog posting!

How did we get to the current production technology? 4 years ago we decided to revamp our approach towards quality improvement and stability issues. This decision gave birth to what we now know as the RAIDIX 4.x product line, which was largely developed from scratch. Our development process went through a cardinal change.

Before the review, we did the right things including code verification and thorough quality assurance yet we were not happy with the outcome. Once we reached the testing phase we hopelessly fell behind schedule, since up to 80% of project time was engrossed by testing and bug fixing. At a certain point, we got down to analysis, and here’s what we discovered.

What caused the delay?

  1. Complexity and high level of code coupling. Any modification made in a specific module reverberated in other modules, triggering unexpected issues.
  2. Long error lifecycle. Bugs were passed back and forth from QA to development, from verification back to bug fixing.
  3. Lengthy regression testing.

Suggested course of action

  1. Rewrite the business logic to avoid module coupling and facilitate the code.
  2. Employ tools to improve product quality: compulsory code review before trunk check-ins, CI, unit tests, etc.
  3. Grow the base of automatic tests to boost regression testing.


  1. We managed a complete code rewrite in the course of 12 months. The brand-new code involved fewer coupling elements, and left older issues and workarounds in the past. Architecture 4.x allowed us to significantly extend functionality without any prejudice to existing features, reinstate planning accuracy and shorten the testing timeframes.
  2. New features and bug fixes triggered fewer iterations across QA and developers due to consistent code improvement techniques.
  3. After implementing auto-tests, we automated a good deal of test cases (~10%). At the bottom line, testing got way less time-consuming and now required less direct QA intervention than before.

As RAIDIX software, development tools and approaches evolve, we adjust to the new reality and tailor our production technology to the new requirements. Stay tuned for more insights into RAIDIX software engineering and share your feedback!