ETL Best Practice

Sushant Kumar Jha
2 min readSep 3, 2021

Happy Reading To All

In this story, we will talk about what are the few points we should understand before designing any ETL pipeline.

ETL Process Workflow

Analyse Source Of Data

The very first question you should ask is, what is the type and volume of data it is?

Sometimes data comes from different sources like Saas source, might be media storage it can be anywhere. or might be real-time as well.

Many more questions/answers can be addressed.

Solving Data Issues

Data is the biggest asset for any company today. Processing it with utmost importance is essential.

Thus, solving any data issues that arise in one run of the ETL cycle so that it doesn’t repeat itself in the next cycle is a good practice.

Some ways of doing this are: adding autocorrect tasks for predictable errors, adding data validation constraints talking to source partners if the error persists.

Analysing source, type, method of data helps to build better planning for adding autocorrect tasks for predictable errors.

ETL Logging

ETL process cannot be decided on through a cookie-cutter approach, every business is different and requires a unique solution. Maintaining proper logs help in making this choice and tailoring your ETL process.

In short, documenting events that occurred before ETL, after ETL, or during the process of ETL.

Auditing

Auditing is used to ensure that the ETL process is going on as desired.

ETL auditing lets you make sure that there are no abnormalities in the data even when there are no errors.

Modularity

Modularization is the process of abstracting ETL processes into smaller reusable blocks.

This helps in simplifying the process and reusing a single block of code for multiple processes.

This can be done by breaking down the code into a number of functions while also leveraging the different concepts of object-oriented programming. It makes unit testing easier and establishes a standard that every process must follow.

Alerting

Setting up an alert system in case of an error is a good practice. It helps you correct the error immediately. This is especially important in case of unauthorized access or any other security breach.

Optimizing ETL Solution

This involves using parallel processes wherever possible. Ensuring that your hardware can handle the ETL process, capturing each running time and comparing them periodically are some other practices you can follow. Making simple changes like disabling check and foreign key constraints, or separating triggers into a complete task can reduce the running time of an ETL cycle.

Conclusion

I have discussed a few best practices that can help you start your ETL pipeline designing easy.

--

--

Sushant Kumar Jha

Machine Learning enthusiast |Python |C++|JavaScript|Tensorflow|keras|pytorch