Broad knowledge and experience in:
- Understanding of Big Data Engineering/processing, Business Intelligence and Advanced analytics
- Knowledge on ETL/ELT Design and processes
- Knowledge in databases and Datawarehouse modeling
- Knowledge in Cloud based data engineering and Machine Learning Models
- Knowledge in building APIs for application integration
- Knowledge on Software development life cycle process. e.g. Agile and Waterfall.
- Establish procedures and determine the necessary transformations to transfer both structured and unstructured data from the source to a new physical Data Model.
- Work with Data Scientists to implement strategies for cleaning and preparing data for analysis, to develop data imputation algorithms, and optimize performance of big data and machine learning systems.
Above average skills in:
- Big Data Engineering and Processing using Google Cloud Tech stack (Big Query, Big Table, Cloud functions, Spark, and cloud storage etc.)
- Cloud Data Engineering knowledge on GCP/AWS/AZURE is a plus.
- Conceptual Knowledge on ETL/ELT Design and processes (Apache Ni-Fi and Snowflake is a plus)
- Algorithms and Data Structures (Understand appropriate usage of data structures and how they are used in common algorithms)
- Cloud tools (Object Store (GCS), Kubernetes (GKE), Databases (BQ))
- Solid understanding of SQL and databases (Oracle, MSSQL, etc.), including SQL and NoSQL.
- Programming Skills in Python/Java etc…
- Effective Communicator with analytical, logical and problem-solving skills.
Demonstrated ability to:
- Work in a dynamic, fast-paced work environment
- Be self-motivated and able to work under minimal supervision.
- Adapt to new technologies and learn quickly.
- Have a strong passion for data and information, as well as strong analytical, problem-solving, and organizational skills.
- Work in multi-functional groups with diverse interests and requirements to achieve a common goal.
- Communicate very well with distributed teams (written, verbal and presentation)
Broad knowledge and experience in:
- Understanding of Big Data Engineering/processing, Business Intelligence and Advanced analytics
- Knowledge on ETL/ELT Design and processes
- Knowledge in databases and Datawarehouse modeling
- Knowledge in Cloud based data engineering and Machine Learning Models
- Knowledge in building APIs for application integration
- Knowledge on Software development life cycle process. e.g. Agile and Waterfall.
- Establish procedures and determine the necessary transformations to transfer both structured and unstructured data from the source to a new physical Data Model.
- Work with Data Scientists to implement strategies for cleaning and preparing data for analysis, to develop data imputation algorithms, and optimize performance of big data and machine learning systems.
Above average skills in:
- Big Data Engineering and Processing using Google Cloud Tech stack (Big Query, Big Table, Cloud functions, Spark, and cloud storage etc.)
- Cloud Data Engineering knowledge on GCP/AWS/AZURE is a plus.
- Conceptual Knowledge on ETL/ELT Design and processes (Apache Ni-Fi and Snowflake is a plus)
- Algorithms and Data Structures (Understand appropriate usage of data structures and how they are used in common algorithms)
- Cloud tools (Object Store (GCS), Kubernetes (GKE), Databases (BQ))
- Solid understanding of SQL and databases (Oracle, MSSQL, etc.), including SQL and NoSQL.
- Programming Skills in Python/Java etc…
- Effective Communicator with analytical, logical and problem-solving skills.
Demonstrated ability to:
- Work in a dynamic, fast-paced work environment
- Be self-motivated and able to work under minimal supervision.
- Adapt to new technologies and learn quickly.
- Have a strong passion for data and information, as well as strong analytical, problem-solving, and organizational skills.
- Work in multi-functional groups with diverse interests and requirements to achieve a common goal.
- Communicate very well with distributed teams (written, verbal and presentation)