Who we are? We are building a blockchain data analytics platform – designed to deliver affordable business intelligence to small and medium businesses. The engine employs correlational algorithms to integrate enterprise data and consumer behavioural data to meet the everchanging requirements of the business landscape.
Why we do? The world churns out more than 2.5 quintillion bytes of data every day, and less than 1% of all data is analysed and used. The demand for analysed data far exceeds the availability. As consumption patterns evolve rapidly with technological disruption, relying on traditional means of tracking consumer data may no longer be enough to facilitate important decision making. The amount of data generated continues to increase at a rate so fast that businesses and organisations must adapt quickly, or risk becoming obsolete.
How we do? Backed by artificial intelligence and learning capabilities, the engine aims to track, gather, analyse and correlate different types of data, e.g. Enterprise data, online marketing performance data, third-party reports and information, etc., onto a singular smart dashboard that can empower businesses, especially small medium enterprises (SMEs) that are looking to grow their internal capabilities cost efficiently.
About the role
DataVLT is an affordable and secure Data Analytics as a Service solution that focuses on the most important things: Automating the hard work of data preparation and analysis for companies through our platform.
1. Proficient with containerization solutions like Docker, Kubernetes or Openshift.
2. Administrative experience in various flavours of Linux environments.
3. Proficient with tools and techniques for continuous integration and continuous delivery.
4. Strong background in shell scripting languages (e.g. Perl, BASH, Python)
5. Experience in building and releasing automation tools for production.
6. Possess excellent listening, critical thinking, and analytical skills.
Good to have:
1. Keen explorer of data science and/or blockchain technology.
2. A certified Scrum Master.
1. Involve in the full DevOps lifecycle from the designing, development, testing, deployment, maintenance and operations of DATAVLT services.
2. Provide knowledge and skills in the management of source code, change control, configuration management and process for build/deployment of code.
3. Communicate collaborate with data science and development team in the validation, refinement, articulation, and implementation of data science algorithms and software development.
4. Develop and perform automation for testing, deployment and maintenance of infrastructure and applications to ensure that teams maintain a high standard of quality at all points along the pipeline.
5. Ensure that development documentations are maintained and updated.
6. Keep abreast with the latest DevOps practices and methodologies.
7. Provide technical leadership during stakeholder review meetings and offer insights into how to best secure technical success.
1. Bachelor’s Degree in Computer Science or equivalent practical experience.
2. At least 3-6 years of relevant work experience
3. Show strong understanding of environment management, release management, code versioning best practices, and deployment methodologies