Bright Coders' Factory — our name speaks for us, as our software sits in the hearts of global companies. We provide customers with state-of-the-art technologies. Our potential still grows, which is proven by the Forbes Diamond and Great Place to Work Awards.
We're writing code to make people's lives easier. In BCF, you will find your place and see that your work matters. Our portfolio includes projects from more than 15 industries - so depending on your preferences and stage of career, we're definitely going to find the right one for you.
Our client is embarking on a significant transformation project aimed at replatforming our financial system. Two initiatives will run in parallel: one for all financial and second for all non-financial data. We are in the early stages of an ambitious project to consolidate data from various sources into a single Delta Lake data warehouse.
Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
Solid understanding of the software development life cycle.
Proven experience with Python, Spark and SQL for data engineering and analysis.
Strong data analysis skills, with the ability to interpret and manipulate large datasets.
Experience with Databricks for data ingestion and transformation.
Familiarity with Delta Lake and data warehousing concepts.
Excellent problem-solving skills and the ability to see the big picture and understand the broader context of assigned tasks.
Strong teamwork and collaboration skills, with a demonstrated ability to work effectively in a team-oriented environment.
Knowledge of reporting tools like Power BI.
Knowledge of Data Science concepts is a plus.
Strong communication, interpersonal and presentation skills.
Lead the integration and ingestion of data from diverse sources into the Delta Lake platform, ensuring data consistency and availability.
Collaborate with various teams to identify all data sources, including SFTP / flat files, and APIs.
Design and implement systematic processes for data ingestion and integration.
Conduct extensive data analysis to support the consolidation of organizational data into a single data warehouse.
Utilize Databricks for data ingestion and reporting, transitioning data from multiple data warehouses to a unified Delta Lake platform.
Work closely with the development team to understand project scope, gather full requirements, and plan subsequent phases.
Participate in daily standups, sprint planning, backlog refinement, and retrospectives within a scaled scrum framework.
Exhibit flexibility and adaptability in navigating unknowns and evolving project requirements.