BS in Computer Science or related fields with 8+ years of professional experience.
Must have strong background and competency in enterprise data lake, data warehouse, data architecture, data management and data pipelines in streaming/real-time/near real time/batch modes.
Must have technical skills: Talend Cloud, HVR, Matillion, AWS S3, Snowflake, Salesforce, Kafka, Unix, Python, Java, XML/Avro/JSON/Parquet file formats, RESTful and SOAP based web services.
Experience with standard security practices to encrypt / decrypt files or PII fields.
Experience in product delivery using Agile delivery method.
Experience using DevOps tools and agile development methodologies with Test Driven Development (TDD) and CI (Continuous Integration)/CD (Continuous Delivery).
Good analytical and troubleshooting skills for effective problem solving.
Ability to work independently, handle multiple tasks, understand the need for consistent process and attention to details.
Team player with strong analytical, verbal, and written communication skills
Ability to work in a fast paced, iterative development environment and adapt to changing business priorities and to thrive under pressure.
Excellent understanding of computer science fundamentals, data structures, algorithms, OOPs, and OOA/D
Proven ability to understand our business and ability to contribute to technology direction that drives measurable business improvements.