* Understanding of Apache Hadoop and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Hive, Atlas, Ranger, Flume, Kafka, Oozie, NiFi, HBase, Solr, Avro).
* Deep knowledge on Extract, Transform, Load (ETL) and distributed processing techniques such as Map-Reduce
* Experience with Columnar databases like Snowflake, Redshift
* Experience in building and deploying applications in AWS (EC2, S3, Hive, Glue, EMR, RDS, ELB, Lambda, etc.)
* Experience with building production web services
* Experience with cloud computing and storage services
* Knowledge of Mortgage industry