Immediate need for two (2) Ab Initio Administrator/Engineer who can independently install and maintain Big Data (Cloudera, Horton Works, etc.) clusters in high available, load balanced configuration across multiple (Production, User Acceptance, Performance, and Development) environments. These positions are contract to permanent opportunities with our client, a major Federal healthcare organization, in Reston VA (currently 100% remote but 2021 plans to return to the Reston office is under consideration)
Under general supervision, manage Big Data Administration activities, technical documentation, system performance support, and internal customer support. May provide input into the development of Systems Architecture for mission critical corporate development projects. The selected candidate will work with Solutions Architects, Infrastructure Architects, Lead Big Data Administrator, Big Data Supplier and Developers to setup environments and support the development teams. Candidate will support the implementation of a Data Integration/Data Warehouse for the Informatics Teams at our customer sites. Additional tasks include:
- Responsible for all the tasks involved in administration of ETL Tool (Ab-Initio).
- Maintaining access, licensing and file system in the ETL server.
- Provide guidance on the design and integration of ETL to ETL Developers.
- Manage metadata hub, Operational Console and troubleshoot environmental issues which affect these components.
- Responsible for technical Metadata management.
- Work with the team to maintain data lineage and resolve data lineage issues.
- Design and develop automated ETL process and architecture.
- Interact with the Client on daily basis to define the scope of different applications.
- Work on the break fix and continuous development items and review and inspection for the production changes.
- Perform the code review for the ETL code developed by the development team and guide to resolve any issues.
- Work with the various other groups - DBAs, Server Engineer Team, Middleware Group, Citrix Group, Network Group, data transmission, etc. to resolve the performance related/integration related issues.
- This position requires a BA/BS in Computer Science, Information Systems, Information Technology or related field with 6+ years of prior experience in software development, Data Warehousing and Business Intelligence OR equivalent experience.
-Must have Ab Initio administration/engineering background to support infrastructure related tasks/procedures
- Administrator experience working with batch processing and tools in the Hadoop technical stack (e.g. MapReduce, Yarn, Pig, Hive, HDFS, Oozie)
- Administrator experience working with tools in the stream processing technical stack (e.g. Spark, Storm, Sama, Kafka, Avro)
-Must have at least one of the following: Hbase, Solr, Spark, and Kafka Experience
- Administrator experience with NoSQL stores (e.g. ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB)
- Expert knowledge on AD/LDAP security integration with Big Data
- Hands-on experience with at least one major Hadoop Distribution such as Cloudera, Horton Works, MapR or IBM Big Insights
- Advanced experience with SQL and at least two major RDBMS-s
- Advanced experience as a systems integrator with Linux systems and shell scripting
- Advanced experience doing data related benchmarking, performance analysis and tuning, troubleshooting
- Excellent verbal and written communication skills
Monday, April 19, 2021
12 month T2P
Friday, April 2, 2021
Know someone who would be a good fit? We pay for referrals!