Big Data Architect
As an extremely technical position, the position of Big Data Architect requires an extensive skillset and approximately 10-15 years’ experience in addressing big data problems. A Big Data Architect is accountable for the overall development and design of a vision behind a projected big data solution. A Big Data Architect is required in any organization that is looking to build a big data environment either on-premises or in the cloud. A Big Data Architect is considered the link between the needs of the organization and the big data scientists and the big data engineers. Duties include creating the requirements analysis, the platform selection, design of the technical architecture, design of the application design and development, testing, and deployment of the proposed solution. The ideal candidate will have he experience with major big data solutions including Hadoop, MapReduce, Hive, HBase, MongoDB, Cassandra. Experience in Impala, Oozie, Mahout, Flume, ZooKeeper and/or Sqoop is also desirable.
Aside from a firm understanding in big data solutions, a Big Data Architect should have an understanding of major programming/scripting languages like Java, Linux, PHP, Ruby, Phyton and/or R. Experience working with ETL tools like as Informatica, Talend and/or Pentaho is also preferred. Large cloud-computing infrastructure solutions experience such as Amazon Web Services or Elastic MapReduce is required. Technology priorities for this position include BI/analytics, cloud, mobile, digital marketing, infrastructure and data center, ERP, security, industry-specific applications, CRM, and data communications.