We want you. Because you have talent and drive. Because you're resourceful and creative. Because you have ideas and enthusiasm. We want you because we believe in the power of one. And we've created a world where every individual understands how hard we work for them. What will your contribution be?

Verizon Enterprise Solutions Big Data Group is looking for Big Data engineer(s) with expert level experience in architecting and building our new Hadoop, NoSql, InMemory Platforms(s) and data collectors. You will be part of the team building worlds one of the largest Big Data Platform(s) that can ingest 100’s of Terabytes of data that will be consumed for Business Analytics, Operational Analytics, Text Analytics, Data Services and build Big Data Solutions for various Verizon Business units

With over 100 million wireless, wireline, enterprise customers and a massive global network, data analytics is a critical component of our business. This is a unique opportunity to be part of building disruptive technology where Big Data will be used as platform to build solutions for various units.                                                                                                            

·       Architect and Build Big Data Platform using Hadoop echo system.

·        Evaluate, recommend and build NoSQL, SQL and InMemory platform(s)understanding the business needs.     

·         Architect and build data collectors and data fabric to collect and transport data to the Hadoop echo system based Big Data Platform.

·         Build monitoring solution(s) for the Big Data infrastructure to understand the health of the infrastructure.

·         Automate deployment and configuration management using open source frameworks.

·         Be the SME (Subject Matter Expert) for Big Data Platform(s).

Desired Skills:

·        Bachelor¹s degree in Computer Science, Management Information Systems or related field.     

·        8-10 years of experience building and managing complex products/solutions.

·        7+ years of experience working in Linux/Unix environment.

·        Expert level experience architecting, building and maintaining Enterprise grade Hadoop Petabyte store.

·        Expert level understanding of Hadoop HDFS and MapReduce framework internals.

·        Experience with components in Hadoop echo system (Hive, Pig, Impala, Ambari, Oozie, Sqoop. Zookeeper, Mahout)

·        Experience architecting and building data fabric with Flume, Scribe or kafka would be a plus.

·        Experience with open source frameworks like Puppet or Chef for deployment and configuration management is a plus.

·        Experience building and managing NoSQL Database like HBase or Cassandra.

·        Experience building and managing Big Data SQL Database like Vertica, MemSQL or VoltDB.

·        Experience with Shell Scripting, Perl and Python.

·        Experience programming in Java is a plus.

·        Most importantly be a good team player and willingness to learn and implement new Big Data technologies as needed.



Sr. Software Engineer - Data Warehousing   - San Francisco - Waltham  - Information Technology