Menu
©2020 ProObject
Interested in matching your career to your job? Want to mix cutting-edge technology with challenging and rewarding work assignments?
ProObject’s culture is one of continual employee investment—we know that happy employees are engaged employees. Click to learn more
NIST Cybersecurity Compliance is not an option - it's a requirement! ProObject’s certified staff follow our unique CYBRX process to ensure that you are on the right track to cybersecurity compliance in just a few short weeks. - Contact us to get compliant!

If due to the Coronavirus you get locked out of your workplace and cannot work from home or an alternative location, ProObject employees will be paid for up to 14 days.

Cloud Software Engineer III (Java, Pig, MapReduce)

Position Description:

  • Develop and maintain assigned persona analytic tasks requiring implementation, documentation and testing for the Identity & Behavior Discovery team.
  • Analytics will be developed primarily, but not exclusively, in the customer cloud infrastructure.
  • The developer shall possess the necessary skills required to implement an end-to-end solution including (but not limited to) accessing existing datasets, creating new datasets by ingesting data, performing analytic functions and exposing analytic results to the users. The data could be metadata or content.

 

Labor Requirements:

  • Shall have demonstrated work experience with Serialization such as JSON and/or BSON.
  • Shall have demonstrated work experience with developing restful services, Ruby on Rails framework, LDAP protocol configuration management and cluster performance management (e.g. Nagios).
  • Shall have demonstrated work experience in the design and development of at least one Object Oriented System.
  • Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
  • Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development projects.
  • In addition, the candidate will have demonstrated experience, work or college level courses, in at least 2 of the desired characteristics.
  • Shall have demonstrated work experience with Source Code Management (e.g. Git, Stash, or Subversion, etc.)

 

Technical Skills Required:

  • Experience with Java/Pig MapReduce is required
  • Desire knowledge of Graph Theory to include Social Network Analysis
  • Experience developing software in a Linux environment
  • Experience with Jira, Maven and Git
  • Experience with network technologies and protocols
  • Familiarity with customer corporate tools such as DX, TargetProfiler, GeoXplorer, LinkXplorer, Renoir, etc.
  • Work in a team environment

 

Special Technical Skills Desired:

  • Experience with visualization frameworks JavaScript/Angular and D3 is desired
  • Experience with Python, is desired
  • Experience deploying applications in a cloud environment.
  • Understanding of Cloud Scalability.
  • Hadoop/Cloud Certification.
  • Experience designing and developing automated analytic software, techniques, and algorithms.
  • Experience developing and deploying analytics that include foreign language processing; analytic processes that incorporate/integrate multi-media technologies; analytics that function on massive data sets, for example, more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e. inference engines) between structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ techniques commonly associated with Artificial Intelligence for example algorithms.
  • Experience with data formats/techniques such as XML (Schema, XSL/T, XQuery), Streaming parsers (Stax or SAX, DOM), protobuf, or Avro
  • Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
  • Experience developing and deploying: data driven analytics, event driven analytics, sets of analytics orchestrated through rules engines.
  • Experience with linguistics (grammar, morphology, concepts).
  • Experience developing and deploying analytics
  • Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
  • Experience developing and deploying analytics within a heterogeneous schema environment.

 

Minimum Experience Required:

  • At least eight (8) years of general experience in software development/engineering, including requirements analysis, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution. .At least 5 years of experience in software-intensive projects and programs for government or industry customers.
  • At least six (6) years of experience developing software with high level languages (such as Java, C, C++), and at least three (3) years developing software in UNIX/Linux (RedHat versions 3-5+) and software integration and testing (to include developing and implementing test plans and scripts).
  • At least four (4) years of experience with distributed scalable Big Data Store (NoSQL) such as HBase, CloudBase/Accumulo, Big Table, etc., as well as MapReduce programming model, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop, Hive, Pig, Etc.
  • Shall have demonstrated work experience with 1) Serialization such as JSON and/or BSON, 2) developing restful services, and 3) using source code management tools
  • A bachelor’s degree in computer science, engineering, mathematics or a related discipline may be substituted for 4 years of general experience.
  • TS/SCI with Polygraph Required

 

Previous Job Next Job