Menu
©2021 ProObject
Interested in matching your career to your job? Want to mix cutting-edge technology with challenging and rewarding work assignments?
ProObject’s culture is one of continual employee investment—we know that happy employees are engaged employees. Click to learn more
NIST Cybersecurity Compliance is not an option - it's a requirement! ProObject’s certified staff follow our unique CYBRX process to ensure that you are on the right track to cybersecurity compliance in just a few short weeks. - Contact us to get compliant!
Apprentice Cloud Software Engineer (Stash, Ruby, Design)

Position Description:

  • Develop and maintain assigned persona analytic tasks requiring implementation, documentation and testing for the Finance Intelligence and High Interest Cargo team to support Threat Finance Analytic Development. Analytics will be developed primarily, but not exclusively, in the customer cloud infrastructure. The developer shall possess the necessary skills required to implement an end-to-end solution including (but not limited to) accessing existing datasets, creating new datasets by ingesting data, performing analytic functions and exposing analytic results to the users. The data could be metadata or content.

Labor Requirements:

  • Shall have demonstrated work experience with Serialization such as JSON and/or BSON.
  • Shall have demonstrated work experience with developing restful services, Ruby on Rails framework, LDAP protocol configuration management and cluster performance management (e.g. Nagios).
  • Shall have demonstrated work experience in the design and development of at least one Object Oriented System.
  • Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
  • Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development projects.
  • In addition, the candidate will have demonstrated experience, work or college level courses, in at least 2 of the desired characteristics.
  • Shall have demonstrated work experience with Source Code Management (e.g. Git, Stash, or Subversion, etc.)

Special Technical Skills Desired:

  • 2 years of experience with writing cloud analytics (HADOOP, MapReduce, Accumulo) in Pig or Java
  • Experience using Java, Python, Maven, Git, Jira and Confluence
  • Prior experience developing analytics utilizing MWS, and SOAP is highly preferred
  • Experience with UI visualization technologies to include JavaScript/Angular, Node.js, ElasticSearch, and D3 is desired
  • Familiarity with customer corporate tools such as DX, TargetProfiler, GeoXplorer, LinkXplorer, Renoir, etc.
  • Work in a team environment
  • Experience deploying applications in a cloud environment.
  • Understanding of Cloud Scalability.
  • Hadoop/Cloud Certification.
  • Experience designing and developing automated analytic software, techniques, and algorithms.
  • Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
  • Experience developing and deploying: data driven analytics, event driven analytics, sets of analytics orchestrated through rules engines.
  • Experience with linguistics (grammar, morphology, concepts).
  • Experience developing and deploying analytics that discover social networks.
  • Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
  • Experience developing and deploying analytics within a heterogeneous schema environment.

Minimum Experience Required:

  • At least five (5) years of general experience in software development/engineering, including requirements analysis, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
  • At least three (3) years of experience developing software with high level languages (such as Java, C, C++), developing software in UNIX/Linux (RedHat versions 3-5+), and software integration and testing (to include developing and implementing test plans and scripts).
  • At least two (2) years of experience with distributed scalable Big Data Store (NoSQL) such as H Base, CloudBase/Accumulo, Big Table, etc., as well as Map Reduce programming model, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop, Hive, Pig, Etc.
  • Shall have demonstrated work experience with 1) Serialization such as JSON and/or BSON, 2) developing restful services, and 3) using source code management tools.
  • Shall have at least 3 years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
  • A bachelor’s degree in computer science, engineering, mathematics or a related discipline may be substituted for 4 years of general experience.
  • TS/SCI with Polygraph Required
Next Job