Mid-Level Cloud Software Engineer
- A software developer with an analyst mindset who develops capabilities designed to help solve common problems arising from various analyst offices is needed.
- The primary focus will be on prototype development in the area of analytics and streaming analytics focusing on message characterization and communications of SMS and IoT traffic.
- The individual may also work directly with customers to gather requirements and feedback.
Technical Skills Required:
- Shall have demonstrated work experience with 1) Serialization such as JSON and/or BSON, 2) developing restful services, and 3) using source code management tools
- Shall have demonstrated work experience with developing restful services, Ruby on Rails framework, LDAP protocol configuration management and cluster performance management (e.g. Nagios).
- Shall have at least 3 years of experience developing software in UNIX/Linux (Red Hat versions 3-5+) operating systems.
- Shall have demonstrated work experience in the design and development of at least one Object Oriented System.
- Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
- Shall have at least 3 years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
- Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development projects.
- In addition, the candidate will have demonstrated experience, work or college level courses, in at least 2 of the desired characteristics.
- Shall have demonstrated work experience with Source Code Management (e.g. Git, Stash, or Subversion, etc.)
Special Technical Skills Desired:
- Experience with Java and Pig
- Experience with DataXplorer and QTA
- Knowledge of network intrusion/network packet analysis, malware and computer forensics
- Experience with IBM InfoSphere Streams
- User Interface development experience with Node.js and React
- Experience with Python and Apache Spark
- Good interpersonal skills
- Experience deploying applications in a cloud environment.
- Understanding of Cloud Scalability.
- Hadoop/Cloud Certification.
- Experience designing and developing automated analytic software, techniques, and algorithms.
- Experience developing and deploying analytics that include foreign language processing; analytic processes that incorporate/integrate multi-media technologies, including speech, text, analytics that function on massive data sets, for example, more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e. inference engines) between structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ techniques commonly associated with Artificial Intelligence for example algorithms.
- Experience with data formats/techniques such as XML (Schema, XSL/T, XQuery), Streaming parsers (Stax or SAX, DOM), protobuf, or Avro
- Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
- Experience developing and deploying: data driven analytics, event driven analytics, sets of analytics orchestrated through rules engines.
- Experience with linguistics (grammar, morphology, concepts).
- Experience developing and deploying analytics that discover social networks.
- Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
- Experience developing and deploying analytics within a heterogeneous schema environment.
Minimum Experience Required:
- A bachelor’s degree in computer science, engineering, mathematics or a related discipline may be substituted for 4 years of general experience.
- The candidate must have:
- At least eight (8) years of general experience in software development/engineering, including requirements analysis, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
- At least 5 years of experience in software-intensive projects and programs for government or industry customers.
- At least six (6) years of experience developing software with high level languages (such as Java, C, C++), and at least three (3) years developing software in UNIX/Linux (RedHat versions 3-5+) and software integration and testing (to include developing and implementing test plans and scripts).
- At least four (4) years of experience with distributed scalable Big Data Store (NoSQL) such as H Base, CloudBase/Accumulo, Big Table, etc., as well as Map Reduce programming model, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop, Hive, Pig, Etc.
- TS/SCI with Polygraph Required