In addition to meeting the labor category requirements for which the candidate is submitted, candidates are also required to meet the following:
• Strong domain knowledge of network metadata and protocols; network analysis.
• Analytic development experience using scripting languages such as Python and Scala to use statistical libraries against data.
• Skilled with big data processing frameworks such as Pig, MapReduce and Spark to scale algorithms over large volumes of data.
• Desire to work with network metadata to include deep packet inspection generated data, alerting data, and netflow in order to generate insights about network behavior.
• Experience employing combination (2 or more) of analysis, computer science, mathematics, and software engineering skills to devise strategies for extracting meaning and value from large datasets.
•Experience with statistics and analytics and data mining.
• Experience or knowledge working with data generated by Bro, Snort, Suricata, and Netflow sensors.
• Experience working with notebook style data analysis tools to present analytic workflows.
• Skilled working with development and SE teams, other stakeholders’ agencies, and leadership.
• Knowledge of data indexing and storage methodologies for analytic outputs.
• Strong communication and presentation skills required.
• Experience working with cloud service providers and data stewards.
• Demonstrated experience working with corporate query and visualization tools.
• Experience with machine learning.
• Devise strategies to extract meaning and value from structured and unstructured data.
• Leverage statistical methods and/or machine learning to discover patterns and behaviors of entities.
• User query and visualization tools, such as DataXplorer and GMAE, to present question focused datasets in a story like manner (might need some rewording, gist is to be able to tell a story with the data).
• Work with network metadata to develop new methodologies and techniques for detection of abnormal behavior.
• Analyze and develop requirements to support the characterization and ingestion of new and existing data types.
• Collaborate with customer teams to understand direct mission needs and requirements.