Atlanta, GA


Apply now


Overall Purpose:

The candidate will be responsible for requirements gathering and analysis, design, development, testing and debugging of analytical, statistical/mathematical programming, optimization, decision support systems and network function virtualization systems. Additionally, will work with a large amount of data from diverse, complex and frequently unrelated sources and be able to perform data collection, data analyses/mining, data wrangling/cleansing, data integration and occasional mathematical/data modeling with little or no guidance. 

Role and Responsibilities:

1) Requirements gathering and analysis; architect solution with system design, system engineering and appropriate process flow and implement with Agile methodology.
2) Data acquisition, data ETL, data cleansing, data mining, data integration with data quality controls, process management and statistical analyses techniques.
3) Implement data modeling, statistical/mathematical, linear or optimization programming models.
4) Development of high performance, distributed computing tasks using Big Data technologies such as Hadoop, NoSQL and other data mining/management techniques in distributed environments.
5) Design and develop software (web based or otherwise) for decision support systems, optimization systems, planning systems or other data mining/modeling applications for mobility, network engineering or enterprise solutions.
6) Web-applications with multiple levels/dimensions of securities, inside/outside firewalls, eCommerce like with encryption or equivalent technologies.
7) Write code, complete programming and documentation, and perform testing and debugging of applications using programming languages and technologies in Unix/Linus, VM and web environments.
8) Setup and administer document/file transfer/depository systems such as FTP, SSH, Sharepoint drive or other similar technologies.


  • Angular JS
  • Architecting applications that handle Big Data
  • Big Data - Hbase
  • Big Data - Hive
  • Big Data - Sqoop / Spark / Pig
  • Collaborarive personality - engage in interactive discussions 
  • Communication Skills - both oral and written
  • Demonstate ability to understand/analyze/synthesize and integrate new complex data sets
  • Development experience with Big Data Technologies - Hadoop/MapReduce/Hive
  • Experience developing applications with interactive geographical interfaces
  • Experience with developement JavaScript, Ajax, Spring, Hibernate, D3
  • Experience with development in Core Java/J2EE, JSP/Servlets. HTML/DHTML
  • Experience with development of Web Svcs/APIs/Eclipse/Apache/Tomcat
  • Experience with statistical, mathematical/linear programming software SAS/R/AMPL/CPLEX
  • Fast learner
  • Hands on experience in rational databases - Oracle,SQL/PL-SQL, etc
  • Laboratory testing large scale UNIX or LINUX network management systems
  • Problem solving skills - data quality controls, system design, data analysis and data modeling
  • Self-motivated
  • Teamwork 
  • Virtualized environment with Java, C/C++, Perl, Shell/AWK or Python languages
  • Work independently