Salahuddin Aziz

Salahuddin Aziz
Senior Data Mining Analyst

Background

Mr. Salahuddin works as a Senior Java Developer at Dell Inc. He works part-time for the TASS company as the Senior Data Mining Analyst. Mr. Aziz physically works from Texas. He is also a Ph.D. candidate in the Computer Science Department at the Wayne State University, MI. His Ph.D. focus is in Data Mining.

Education:

  • Ph.D. Candidate, Computer Science (Data Mining), Wayne State University, MI
  • MS in Computer Science, Wayne State University, MI
  • BS in Computer Science, Bangladesh University of Engineering Technology

Software Development Focus:

  • Proficiency in Java, JSP, JMS, EJB(Session Bean, MDB, JPA), JAXB, Frameworks (Struts, Spring, Hibernate) in implementing JavaEE technology.
  • Specially proficient in creating and consuming Restful Webservices with json input/output using wink api’s.
  • Used Apache Tomcat, WebLogic, JBOSS, WebSphere in my projects and successfully deployed and compiled many of the applications on them.
  • Experience in developing Front-End using Struts, Spring, JSP, JSF, JavaScript, HTML, DHTML and CSS.
  • Good knowledge and experience in SDLC, especially in agile methodology(extreme programming). Worked in test driven development using SCRUM methodology with the help of “Agilo” tool.
  • Experience in unit testing using Junit and Qunit(for javascript).
  • Experience in object oriented analysis and design using Rational Rose and UML Lab. Hands on experience in preparing use case, class diagram and sequence diagrams to model a system.
  • Experience in Design Patterns including Model View Controller, Session Façade, Factory, Data Access Objects, Composite View, Singleton, Filter, Decorator Pattern.
  • Extensive experience on Eclipse and NetBeans IDE’s.
  • In-depth knowledge of XML technologies XML, XSL, XSLT, XSL-FO, XPath, DOM & SAX Parsers.
  • Good exposure to both creating and publishing Web Services as well as consuming it. In depth knowledge in SOAP, WSDL, UDDI.
  • Knowledge in cloud computing through the use of Apache Hadoop for distributed computing to implement and compare the performance of data mining algorithms.
  • Used Hibernate for Object Relational Mapping.
  • Practical knowledge on using Apache Mahout for implementing large scale data mining algorithms
  • Experience in Version Control System such as SVN and CVS.
  • Have experience in using JAAS, JavaMail, Apache Mahout api.
  • Have development experience in both Windows and Unix environment.
  • Good interpersonal skills, committed, result oriented, hard working with a quest and zeal to learn new technologies. Adaptive to the environment and new technology.
  • Good background on Data mining Algorithms in Classification, Clustering and Collaborative Filtering.