Raphael do Vale Amaral Gomes

Raphael do Vale Amaral Gomes

Rio de Janeiro, Brazil
Hire Raphael
Scroll To View More
Raphael do Vale Amaral Gomes

Raphael do Vale Amaral Gomes

Rio de Janeiro, Brazil
Member since April 6, 2016
Raphael has over 10 years of experience developing web apps including coding, project planning, and specification. He has worked in several Java and JavaScript projects during his career. His technology stack usually involves Java, Spring, relational databases, and JavaScript tech with jQuery & Knockout. He has a PhD in the Semantic Web arena, winning awards during his studies. He has worked with description logic, Prolog, RDF, and OWL Tech.
Raphael is now available for hire
Portfolio
Experience
  • SQL, 12 years
  • JavaScript, 10 years
  • Java, 8 years
  • Knockout.js, 3 years
  • Spring, 3 years
  • jQuery, 8 years
  • SQL Server, 12 years
  • Web Crawlers, 6 years
Rio de Janeiro, Brazil
Availability
Part-time
Preferred Environment
Git, Eclipse or IntelliJ IDEA, MS SQL Server
The most amazing...
...work I've done was on my PhD project. I created a distributed semantic web crawler that was able to find and read all the semantic web cloud related terms.
Employment
  • Tech Lead
    Quantum
    2014 - PRESENT
    • Handled the transformation of a legacy system from a monolith to a more modular, small services architecture.
    • Used messaging (JMS with ActiveMQ) for service communications; handling service discovery and balancing.
    • Improved the UI experience, moving from a "table-based" layout to a CSS3 + HTML5 layout.
    • Introduced BDD (Behavior Driven Design) using Serenity and JBehave. Integration tests on our web application uses Selenium + JBehave and our internal systems uses JBehave + JUnit and some internal solutions.
    • Handled product team meetings and worked together with a requirements team to split our work in sprints using Scrum methodology.
    • Worked with a QA team, joining forces to improve automated tests.
    • Gained experience with a SQL Server, handling stability with a database consultant.
    • Created a tool for our clients so that they are able to connect with our systems through Excel functions and SOAP. The tool is able to handle tasks queues for each user, in that way one user cannot use all of our CPU power.
    • Created a back-end + front-end framework for handling contextual user navigation throughout our legacy web system. The technology shows different information based on what is selected and in which screen the user is in. The solution was challenging due to the legacy nature of the application and we did it in a way so that it is now simple to extend and improve the solution.
    Technologies: Java, JMS, ActiveMQ, Realtime Applications, J2EE, Spring, Hibernate, JPA, JavaScript, Knockout, .NET, SQL Server, TDD, BDD, Continuous Integration, Maven, Jenkins, Mercurial, SOAP, Rest Services, Scrum, New Relic
  • Professor
    CCE PUC-Rio Centro
    2012 - PRESENT
    • Worked as a professor of back-end development using servlets and JDBC on other low-level technologies.
    • Taught databases using Oracle SQL Developer.
    • Taught front-end development; mainly CSS, HTML5, JavaScript, jQuery and PhoneGap for mobile development.
    Technologies: Java, Oracle, JavaScript, jQuery, PhoneGap, CSS, HTML5
  • Software Engineer
    Instituto Tecgraf - PUC-Rio
    2013 - 2014
    • Supported a long-term emergency system for Petrobras with innumerable developers, managers, and QA staff.
    • Used JavaServer Faces with PrimeFaces and some internal technologies to handle the application.
    • Contributed to an internal schemaless database based on description logic.
    • Aided in management meetings and provided solutions for handling the products.
    • Used Knockout + PrimeFaces for handling the front-end.
    Technologies: Java, J2EE, JavaServer Faces, Hibernate, Maven, JUnit, Selenium, Subversion, JavaScript, jQuery, SQL Server, Jira, Scrum
  • Senior Software Engineer
    Minds at Work
    2010 - 2013
    • Created a Single Page Application (SPA) for portfolio management. The tool was built in JavaScript (jQuery) + Java (Struts) and a Microsoft SQL Server. The client's user was able to use the application on a tablet, computer, TV, or projected using a "wii remote" controller. The system shows a world map and the user is able to "drag-and-drop" assets for different countries and continents. While the user changes its portfolio, the system calculates, in JavaScript, its profit and risk on the country, continent, and the world. The Java part was built to access and cache data provided from Bloomberg and other vendors. The historical data was stored in a SQL Server, but clients' portfolios are in a XML file.
    • Promoted to project coordinator of clients. There I led several other Single Page Applications for the same client. Examples are: a due diligence system, a qualitative analysis system, a risk workflow system. and others.
    • Defined a "JavaScript guideline" for developers in the company. Based on our previous experience, we defined a guide for developers on the team to maintain the same code style and to avoid bugs.
    • As we grew, the client cancelled contracts with other software providers and gave the projects to us. We had to take a lot of bad code and improve it using software engineering patterns and a lot of refactorings. In one month, we had the system operational, but throughout the years we was always trying to improve the products.
    • Helped to create the first QA team of the company. The team was responsible for creating new Selenium integration tests for new products and different clients.
    Technologies: .NET, Java, Hibernate, Struts, JavaScript, jQuery, Ajax, Touch User Interfaces, SQL, HTML5, CSS, Eclipse, SQL Server, WebSphere Application Server
  • Project Coordinator
    OIP - PUC-Rio
    2004 - 2010
    • As a trainee, maintained a Visual Basic for Application for the Hospital ProCrianca.
    • Data mined Brazil's education ministry.
    • Created, modeled, and did project estimations for commerce software which was written in PHP and JavaScript.
    • Developed a workflow tool in Java for the legislative assembly of Sao Paulo.
    • Worked as a project coordinator for legacy software for the Oswaldo Cruz Foundation (Portuguese Fundação Oswaldo Cruz aka FIOCRUZ).
    Technologies: Java, PHP, SQL, JavaScript, jQuery, AJAX, HTML, Eclipse, MySQL, SQL Server, Oracle
Experience
  • Best Paper Award (Other amazing things)
    http://iceis.org/PreviousAwards.aspx

    Won a Best Paper Award in the area of "Software Agents and Internet Computing" for the paper entitled "A Metadata Focused Crawler for Linked Data" received at the 16th International Conference on Enterprise Information Systems (ICEIS-2014).

  • Paper - A Metadata Focused Crawler for Linked Data (Other amazing things)
    http://www.inf.puc-rio.br/~casanova/Publications/Papers/2014-Papers/2014-ICEIS-Crawler.pdf

    The Linked Data best practices recommend publishers of triplesets to use well-known ontologies in the triplication process and to link their triplesets with other triplesets.

    However, despite the fact that extensive lists of open ontologies and triplesets are available, most publishers typically do not adopt those ontologies and link their triplesets only with popular ones, such as DBpedia and GeoNames.

    This paper presents a metadata crawler for Linked Data to assist publishers in the triplification and the linkage processes. The crawler provides publishers with a list of the most suitable ontologies and vocabulary terms for triplification, as well as a list of triplesets that the new tripleset can be most likely linked with. The crawler focuses on specific metadata properties, including the subclass of, and returns only metadata, hence the classification “metadata focused crawler”.

  • Book Chapter - CRAWLER-LD: A Multilevel Metadata Focused Crawler Framework for Linked Data (Other amazing things)
    http://www.inf.puc-rio.br/~casanova/Publications/Papers/2015-Papers/2015-LNBIP-Gomes.pdf

    The Linked Data best practices recommend to publish a new tripleset using well-known ontologies and to interlink the new tripleset with other triplesets. However, both are difficult tasks.

    This paper describes CRAWLER-LD, a metadata crawler that helps selecting ontologies and triplesets to be used, respectively, in the publication and the interlinking processes. The publisher of the new tripleset first selects a set number (T) of terms that describe the application domain of interest. Then, he submits T to the CRAWLER-LD, which searches for triplesets whose vocabularies include terms direct or transitively related to those in T. Then CRAWLER-LD returns a list of ontologies to be used for publishing the new tripleset, as well as a list of triplesets that the new tripleset can be interlinked with. CRAWLER-LD focuses on specific metadata properties, including the subclass of, and returns only metadata, hence the classification “metadata focused crawler”.

  • PhD Thesis - Crawler Frameworks for Linked Data (Other amazing things)
    http://www.inf.puc-rio.br/~casanova/Publications/Dissertations-Theses/2015-Raphael.pdf

    The Linked Data best practices recommend to publish a new tripleset using well-known ontologies and to interlink the new tripleset with other triplesets. However, both are difficult tasks.

    This thesis describes frameworks for metadata crawlers that help selecting ontologies and triplesets to be used, respectively, in the publication and the interlinking processes.

    Briefly, the publisher of a new tripleset first selects a set of terms that describe the application domain of interest. Then, he submits the set of terms to a metadata crawler, constructed using one of the frameworks described in the thesis, which searches for triplesets whose vocabularies include terms direct or transitively related to those in the initial set of terms. The crawler returns a list of ontologies to be used for publishing the new tripleset, as well as a list of triplesets that the new tripleset can be interlinked with. Hence, the crawler focuses on specific metadata properties, including the subclass of, and returns only metadata, which justifies the classification “metadata focused crawler”.

Skills
  • Languages
    JavaScript, Java, Java 8.0, OWL, RDF, SQL, PHP, C#, Prolog
  • Frameworks
    Spring, Jena Semantic Web Framework, JUnit, Akka, Knockout.js, Google Guava, Apache Struts, Bootstrap, Play Framework, JBehave, Apache Velocity, .NET
  • Libraries/APIs
    jQuery, Quartz, JMS, DWR, Amazon API
  • Tools
    Eclipse, Apache Tomcat, IntelliJ IDEA, SVN, Mercurial, Visual Studio, Netbeans, Git, Atom, CVS
  • Paradigms
    Test-driven Development (TDD), Concurrent Programming, Agile Software Development, Object-oriented Programming (OOP), Object-oriented Design (OOD), AJAX, REST, Functional programming, Distributed Programming, Scrum, Behavior-driven Development (BDD), Single-page Application Development
  • Platforms
    J2EE, Windows, Linux, Azure, WebSphere, Amazon Web Services (AWS)
  • Storage
    SQL Server, PostgreSQL, Oracle, MySQL
  • Misc
    Servlets, Lambda expressions, Linked Data, Single-page application, JSP, Web Crawlers, Apache HTTP Server, Actor Model, SOAP, ActiveMQ, Data Mining, Serenity
Education
  • PhD in Computer Science (Semantic Web)
    Pontifícia Universidade Católica do Rio de Janeiro - Rio de Janeiro, Brazil
    2010 - 2015
  • Master's degree in Computer Science (Semantic Web)
    Pontifícia Universidade Católica do Rio de Janeiro - Rio de Janeiro, Brazil
    2008 - 2010
  • Bachelor's degree in Computer Science, Information Systems, Databases
    Pontifícia Universidade Católica do Rio de Janeiro - Rio de Janeiro, Brazil
    2002 - 2006
I really like this profile
Share it with others