resultscrawler

module
v0.0.0-...-301f747 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 21, 2015 License: MIT

README

UQAM Resultats Crawler

Build Status GoDoc Coverage Status

This application contains two executables, a crawler to fetch data from the UQAM website and a webserver to access the data at any time.

The is a web client, built using Angularjs. An iOS and Androit app are also in developpement.

The project is currently running on https://results.jdupserver.com.

Prerequisites

Mandatory:

Optional:

  • An email smtp server

Installation

The recommended way to get the code is through the go get command.

    go get github.com/janicduplessis/resultscrawler
  1. Navigate to the root folder and install go dependencies. You may need to install mercurial and bazaar to be able to download the dependencies.

    cd $GOPATH/src/github.com/janicduplessis/resultscrawler
    go install ./...
    
  2. If you don't already have bower installed globally, install it.

    npm install bower -g
    
  3. Install the webserver libraries using bower.

    cd webserver
    bower install
    
  4. Create config files.

    4.1 Crawler

    From the project root:

          cd crawler
          cp template.config.json crawler.config.json
    

    Edit the crawler.config.json file to reflect your server configuration.

    4.2 Webserver

    From the project root:

          cd webserver
          cp template.config.json webserver.config.json
    

    Edit the webserver.config.json file to reflect your server configuration.

Run the code

To run the crawler, from the project root:

    cd crawler
    go run crawler.go

To run the webserver, from the project root:

    cd webserver
    go run webserver.go

Hopefully everything worked!

Directories

Path Synopsis
pkg
api
Package api provides entities for the appilcation.
Package api provides entities for the appilcation.
crawler
Package crawler is a crawler that runs periodically for each user and updates the database if it finds new results.
Package crawler is a crawler that runs periodically for each user and updates the database if it finds new results.
crypto
Package crypto provides utilities for various crytographic functions.
Package crypto provides utilities for various crytographic functions.
store/crawlerconfig
Package crawlerconfig provides Store interface for storing the crawler configuration.
Package crawlerconfig provides Store interface for storing the crawler configuration.
store/mongo
Package mongo implements the interfaces for storing users, results and crawler config in a mongodb datastore.
Package mongo implements the interfaces for storing users, results and crawler config in a mongodb datastore.
store/results
Package results provides store interface for results.
Package results provides store interface for results.
store/user
Package user provides store interface for users.
Package user provides store interface for users.
tools
Package tools provides various helpers.
Package tools provides various helpers.
webserver
Package webserver implements a json api for the client to be able to access results from the web.
Package webserver implements a json api for the client to be able to access results from the web.
ws
Package ws profides utilities to create webservices.
Package ws profides utilities to create webservices.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL