Skip to content

Backend CRUD and web crawler to populate and store auction data for price comparison

License

Notifications You must be signed in to change notification settings

kitoshi/auctioncrawl-api

Repository files navigation

auctionapi

Contributors Forks Stargazers Issues MIT License LinkedIn


Auction Crawler API

Backend CRUD and web crawler to populate and store auction data for price comparison

View Demo · Report Bug · Request Feature

Table of Contents

  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgements

About The Project

project screenshot

This is the backend portion of a full stack web crawler price comparison. We are using puppeteer to create instances of a chrome browser to navigate to government surplus auction listings and scraping keywords, prices, and urls with filtering done by cheerio. The keywords pulled are then sent to Ebay's "finding" service api to find matches and receives the listing URL and price. All this data is then sent to a redis(free) hosted data storage, where express will route fetch requests from the frontend.

Here's why I built this:

- The ability to compare over 250 items at once saves a huge amount of time
- The backend can self update listings as auctions expire or prices change
- Express is a great boilerplate solution for handling routing and fetch requests.
- Cost of hosting is zero.

Built With

Getting Started

To get a local copy up and running follow these simple steps.

Prerequisites

This is an example of how to list things you need to use the software and how to install them.

  • npm
    npm install npm@latest -g

Installation

  1. Clone the repo
    git clone https://github.com/kitoshi/auctioncrawl-api.git
  2. Install NPM packages
    npm install

Usage

  1. You will need a redis host to set "client" in app.js with a .env file for the password
    redis screenshot
  2. You will also need an auction site to crawl (see urls & link in app.js), the one in the release is GCSurplus (Canadian Federal Gov't)
    gc surplus screenshot
  3. Last account is an Ebay Developer account needed for access to the API. You will have to replace SECURITY_APPNAME with your personal string.
    ebay screenshot
  4. Run with npm start - console will return messages while crawler is in progress. Done after two OK replies from redis
  5. Server will run default on http://localhost:9000/, GET data available on routes http://localhost:9000/crawlerAPI, http://localhost:9000/ebayAPI
  6. For production, backend can be hosted on google cloud platform using their app engine. I used their SDK to port to the cloud service by command line in my code editor.

Roadmap

See the open issues for a list of proposed features (and known issues).

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Contact

Robert K. Charlton - [email protected]

Project Link: https://github.com/kitoshi/auctioncrawl-api

Acknowledgements

About

Backend CRUD and web crawler to populate and store auction data for price comparison

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published