Skip to content

Commit

Permalink
Merge pull request #477 from bda-research/beta
Browse files Browse the repository at this point in the history
Some promotes and test update
  • Loading branch information
mike442144 authored Aug 1, 2024
2 parents 7de2ef7 + 672df3b commit c9210c9
Show file tree
Hide file tree
Showing 26 changed files with 338 additions and 322 deletions.
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

------

[![npm package](https://nodei.co/npm/crawler.png?downloads=true&downloadRank=true&stars=true)](https://www.npmjs.com/package/crawler/v/2.0.0)
[![npm package](https://nodei.co/npm/crawler.png?downloads=true&downloadRank=true&stars=true)](https://www.npmjs.com/package/crawler/v/2.0.2)

[![CircleCI](https://circleci.com/gh/bda-research/node-crawler/tree/master.svg?style=svg)](https://circleci.com/gh/bda-research/node-crawler/tree/master)
[![NPM download][download-image]][download-url]
Expand Down Expand Up @@ -246,6 +246,7 @@ crawler.send({
- [crawler.queueSize](#crawlerqueuesize)
- [Options](#options)
- [Global only options](#global-only-options)
- [`silence`](#silence)
- [`maxConnections`](#maxconnections)
- [`priorityLevels`](#prioritylevels)
- [`rateLimit`](#ratelimit)
Expand Down Expand Up @@ -520,13 +521,13 @@ items in the **crawler.add()** calls if you want them to be specific to that ite
#### `retryInterval`

- **Type:** `number`
- **Default** : 2000
- **Default** : 3000
- The number of milliseconds to wait before retrying.

#### `timeout`

- **Type:** `number`
- **Default** : 15000
- **Default** : 20000
- The number of milliseconds to wait before the request times out.

#### `priority`
Expand Down
1 change: 0 additions & 1 deletion eslint.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@ const options = [
files: ["src/**/*.ts"],
rules: {
"@typescript-eslint/no-explicit-any": "warn",
"@typescript-eslint/ban-types": "warn",
"@typescript-eslint/no-unused-vars": [
"error",
{
Expand Down
20 changes: 10 additions & 10 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "crawler",
"version": "2.0.1",
"version": "2.0.2",
"description": "Crawler is a ready-to-use web spider that works with proxies, asynchrony, rate limit, configurable request pools, jQuery, and HTTP/2 support.",
"repository": {
"type": "git",
Expand Down Expand Up @@ -36,27 +36,27 @@
"license": "MIT",
"dependencies": {
"cheerio": "1.0.0-rc.12",
"got": "^14.4.1",
"got": "^14.4.2",
"hpagent": "^1.2.0",
"http2-wrapper": "^2.2.1",
"iconv-lite": "^0.6.3",
"seenreq": "^3.0.0",
"tslog": "^4.9.3"
},
"devDependencies": {
"@eslint/js": "^9.5.0",
"@eslint/js": "^9.8.0",
"@types/got": "^9.6.12",
"@types/node": "^20.14.8",
"@types/node": "^20.14.13",
"ava": "^6.1.3",
"c8": "^10.1.2",
"eslint": "~9.4.0",
"globals": "^15.6.0",
"eslint": "^9.8.0",
"globals": "^15.8.0",
"nock": "^13.5.4",
"sinon": "^18.0.0",
"tough-cookie": "^4.1.4",
"tsx": "^4.15.7",
"typescript": "^5.4.5",
"typescript-eslint": "8.0.0-alpha.30"
"tsx": "^4.16.3",
"typescript": "^5.5.4",
"typescript-eslint": "^8.0.0"
},
"ava": {
"files": [
Expand All @@ -77,4 +77,4 @@
],
"clean": true
}
}
}
Loading

0 comments on commit c9210c9

Please sign in to comment.