Skip to content

Commit

Permalink
Quick and dirty fix to test whether scrapy crawl fails.
Browse files Browse the repository at this point in the history
`scrapy crawl` command does not return any exit status depending on the
errors from the spider. I have to simulate this by checking whether
there is any instance of 'log_count/ERROR' in the log.
This commit will deliberately fail in Travis
  • Loading branch information
tzermias committed Sep 6, 2015
1 parent 97b5a78 commit ffcef5c
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ language: python
python:
- "2.7"
install: pip install scrapy py-dateutil
script: scrapy crawl diavgeia_spider
script: scrapy crawl diavgeia_spider --logfile=test && [[ $(grep -c 'log_count/ERROR' test) == 0 ]] && cat test && rm test
2 changes: 1 addition & 1 deletion diavgeia/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
SPIDER_MODULES = ['diavgeia.spiders']
NEWSPIDER_MODULE = 'diavgeia.spiders'
ITEM_PIPELINES = {
'diavgeia.pipelines.DownloaderPipeline': 100
# 'diavgeia.pipelines.DownloaderPipeline': 100
}

# DownloaderPipeline settings
Expand Down

0 comments on commit ffcef5c

Please sign in to comment.