Scrappy crawl error in virtual environment

68 views Asked by At

Tried the Scrapy tutorial, but cannot run the crawling script:

scrapy crawl quotes

The error I get:

-bash: /home/szendrei/.virtualenvs/scrapy-projects/bin/scrapy: No such file or directory

I'm under a virtual environment (created with virtualenvwrapper), and Python version 3.8. Scrapy installed with pip3 in the virtual environment. which scrapy shows nothing, pip3 show scrapy says:

Name: Scrapy
Version: 2.7.1
Summary: A high-level Web Crawling and Web Scraping framework
Home-page: https://scrapy.org
Author: Scrapy developers
Author-email:
License: BSD
Location: /home/szendrei/.local/lib/python3.8/site-packages
Requires: cryptography, cssselect, itemadapter, itemloaders, lxml, packaging, parsel, protego, PyDispatcher, pyOpenSSL, queuelib, service-identity, setuptools, tldextract, Twisted, w3lib, zope.interface
Required-by:

scrapy startproject tutorial worked just fine.

Tried to reinstall, but requirements already satisfied.

What could be the solution?

1

There are 1 answers

0
Dóra Szendrei On

Still don't konw how, but I managed to solve the problem. Uninstalled scrapy with pip3 uninstall scarpy, deleted the folders that have been created before for the project, and installed again. Now it works.