Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v3.6.3.0 PermissionError: [WinError 5] Access is denied #328

Open
caffeinatedMike opened this issue Jan 25, 2021 · 1 comment
Open

v3.6.3.0 PermissionError: [WinError 5] Access is denied #328

caffeinatedMike opened this issue Jan 25, 2021 · 1 comment

Comments

@caffeinatedMike
Copy link

I'm receiving a PermissionError using version 3.6.3.0 inside a virtual environment created by PyCharm. It appears to be some sort of issues in the queues.py file. Any idea on how this can be resolved? I can't upgrade or download python versions.

System details

Windows 10 64-bit
Python version: Python 3.7.8 (tags/v3.7.8:4b47a5b6ba, Jun 28 2020, 08:53:46) [MSC v.1916 64 bit (AMD64)] on win32

Traceback

Process Process-1:
Traceback (most recent call last):
  File "C:\Users\mhill\PycharmProjects\d_price_comparisons\venv\lib\site-packages\billiard\process.py", line 327, in _bootstrap
    self.run()
  File "C:\Users\mhill\PycharmProjects\d_price_comparisons\venv\lib\site-packages\billiard\process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Users\mhill\PycharmProjects\d_price_comparisons\venv\lib\site-packages\scrapyscript\__init__.py", line 73, in _crawl
    self.results.put(self.items)
  File "C:\Users\mhill\PycharmProjects\d_price_comparisons\venv\lib\site-packages\billiard\queues.py", line 87, in put
    if not self._sem.acquire(block, timeout):
PermissionError: [WinError 5] Access is denied

Test Script

from scrapyscript import Job, Processor
from price_comparisons import KPricingSpider, WPricingSpider, DPricingSpider
import json

test_upc = "016000124790"
test_zipcode = "46224"
test_radius = "10"
spider_list = [KPricingSpider, WPricingSpider, DPricingSpider]
job_list = [
    Job(
        spidercls,
        upc=test_upc,
        zipcode=test_zipcode,
        radius=test_radius
     ) for spidercls in spider_list
]

# Create a Processor, optionally passing in a Scrapy Settings object.
processor = Processor(settings=None)

# Start the reactor, and block until all spiders complete.
data = processor.run(job_list)

# Print the consolidated results
print(json.dumps(data, indent=4))
@auvipy
Copy link
Member

auvipy commented Jan 29, 2021

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants