redis.exceptions.ConnectionError: Error 10061 connecting to localhost:6379. No connection could be made because the target machine actively refused it

81 views Asked by At

When ever I close the pycharm and open it and run the celery and wsgi(gunicorn) I am getting below reddis error, I have to restart redis service, in order to make my application work work again

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\connection.py", line 264, in connect
    sock = self.retry.call_with_retry(
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\retry.py", line 46, in call_with_retry
    return do()
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\connection.py", line 265, in <lambda>
    lambda: self._connect(), lambda error: self.disconnect(error)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\connection.py", line 627, in _connect
    raise err
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\connection.py", line 615, in _connect
    sock.connect(socket_address)
ConnectionRefusedError: [WinError 10061] No connection could be made because the target machine actively refused it
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "D:\venvs\mlops_backend_venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 408, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "D:\venvs\mlops_backend_venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\applications.py", line 116, in __call__
    await self.middleware_stack(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
    raise exc
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\middleware\cors.py", line 83, in __call__
    await self.app(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\middleware\exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\_exception_handler.py", line 55, in wrapped_app
    raise exc
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\_exception_handler.py", line 44, in wrapped_app
    await app(scope, receive, sender)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\routing.py", line 746, in __call__
    await route.handle(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\routing.py", line 288, in handle
    await self.app(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\routing.py", line 75, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\_exception_handler.py", line 55, in wrapped_app
    raise exc
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\_exception_handler.py", line 44, in wrapped_app
    await app(scope, receive, sender)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\routing.py", line 70, in app
    response = await func(request)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\fastapi\routing.py", line 299, in app
    raise e
  File "D:\venvs\mlops_backend_venv\lib\site-packages\fastapi\routing.py", line 294, in app
    raw_response = await run_endpoint_function(
  File "D:\venvs\mlops_backend_venv\lib\site-packages\fastapi\routing.py", line 193, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\starlette\concurrency.py", line 35, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "D:\venvs\mlops_backend_venv\lib\site-packages\anyio\_backends\_asyncio.py", line 2134, in run_sync_in_worker_thread
    return await future
  File "D:\venvs\mlops_backend_venv\lib\site-packages\anyio\_backends\_asyncio.py", line 851, in run
    result = context.run(func, *args)
  File "C:\Users\raraj\PycharmProjects\NoCodeAI\backend\api\views\authentication_and_authorization\authentication\authentication.py", line 133, in wrap
    redis_login_data = connect.master_redis.get_val(key="login_data_{}".format(str(user_id)))
  File "C:\Users\raraj\PycharmProjects\NoCodeAI\backend\api\connection\utility.py", line 75, in get_val
    return self.r.get(key)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\commands\core.py", line 1829, in get
    return self.execute_command("GET", name)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\client.py", line 533, in execute_command
    conn = self.connection or pool.get_connection(command_name, **options)
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\connection.py", line 1086, in get_connection
    connection.connect()
  File "D:\venvs\mlops_backend_venv\lib\site-packages\redis\connection.py", line 270, in connect
    raise ConnectionError(self._error_message(e))
redis.exceptions.ConnectionError: Error 10061 connecting to localhost:6379. No connection could be made because the target machine actively refused it.

what does this error mean, why is this happening, how do I resolve this.

I am new with redis and celery, so I don't know much about it, except it's a cache holding server.Suggesting any simple, begining level documentation or any source available online could be very appriciated.

I could've checked for myself but, someone who already learnt might go through several sources and happen to find the better one, it would reduce my hustle. Thankyou in advance.

1

There are 1 answers

0
DejanLekic On

Does Celery support Windows??

Answer: No.

Since Celery 4.x, Windows is no longer supported due to lack of resources.

But it may still work and we are happy to accept patches.