Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

All aio.FakeRedis instances share the same server #218

Closed
ijrsvt opened this issue Jul 26, 2023 · 5 comments · Fixed by #223
Closed

All aio.FakeRedis instances share the same server #218

ijrsvt opened this issue Jul 26, 2023 · 5 comments · Fixed by #223
Labels
bug Something isn't working

Comments

@ijrsvt
Copy link

ijrsvt commented Jul 26, 2023

Describe the bug

If I create two clients with fakeredis.aioredis.FakeRedis() (even with different hosts/ports) they share the same backend.

To Reproduce

from fakeredis.aioredis import FakeRedis as FakeAsyncRedis
FakeAsyncRedis(host="host_one", port=1000).set("30", "30")
assert FakeAsyncRedis(host="host_two", port=2000).get("30") is None

Raises an AssertionError

Expected behavior

from fakeredis import FakeRedis
FakeRedis(host="host_one", port=1000).set("30", "30")
assert FakeRedis(host="host_two", port=2000).get("30") is None

Desktop (please complete the following information):

  • OS: MacOS + Linux
  • python version: 3.7.11
  • redis-py version 2.17.0
$ pip list | grep redis
aioredis                                 1.3.1
fakeredis                                2.17.0
hiredis                                  2.1.0
redis                                    4.3.4

Additional context
Add any other context about the problem here.

@ijrsvt ijrsvt added the bug Something isn't working label Jul 26, 2023
@kamilglod
Copy link

We're having the same problem - bug was introduced in 2.17.0 - in previous version it works fine on our end.
Maybe it's because of this change: https://github.com/cunla/fakeredis-py/pull/213/files#diff-288db5468d2e9d706d7767762175c8b7f73e00a2601114e0be0c42b43d8f2843L66-R66 - reusing some default port instead of creating new one each time.

@netrix
Copy link

netrix commented Nov 24, 2023

I am afraid that this issue hasn't been fixed by above commit. The change in the place pointed by #218 (comment) wasn't reversed.

The problem still occurs in the newest version. Is this change of behavior done on purpose ?

@wenzhiTeo
Copy link

wenzhiTeo commented Dec 11, 2023

This issue might relate to changes in d227138

Just to provide possible solution. I think that the default host value in class FakeRedis(redis_async.Redis): should be remove. To test the solution can just update __init__ function with kwargs["host"] = uuid.uuid4().hex. Then will found that issue fixed.

image


For those facing the issue and need solution immediately, may override the __init__ function as below:

class CustomFakeStrictRedis(fakeredis.FakeStrictRedis):
    def __init__(self, *args, server=None, version=(7,), **kwargs):
        # To ensure the FakeStrictRedis not gonna sharing the same server
        if not server:
            server = fakeredis.FakeServer()

        super().__init__(*args, server=server, **kwargs)

Example case:

class xxxxxTestCase(GraphQLTestCase):
    def setUp(self):
        super().setUp()
        self.loaders_get_redis_connection_patcher = patch(
            "xxx.loaders.get_redis_connection", CustomFakeStrictRedis
        )
        self.utils_get_redis_connection_patcher = patch(
            "xxxx.utils.get_redis_connection", CustomFakeStrictRedis
        )
        self.loaders_get_redis_connection_patcher.start()
        self.utils_get_redis_connection_patcher.start()


    def tearDown(self):
        super().tearDown()
        self.loaders_get_redis_connection_patcher.stop()
        self.utils_get_redis_connection_patcher.stop()

@cunla
Copy link
Owner

cunla commented Dec 11, 2023

If you create two redis connections without supplying any connection parameters (host/port/...) it will use the defaults and therefore will use the same server.

FakeRedis mimiks this behavior, I am not sure I understand the issue...

Please provide a test where you provide different connection parameteers and the result is two connection sharing the server instance.

@wenzhiTeo
Copy link

wenzhiTeo commented Dec 12, 2023

If you create two redis connections without supplying any connection parameters (host/port/...) it will use the defaults and therefore will use the same server.

You are right. But for some cases we may not provide any connection parameters just like the example I provided.

We might do

self.loaders_get_redis_connection_patcher = patch(
    "xxx.loaders.get_redis_connection", CustomFakeStrictRedis
)

# In patch we cannot pass initiated object

(Probably this is a rare case)


At fakeredis 2.16.0 + redis 5.0.0rc1 the creation of redis connections is unique. When no parameter passed

  • connection parameters host & port not introduce yet

But in fakeredis 2.20.0 +redis 5.1.0a1 the creation of redis connections is identical. When no parameter passed

  • This will messed the result between 2 redis

image

I suspect the reason is because the default value set in aioredis.py::FakeRedis. Then the default value might not needed
Because from the source code I notice that _server.py::FakeRedisMixin::__init__ have ability to auto generate the unique or random host value with kwargs.setdefault("host", uuid.uuid4().hex)


Probably this not an issue anymore, if current version of fakeredis is making connection parameters as required parameters for unique redis connections.

Which mean whenever unique redis connection is needed

We need to do this

from fakeredis.aioredis import FakeRedis as FakeAsyncRedis
x = FakeAsyncRedis(host="host_one", port=1000)
y = FakeAsyncRedis(host="host_two", port=2000)

instead of

from fakeredis.aioredis import FakeRedis as FakeAsyncRedis
x = FakeAsyncRedis()
y = FakeAsyncRedis()

Then for case not going initialize the fake redis connection, will need to override the __init__ function

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants