You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For starters, I've been quite pleased with both the node implementation as well as the python port of socket.io, except for one hiccup I'm running into while building up a proof of concept... I cannot for the life of me get my services to emit into socket.io processes that aren't of the same language.
Some background: I have a micro service architecture, and would like to vend sockets via one API (hoping for flask-socketio) but need to be able to feed those sockets with data from many disjoint back-end services written in different environments (Node, Golang, and Python emitters).
I've created a repo with a demo to showcase the issues I'm encountering. The repo contains 4 sub-directories, each intended to be their own running service. An emitter process and a socket.io server has been included for both python and node. The desired outcome is to see the pings from both the python and node emitters in all of the python and node served clients.
Socket.IO on first glance looks perfect given the support for a backing redis layer built in for horizontal scaling and delivery across multiple nodes. I've got messages properly emitting through multiple horizontally scaled node socket.io services, but only when the emit originates from a node process... and I've got the messages emitting between multiple python servers, when python originates the event. But I cannot for the life of me get the two environments to handle the other environment's emitted messages. I've monitored redis to watch all pub/sub operations occurring and can see that they are properly receiving messages emitted by the other environment, but neither can properly encode the other's data back to it's original contents.
node-emitter -> flask-socketio results in the socketio.RedisManager dropping the messages in its internal _thread() function due to a utf-8 encoding issue. It quietly drops these messages so it took me a bit to find out where this was occurring since debugging with eventlet monkey patching enabled seems to require some extra finesse. In my hunt for an answer, I obviously saw all of the issues surrounding the 1.X socket.io utf-8 double encoding issue, but seeing as I'm on all of the latest versions, I'm not sure why I'm seeing this behavior.
Just for a sanity check, I also built the flow in the other direction, python-emitter -> node-socketio ... which results in node crashing out on the object decode.
Stack trace:
/Users/jon/Desktop/real-socks/node-sock/node_modules/notepack.io/lib/decode.js:246
throw new Error((buffer.length - decoder.offset) + ' trailing bytes');
^
Error: 204 trailing bytes
at Object.decode (/Users/jon/Desktop/real-socks/node-sock/node_modules/notepack.io/lib/decode.js:246:11)
at Redis.onmessage (/Users/jon/Desktop/real-socks/node-sock/node_modules/socket.io-redis/index.js:148:24)
at RedisClient.emit (events.js:188:13)
at return_pub_sub (/Users/jon/Desktop/real-socks/node-sock/node_modules/redis/index.js:796:18)
at RedisClient.return_reply (/Users/jon/Desktop/real-socks/node-sock/node_modules/redis/index.js:833:9)
at JavascriptRedisParser.returnReply (/Users/jon/Desktop/real-socks/node-sock/node_modules/redis/index.js:192:18)
at JavascriptRedisParser.execute (/Users/jon/Desktop/real-socks/node-sock/node_modules/redis-parser/lib/parser.js:574:12)
at Socket.<anonymous> (/Users/jon/Desktop/real-socks/node-sock/node_modules/redis/index.js:274:27)
at Socket.emit (events.js:188:13)
at addChunk (_stream_readable.js:288:12)
You can get both of the provided node processes running with an npm install and node index.js. To get both of the python environments up, just run pip3 install -r requirements.txt and python3 app.py. The node service runs on port 3000, python service runs on port 5000, they expect redis to be available on the default port 6379, and each service is using the redis channel socket.io#/# for it's pub/sub key.
To see the versions installed check the committed package-lock.json and requirements.txt files correspondingly.
I'd docker-compose this for you but I've sank quite a bit of time on this as is already. Hoping I've just made a simple mistake.
The redis pub/sub format is not part of the Socket.IO protocol. Node uses its own format, different from the one used by this server. So basically, if you horizontally scale your servers, they all need to be the same server, or in other words, all the servers must be Python, or all the servers must be Node.
For starters, I've been quite pleased with both the node implementation as well as the python port of socket.io, except for one hiccup I'm running into while building up a proof of concept... I cannot for the life of me get my services to emit into socket.io processes that aren't of the same language.
Some background: I have a micro service architecture, and would like to vend sockets via one API (hoping for flask-socketio) but need to be able to feed those sockets with data from many disjoint back-end services written in different environments (Node, Golang, and Python emitters).
I've created a repo with a demo to showcase the issues I'm encountering. The repo contains 4 sub-directories, each intended to be their own running service. An emitter process and a socket.io server has been included for both python and node. The desired outcome is to see the pings from both the python and node emitters in all of the python and node served clients.
Socket.IO on first glance looks perfect given the support for a backing redis layer built in for horizontal scaling and delivery across multiple nodes. I've got messages properly emitting through multiple horizontally scaled node socket.io services, but only when the emit originates from a node process... and I've got the messages emitting between multiple python servers, when python originates the event. But I cannot for the life of me get the two environments to handle the other environment's emitted messages. I've monitored redis to watch all pub/sub operations occurring and can see that they are properly receiving messages emitted by the other environment, but neither can properly encode the other's data back to it's original contents.
node-emitter -> flask-socketio results in the socketio.RedisManager dropping the messages in its internal
_thread()
function due to a utf-8 encoding issue. It quietly drops these messages so it took me a bit to find out where this was occurring since debugging with eventlet monkey patching enabled seems to require some extra finesse. In my hunt for an answer, I obviously saw all of the issues surrounding the 1.X socket.io utf-8 double encoding issue, but seeing as I'm on all of the latest versions, I'm not sure why I'm seeing this behavior.Just for a sanity check, I also built the flow in the other direction, python-emitter -> node-socketio ... which results in node crashing out on the object decode.
Stack trace:
You can get both of the provided node processes running with an
npm install
andnode index.js
. To get both of the python environments up, just runpip3 install -r requirements.txt
andpython3 app.py
. The node service runs on port 3000, python service runs on port 5000, they expect redis to be available on the default port 6379, and each service is using the redis channelsocket.io#/#
for it's pub/sub key.To see the versions installed check the committed package-lock.json and requirements.txt files correspondingly.
I'd docker-compose this for you but I've sank quite a bit of time on this as is already. Hoping I've just made a simple mistake.
Thanks in advance.
Here's the Code.
The text was updated successfully, but these errors were encountered: