-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Session Replay not showing any replays #2002
Comments
Same here.... But it worked yesterday, so I'll try uisng one of the previously pushed snuba images. |
Just to be sure, you ran the install script after pulling master so your images are up to date? |
same issue here, only saw the first session
|
Yes. I in fact created a new fresh server build and installed sentry again from the start. To see if that works. Replays worked! ...for like 2 minutes, then scuba started throwing these errors again. |
I've attempted to reproduce this with the react SDK. Replays have been going through for over 30 minutes and nothing is breaking 🤔. I've dropped a line internally so this will be looked at shortly by the team that owns Session Replay |
For reference, the first env I tested in was using the Vue.js SDK, while the new instance I pulled on on a fresh server was hooked up with two projects, one using Vue, the other React. |
Re-deploying older images doesn't fix the problem. |
I haven't tested it yet, but I believe it only stopped working for me when I caught a rest api exception. I save parts of the JSON output and send it to sentry via extra data as a key:value pair. I'm guessing that this is what might be causing issues in Replay. I'll get around to testing it later today when I get some time. |
Interesting indeed. I use the same approach where i store the json output of the rest call as extra data in the JS SDK sentry object. I'n guessing Snuba doesn't escape the json string correctly when trying to store data in Clickhouse. |
Can confirm this is happening even with the replay on sentry.io, so not really limited to to self hosted version of Replays. Where would be the correct place to report this? @hubertdeng123 Can you forward this new info to that internal ticket? |
@edgolub @TsubasaBE could you provide an example of a JSON structure you're storing (PII removed if adding here)? is it nested? |
Hi @JoshFerge , yes it can be nested in some cases. I save both the request and response JSON for my XHR requests as extra data for easy lookup. These are added to the "Additional Data" section of an Issue: |
thanks for that information. and on your javascript SDK, what methods are you calling to set these values? also, what javascript version of the SDK are you on? |
and i'm noticing that we ingested the replay on our side -- we don't currently support adding extra fields onto a replay, but I do see that we ingested your replay, so I don't think in production we are actually erroring out. does that make sense? |
I have exactly the same problem. I saw a replay, was excited, but then snuba-replays-consumer runs into the problem and replays stop working. |
@JoshFerge I've actually sent multiple sessions, not just that first one. The first replay is simply the only one that is visible. I actually disabled extra data for our issue events, and I actually saw a new replay a few hours ago, in our self hosted sentry. |
@DarkByteZero got 2 replays that crash in Self Hosted sent to this project in SaaS: https://sentry-sdks.sentry.io/replays/?project=4504833789001728 |
I have the exact same logs after trying in dev the replays with SDK from last week ( We use various extra properties with nested objects and breadcrumbs with objects as well, so not sure with the current trace what is causing the crash. Let me know if you need more details or debug from this service.
|
Same error here. I somehow got one replay from production. There are no errors attached to it. Just a random session. But that's it and its been running for 4 hours. Even with sessions sampling set to 1.0 in dev I still wasn't getting anything. I'm assuming the recording data is compressed? After all the event info and context its just garbled characters in the browser network tab and postman interceptor. I might set session sampling to 1 on production and see if I get any more replays. Maybe even some from my session so i can look at the data its sending. |
Yeah, one replay goes through, after that, it dies. But I provided 2 Replays to help, maybe it will be fixed soon. |
Hopefully. The one replay I got looks awesome. I cant wait to actually use it. |
I had one Replay appear on the dashboard, and no further replays after that. I also upgraded to the latest release. I look forward to using it as that replay looked very useful. |
Got same error here, only one record then |
was able to reproduce and put a PR fix up. had trouble replicating because of clickhouse version, but figured it out. see getsentry/snuba#3878 |
We are currently avoiding upgrading Clickhouse for self-hosted users due to the risk where ingested data on 20.3 may not be compatible with Clickhouse 21.8. So please be aware of this risk and it'd be helpful to know if you all are experiencing issues after upgrading to 21.8. |
I mean i have no problems, but in fact iam missing some issue events, idk if that is because of Clickhouse. But they are just gone, lol. If i open URLs from E-Mails, they just say: "The issue you were looking for was not found." |
I just upgraded clickhouse a few hours ago, and not seeing any issues at all so far. Session Replays works now! So cool! |
I haven't seen any issues as of yet. Other than some errors having to do
with offsets in the ingester for replays. That was fixed with a command for
clearing offsets for a specific topic. Possibility for data loss with that
but I think it's mainly to do with that one specific topic. Not that I had
any legit data to begin with. So side affects of that are unknown to me.
…On Mon, Mar 20, 2023, 6:14 PM Edis Golubich ***@***.***> wrote:
I just upgraded clickhouse a few hours ago, and not seeing any issues at
all so far.
Session Replays works now! So cool!
—
Reply to this email directly, view it on GitHub
<#2002 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGTS4HDJSN5GW6SOKMTD4JLW5DCDBANCNFSM6AAAAAAVLX4FJE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
It did not fix the replays issue for me. Looking forward to official release. |
Check your consumer and ingester logs. Could be the same thing as what I
had where the offset was invalid. If you have errors that's crashing it let
me know and I can dig up the docker command.
…On Mon, Mar 20, 2023, 7:24 PM Jonathan Hassall ***@***.***> wrote:
It did not fix the replays issue for me. Looking forward to official
release.
—
Reply to this email directly, view it on GitHub
<#2002 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGTS4HCXZVPUW2Y4II4WXJDW5DKKHANCNFSM6AAAAAAVLX4FJE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
I get |
Are you resending the request? Or is that from just normal recording. This
is a similar error to what I was getting but it said something along the
lines of ``` Partitions revoked:
[Partition(topic=Topic(name='ingest-recordings'), index=0)]```
Instead of
Partitions revoked: [Partition(topic=Topic(name='outcomes'), index=0)]
#1894
If you find something similar in one of the consumer logs then I can send
you the command. Would send it now but I don't have my laptop on.
…On Mon, Mar 20, 2023, 8:15 PM Jonathan Hassall ***@***.***> wrote:
I get
[WARNING] root: Recording segment was already processed.
Is this relevant?
—
Reply to this email directly, view it on GitHub
<#2002 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGTS4HEB2TPFQHEEFP47HKTW5DQIFANCNFSM6AAAAAAVLX4FJE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
I didn't get a message like that. I am thinking of reinstalling. |
So far I haven't noticed any issues. Replays are working great. |
Replays working great for me too now I reinstalled. Thanks all and I hope the team gets this integrated into the release. |
Yep! this will be included in the april calver release of self hosted :) @jonhassall |
We've decided to cut a 23.3.1 release to fix this for you all. This will go out later this week |
v23.3.1 is now out, issue is still open but just pending some documentation PR reviews now |
hi im experiencing so much issues using that version of click house, how I can restore this or make this working again ? the error receiving is this
|
Hi, thanks for the update, I'm still having issues with kafka topic "ingest-replay-events" and the container replay-consumer (the container keeps restarting) and the offset out or range error, how can I clear the content ? Thanks Log of the snuba-replays-consumer container :
EDIT: found a fix thanks to #478 (comment)
After that replay work without problem. |
Unfortunately, if you've upgraded to Clickhouse 21.8, we don't have a recommendation on how to restore your previous data unless you have a backup of your docker volumes. You may need to consider a clean install otherwise |
great to hear that it's working for you @alexphili |
Not even fresh and removing all volúmenes works, I execute the script to test integrations and that change something that I never be able to reinstall again, I end up reinstalling the whole server and making a fresh installation and everything works perfect |
going to close this as it seems to be resolved now |
Hi, please confirm that with this config
replays should still be sent on errors. Because as soon as I set replaysSessionSampleRate = 0 (it was more than 0 for testing), replays stopped coming at all, although errors continue to come as usual, but they don't have replays. I only want sessions with errors, I'm not interested in normal replays Thanks :) |
@sashamorozov the issue you are describing is possibly an SDK problem, and not related to this issue. please report in https://github.com/getsentry/sentry-javascript if you are still having issues after upgrading your SDK to the latest version (7.45.0). thanks! |
Self-Hosted Version
23.3.0.dev0
CPU Architecture
x86_64
Docker Version
20.10.21
Docker Compose Version
1.29.2
Steps to Reproduce
master
branchExpected Result
The Replays list in the Sentry dashboard is not showing anything for the project.
After checking logs, I see a couple of errors in
snuba-replays-consumer
, attached below.Actual Result
snuba-replays-consumer_1 | 2023-03-01T08:32:52.220558504Z snuba.clickhouse.errors.ClickhouseWriterError: Cannot parse JSON string: expected opening quote: (while read the value of key title): (at row 1)
snuba-replays-consumer_1 | 2023-03-01T08:32:53.236009713Z 2023-03-01 08:32:53,235 Initializing Snuba...
snuba-replays-consumer_1 | 2023-03-01T08:32:56.125327791Z 2023-03-01 08:32:56,125 Snuba initialization took 2.8903919691219926s
snuba-replays-consumer_1 | 2023-03-01T08:32:56.627888461Z 2023-03-01 08:32:56,627 Initializing Snuba...
snuba-replays-consumer_1 | 2023-03-01T08:32:59.388766641Z 2023-03-01 08:32:59,388 Snuba initialization took 2.761911698617041s
snuba-replays-consumer_1 | 2023-03-01T08:32:59.396832190Z 2023-03-01 08:32:59,396 Consumer Starting
snuba-replays-consumer_1 | 2023-03-01T08:32:59.397231560Z 2023-03-01 08:32:59,397 librdkafka log level: 6
snuba-replays-consumer_1 | 2023-03-01T08:33:02.245125423Z 2023-03-01 08:33:02,244 New partitions assigned: {Partition(topic=Topic(name='ingest-replay-events'), index=0): 21}
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056774048Z 2023-03-01 08:33:04,055 Caught exception, shutting down...
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056819569Z Traceback (most recent call last):
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056827338Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/processor.py", line 175, in run
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056832589Z self._run_once()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056837249Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/processor.py", line 215, in _run_once
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056842118Z self.__processing_strategy.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056846189Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/dead_letter_queue/dead_letter_queue.py", l ine 39, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056867938Z self.__next_step.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056872649Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/run_task.py", line 62, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056876729Z self.__next_step.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056880258Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/reduce.py", line 140, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056884258Z self.__next_step.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056887909Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/run_task.py", line 127, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056892018Z result = future.result()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056895678Z File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 437, in result
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056899538Z return self.__get_result()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056903438Z File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056907438Z raise self._exception
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056911038Z File "/usr/local/lib/python3.8/concurrent/futures/thread.py", line 57, in run
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056915129Z result = self.fn(*self.args, **self.kwargs)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056919329Z File "/usr/src/snuba/snuba/consumers/strategy_factory.py", line 120, in flush_batch
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056923878Z message.payload.close()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056928089Z File "/usr/src/snuba/snuba/consumers/consumer.py", line 288, in close
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056932449Z self.__insert_batch_writer.close()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056936629Z File "/usr/src/snuba/snuba/consumers/consumer.py", line 127, in close
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056941158Z self.__writer.write(
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056945009Z File "/usr/src/snuba/snuba/clickhouse/http.py", line 328, in write
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056949098Z batch.join()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056952978Z File "/usr/src/snuba/snuba/clickhouse/http.py", line 266, in join
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056957389Z raise ClickhouseWriterError(message, code=code, row=row)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.056961458Z snuba.clickhouse.errors.ClickhouseWriterError: Cannot parse JSON string: expected opening quote: (while read the value of key title): (at row 1)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.065375948Z 2023-03-01 08:33:04,065 Closing <arroyo.backends.kafka.consumer.KafkaConsumer object at 0x7fc5cb4fda00>...
snuba-replays-consumer_1 | 2023-03-01T08:33:04.065903508Z 2023-03-01 08:33:04,065 Partitions revoked: [Partition(topic=Topic(name='ingest-replay-events'), index=0)]
snuba-replays-consumer_1 | 2023-03-01T08:33:04.067680488Z 2023-03-01 08:33:04,067 Processor terminated
snuba-replays-consumer_1 | 2023-03-01T08:33:04.072085497Z Traceback (most recent call last):
snuba-replays-consumer_1 | 2023-03-01T08:33:04.072105348Z File "/usr/local/bin/snuba", line 33, in
snuba-replays-consumer_1 | 2023-03-01T08:33:04.072897027Z sys.exit(load_entry_point('snuba', 'console_scripts', 'snuba')())
snuba-replays-consumer_1 | 2023-03-01T08:33:04.072924827Z File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1130, in call
snuba-replays-consumer_1 | 2023-03-01T08:33:04.072941897Z return self.main(*args, **kwargs)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.072946087Z File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1055, in main
snuba-replays-consumer_1 | 2023-03-01T08:33:04.073385877Z rv = self.invoke(ctx)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.073402197Z File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1657, in invoke
snuba-replays-consumer_1 | 2023-03-01T08:33:04.074031247Z return _process_result(sub_ctx.command.invoke(sub_ctx))
snuba-replays-consumer_1 | 2023-03-01T08:33:04.074044957Z File "/usr/local/lib/python3.8/site-packages/click/core.py", line 1404, in invoke
snuba-replays-consumer_1 | 2023-03-01T08:33:04.074668317Z return ctx.invoke(self.callback, **ctx.params)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.074683677Z File "/usr/local/lib/python3.8/site-packages/click/core.py", line 760, in invoke
snuba-replays-consumer_1 | 2023-03-01T08:33:04.074689787Z return __callback(*args, **kwargs)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.074694847Z File "/usr/src/snuba/snuba/cli/consumer.py", line 189, in consumer
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075384857Z consumer.run()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075398927Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/processor.py", line 175, in run
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075405047Z self._run_once()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075409717Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/processor.py", line 215, in _run_once
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075414377Z self.__processing_strategy.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075418757Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/dead_letter_queue/dead_letter_queue.py", l ine 39, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075423727Z self.__next_step.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075428297Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/run_task.py", line 62, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075432817Z self.__next_step.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.075437377Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/reduce.py", line 140, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076195707Z self.__next_step.poll()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076210517Z File "/usr/local/lib/python3.8/site-packages/arroyo/processing/strategies/run_task.py", line 127, in poll
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076216767Z result = future.result()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076220997Z File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 437, in result
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076225547Z return self.__get_result()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076230177Z File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076234877Z raise self._exception
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076239027Z File "/usr/local/lib/python3.8/concurrent/futures/thread.py", line 57, in run
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076965347Z result = self.fn(*self.args, **self.kwargs)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.076981907Z File "/usr/src/snuba/snuba/consumers/strategy_factory.py", line 120, in flush_batch
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077002947Z message.payload.close()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077007447Z File "/usr/src/snuba/snuba/consumers/consumer.py", line 288, in close
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077011197Z self.__insert_batch_writer.close()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077014777Z File "/usr/src/snuba/snuba/consumers/consumer.py", line 127, in close
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077018467Z self.__writer.write(
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077021987Z File "/usr/src/snuba/snuba/clickhouse/http.py", line 328, in write
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077648317Z batch.join()
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077660327Z File "/usr/src/snuba/snuba/clickhouse/http.py", line 266, in join
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077663597Z raise ClickhouseWriterError(message, code=code, row=row)
snuba-replays-consumer_1 | 2023-03-01T08:33:04.077666007Z snuba.clickhouse.errors.ClickhouseWriterError: Cannot parse JSON string: expected opening quote: (while read the value
Event ID
No response
The text was updated successfully, but these errors were encountered: