Aspects' dashboards do not reflect pages, problems or video visualizations

Hello

In my Aspects’ dashboards I cannot see the progress of pages, problemas and video visualizations, none in engagement or performance tab

I’ve installed tutor-contrib-aspects plugin in an stage environment connected to a remote Clickhouse instance. I’ve runned all the commands suggested in the Data Population Options in tutor plugin documentation and they all completed propertly

Any suggestions here? I believe I am missing something but not sure of what…

Thanks in advance!
Regards

Hi @Yago , I think this is probably the same issue as the other one you posted. However, if you go through the course and watch videos and answer some problems does any of the new data populate? If so, I think we can continue to troubleshoot in your other thread. If not, can you check the lms-worker and ralph container logs for any errors?

Best,
Ty

Hi @TyHob thanks for your reply. I followed your instructions to fix tracking log file mounts but still having some issues with charts in dashboards. It’s a bit weird because it seems that data is been populated propertly, but for some reason the charts are not been displayed. You can see an example in the following screenshot:

As this is common to mostly every course I’ve being testing, there’s one course that shows the dashboard propertly!. I have check the configuration, calendar, etc.. but didn’t see anything special compared to others. See the screenshot below:

I’m not sure if it could be related but you can see that there’s an error displayed in the browser’s console (top right of the image), related with superset giving a 403 (forbidden) when trying to retrieve something from the superset API. I noticed that it uses secure https while my Django Oauth Toolkit application configuration is on http (insecure), but I don’t know why…

I’ve searched for errors in lms-worker and ralph containers but I do not see nothing strange, except that user and ip are always None. As an example here’s a 13 line log with a event in lms-worker and ralph:

tutor_local-lms-worker-1  | 2025-08-25 12:25:53,315 INFO 13 [celery.app.trace] [user None] [ip None] trace.py:128 - Task event_routing_backends.tasks.dispatch_bulk_events[64940644-e3e2-4e04-9018-d7b6e55be141] succeeded in 9.08336120378226s: Nonetutor_local-lms-worker-1  | 2025-08-25 12:25:53,333 INFO 1 [celery.worker.strategy] [user None] [ip None] strategy.py:161 - Task eventtracking.tasks.send_event[7313923e-10be-424e-ab70-771b8472d1a1] receivedtutor_local-lms-worker-1  | 2025-08-25 12:25:53,374 INFO 13 [celery.app.trace] [user None] [ip None] trace.py:128 - Task lms.djangoapps.gating.tasks.task_evaluate_subsection_completion_milestones[0a0956ac-95db-4de7-bac2-6eb2ee295f15] succeeded in 0.054048920050263405s: Nonetutor_local-lms-worker-1  | 2025-08-25 12:25:53,386 INFO 13 [event_routing_backends.backends.events_router] [user None] [ip None] events_router.py:234 - Event edx.completion.block_completion.changed has been queued for batching. Queue size: 1tutor_local-lms-worker-1  | 2025-08-25 12:25:53,390 INFO 13 [event_routing_backends.backends.events_router] [user None] [ip None] events_router.py:234 - Event edx.completion.block_completion.changed has been queued for batching. Queue size: 1tutor_local-lms-worker-1  | 2025-08-25 12:25:53,395 INFO 13 [celery.app.trace] [user None] [ip None] trace.py:128 - Task eventtracking.tasks.send_event[ca8d66da-1766-4c31-9b10-a997a7f6c549] succeeded in 0.01478326041251421s: Nonetutor_local-lms-worker-1  | 2025-08-25 12:25:53,405 INFO 13 [event_routing_backends.backends.events_router] [user None] [ip None] events_router.py:234 - Event problem_check has been queued for batching. Queue size: 2tutor_local-lms-worker-1  | 2025-08-25 12:25:53,409 INFO 13 [event_routing_backends.backends.events_router] [user None] [ip None] events_router.py:234 - Event problem_check has been queued for batching. Queue size: 2tutor_local-lms-worker-1  | 2025-08-25 12:25:53,414 INFO 13 [celery.app.trace] [user None] [ip None] trace.py:128 - Task eventtracking.tasks.send_event[7313923e-10be-424e-ab70-771b8472d1a1] succeeded in 0.014985626563429832s: Nonetutor_local-lms-worker-1  | 2025-08-25 12:25:54,310 WARNING 13 [edx_toggles.toggles.internal.waffle.flag] [user None] [ip None] flag.py:79 - Grades: Flag ‘grades.enforce_freeze_grade_after_course_end’ accessed without a request, which is likely in the context of a celery task.tutor_local-ralph-1       | 2025-08-25 12:25:54,375 INFO:     Inserted a total of 1 documents with successtutor_local-ralph-1       | 2025-08-25 12:25:54,375 INFO:     Indexed 1 statements with successtutor_local-ralph-1       | 2025-08-25 12:25:54,376 INFO:     172.23.0.15:36416 - “POST /xAPI/statements HTTP/1.1” 200 OK

Thanks in advance for all the help

Best

Those logs seem ok, and I don’t think the 403 is important (I suspect it’s just sending Superset usage data). Are you able to see other charts updating as you take do various things in the LMS? Is it just the Events don’t get sent from Studio, and I’m not sure they do if you are using masquerade mode either. Is it just the “Page Engagement per Section / Subsection” that’s empty?

I know @Sara_Burns has been looking into issues around some of that and may have further information if you can answer those questions.

Hi @Yago - following what Ty said - this is a known issue that I’m fixing as we speak. The top chart is using a different data set than the other charts on that page so there is a mismatch in data. This is also the case with the ‘Problem Engagement’ charts. Once we release a new version of Aspects, these issues should be fixed.

Hi @TyHob @Sara_Burns, thanks for your quick reply! I feel more confident now with my environment configuration after reading your answers.

My superset instance surprisingly started showing charts this morning, without any new command ran🤷‍♂️. I cannot tell why, maybe the data need to be old enough to be shown? Or maybe there’s an automatic command that was not being thrown yet?

Anyway, the issue is solved :clap:

@Sara_Burns , looking forward for that new Aspects release. Just in case, would it need a tutor version higher than 19.0.5?

Thanks very much for your support. Have a nice day!

Best

I’m glad to hear it’s sorted! A couple of things could delay data showing up in Superset:

  1. Events need to happen to drive the pipeline, so on dev systems where nothing is happening the batching mechanism can get stuck waiting for a new event to force it to evaluate whether to send the events in the queue (it wants either batch size or a certain amount of time, but can’t actually evaluate either until another event comes in).
  2. ClickHouse itself will only process data for the materialized views that power the charts when it feels like it. According to the docs there are no guarantees about timing, but I’ve rarely seen it take over a minute.
  3. Sometimes Superset can cache results for a few minutes. There is a little “refresh” button on the chart menus that will override that.

My guess for this is that it’s #1 though.

The next Aspects release should still be compatible back to at least Sumac (v19), and probably Redwood with in-context metrics turned off.