The contentstore xblock patch returns with HTTP status 400 badrequest in cms

Dear community,

We are facing issues in using the PATCH method of the content store in the CMS. The request that we are using is as follows.
We are receiving 400 bad request in the response without any error in the tutor.

Please help us resolve this. Let us know if any more inputs are required from our side.

curl -X PATCH -v
-H ‘accept: application/json’
-H ‘Content-Type: multipart/form-data’
-H ‘X-CSRFToken: token’
-H 'Authorization: JWT auth string
-F parent_locator=“some_locator”
-F display_name=“a name”
-F category=“chapter”
-F “move_source_locator=locator
-F “target_index=3”
-F data=@”./file-1.zip" https://openedx.com/api/contentstore/v1/xblock/{courseId}/{user_string} -w “%{http_code}\n”

Hi :slight_smile:

What are you trying to do by calling this API? What is file-1.zip?

Unfortunately the REST API doesn’t always produce useful error message, so sometimes it’s necessary to modify the code to see what the actual error is. But we might be able to guess if you can explain the intention here a bit more.

Hi @braden :smile: ,

Thanks for your quick response. We are trying to upload to the units see this Screenshot of the web page showing what do we want to do through the sand box. The file-1.zip is scorm where multiple things related to content to be shown in the black rectangle shown in the screenshot.

For this we are trying to use the patch method of the xblock. When we try to see the logs from the tutor with tutor local logs cms it does not show any error description or error code in the tutor as well. It shows only the status code of the http request.

2024-02-13 07:46:47,308 INFO 16 [tracking] [user None] [ip 59.144.164.252] logger.py:41 - {"name": "/api/contentstore/v1/xblock/course-v1:abc/block-v1:abc+type@scorm+block@abc", "context": {"user_id": null, "path": "/api/contentstore/v1/xblock/course-v1:abc/block-v1:abc+type@scorm+block@abc", "course_id": "", "org_id": "", "enterprise_uuid": ""}, "username": "", "session": "", "ip": "abc", "agent": "curl/7.68.0", "host": "studio.openedx.com", "referer": "", "accept_language": "", "event": "{\"GET\": {}, \"POST\": {}}", "time": "2024-02-13T07:46:47.308077+00:00", "event_type": "/api/contentstore/v1/xblock/course-v1:abc/block-v1:abc+type@scorm+block@abc", "event_source": "server", "page": null}
[pid: 16|app: 0|req: 305/1215] 172.18.0.3 () {46 vars in 1685 bytes} [Tue Feb 13 07:46:47 2024] PATCH /api/contentstore/v1/xblock/course-v1%3Aabc/block-v1%3Aabc%2Btype%40scorm%2Bblock%abc => generated 0 bytes in 10503 msecs (HTTP/1.1 400) 7 headers in 458 bytes (540 switches on core 0)

Please correct us if I am sending a wrong curl request to test this part. Or we want to know what is expected in this request and why do we receive the bad request error?

Please let us know what is missing from us in the request of the PATCH of xblock for uploading scorm content or any more inputs that you might need from us.

This is what we are trying to do. There is a page under Section → Sub section → Unit → Different boxes with content (DBWC).
There is an edit button above the DBWC on each box. When we tap on that we get a form having the following content as shown below. We want an API that helps us use this form through that API.

Display name
Upload .zip package
Scored
Weight
Display width (px)
Display height (px)

Screenshot of what do we need the API for is available here below.
Google Photos

Hi @braden

Sorry to bother you.
Do we have any update on this? How do we update xblock scorm content in openedx?

Perhaps you are missing the CSRF header?

Is the request you’re making the same as the one that appears in the network log when you upload the file manually?

I found the issue with the request that we were sending. The thing was that we found this request in the swagger of the openedx. File itself was the wrong parameter in the request.

Do you suggest any other workaround on the xblocks upload to the subunits?
I do not find any request that upload the scorm module in the swagger documentation.
Do you think using the curl from the network log would be useful for us to create an upload through code?

@braden do you see any work around to upload scorm to the units in zip? How to do it in such a way that when users see the course they will also get the analytics.

@Sameer_Sawant What’s your exact use case? The Studio API at the moment isn’t really designed for you to do things like upload a SCORM .zip file using REST. However, work is taking place to improve that; I’m not sure of the current status.

It’s probably much easier and more efficient if you just use a script to generate a file of the entire course, including all the scorm .zip files, and then import that course file into Studio, rather than using a the REST API. To do this, create a sample course manually, including a couple scorm components with uploaded zip files, then export it, decompress that file, and then modify the resulting folders/files using a script, to create all the scorm components you want, then compress it and import it back into Studio.

Otherwise, yeah, you can definitely check the network log while manually uploading a file then try to re-create it using curl. But I’m not sure if that same endpoint will work as an API the same way it works when used in the browser.

@braden I appreciate your response. And thank you very much for replying,

Our use case is that we want to do the course content upload programmatically such that we won’t loose the analytics provided by tutor while the user sees the content.

So according to you there doesn’t exist any method to do so programmatically, does it?

The analytics will be the same regardless of how you upload the content.

I’m not sure if it’s currently possible to upload SCORM component .zip files using a REST API. Maybe it is, maybe it isn’t. I’m sure there’s a way to do it using the python API, and I’m sure there’s a way to automate it by creating the course .tar.gz file including all the SCORM component .zip files, and importing that. So there are definitely two “programatic” ways to do it.

The question is just if you need to do it while the course is running. If so, a REST API is probably the way to go, and you’ll have to investigate in more detail or customize the platform to make the REST API work properly. If you don’t need to upload SCORM objects while the course is running, then why not write some scripts to generate the course .tar.gz and import it? That’s still a programmatic way to create the content.

@braden totally understood your point. Thanks for bringing it forward that the analytics will work regardless of the way we upload the content.

Can you please provide some repository or blog or online information available that we can refer to? I am sorry but we are finding it difficult to find out proper documentation for it.

Unfortunately I’m not aware of any documentation on that exact topic. (That doesn’t mean it doesn’t exist; maybe I’m just not aware of it.) Most likely you need to look at the code, and the Network inspector, and figure it out.

From looking at the code, it seems that your original attempt to call PATCH /api/contentstore/v1/xblock/course-v1%3Aabc/block-v1%3Aabc%2Btype%40scorm%2Bblock%abc is wrong for a couple of reasons:

  1. The xblock/:course_id/:block_id endpoint is in v0, not v1.
  2. The PATCH method is only used for moving an XBlock, not updating it. This is of course unusual but that’s how it works.

In order to figure out the correct REST API endpoint, the easiest way is to do it manually in your browser, then watch the network tab to see what the request is. I suspect it will be a POST method to the v0 endpoint like POST /api/contentstore/v0/xblock/course-v1:.../block-v1:... but it may also be POST /xblock/block-v1:.../handler/something/ if a handler API is what uploads the .zip file.