Skip to content

Conversation

@i-just
Copy link
Contributor

@i-just i-just commented Oct 24, 2025

Description

When feeds are queued via CLI, process them sequentially, so that all batches from the first feed are processed before the second feed is started, and so on.

Testing steps:

  • create a (local) filesystem and then a volume that uses it
  • create Assets type field
  • create a section with an entry type that uses the assets field
  • create first feed (feed1) that imports assets into the volume (example data below)
  • create a second feed (feed2) that imports entries into the section; the entries rely on assets already having been imported (example data below)
  • for ease of testing, so that you don’t have to have feeds with over 100 items, you can set public int $batchSize = 1; in src/queue/jobs/FeedImport.php
  • run craft feed-me/feeds/queue 1,2 command (where 1 and 2 correspond actual IDs of the feed1 and feed2 respectively)

The outcome should be that all the assets are processed first before we move onto the entries.

feed1 example data:

{
  "assets":[
    {
      "url": "<URL>",
      "filename": "img1.jpg"
    },
    {
      "url": "<URL>",
      "filename": "img2.jpg"
    }
  ]
}

feed2 example data:

{
  "entries": [
    {
      "title": "entry 1",
      "image": "img2.jpg"
    },
    {
      "title": "entry 2",
      "image": "img1.jpg"
    }
  ]
}

Related issues

#1695

@i-just i-just requested a review from angrybrad as a code owner October 24, 2025 13:37
@i-just i-just requested review from timkelty and removed request for angrybrad October 24, 2025 13:37
@i-just i-just merged commit 4b28938 into 5.x Oct 24, 2025
6 checks passed
@i-just i-just deleted the feature/1695-process-feeds-sequentially-via-cli branch October 24, 2025 14:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants