Bug 26802 - Improve speed with records with many items
Summary: Improve speed with records with many items
Status: NEW
Alias: None
Product: Koha
Classification: Unclassified
Component: Searching (show other bugs)
Version: unspecified
Hardware: All All
: P5 - low major (vote)
Assignee: Bugs List
QA Contact: Testopia
URL:
Keywords:
: 31541 (view as bug list)
Depends on: 33167
Blocks: 33746
  Show dependency treegraph
 
Reported: 2020-10-23 16:14 UTC by Vitor Fernandes
Modified: 2023-08-10 16:33 UTC (History)
8 users (show)

See Also:
Change sponsored?: ---
Patch complexity: ---
Documentation contact:
Documentation submission:
Text to go in the release notes:
Version(s) released in:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Vitor Fernandes 2020-10-23 16:14:48 UTC
Koha has a performance problem displaying records with many items in search results and details pages.

For example, Koha takes around 20/30 seconds to process a record XML with 1000 items using XSLT. The Koha's response speed reduces even more with bigger items counting.

Maybe we can improve Koha speed with some decisions:

- Don't use the complete XML (record without items) for XLSTs processing.
- Add pagination to the items table (like is done with other Koha tables) and JS load.


Test case (search results):

- Add some records with 1000 items and the same title
- Try to do a search for the title

Test case (details and other record views):

- Add one records with 1000/2000/3000 items
- Try to open the record
Comment 1 Fridolin Somers 2021-04-07 13:53:21 UTC
+1
Comment 2 Björn Nylén 2021-04-28 22:10:07 UTC
We've done some profiling to figure out where the delay is. The problem is mainly that in the XSLT processing an auth. value description is looked up in the db for every item column. Caching the values will have a big effect. 
Branch- and itemtypenames for items are also looked up per item and will slow down the detail pages from what we've seen.
Comment 3 Fridolin Somers 2022-06-08 06:26:10 UTC
> - Don't use the complete XML (record without items) for XLSTs processing.

For this there have been lot of work :
Bug 23407
Bug 28373

And WIP Bug 28371
Comment 4 Fridolin Somers 2022-06-08 06:26:57 UTC
> - Add pagination to the items table (like is done with other Koha tables) and JS load.
I suggest this bug focus on this point
Comment 5 David Nind 2022-07-12 11:48:18 UTC
(In reply to Fridolin Somers from comment #4)
> > - Add pagination to the items table (like is done with other Koha tables) and JS load.
> I suggest this bug focus on this point

+1
Comment 6 David Cook 2022-09-09 03:47:15 UTC
*** Bug 31541 has been marked as a duplicate of this bug. ***
Comment 7 David Cook 2022-09-09 03:48:48 UTC
(In reply to Björn Nylén from comment #2)
> We've done some profiling to figure out where the delay is. The problem is
> mainly that in the XSLT processing an auth. value description is looked up
> in the db for every item column. Caching the values will have a big effect. 
> Branch- and itemtypenames for items are also looked up per item and will
> slow down the detail pages from what we've seen.

I've noticed other issues in the XSLT processing as well, especially in C4::XSLT::buildKohaItemsNamespace().

It does a huge number of database calls, and when you have thousands of items, it's just going to be slow no matter what.
Comment 8 David Cook 2022-09-09 04:03:53 UTC
(In reply to David Nind from comment #5)
> (In reply to Fridolin Somers from comment #4)
> > > - Add pagination to the items table (like is done with other Koha tables) and JS load.
> > I suggest this bug focus on this point
> 
> +1

This could have other consequences but I think it's absolutely necessary at this point.
Comment 9 David Cook 2022-09-09 04:07:20 UTC
(In reply to David Cook from comment #7)
> I've noticed other issues in the XSLT processing as well, especially in
> C4::XSLT::buildKohaItemsNamespace().
> 
> It does a huge number of database calls, and when you have thousands of
> items, it's just going to be slow no matter what.

That said, after I removed it, apparently I only gained 18 seconds. (My search took 2.7 minutes instead of 3 minutes.

I think no matter how you slice it, ultimately we need to fetch an item count, and then fetch just 1 page of items at a time...
Comment 10 David Cook 2022-09-09 04:42:28 UTC
I think Tomas and Martin know how best to use the DataTables with the REST API, so they might be best suited to this one...

Since the Holdings table is defined in a template block, it actually makes it very easy to provide alternative implementations for the holdings.

In the short-term, it might be useful to keep the current Holdings table for all records with items under X.

Then records over X, we'd display an alternative holdings table. This might make it easier for the community to accept the change from client-side to server-side processing for the items table, since it will be a more gradual change...
Comment 11 Jonathan Druart 2023-05-03 13:23:28 UTC
(In reply to David Cook from comment #8)
> (In reply to David Nind from comment #5)
> > (In reply to Fridolin Somers from comment #4)
> > > > - Add pagination to the items table (like is done with other Koha tables) and JS load.
> > > I suggest this bug focus on this point
> > 
> > +1
> 
> This could have other consequences but I think it's absolutely necessary at
> this point.

See bug 33568.
Comment 12 Eileen Chandler 2023-08-09 16:44:13 UTC
We still have a lot of print serials at our member libraries. We are a consortium with 38 locations.

We've done a lot of work dividing these by year due to slow response time when opening a periodical record in Circulation (to check out)  and, when adding a barcoded item in Cataloging for the latest periodical issue.
We have approximately 100 bibliographic records with between 100 and 500 items attached. These are really slow to find and open.   In a consortium setting it is reasonable to expect faster response time to search for and retrieve a bib record with many items attached and to open this kind of bib record in Cataloging to attach a barcoded item.