Bug 39506 - Have Koha entering the 21st century through AI
Summary: Have Koha entering the 21st century through AI
Status: CLOSED INVALID
Alias: None
Product: Koha
Classification: Unclassified
Component: About (show other bugs)
Version: Main
Hardware: All All
: P5 - low new feature
Assignee: Baptiste Wojtkowski (bwoj)
QA Contact: Testopia
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2025-04-01 11:37 UTC by Baptiste Wojtkowski (bwoj)
Modified: 2025-12-15 19:57 UTC (History)
4 users (show)

See Also:
GIT URL:
Initiative type: ---
Sponsorship status: ---
Comma delimited list of Sponsors:
Crowdfunding goal: 0
Patch complexity: Trivial patch
Documentation contact:
Documentation submission:
Text to go in the release notes:
Version(s) released in:
Circulation function:


Attachments
Bug 39506: Have Koha entering the 21st century through LLMs (5.96 KB, patch)
2025-04-01 11:38 UTC, Baptiste Wojtkowski (bwoj)
Details | Diff | Splinter Review
Bug 39506: Have Koha entering the 21st century through LLMs (6.04 KB, patch)
2025-04-01 11:41 UTC, Fridolin Somers
Details | Diff | Splinter Review
Bug 39506: Have Koha entering the 21st century through LLMs (6.10 KB, patch)
2025-04-01 12:30 UTC, Arthur Suzuki
Details | Diff | Splinter Review
Bug 39506: Have Koha entering the 21st century through LLMs (6.14 KB, patch)
2025-04-01 13:03 UTC, Felicie
Details | Diff | Splinter Review
Bug 39506: Have Koha entering the 21st century through LLMs (6.20 KB, patch)
2025-04-01 15:18 UTC, Victor Grousset/tuxayo
Details | Diff | Splinter Review
Bug 39506: Have Koha entering the 21st century through LLMs (6.25 KB, patch)
2025-04-02 08:01 UTC, Amaury GAU
Details | Diff | Splinter Review

Note You need to log in before you can comment on or make changes to this bug.
Description Baptiste Wojtkowski (bwoj) 2025-04-01 11:37:46 UTC
It is well known that LLM are the most important solution for humanity,        
therefore for librarians.        
As everyone knows, it's often very difficult to stay organized when you have a busy day. That's why we need a tool that can tell us what to do and when to do it.    
To this end, we developped this patch as the end of a long and hard        
training of        
We used two learning models with about 10^11 neurons, trained during        
decades on the ultimate question of books, library and SIGB. They were specifically trained to koha 35 hours a week during durations from a few monthes to years.    
        
However, running LLM in real time can be very expensive, in terms of        
memory and processing. That's why we        
synthesithed their wise knowledge into this easy-to-use patch.        
        
TEST PLAN:        
1 - Apply patch        
2 - Click on the button on main page
Comment 1 Baptiste Wojtkowski (bwoj) 2025-04-01 11:38:28 UTC
Created attachment 180133 [details] [review]
Bug 39506: Have Koha entering the 21st century through LLMs

It is well known that LLM are the most important solution for humanity,
therefore for librarians.
As everyone knows, it's often very difficult to stay organized when you have a busy day. That's why we need a tool that can tell us what to do and when to do it.
To this end, we developped this patch as the end of a long and hard
training of
We used two learning models with about 10^11 neurons, trained during
decades on the ultimate question of books, library and SIGB. They were specifically trained to koha 35 hours a week during durations from a few monthes to years.

However, running LLM in real time can be very expensive, in terms of
memory and processing. That's why we
synthesithed their wise knowledge into this easy-to-use patch.

TEST PLAN:
1 - Apply patch
2 - Click on the button on main page
Comment 2 Fridolin Somers 2025-04-01 11:41:47 UTC
Created attachment 180134 [details] [review]
Bug 39506: Have Koha entering the 21st century through LLMs

It is well known that LLM are the most important solution for humanity,
therefore for librarians.
As everyone knows, it's often very difficult to stay organized when you have a busy day. That's why we need a tool that can tell us what to do and when to do it.
To this end, we developped this patch as the end of a long and hard
training of
We used two learning models with about 10^11 neurons, trained during
decades on the ultimate question of books, library and SIGB. They were specifically trained to koha 35 hours a week during durations from a few monthes to years.

However, running LLM in real time can be very expensive, in terms of
memory and processing. That's why we
synthesithed their wise knowledge into this easy-to-use patch.

TEST PLAN:
1 - Apply patch
2 - Click on the button on main page

Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Comment 3 Arthur Suzuki 2025-04-01 12:30:54 UTC
Created attachment 180152 [details] [review]
Bug 39506: Have Koha entering the 21st century through LLMs

It is well known that LLM are the most important solution for humanity,
therefore for librarians.
As everyone knows, it's often very difficult to stay organized when you have a busy day. That's why we need a tool that can tell us what to do and when to do it.
To this end, we developped this patch as the end of a long and hard
training of
We used two learning models with about 10^11 neurons, trained during
decades on the ultimate question of books, library and SIGB. They were specifically trained to koha 35 hours a week during durations from a few monthes to years.

However, running LLM in real time can be very expensive, in terms of
memory and processing. That's why we
synthesithed their wise knowledge into this easy-to-use patch.

TEST PLAN:
1 - Apply patch
2 - Click on the button on main page

Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Arthur Suzuki <arthur.suzuki@biblibre.com>
Comment 4 Felicie 2025-04-01 13:03:30 UTC
Created attachment 180179 [details] [review]
Bug 39506: Have Koha entering the 21st century through LLMs

It is well known that LLM are the most important solution for humanity,
therefore for librarians.
As everyone knows, it's often very difficult to stay organized when you have a busy day. That's why we need a tool that can tell us what to do and when to do it.
To this end, we developped this patch as the end of a long and hard
training of
We used two learning models with about 10^11 neurons, trained during
decades on the ultimate question of books, library and SIGB. They were specifically trained to koha 35 hours a week during durations from a few monthes to years.

However, running LLM in real time can be very expensive, in terms of
memory and processing. That's why we
synthesithed their wise knowledge into this easy-to-use patch.

TEST PLAN:
1 - Apply patch
2 - Click on the button on main page

Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Arthur Suzuki <arthur.suzuki@biblibre.com>
Signed-off-by: Felicie <felicie.thiery@biblibre.com>
Comment 5 Victor Grousset/tuxayo 2025-04-01 15:18:41 UTC
Created attachment 180211 [details] [review]
Bug 39506: Have Koha entering the 21st century through LLMs

It is well known that LLM are the most important solution for humanity,
therefore for librarians.
As everyone knows, it's often very difficult to stay organized when you have a busy day. That's why we need a tool that can tell us what to do and when to do it.
To this end, we developped this patch as the end of a long and hard
training of
We used two learning models with about 10^11 neurons, trained during
decades on the ultimate question of books, library and SIGB. They were specifically trained to koha 35 hours a week during durations from a few monthes to years.

However, running LLM in real time can be very expensive, in terms of
memory and processing. That's why we
synthesithed their wise knowledge into this easy-to-use patch.

TEST PLAN:
1 - Apply patch
2 - Click on the button on main page

Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Arthur Suzuki <arthur.suzuki@biblibre.com>
Signed-off-by: Felicie <felicie.thiery@biblibre.com>
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Comment 6 Victor Grousset/tuxayo 2025-04-01 15:19:03 UTC
Works, makes sense, QA script happy, code looks good, passing QA :)
Comment 7 David Cook 2025-04-01 23:35:03 UTC
+1
Comment 8 Amaury GAU 2025-04-02 08:01:14 UTC
Created attachment 180247 [details] [review]
Bug 39506: Have Koha entering the 21st century through LLMs

It is well known that LLM are the most important solution for humanity,
therefore for librarians.
As everyone knows, it's often very difficult to stay organized when you have a busy day. That's why we need a tool that can tell us what to do and when to do it.
To this end, we developped this patch as the end of a long and hard
training of
We used two learning models with about 10^11 neurons, trained during
decades on the ultimate question of books, library and SIGB. They were specifically trained to koha 35 hours a week during durations from a few monthes to years.

However, running LLM in real time can be very expensive, in terms of
memory and processing. That's why we
synthesithed their wise knowledge into this easy-to-use patch.

TEST PLAN:
1 - Apply patch
2 - Click on the button on main page

Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Arthur Suzuki <arthur.suzuki@biblibre.com>
Signed-off-by: Felicie <felicie.thiery@biblibre.com>
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Amaury GAU <amaurygau@gmail.com>
Comment 9 Brendan Lawlor 2025-04-03 19:55:14 UTC
Inspiring what you have managed to achieve in <100 lines of code!

In consideration of recent global events and trends, I've also begun working on a development to prepare Koha for the 21st century: Make Koha run on Collapse OS.
https://collapseos.org/

It's a big undertaking but I hope to submit a patch in about one year's time :)
Comment 10 Baptiste Wojtkowski (bwoj) 2025-04-04 07:31:31 UTC
Hi,
Thank you all for your interest in this patch. Unfortunately, I found some issues with this AI (AI are very complex and hard to predict), so I have to remove this patch from Bugzilla. I'll maybe come with a better version but not in the next days,