Does anyone here use the Open API responses API?
| Crawly brunch | 08/16/25 | | Aromatic stock car | 08/16/25 | | Crawly brunch | 08/16/25 | | Crawly brunch | 08/16/25 | | The Barrister of Babylon | 08/17/25 | | football | 08/17/25 |
Poast new message in this thread
Date: August 16th, 2025 1:53 PM Author: Aromatic stock car
This is one reason why Gemini will win in the end
It's only the one that you can start out dumping a huge load of shit on, and then immediately ask extremely specific questions
GPT can do it too but is extremely slow and gets slower and more retarded as more info is accumulated in the same chat
(http://www.autoadmit.com/thread.php?thread_id=5763083&forum_id=2,#49190283) |
 |
Date: August 16th, 2025 1:55 PM Author: Crawly brunch
im doing it using chunked pipline right now where it summarizes segments at a time and then dumps them into a final extraction adn its still taking forever, its been plodding through this for like 10 min now:
tokens≈21975 (inline_limit 120000)
[route] attempting file-attachment path
[route] attachments unsupported here; using chunked pipeline
[map] summarizing segment 1
[map] summarizing segment 2
slow as fuck its on segment 4 now 5 min later
(http://www.autoadmit.com/thread.php?thread_id=5763083&forum_id=2,#49190287)
|
|
|