Does anyone here use the Open API responses API?
| The Wandering Mercatores | 08/16/25 | | Dave Prole | 08/16/25 | | The Wandering Mercatores | 08/16/25 | | The Wandering Mercatores | 08/16/25 | | The Wandering Mercatores | 08/17/25 | | scholarship | 08/17/25 |
Poast new message in this thread
Date: August 16th, 2025 1:53 PM Author: Dave Prole
This is one reason why Gemini will win in the end
It's only the one that you can start out dumping a huge load of shit on, and then immediately ask extremely specific questions
GPT can do it too but is extremely slow and gets slower and more retarded as more info is accumulated in the same chat
(http://www.autoadmit.com/thread.php?thread_id=5763083&forum_id=2),#49190283) |
 |
Date: August 16th, 2025 1:55 PM Author: The Wandering Mercatores (from the Euphrates to the Forum)
im doing it using chunked pipline right now where it summarizes segments at a time and then dumps them into a final extraction adn its still taking forever, its been plodding through this for like 10 min now:
tokens≈21975 (inline_limit 120000)
[route] attempting file-attachment path
[route] attachments unsupported here; using chunked pipeline
[map] summarizing segment 1
[map] summarizing segment 2
slow as fuck its on segment 4 now 5 min later
(http://www.autoadmit.com/thread.php?thread_id=5763083&forum_id=2),#49190287)
|
|
|