Skip to content

fix: use max_completion_tokens for reasoning models (o1/o3/o4/gpt-5)#544

Merged
di-sukharev merged 4 commits intodi-sukharev:masterfrom
majiayu000:fix/issue-529-max-completion-tokens
Mar 31, 2026
Merged

fix: use max_completion_tokens for reasoning models (o1/o3/o4/gpt-5)#544
di-sukharev merged 4 commits intodi-sukharev:masterfrom
majiayu000:fix/issue-529-max-completion-tokens

Conversation

@majiayu000
Copy link
Copy Markdown
Contributor

Fixes #529

Problem

Newer OpenAI models (o1, o3, o4, gpt-5 series) reject the max_tokens parameter and require max_completion_tokens instead. These reasoning models also don't support temperature and top_p parameters.

Changes

In src/engine/openAi.ts, added model detection (/^(o[1-9]|gpt-5)/) to conditionally:

  • Use max_completion_tokens instead of max_tokens for reasoning models
  • Omit temperature and top_p for reasoning models (they don't support these)
  • Keep existing behavior for all other models

The detection pattern is consistent with the existing pattern in src/utils/modelCache.ts.

@majiayu000 majiayu000 marked this pull request as ready for review March 21, 2026 02:53
@di-sukharev
Copy link
Copy Markdown
Owner

@majiayu000 thank you for the contribution, please fix the tests and i merge it🙏

@majiayu000
Copy link
Copy Markdown
Contributor Author

fixed — two issues: chalk was missing from jest transformIgnorePatterns (ESM import error), and gemini.test.ts had a wrong mock path (../src -> ../../src). all 4 unit test suites pass now

Repository owner deleted a comment from arvkvk7-crypto Mar 29, 2026
@di-sukharev
Copy link
Copy Markdown
Owner

di-sukharev commented Mar 29, 2026

@majiayu000 one test is still failing at https://github.com/di-sukharev/opencommit/actions/runs/23655960709/job/69058677428?pr=544

FAIL test/unit/gemini.test.ts (15.153 s)
  ● Gemini › should generate commit message

    Cannot find module '../../src/commands/config' from 'test/jest-setup.ts'

      34 |
      35 |     jest.mock('@google/generative-ai');
    > 36 |     jest.mock('../../src/commands/config');
         |          ^
      37 |
      38 |     jest.mock('@clack/prompts', () => ({
      39 |       intro: jest.fn(),

      at Resolver._throwModNotFoundError (node_modules/jest-resolve/build/resolver.js:427:11)
      at Object.<anonymous> (test/unit/gemini.test.ts:36:10)


  ● Test suite failed to run

    TypeError: Cannot read properties of undefined (reading 'mockRestore')

      60 |
      61 |   afterAll(() => {
    > 62 |     mockExit.mockRestore();
         |              ^
      63 |     process.env = oldEnv;
      64 |   });
      65 |

      at Object.<anonymous> (test/unit/gemini.test.ts:62:14)

PASS test/unit/config.test.ts (15.17 s)

@di-sukharev
Copy link
Copy Markdown
Owner

you can solve this conflict however is easier for you, we anyway regenerate the /out folder on every npm version submission, so feel free to ignore it or resolve like you want, it doesnt matter really
Screenshot 2026-03-29 at 7 21 17 pm

Newer OpenAI models (o1, o3, o4, gpt-5 series) reject the max_tokens
parameter and require max_completion_tokens instead. These reasoning
models also do not support temperature and top_p parameters.

Conditionally set the correct token parameter and omit unsupported
sampling parameters based on the model name.

Fixes di-sukharev#529

Signed-off-by: majiayu000 <1835304752@qq.com>
Remove Record<string, unknown> type annotation to let TypeScript infer
the params object type, preserving type checking on all properties.
Cast to ChatCompletionCreateParamsNonStreaming at the create() call site
to accommodate the SDK's missing max_completion_tokens type. Add unit
test for reasoning model detection regex.

Signed-off-by: majiayu000 <1835304752@qq.com>
- Add chalk to jest transformIgnorePatterns so ESM chalk import works
- Fix wrong mock path in gemini.test.ts (../src -> ../../src)

Signed-off-by: majiayu000 <1835304752@qq.com>
…ier formatting

Signed-off-by: majiayu000 <1835304752@qq.com>
@majiayu000 majiayu000 force-pushed the fix/issue-529-max-completion-tokens branch from d4ecc49 to f74ba2d Compare March 29, 2026 17:50
@majiayu000
Copy link
Copy Markdown
Contributor Author

@di-sukharev Rebased on master and fixed all CI failures:

  1. gemini.test.ts — reverted the mock path back to ../src/commands/config (Jest resolves it relative to test/jest-setup.ts, not the test file itself)
  2. prettier — fixed indentation in openAi.ts
  3. /out conflict — removed the conflicting out/ files since they'll be regenerated on release

All three should be green now 🙏

@di-sukharev
Copy link
Copy Markdown
Owner

thank you @majiayu000

@di-sukharev di-sukharev merged commit 9923dab into di-sukharev:master Mar 31, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: gpt-5-nano param mismatch

2 participants