mirror of
https://github.com/QwenLM/qwen-code.git
synced 2026-01-13 12:29:14 +00:00
Compare commits
5 Commits
feature/re
...
mingholy/f
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
996b9df947 | ||
|
|
64291db926 | ||
|
|
a8e3b9ebe7 | ||
|
|
299b7de030 | ||
|
|
824ca056a4 |
@@ -25,7 +25,7 @@ Qwen Code is an open-source AI agent for the terminal, optimized for [Qwen3-Code
|
||||
- **OpenAI-compatible, OAuth free tier**: use an OpenAI-compatible API, or sign in with Qwen OAuth to get 2,000 free requests/day.
|
||||
- **Open-source, co-evolving**: both the framework and the Qwen3-Coder model are open-source—and they ship and evolve together.
|
||||
- **Agentic workflow, feature-rich**: rich built-in tools (Skills, SubAgents, Plan Mode) for a full agentic workflow and a Claude Code-like experience.
|
||||
- **Terminal-first, IDE-friendly**: built for developers who live in the command line, with optional integration for VS Code and Zed.
|
||||
- **Terminal-first, IDE-friendly**: built for developers who live in the command line, with optional integration for VS Code, Zed, and JetBrains IDEs.
|
||||
|
||||
## Installation
|
||||
|
||||
@@ -137,10 +137,11 @@ Use `-p` to run Qwen Code without the interactive UI—ideal for scripts, automa
|
||||
|
||||
#### IDE integration
|
||||
|
||||
Use Qwen Code inside your editor (VS Code and Zed):
|
||||
Use Qwen Code inside your editor (VS Code, Zed, and JetBrains IDEs):
|
||||
|
||||
- [Use in VS Code](https://qwenlm.github.io/qwen-code-docs/en/users/integration-vscode/)
|
||||
- [Use in Zed](https://qwenlm.github.io/qwen-code-docs/en/users/integration-zed/)
|
||||
- [Use in JetBrains IDEs](https://qwenlm.github.io/qwen-code-docs/en/users/integration-jetbrains/)
|
||||
|
||||
#### TypeScript SDK
|
||||
|
||||
|
||||
@@ -12,6 +12,7 @@ export default {
|
||||
},
|
||||
'integration-vscode': 'Visual Studio Code',
|
||||
'integration-zed': 'Zed IDE',
|
||||
'integration-jetbrains': 'JetBrains IDEs',
|
||||
'integration-github-action': 'Github Actions',
|
||||
'Code with Qwen Code': {
|
||||
type: 'separator',
|
||||
|
||||
BIN
docs/users/images/jetbrains-acp.png
Normal file
BIN
docs/users/images/jetbrains-acp.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 36 KiB |
57
docs/users/integration-jetbrains.md
Normal file
57
docs/users/integration-jetbrains.md
Normal file
@@ -0,0 +1,57 @@
|
||||
# JetBrains IDEs
|
||||
|
||||
> JetBrains IDEs provide native support for AI coding assistants through the Agent Control Protocol (ACP). This integration allows you to use Qwen Code directly within your JetBrains IDE with real-time code suggestions.
|
||||
|
||||
### Features
|
||||
|
||||
- **Native agent experience**: Integrated AI assistant panel within your JetBrains IDE
|
||||
- **Agent Control Protocol**: Full support for ACP enabling advanced IDE interactions
|
||||
- **Symbol management**: #-mention files to add them to the conversation context
|
||||
- **Conversation history**: Access to past conversations within the IDE
|
||||
|
||||
### Requirements
|
||||
|
||||
- JetBrains IDE with ACP support (IntelliJ IDEA, WebStorm, PyCharm, etc.)
|
||||
- Qwen Code CLI installed
|
||||
|
||||
### Installation
|
||||
|
||||
1. Install Qwen Code CLI:
|
||||
|
||||
```bash
|
||||
npm install -g @qwen-code/qwen-code
|
||||
```
|
||||
|
||||
2. Open your JetBrains IDE and navigate to AI Chat tool window.
|
||||
|
||||
3. Click the 3-dot menu in the upper-right corner and select **Configure ACP Agent** and configure Qwen Code with the following settings:
|
||||
|
||||
```json
|
||||
{
|
||||
"agent_servers": {
|
||||
"qwen": {
|
||||
"command": "/path/to/qwen",
|
||||
"args": ["--acp"],
|
||||
"env": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
4. The Qwen Code agent should now be available in the AI Assistant panel
|
||||
|
||||

|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Agent not appearing
|
||||
|
||||
- Run `qwen --version` in terminal to verify installation
|
||||
- Ensure your JetBrains IDE version supports ACP
|
||||
- Restart your JetBrains IDE
|
||||
|
||||
### Qwen Code not responding
|
||||
|
||||
- Check your internet connection
|
||||
- Verify CLI works by running `qwen` in terminal
|
||||
- [File an issue on GitHub](https://github.com/qwenlm/qwen-code/issues) if the problem persists
|
||||
@@ -23,8 +23,6 @@
|
||||
"build-and-start": "npm run build && npm run start",
|
||||
"build:vscode": "node scripts/build_vscode_companion.js",
|
||||
"build:all": "npm run build && npm run build:sandbox && npm run build:vscode",
|
||||
"build:native": "node scripts/build_native.js",
|
||||
"build:native:all": "node scripts/build_native.js --all",
|
||||
"build:packages": "npm run build --workspaces",
|
||||
"build:sandbox": "node scripts/build_sandbox.js",
|
||||
"bundle": "npm run generate && node esbuild.config.js && node scripts/copy_bundle_assets.js",
|
||||
|
||||
@@ -83,12 +83,26 @@ export const useAuthCommand = (
|
||||
async (authType: AuthType, credentials?: OpenAICredentials) => {
|
||||
try {
|
||||
const authTypeScope = getPersistScopeForModelSelection(settings);
|
||||
|
||||
// Persist authType
|
||||
settings.setValue(
|
||||
authTypeScope,
|
||||
'security.auth.selectedType',
|
||||
authType,
|
||||
);
|
||||
|
||||
// Persist model from ContentGenerator config (handles fallback cases)
|
||||
// This ensures that when syncAfterAuthRefresh falls back to default model,
|
||||
// it gets persisted to settings.json
|
||||
const contentGeneratorConfig = config.getContentGeneratorConfig();
|
||||
if (contentGeneratorConfig?.model) {
|
||||
settings.setValue(
|
||||
authTypeScope,
|
||||
'model.name',
|
||||
contentGeneratorConfig.model,
|
||||
);
|
||||
}
|
||||
|
||||
// Only update credentials if not switching to QWEN_OAUTH,
|
||||
// so that OpenAI credentials are preserved when switching to QWEN_OAUTH.
|
||||
if (authType !== AuthType.QWEN_OAUTH && credentials) {
|
||||
@@ -106,9 +120,6 @@ export const useAuthCommand = (
|
||||
credentials.baseUrl,
|
||||
);
|
||||
}
|
||||
if (credentials?.model != null) {
|
||||
settings.setValue(authTypeScope, 'model.name', credentials.model);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
handleAuthFailure(error);
|
||||
|
||||
@@ -103,7 +103,7 @@ export function resolveCliGenerationConfig(
|
||||
|
||||
// Log warnings if any
|
||||
for (const warning of resolved.warnings) {
|
||||
console.warn(`[modelProviderUtils] ${warning}`);
|
||||
console.warn(warning);
|
||||
}
|
||||
|
||||
// Resolve OpenAI logging config (CLI-specific, not part of core resolver)
|
||||
|
||||
@@ -105,15 +105,6 @@ export const QWEN_OAUTH_MODELS: ModelConfig[] = [
|
||||
description:
|
||||
'The latest Qwen Coder model from Alibaba Cloud ModelStudio (version: qwen3-coder-plus-2025-09-23)',
|
||||
capabilities: { vision: false },
|
||||
generationConfig: {
|
||||
samplingParams: {
|
||||
temperature: 0.7,
|
||||
top_p: 0.9,
|
||||
max_tokens: 8192,
|
||||
},
|
||||
timeout: 60000,
|
||||
maxRetries: 3,
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'vision-model',
|
||||
@@ -121,14 +112,5 @@ export const QWEN_OAUTH_MODELS: ModelConfig[] = [
|
||||
description:
|
||||
'The latest Qwen Vision model from Alibaba Cloud ModelStudio (version: qwen3-vl-plus-2025-09-23)',
|
||||
capabilities: { vision: true },
|
||||
generationConfig: {
|
||||
samplingParams: {
|
||||
temperature: 0.7,
|
||||
top_p: 0.9,
|
||||
max_tokens: 8192,
|
||||
},
|
||||
timeout: 60000,
|
||||
maxRetries: 3,
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
@@ -480,6 +480,91 @@ describe('ModelsConfig', () => {
|
||||
expect(gc.apiKeyEnvKey).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should use default model for new authType when switching from different authType with env vars', () => {
|
||||
// Simulate cold start with OPENAI env vars (OPENAI_MODEL and OPENAI_API_KEY)
|
||||
// This sets the model in generationConfig but no authType is selected yet
|
||||
const modelsConfig = new ModelsConfig({
|
||||
generationConfig: {
|
||||
model: 'gpt-4o', // From OPENAI_MODEL env var
|
||||
apiKey: 'openai-key-from-env',
|
||||
},
|
||||
});
|
||||
|
||||
// User switches to qwen-oauth via AuthDialog
|
||||
// refreshAuth calls syncAfterAuthRefresh with the current model (gpt-4o)
|
||||
// which doesn't exist in qwen-oauth registry, so it should use default
|
||||
modelsConfig.syncAfterAuthRefresh(AuthType.QWEN_OAUTH, 'gpt-4o');
|
||||
|
||||
const gc = currentGenerationConfig(modelsConfig);
|
||||
// Should use default qwen-oauth model (coder-model), not the OPENAI model
|
||||
expect(gc.model).toBe('coder-model');
|
||||
expect(gc.apiKey).toBe('QWEN_OAUTH_DYNAMIC_TOKEN');
|
||||
expect(gc.apiKeyEnvKey).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should clear manual credentials when switching from USE_OPENAI to QWEN_OAUTH', () => {
|
||||
// User manually set credentials for OpenAI
|
||||
const modelsConfig = new ModelsConfig({
|
||||
initialAuthType: AuthType.USE_OPENAI,
|
||||
generationConfig: {
|
||||
model: 'gpt-4o',
|
||||
apiKey: 'manual-openai-key',
|
||||
baseUrl: 'https://manual.example.com/v1',
|
||||
},
|
||||
});
|
||||
|
||||
// Manually set credentials via updateCredentials
|
||||
modelsConfig.updateCredentials({
|
||||
apiKey: 'manual-openai-key',
|
||||
baseUrl: 'https://manual.example.com/v1',
|
||||
model: 'gpt-4o',
|
||||
});
|
||||
|
||||
// User switches to qwen-oauth
|
||||
// Since authType is not USE_OPENAI, manual credentials should be cleared
|
||||
// and default qwen-oauth model should be applied
|
||||
modelsConfig.syncAfterAuthRefresh(AuthType.QWEN_OAUTH, 'gpt-4o');
|
||||
|
||||
const gc = currentGenerationConfig(modelsConfig);
|
||||
// Should use default qwen-oauth model, not preserve manual OpenAI credentials
|
||||
expect(gc.model).toBe('coder-model');
|
||||
expect(gc.apiKey).toBe('QWEN_OAUTH_DYNAMIC_TOKEN');
|
||||
// baseUrl should be set to qwen-oauth default, not preserved from manual OpenAI config
|
||||
expect(gc.baseUrl).toBe('DYNAMIC_QWEN_OAUTH_BASE_URL');
|
||||
expect(gc.apiKeyEnvKey).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should preserve manual credentials when switching to USE_OPENAI', () => {
|
||||
// User manually set credentials
|
||||
const modelsConfig = new ModelsConfig({
|
||||
initialAuthType: AuthType.USE_OPENAI,
|
||||
generationConfig: {
|
||||
model: 'gpt-4o',
|
||||
apiKey: 'manual-openai-key',
|
||||
baseUrl: 'https://manual.example.com/v1',
|
||||
samplingParams: { temperature: 0.9 },
|
||||
},
|
||||
});
|
||||
|
||||
// Manually set credentials via updateCredentials
|
||||
modelsConfig.updateCredentials({
|
||||
apiKey: 'manual-openai-key',
|
||||
baseUrl: 'https://manual.example.com/v1',
|
||||
model: 'gpt-4o',
|
||||
});
|
||||
|
||||
// User switches to USE_OPENAI (same or different model)
|
||||
// Since authType is USE_OPENAI, manual credentials should be preserved
|
||||
modelsConfig.syncAfterAuthRefresh(AuthType.USE_OPENAI, 'gpt-4o');
|
||||
|
||||
const gc = currentGenerationConfig(modelsConfig);
|
||||
// Should preserve manual credentials
|
||||
expect(gc.model).toBe('gpt-4o');
|
||||
expect(gc.apiKey).toBe('manual-openai-key');
|
||||
expect(gc.baseUrl).toBe('https://manual.example.com/v1');
|
||||
expect(gc.samplingParams?.temperature).toBe(0.9); // Preserved from initial config
|
||||
});
|
||||
|
||||
it('should maintain consistency between currentModelId and _generationConfig.model after initialization', () => {
|
||||
const modelProvidersConfig: ModelProvidersConfig = {
|
||||
openai: [
|
||||
|
||||
@@ -600,7 +600,7 @@ export class ModelsConfig {
|
||||
|
||||
// If credentials were manually set, don't apply modelProvider defaults
|
||||
// Just update the authType and preserve the manually set credentials
|
||||
if (preserveManualCredentials) {
|
||||
if (preserveManualCredentials && authType === AuthType.USE_OPENAI) {
|
||||
this.strictModelProviderSelection = false;
|
||||
this.currentAuthType = authType;
|
||||
if (modelId) {
|
||||
@@ -621,7 +621,17 @@ export class ModelsConfig {
|
||||
this.applyResolvedModelDefaults(resolved);
|
||||
}
|
||||
} else {
|
||||
// If the provided modelId doesn't exist in the registry for the new authType,
|
||||
// use the default model for that authType instead of keeping the old model.
|
||||
// This handles the case where switching from one authType (e.g., OPENAI with
|
||||
// env vars) to another (e.g., qwen-oauth) - we should use the default model
|
||||
// for the new authType, not the old model.
|
||||
this.currentAuthType = authType;
|
||||
const defaultModel =
|
||||
this.modelRegistry.getDefaultModelForAuthType(authType);
|
||||
if (defaultModel) {
|
||||
this.applyResolvedModelDefaults(defaultModel);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -125,8 +125,9 @@ function normalizeForRegex(dirPath: string): string {
|
||||
function tryResolveCliFromImportMeta(): string | null {
|
||||
try {
|
||||
if (typeof import.meta !== 'undefined' && import.meta.url) {
|
||||
const cliUrl = new URL('./cli/cli.js', import.meta.url);
|
||||
const cliPath = fileURLToPath(cliUrl);
|
||||
const currentFilePath = fileURLToPath(import.meta.url);
|
||||
const currentDir = path.dirname(currentFilePath);
|
||||
const cliPath = path.join(currentDir, 'cli', 'cli.js');
|
||||
if (fs.existsSync(cliPath)) {
|
||||
return cliPath;
|
||||
}
|
||||
|
||||
@@ -1,323 +0,0 @@
|
||||
/**
|
||||
* @license
|
||||
* Copyright 2025 Qwen
|
||||
* SPDX-License-Identifier: Apache-2.0
|
||||
*/
|
||||
|
||||
import fs from 'node:fs';
|
||||
import path from 'node:path';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
import { spawnSync } from 'node:child_process';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = path.dirname(__filename);
|
||||
const rootDir = path.resolve(__dirname, '..');
|
||||
|
||||
const distRoot = path.join(rootDir, 'dist', 'native');
|
||||
const entryPoint = path.join(rootDir, 'packages', 'cli', 'index.ts');
|
||||
const localesDir = path.join(
|
||||
rootDir,
|
||||
'packages',
|
||||
'cli',
|
||||
'src',
|
||||
'i18n',
|
||||
'locales',
|
||||
);
|
||||
const vendorDir = path.join(rootDir, 'packages', 'core', 'vendor');
|
||||
|
||||
const rootPackageJson = JSON.parse(
|
||||
fs.readFileSync(path.join(rootDir, 'package.json'), 'utf-8'),
|
||||
);
|
||||
const cliName = Object.keys(rootPackageJson.bin || {})[0] || 'qwen';
|
||||
const version = rootPackageJson.version;
|
||||
|
||||
const TARGETS = [
|
||||
{
|
||||
id: 'darwin-arm64',
|
||||
os: 'darwin',
|
||||
arch: 'arm64',
|
||||
bunTarget: 'bun-darwin-arm64',
|
||||
},
|
||||
{
|
||||
id: 'darwin-x64',
|
||||
os: 'darwin',
|
||||
arch: 'x64',
|
||||
bunTarget: 'bun-darwin-x64',
|
||||
},
|
||||
{
|
||||
id: 'linux-arm64',
|
||||
os: 'linux',
|
||||
arch: 'arm64',
|
||||
bunTarget: 'bun-linux-arm64',
|
||||
},
|
||||
{
|
||||
id: 'linux-x64',
|
||||
os: 'linux',
|
||||
arch: 'x64',
|
||||
bunTarget: 'bun-linux-x64',
|
||||
},
|
||||
{
|
||||
id: 'linux-arm64-musl',
|
||||
os: 'linux',
|
||||
arch: 'arm64',
|
||||
libc: 'musl',
|
||||
bunTarget: 'bun-linux-arm64-musl',
|
||||
},
|
||||
{
|
||||
id: 'linux-x64-musl',
|
||||
os: 'linux',
|
||||
arch: 'x64',
|
||||
libc: 'musl',
|
||||
bunTarget: 'bun-linux-x64-musl',
|
||||
},
|
||||
{
|
||||
id: 'windows-x64',
|
||||
os: 'windows',
|
||||
arch: 'x64',
|
||||
bunTarget: 'bun-windows-x64',
|
||||
},
|
||||
];
|
||||
|
||||
function getHostTargetId() {
|
||||
const platform = process.platform;
|
||||
const arch = process.arch;
|
||||
if (platform === 'darwin' && arch === 'arm64') return 'darwin-arm64';
|
||||
if (platform === 'darwin' && arch === 'x64') return 'darwin-x64';
|
||||
if (platform === 'win32' && arch === 'x64') return 'windows-x64';
|
||||
if (platform === 'linux' && arch === 'x64') {
|
||||
return isMusl() ? 'linux-x64-musl' : 'linux-x64';
|
||||
}
|
||||
if (platform === 'linux' && arch === 'arm64') {
|
||||
return isMusl() ? 'linux-arm64-musl' : 'linux-arm64';
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
function isMusl() {
|
||||
if (process.platform !== 'linux') return false;
|
||||
const report = process.report?.getReport?.();
|
||||
return !report?.header?.glibcVersionRuntime;
|
||||
}
|
||||
|
||||
function parseArgs(argv) {
|
||||
const args = {
|
||||
all: false,
|
||||
list: false,
|
||||
targets: [],
|
||||
};
|
||||
|
||||
for (let i = 0; i < argv.length; i += 1) {
|
||||
const arg = argv[i];
|
||||
if (arg === '--all') {
|
||||
args.all = true;
|
||||
} else if (arg === '--list-targets') {
|
||||
args.list = true;
|
||||
} else if (arg === '--target' && argv[i + 1]) {
|
||||
args.targets.push(argv[i + 1]);
|
||||
i += 1;
|
||||
} else if (arg?.startsWith('--targets=')) {
|
||||
const raw = arg.split('=')[1] || '';
|
||||
args.targets.push(
|
||||
...raw
|
||||
.split(',')
|
||||
.map((value) => value.trim())
|
||||
.filter(Boolean),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return args;
|
||||
}
|
||||
|
||||
function ensureBunAvailable() {
|
||||
const result = spawnSync('bun', ['--version'], { stdio: 'pipe' });
|
||||
if (result.error) {
|
||||
console.error('Error: Bun is required to build native binaries.');
|
||||
console.error('Install Bun from https://bun.sh and retry.');
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
function cleanNativeDist() {
|
||||
fs.rmSync(distRoot, { recursive: true, force: true });
|
||||
fs.mkdirSync(distRoot, { recursive: true });
|
||||
}
|
||||
|
||||
function copyRecursiveSync(src, dest) {
|
||||
if (!fs.existsSync(src)) {
|
||||
return;
|
||||
}
|
||||
|
||||
const stats = fs.statSync(src);
|
||||
if (stats.isDirectory()) {
|
||||
if (!fs.existsSync(dest)) {
|
||||
fs.mkdirSync(dest, { recursive: true });
|
||||
}
|
||||
for (const entry of fs.readdirSync(src)) {
|
||||
if (entry === '.DS_Store') continue;
|
||||
copyRecursiveSync(path.join(src, entry), path.join(dest, entry));
|
||||
}
|
||||
} else {
|
||||
fs.copyFileSync(src, dest);
|
||||
if (stats.mode & 0o111) {
|
||||
fs.chmodSync(dest, stats.mode);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function copyNativeAssets(targetDir, target) {
|
||||
if (target.os === 'darwin') {
|
||||
const sbFiles = findSandboxProfiles();
|
||||
for (const file of sbFiles) {
|
||||
fs.copyFileSync(file, path.join(targetDir, path.basename(file)));
|
||||
}
|
||||
}
|
||||
|
||||
copyVendorRipgrep(targetDir, target);
|
||||
copyRecursiveSync(localesDir, path.join(targetDir, 'locales'));
|
||||
}
|
||||
|
||||
function findSandboxProfiles() {
|
||||
const matches = [];
|
||||
const packagesDir = path.join(rootDir, 'packages');
|
||||
const stack = [packagesDir];
|
||||
|
||||
while (stack.length) {
|
||||
const current = stack.pop();
|
||||
if (!current) break;
|
||||
const entries = fs.readdirSync(current, { withFileTypes: true });
|
||||
for (const entry of entries) {
|
||||
const entryPath = path.join(current, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
stack.push(entryPath);
|
||||
} else if (entry.isFile() && entry.name.endsWith('.sb')) {
|
||||
matches.push(entryPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return matches;
|
||||
}
|
||||
|
||||
function copyVendorRipgrep(targetDir, target) {
|
||||
if (!fs.existsSync(vendorDir)) {
|
||||
console.warn(`Warning: Vendor directory not found at ${vendorDir}`);
|
||||
return;
|
||||
}
|
||||
|
||||
const vendorRipgrepDir = path.join(vendorDir, 'ripgrep');
|
||||
if (!fs.existsSync(vendorRipgrepDir)) {
|
||||
console.warn(`Warning: ripgrep directory not found at ${vendorRipgrepDir}`);
|
||||
return;
|
||||
}
|
||||
|
||||
const platform = target.os === 'windows' ? 'win32' : target.os;
|
||||
const ripgrepTargetDir = path.join(
|
||||
vendorRipgrepDir,
|
||||
`${target.arch}-${platform}`,
|
||||
);
|
||||
if (!fs.existsSync(ripgrepTargetDir)) {
|
||||
console.warn(`Warning: ripgrep binaries not found at ${ripgrepTargetDir}`);
|
||||
return;
|
||||
}
|
||||
|
||||
const destVendorRoot = path.join(targetDir, 'vendor');
|
||||
const destRipgrepDir = path.join(destVendorRoot, 'ripgrep');
|
||||
fs.mkdirSync(destRipgrepDir, { recursive: true });
|
||||
|
||||
const copyingFile = path.join(vendorRipgrepDir, 'COPYING');
|
||||
if (fs.existsSync(copyingFile)) {
|
||||
fs.copyFileSync(copyingFile, path.join(destRipgrepDir, 'COPYING'));
|
||||
}
|
||||
|
||||
copyRecursiveSync(
|
||||
ripgrepTargetDir,
|
||||
path.join(destRipgrepDir, path.basename(ripgrepTargetDir)),
|
||||
);
|
||||
}
|
||||
|
||||
function buildTarget(target) {
|
||||
const outputName = `${cliName}-${target.id}`;
|
||||
const targetDir = path.join(distRoot, outputName);
|
||||
const binDir = path.join(targetDir, 'bin');
|
||||
const binaryName = target.os === 'windows' ? `${cliName}.exe` : cliName;
|
||||
|
||||
fs.mkdirSync(binDir, { recursive: true });
|
||||
|
||||
const buildArgs = [
|
||||
'build',
|
||||
'--compile',
|
||||
'--target',
|
||||
target.bunTarget,
|
||||
entryPoint,
|
||||
'--outfile',
|
||||
path.join(binDir, binaryName),
|
||||
];
|
||||
|
||||
const result = spawnSync('bun', buildArgs, { stdio: 'inherit' });
|
||||
if (result.status !== 0) {
|
||||
throw new Error(`Bun build failed for ${target.id}`);
|
||||
}
|
||||
|
||||
const packageJson = {
|
||||
name: outputName,
|
||||
version,
|
||||
os: [target.os === 'windows' ? 'win32' : target.os],
|
||||
cpu: [target.arch],
|
||||
};
|
||||
|
||||
fs.writeFileSync(
|
||||
path.join(targetDir, 'package.json'),
|
||||
JSON.stringify(packageJson, null, 2) + '\n',
|
||||
);
|
||||
|
||||
copyNativeAssets(targetDir, target);
|
||||
}
|
||||
|
||||
function main() {
|
||||
if (!fs.existsSync(entryPoint)) {
|
||||
console.error(`Entry point not found at ${entryPoint}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const args = parseArgs(process.argv.slice(2));
|
||||
if (args.list) {
|
||||
console.log(TARGETS.map((target) => target.id).join('\n'));
|
||||
return;
|
||||
}
|
||||
|
||||
ensureBunAvailable();
|
||||
cleanNativeDist();
|
||||
|
||||
let selectedTargets = [];
|
||||
if (args.all) {
|
||||
selectedTargets = TARGETS;
|
||||
} else if (args.targets.length > 0) {
|
||||
selectedTargets = TARGETS.filter((target) =>
|
||||
args.targets.includes(target.id),
|
||||
);
|
||||
} else {
|
||||
const hostTargetId = getHostTargetId();
|
||||
if (!hostTargetId) {
|
||||
console.error(
|
||||
`Unsupported host platform/arch: ${process.platform}/${process.arch}`,
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
selectedTargets = TARGETS.filter((target) => target.id === hostTargetId);
|
||||
}
|
||||
|
||||
if (selectedTargets.length === 0) {
|
||||
console.error('No matching targets selected.');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
for (const target of selectedTargets) {
|
||||
console.log(`\nBuilding native binary for ${target.id}...`);
|
||||
buildTarget(target);
|
||||
}
|
||||
|
||||
console.log('\n✅ Native build complete.');
|
||||
}
|
||||
|
||||
main();
|
||||
@@ -1,251 +0,0 @@
|
||||
# Standalone Release Spec (Bun Native + npm Fallback)
|
||||
|
||||
This document describes the target release design for shipping Qwen Code as native
|
||||
binaries built with Bun, while retaining the existing npm JS bundle as a fallback
|
||||
distribution. It is written as a migration-ready spec that bridges the current
|
||||
release pipeline to the future dual-release system.
|
||||
|
||||
## Goal
|
||||
|
||||
Provide a CLI that:
|
||||
|
||||
- Runs as a standalone binary on Linux/macOS/Windows without requiring Node or Bun.
|
||||
- Retains npm installation (global/local) as a JS-only fallback.
|
||||
- Supports a curl installer that pulls the correct binary from GitHub Releases.
|
||||
- Ships multiple variants (x64/arm64, musl/glibc where needed).
|
||||
- Uses one release flow to produce all artifacts with a single tag/version.
|
||||
|
||||
## Non-Goals
|
||||
|
||||
- Replacing npm as a dev-time dependency manager.
|
||||
- Shipping a single universal binary for all platforms.
|
||||
- Supporting every architecture or OS outside the defined target matrix.
|
||||
- Removing the existing Node/esbuild bundle.
|
||||
|
||||
## Current State (Baseline)
|
||||
|
||||
The current release pipeline:
|
||||
|
||||
- Bundles the CLI into `dist/cli.js` via esbuild.
|
||||
- Uses `scripts/prepare-package.js` to create `dist/package.json`,
|
||||
plus `vendor/`, `locales/`, and `*.sb` assets.
|
||||
- Publishes `dist/` to npm as the primary distribution.
|
||||
- Creates a GitHub Release and attaches only `dist/cli.js`.
|
||||
- Uses `release.yml` for nightly/preview schedules and manual stable releases.
|
||||
|
||||
This spec extends the above pipeline; it does not replace it until the migration
|
||||
phases complete.
|
||||
|
||||
## Target Architecture
|
||||
|
||||
### 1) Build Outputs
|
||||
|
||||
There are two build outputs:
|
||||
|
||||
1. Native binaries (Bun compile) for a target matrix.
|
||||
2. Node-compatible JS bundle for npm fallback (existing `dist/` output).
|
||||
|
||||
Native build output for each target:
|
||||
|
||||
- dist/<name>/bin/<cli> (or .exe on Windows)
|
||||
- dist/<name>/package.json (minimal package metadata)
|
||||
|
||||
Name encodes target:
|
||||
|
||||
- <cli>-linux-x64
|
||||
- <cli>-linux-x64-musl
|
||||
- <cli>-linux-arm64
|
||||
- <cli>-linux-arm64-musl
|
||||
- <cli>-darwin-arm64
|
||||
- <cli>-darwin-x64
|
||||
- <cli>-windows-x64
|
||||
|
||||
### 2) npm Distribution (JS Fallback)
|
||||
|
||||
Keep npm as a pure JS/TS CLI package that runs under Node/Bun. Do not ship or
|
||||
auto-install native binaries through npm.
|
||||
|
||||
Implications:
|
||||
|
||||
- npm install always uses the JS implementation.
|
||||
- No optionalDependencies for platform binaries.
|
||||
- No postinstall symlink logic.
|
||||
- No node shim that searches for a native binary.
|
||||
|
||||
### 3) GitHub Release Distribution (Primary)
|
||||
|
||||
Native binaries are distributed only via GitHub Releases and the curl installer:
|
||||
|
||||
- Archive each platform binary into a tar.gz (Linux) or zip (macOS/Windows).
|
||||
- Attach archives to the GitHub Release.
|
||||
- Provide a shell installer that detects target and downloads the correct archive.
|
||||
|
||||
## Detailed Implementation
|
||||
|
||||
### A) Target Matrix
|
||||
|
||||
Define a target matrix that includes OS, arch, and libc variants.
|
||||
|
||||
Target list (fixed set):
|
||||
|
||||
- darwin arm64
|
||||
- darwin x64
|
||||
- linux arm64 (glibc)
|
||||
- linux x64 (glibc)
|
||||
- linux arm64 musl
|
||||
- linux x64 musl
|
||||
- win32 x64
|
||||
|
||||
### B) Build Scripts
|
||||
|
||||
1. Native build script (new, e.g. `scripts/build-native.ts`)
|
||||
Responsibilities:
|
||||
|
||||
- Remove native build output directory (keep npm `dist/` intact).
|
||||
- For each target:
|
||||
- Compute a target name.
|
||||
- Compile using `Bun.build({ compile: { target: ... } })`.
|
||||
- Write the binary to `dist/<name>/bin/<cli>`.
|
||||
- Write a minimal `package.json` into `dist/<name>/`.
|
||||
|
||||
2. npm fallback build (existing)
|
||||
Responsibilities:
|
||||
|
||||
- `npm run bundle` produces `dist/cli.js`.
|
||||
- `npm run prepare:package` creates `dist/package.json` and copies assets.
|
||||
|
||||
Key details:
|
||||
|
||||
- Use Bun.build with compile.target = <bun-target> (e.g. bun-linux-x64).
|
||||
- Include any extra worker/runtime files in entrypoints.
|
||||
- Use define or execArgv to inject version/channel metadata.
|
||||
- Use "windows" in archive naming even though the OS is "win32" internally.
|
||||
|
||||
Build-time considerations:
|
||||
|
||||
- Preinstall platform-specific native deps for bundling (example: bun install --os="_" --cpu="_" for dependencies with native bindings).
|
||||
- Include worker assets in the compile entrypoints and embed their paths via define constants.
|
||||
- Use platform-specific bunfs root paths when resolving embedded worker files.
|
||||
- Set runtime execArgv flags for user-agent/version and system CA usage.
|
||||
|
||||
Target name example:
|
||||
<cli>-<os>-<arch>[-musl]
|
||||
|
||||
Minimal package.json example:
|
||||
{
|
||||
"name": "<cli>-linux-x64",
|
||||
"version": "<version>",
|
||||
"os": ["linux"],
|
||||
"cpu": ["x64"]
|
||||
}
|
||||
|
||||
### C) Publish Script (new, optional)
|
||||
|
||||
Responsibilities:
|
||||
|
||||
1. Run the native build script.
|
||||
2. Smoke test a local binary (`dist/<host>/bin/<cli> --version`).
|
||||
3. Create GitHub Release archives.
|
||||
4. Optionally build and push Docker image.
|
||||
5. Publish npm package (JS-only fallback) as a separate step or pipeline.
|
||||
|
||||
Note: npm publishing is now independent of native binary publishing. It should not reference platform binaries.
|
||||
|
||||
### D) GitHub Release Installer (install)
|
||||
|
||||
A bash installer that:
|
||||
|
||||
1. Detects OS and arch.
|
||||
2. Handles Rosetta (macOS) and musl detection (Alpine, ldd).
|
||||
3. Builds target name and downloads from GitHub Releases.
|
||||
4. Extracts to ~/.<cli>/bin.
|
||||
5. Adds PATH unless --no-modify-path.
|
||||
|
||||
Supports:
|
||||
|
||||
- --version <version>
|
||||
- --binary <path>
|
||||
- --no-modify-path
|
||||
|
||||
Installer details to include:
|
||||
|
||||
- Require tar for Linux and unzip for macOS/Windows archives.
|
||||
- Use "windows" in asset naming, not "win32".
|
||||
- Prefer arm64 when macOS is running under Rosetta.
|
||||
|
||||
## CI/CD Flow (Dual Pipeline)
|
||||
|
||||
Release pipeline (native binaries):
|
||||
|
||||
1. Bump version.
|
||||
2. Build binaries for the full target matrix.
|
||||
3. Smoke test the host binary.
|
||||
4. Create GitHub release assets.
|
||||
5. Mark release as final (if draft).
|
||||
|
||||
Release pipeline (npm fallback):
|
||||
|
||||
1. Bump version (same tag).
|
||||
2. Publish the JS-only npm package.
|
||||
|
||||
Release orchestration details to consider:
|
||||
|
||||
- Update all package.json version fields in the repo.
|
||||
- Update any extension metadata or download URLs that embed version strings.
|
||||
- Tag the release and create a GitHub Release draft that includes the binary assets.
|
||||
|
||||
### Workflow Mapping to Current Code
|
||||
|
||||
The existing `release.yml` workflow remains the orchestrator:
|
||||
|
||||
- Use `scripts/get-release-version.js` for version/tag selection.
|
||||
- Keep tests and integration checks as-is.
|
||||
- Add a native build matrix job that produces archives and uploads them to
|
||||
the GitHub Release.
|
||||
- Keep the npm publish step from `dist/` as the fallback.
|
||||
- Ensure the same `RELEASE_TAG` is used for both native and npm outputs.
|
||||
|
||||
## Edge Cases and Pitfalls
|
||||
|
||||
- musl: Alpine requires musl binaries.
|
||||
- Rosetta: macOS under Rosetta should prefer arm64 when available.
|
||||
- npm fallback: ensure JS implementation is functional without native helpers.
|
||||
- Path precedence: binary install should appear earlier in PATH than npm global bin if you want native to win by default.
|
||||
- Archive prerequisites: users need tar/unzip depending on OS.
|
||||
|
||||
## Testing Plan
|
||||
|
||||
- Build all targets in CI.
|
||||
- Run dist/<host>/bin/<cli> --version.
|
||||
- npm install locally and verify CLI invocation.
|
||||
- Run installer script on each OS or VM.
|
||||
- Validate musl builds on Alpine.
|
||||
|
||||
## Migration Plan
|
||||
|
||||
Phase 1: Add native builds without changing npm
|
||||
|
||||
- [ ] Define target matrix with musl variants.
|
||||
- [ ] Add native build script for Bun compile per target.
|
||||
- [ ] Generate per-target package.json.
|
||||
- [ ] Produce per-target archives and upload to GitHub Releases.
|
||||
- [ ] Keep existing npm bundle publish unchanged.
|
||||
|
||||
Phase 2: Installer and docs
|
||||
|
||||
- [ ] Add curl installer for GitHub Releases.
|
||||
- [ ] Document recommended install paths (native first).
|
||||
- [ ] Add smoke tests for installer output.
|
||||
|
||||
Phase 3: Default install guidance and cleanup
|
||||
|
||||
- [ ] Update docs to recommend native install where possible.
|
||||
- [ ] Decide whether npm stays equal or fallback-only in user docs.
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
- [ ] Keep `npm run bundle` + `npm run prepare:package` for JS fallback.
|
||||
- [ ] Add `scripts/build-native.ts` for Bun compile targets.
|
||||
- [ ] Add archive creation and asset upload in `release.yml`.
|
||||
- [ ] Add an installer script with OS/arch/musl detection.
|
||||
- [ ] Ensure tag/version parity across native and npm releases.
|
||||
Reference in New Issue
Block a user