Skip to content

Commit e9f35e2

Browse files
authored
feat(x2a): bulk CSV project creation (#2579)
* feat(x2a): bulk CSV project creation Signed-off-by: Marek Libra <marek.libra@gmail.com> * add sample-project.csv Signed-off-by: Marek Libra <marek.libra@gmail.com> --------- Signed-off-by: Marek Libra <marek.libra@gmail.com>
1 parent 99aa8ff commit e9f35e2

38 files changed

Lines changed: 4782 additions & 1387 deletions
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
'@red-hat-developer-hub/backstage-plugin-scaffolder-backend-module-x2a': patch
3+
'@red-hat-developer-hub/backstage-plugin-x2a-common': patch
4+
'@red-hat-developer-hub/backstage-plugin-x2a': patch
5+
'@red-hat-developer-hub/backstage-plugin-x2a-backend': patch
6+
---
7+
8+
The user can newly bulk-create conversion projects from an uploaded CSV file.

workspaces/x2a/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -301,3 +301,7 @@ Loaded Kubernetes configuration from ~/.kube/config
301301
- `migrations/` - Database migrations
302302
- `plugins/x2a-common/` - Shared code between frontend and backend
303303
- `client/src/schema/openapi/generated/` - Generated client-side API code
304+
305+
## CSV Bulk Project Import
306+
307+
See [CSV Bulk Project Import](./docs/csv-bulk-import.md) for the CSV file format, an example, repository URL conventions, and the `RepoAuthentication` scaffolder extension.
Lines changed: 132 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
# CSV Bulk Project Import
2+
3+
The CSV bulk import lets the user create multiple conversion projects at once by uploading a single CSV file.
4+
5+
## How to Access
6+
7+
1. Open the Backstage instance and navigate to `/create`.
8+
2. Select the **Chef-to-Ansible Conversion Project** template (`chef-conversion-project-template`).
9+
3. On the first page, choose **CSV upload** as the input method.
10+
4. Upload the CSV file and proceed through the wizard.
11+
12+
The wizard asks for authentication with each SCM provider (GitHub, GitLab, Bitbucket) referenced in the CSV. Projects are created sequentially with the same permission checks as if the user had created each one individually.
13+
14+
## CSV File Format
15+
16+
The file must be UTF-8 encoded with a header row. Column order does not matter, but header names must match exactly.
17+
18+
### Required Columns
19+
20+
| Column | Description |
21+
| ------------------ | --------------------------------------------------------------------------------------------------------- |
22+
| `name` | Unique project name |
23+
| `abbreviation` | Short project identifier, 1-5 alphanumeric characters matching `^([a-zA-Z][a-zA-Z0-9]*)(-[a-zA-Z0-9]+)*$` |
24+
| `sourceRepoUrl` | URL of the repository containing the Chef cookbook to convert |
25+
| `sourceRepoBranch` | Branch to read from in the source repository |
26+
| `targetRepoBranch` | Branch to write converted Ansible output to |
27+
28+
### Optional Columns
29+
30+
| Column | Description |
31+
| --------------- | ---------------------------------------------------------------------------------------- |
32+
| `description` | Project description (defaults to empty) |
33+
| `ownedByGroup` | Backstage group that owns the project. When empty, the signed-in user becomes the owner. |
34+
| `targetRepoUrl` | Repository for converted output. Defaults to `sourceRepoUrl` when empty. |
35+
36+
No extra columns are allowed -- the import will reject unknown headers.
37+
38+
### Repeatable Import
39+
40+
The CSV import is designed to be run repeatedly with the same or an updated file. Projects whose name already exists are **skipped** (not duplicated) and counted as "skipped" in the results summary.
41+
42+
A typical workflow for a large import:
43+
44+
1. Upload the CSV. Some projects succeed, some may fail (e.g. due to a missing repository or a typo).
45+
2. Review the results. The summary shows how many succeeded, failed, and were skipped.
46+
3. Fix the issues - correct the CSV rows that failed and, if a partially-created project needs to be recreated, delete it from the application first.
47+
4. Re-upload the corrected CSV. Already-created projects are skipped automatically. Only the new or corrected rows are processed.
48+
49+
### Repository URL Format
50+
51+
Both `sourceRepoUrl` and `targetRepoUrl` accept two formats. All URLs are normalized to HTTPS clone URLs before being stored.
52+
53+
**Plain HTTPS URLs** (standard clone URLs):
54+
55+
| Provider | Format |
56+
| --------- | -------------------------------------- |
57+
| GitHub | `https://github.com/owner/repo` |
58+
| GitLab | `https://gitlab.com/owner/repo` |
59+
| Bitbucket | `https://bitbucket.org/workspace/repo` |
60+
61+
**Backstage RepoUrlPicker format** (query-parameter style, without `https://`):
62+
63+
| Provider | Format |
64+
| --------- | ---------------------------------------------------------------- |
65+
| GitHub | `github.com?owner=myuser&repo=myrepo` |
66+
| GitLab | `gitlab.com?owner=myuser&repo=myrepo` |
67+
| Bitbucket | `bitbucket.org?workspace=myworkspace&project=myproj&repo=myrepo` |
68+
69+
For Bitbucket, the `project` parameter is organizational metadata and is not part of the clone URL. Only `workspace` and `repo` are used.
70+
71+
For self-hosted instances (e.g. GitHub Enterprise, self-hosted GitLab), the corresponding host should be used in place of the public domain. The host must be listed in the `integrations:` section of `app-config.yaml` so the plugin can detect the correct SCM provider. See [SCM Provider Detection](../README.md#scm-provider-detection).
72+
73+
### Example
74+
75+
```csv
76+
name,abbreviation,sourceRepoUrl,sourceRepoBranch,targetRepoUrl,targetRepoBranch,description,ownedByGroup
77+
web-app,wapp,https://github.com/myorg/web-app-chef,main,https://github.com/myorg/web-app-ansible,main,Convert web app cookbook,team-platform
78+
db-setup,dbset,gitlab.com?owner=myorg&repo=db-chef,develop,gitlab.com?owner=myorg&repo=db-ansible,main,,
79+
cache-svc,cache,bitbucket.org?workspace=myws&project=x2a&repo=cache-chef,main,,main,Cache service conversion,
80+
```
81+
82+
Notes on the example:
83+
84+
- Row 1 (`web-app`): uses plain HTTPS URLs.
85+
- Row 2 (`db-setup`): uses RepoUrlPicker-style URLs for GitLab. `description` and `ownedByGroup` are left empty.
86+
- Row 3 (`cache-svc`): uses RepoUrlPicker-style URL for Bitbucket. `targetRepoUrl` is empty, so the source repository is used as the target.
87+
88+
### CSV file template
89+
90+
Download a [sample CSV file](../plugins/x2a-backend/public/sample-projects.csv) with all supported headers.
91+
92+
At runtime, the file is served at `/x2a/download/sample-projects.csv` (via the frontend plugin route).
93+
94+
## RepoAuthentication Scaffolder Extension
95+
96+
When using CSV import, the template uses the `RepoAuthentication` custom scaffolder field to collect OAuth tokens for each SCM provider found in the CSV. This replaces the standard `RepoUrlPicker` used in manual mode.
97+
98+
### How It Works
99+
100+
1. The extension parses the uploaded CSV and identifies all distinct SCM providers across source and target URLs.
101+
2. It opens an OAuth dialog for each provider for the user to authenticate.
102+
3. Tokens are stored as scaffolder secrets with the prefix `OAUTH_TOKEN_` (e.g. `OAUTH_TOKEN_github`, `OAUTH_TOKEN_gitlab`, `OAUTH_TOKEN_bitbucket`).
103+
4. The wizard blocks progression until all required providers are authenticated.
104+
105+
### Using in a Template
106+
107+
The extension is registered as `RepoAuthentication` and is available as a `ui:field`. It must reference the CSV field via `ui:options.csvFieldName`:
108+
109+
```yaml
110+
properties:
111+
repoAuthentication:
112+
type: string
113+
description: Provide login to all the SCMs relevant for the source CSV.
114+
ui:field: RepoAuthentication
115+
ui:options:
116+
csvFieldName: csvContent
117+
```
118+
119+
### Registering the Extension
120+
121+
In a standard Backstage app, import and add the extension so the scaffolder can find it:
122+
123+
```typescript
124+
import { RepoAuthenticationExtension } from '@red-hat-developer-hub/backstage-plugin-x2a';
125+
126+
// In the App component, alongside <ScaffolderPage>:
127+
<ScaffolderFieldExtensions>
128+
<RepoAuthenticationExtension />
129+
</ScaffolderFieldExtensions>
130+
```
131+
132+
For Red Hat Developer Hub (RHDH) with dynamic plugins, the extension is registered via configuration instead. See [Providing custom Scaffolder field extensions](https://docs.redhat.com/en/documentation/red_hat_developer_hub/1.9/html/installing_and_viewing_plugins_in_red_hat_developer_hub/assembly-front-end-plugin-wiring.adoc_rhdh-extensions-plugins#con-providing-custom-scaffolder-field-extensions.adoc_assembly-front-end-plugin-wiring) in the RHDH documentation.

workspaces/x2a/packages/app/package.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,7 @@
3838
"@backstage/plugin-org": "^0.6.43",
3939
"@backstage/plugin-permission-react": "^0.4.36",
4040
"@backstage/plugin-scaffolder": "^1.34.0",
41+
"@backstage/plugin-scaffolder-react": "^1.20.0",
4142
"@backstage/plugin-search": "^1.4.29",
4243
"@backstage/plugin-search-react": "^1.9.3",
4344
"@backstage/plugin-signals": "^0.0.22",

workspaces/x2a/packages/app/src/App.tsx

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ import {
2525
catalogImportPlugin,
2626
} from '@backstage/plugin-catalog-import';
2727
import { ScaffolderPage, scaffolderPlugin } from '@backstage/plugin-scaffolder';
28+
import { ScaffolderFieldExtensions } from '@backstage/plugin-scaffolder-react';
2829
import { orgPlugin } from '@backstage/plugin-org';
2930
import { SearchPage } from '@backstage/plugin-search';
3031
import {
@@ -55,6 +56,7 @@ import { SignalsDisplay } from '@backstage/plugin-signals';
5556
import {
5657
X2APage,
5758
x2aPluginTranslations,
59+
RepoAuthenticationExtension,
5860
} from '@red-hat-developer-hub/backstage-plugin-x2a';
5961
import {
6062
bitbucketAuthApiRef,
@@ -135,7 +137,6 @@ const routes = (
135137
<ReportIssue />
136138
</TechDocsAddons>
137139
</Route>
138-
<Route path="/create" element={<ScaffolderPage />} />
139140
<Route path="/api-docs" element={<ApiExplorerPage />} />
140141
<Route
141142
path="/catalog-import"
@@ -152,6 +153,15 @@ const routes = (
152153
<Route path="/settings" element={<UserSettingsPage />} />
153154
<Route path="/catalog-graph" element={<CatalogGraphPage />} />
154155
<Route path="/notifications" element={<NotificationsPage />} />
156+
157+
{/* At RHDH runtime, this is replaced by dynamicPlugin configuration:
158+
https://docs.redhat.com/en/documentation/red_hat_developer_hub/1.9/html/installing_and_viewing_plugins_in_red_hat_developer_hub/assembly-front-end-plugin-wiring.adoc_rhdh-extensions-plugins#con-providing-custom-scaffolder-field-extensions.adoc_assembly-front-end-plugin-wiring
159+
*/}
160+
<Route path="/create" element={<ScaffolderPage />}>
161+
<ScaffolderFieldExtensions>
162+
<RepoAuthenticationExtension />
163+
</ScaffolderFieldExtensions>
164+
</Route>
155165
</FlatRoutes>
156166
);
157167

Lines changed: 120 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,120 @@
1+
/*
2+
* Copyright Red Hat, Inc.
3+
*
4+
* Licensed under the Apache License, Version 2.0 (the "License");
5+
* you may not use this file except in compliance with the License.
6+
* You may obtain a copy of the License at
7+
*
8+
* http://www.apache.org/licenses/LICENSE-2.0
9+
*
10+
* Unless required by applicable law or agreed to in writing, software
11+
* distributed under the License is distributed on an "AS IS" BASIS,
12+
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
* See the License for the specific language governing permissions and
14+
* limitations under the License.
15+
*/
16+
import {
17+
DefaultApiClient,
18+
Project,
19+
ProjectsPost,
20+
ProjectsProjectIdRunPost200Response,
21+
normalizeRepoUrl,
22+
ScmProviderName,
23+
} from '@red-hat-developer-hub/backstage-plugin-x2a-common';
24+
import type { ActionLogger } from './createProjectAction';
25+
26+
export type CreateAndInitProjectParams = {
27+
api: DefaultApiClient;
28+
row: ProjectsPost['body'];
29+
sourceRepoToken: string;
30+
targetRepoToken: string;
31+
userPrompt?: string;
32+
backstageToken?: string;
33+
hostProviderMap: Map<string, ScmProviderName>;
34+
logger: ActionLogger;
35+
};
36+
37+
export const createAndInitProject = async (
38+
params: CreateAndInitProjectParams,
39+
): Promise<{ projectId: string; initJobId: string }> => {
40+
const {
41+
api,
42+
row,
43+
sourceRepoToken,
44+
targetRepoToken,
45+
userPrompt,
46+
backstageToken: token,
47+
logger,
48+
} = params;
49+
50+
const body: ProjectsPost['body'] = {
51+
name: row.name,
52+
description: row.description ?? '',
53+
abbreviation: row.abbreviation,
54+
ownedByGroup: row.ownedByGroup?.trim() || undefined,
55+
sourceRepoUrl: normalizeRepoUrl(row.sourceRepoUrl),
56+
targetRepoUrl: normalizeRepoUrl(row.targetRepoUrl),
57+
sourceRepoBranch: row.sourceRepoBranch,
58+
targetRepoBranch: row.targetRepoBranch,
59+
};
60+
61+
logger.info(`Creating project "${row.name}" (${JSON.stringify(body)})`);
62+
63+
let project: Project;
64+
try {
65+
const response = await api.projectsPost({ body }, { token });
66+
if (!response.ok) {
67+
const error = (await response.json()) as { message?: string };
68+
logger.error(
69+
`Project "${row.name}" creation failed (status ${response.status}): ${JSON.stringify(error)}`,
70+
);
71+
throw new Error(error?.message ?? JSON.stringify(error));
72+
}
73+
project = await response.json();
74+
} catch (error) {
75+
const message = error instanceof Error ? error.message : String(error);
76+
logger.error(`Error creating project "${row.name}": ${message}`);
77+
throw error;
78+
}
79+
80+
logger.info(
81+
`Project "${row.name}" created with id ${project.id}, triggering init-phase`,
82+
);
83+
84+
let initResponseData: ProjectsProjectIdRunPost200Response;
85+
try {
86+
const initResponse = await api.projectsProjectIdRunPost(
87+
{
88+
path: { projectId: project.id },
89+
body: {
90+
sourceRepoAuth: { token: sourceRepoToken },
91+
targetRepoAuth: { token: targetRepoToken },
92+
userPrompt,
93+
},
94+
},
95+
{ token },
96+
);
97+
98+
if (!initResponse.ok) {
99+
const error = (await initResponse.json()) as { message?: string };
100+
logger.error(
101+
`Init-phase for project "${row.name}" (${project.id}) failed (status ${initResponse.status}): ${JSON.stringify(error)}`,
102+
);
103+
throw new Error(error?.message ?? JSON.stringify(error));
104+
}
105+
106+
initResponseData = await initResponse.json();
107+
} catch (error) {
108+
const message = error instanceof Error ? error.message : String(error);
109+
logger.error(
110+
`Error triggering init-phase for project "${row.name}" (${project.id}): ${message}`,
111+
);
112+
throw error;
113+
}
114+
115+
logger.info(
116+
`Init-phase triggered for project "${row.name}" (${project.id}), jobId: ${initResponseData.jobId}`,
117+
);
118+
119+
return { projectId: project.id, initJobId: initResponseData.jobId };
120+
};

0 commit comments

Comments
 (0)