Skip to content

Commit dab7bc9

Browse files
authored
initial scaffolding for e2e, smoke, integration tests (#1223)
1 parent e1e27b1 commit dab7bc9

File tree

19 files changed

+1813
-6
lines changed

19 files changed

+1813
-6
lines changed

.github/instructions/generic.instructions.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,3 +43,8 @@ Provide project context and coding guidelines that AI should follow when generat
4343

4444
- When using `getConfiguration().inspect()`, always pass a scope/Uri to `getConfiguration(section, scope)` — otherwise `workspaceFolderValue` will be `undefined` because VS Code doesn't know which folder to inspect (1)
4545
- **path.normalize() vs path.resolve()**: On Windows, `path.normalize('\test')` keeps it as `\test`, but `path.resolve('\test')` adds the current drive → `C:\test`. When comparing paths, use `path.resolve()` on BOTH sides or they won't match (2)
46+
- **Path comparisons vs user display**: Use `normalizePath()` from `pathUtils.ts` when comparing paths or using them as map keys, but preserve original paths for user-facing output like settings, logs, and UI (1)
47+
- **CI test jobs need webpack build**: Smoke/E2E/integration tests run in a real VS Code instance against `dist/extension.js` (built by webpack). CI jobs must run `npm run compile` (webpack), not just `npm run compile-tests` (tsc). Without webpack, the extension code isn't built and tests run against stale/missing code (1)
48+
- **Use inspect() for setting checks with defaults from other extensions**: When checking `python.useEnvironmentsExtension`, use `config.inspect()` and only check explicit user values (`globalValue`, `workspaceValue`, `workspaceFolderValue`). Ignore `defaultValue` as it may come from other extensions' package.json even when not installed (1)
49+
- **API is flat, not nested**: Use `api.getEnvironments()`, NOT `api.environments.getEnvironments()`. The extension exports a flat API object (1)
50+
- **PythonEnvironment has `envId`, not `id`**: The environment identifier is `env.envId` (a `PythonEnvironmentId` object with `id` and `managerId`), not a direct `id` property (1)
Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
name: debug-failing-test
3+
description: Debug a failing test using an iterative logging approach, then clean up and document the learning.
4+
---
5+
6+
Debug a failing unit test by iteratively adding verbose logging, running the test, and analyzing the output until the root cause is found and fixed.
7+
8+
## Workflow
9+
10+
### Phase 1: Initial Assessment
11+
12+
1. **Run the failing test** to capture the current error message and stack trace
13+
2. **Read the test file** to understand what is being tested
14+
3. **Read the source file** being tested to understand the expected behavior
15+
4. **Identify the assertion that fails** and what values are involved
16+
17+
### Phase 2: Iterative Debugging Loop
18+
19+
Repeat until the root cause is understood:
20+
21+
1. **Add verbose logging** around the suspicious code:
22+
- Use `console.log('[DEBUG]', ...)` with descriptive labels
23+
- Log input values, intermediate states, and return values
24+
- Log before/after key operations
25+
- Add timestamps if timing might be relevant
26+
27+
2. **Run the test** and capture output
28+
29+
3. **Assess the logging output:**
30+
- What values are unexpected?
31+
- Where does the behavior diverge from expectations?
32+
- What additional logging would help narrow down the issue?
33+
34+
4. **Decide next action:**
35+
- If root cause is clear → proceed to fix
36+
- If more information needed → add more targeted logging and repeat
37+
38+
### Phase 3: Fix and Verify
39+
40+
1. **Implement the fix** based on findings
41+
2. **Run the test** to verify it passes
42+
3. **Run related tests** to ensure no regressions
43+
44+
### Phase 4: Clean Up
45+
46+
1. **Remove ALL debugging artifacts:**
47+
- Delete all `console.log('[DEBUG]', ...)` statements added
48+
- Remove any temporary variables or code added for debugging
49+
- Ensure the code is in a clean, production-ready state
50+
51+
2. **Verify the test still passes** after cleanup
52+
53+
### Phase 5: Document and Learn
54+
55+
1. **Provide a summary** to the user (1-3 sentences):
56+
- What was the bug?
57+
- What was the fix?
58+
59+
2. **Record the learning** by following the learning instructions (if you have them):
60+
- Extract a single, clear learning from this debugging session
61+
- Add it to the "Learnings" section of the most relevant instruction file
62+
- If a similar learning already exists, increment its counter instead
63+
64+
## Logging Conventions
65+
66+
When adding debug logging, use this format for easy identification and removal:
67+
68+
```typescript
69+
console.log('[DEBUG] <location>:', <value>);
70+
console.log('[DEBUG] before <operation>:', { input, state });
71+
console.log('[DEBUG] after <operation>:', { result, state });
72+
```
73+
74+
## Example Debug Session
75+
76+
```typescript
77+
// Added logging example:
78+
console.log('[DEBUG] getEnvironments input:', { workspaceFolder });
79+
const envs = await manager.getEnvironments(workspaceFolder);
80+
console.log('[DEBUG] getEnvironments result:', { count: envs.length, envs });
81+
```
82+
83+
## Notes
84+
85+
- Prefer targeted logging over flooding the output
86+
- Start with the failing assertion and work backwards
87+
- Consider async timing issues, race conditions, and mock setup problems
88+
- Check that mocks are returning expected values
89+
- Verify test setup/teardown is correct
Lines changed: 125 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,125 @@
1+
---
2+
name: run-e2e-tests
3+
description: Run E2E tests to verify complete user workflows like environment discovery, creation, and selection. Use this before releases or after major changes.
4+
---
5+
6+
Run E2E (end-to-end) tests to verify complete user workflows work correctly.
7+
8+
## When to Use This Skill
9+
10+
- Before submitting a PR with significant changes
11+
- After modifying environment discovery, creation, or selection logic
12+
- Before a release to validate full workflows
13+
- When user reports a workflow is broken
14+
15+
**Note:** Run smoke tests first. If smoke tests fail, E2E tests will also fail.
16+
17+
## Quick Reference
18+
19+
| Action | Command |
20+
| ----------------- | -------------------------------------------------------------- |
21+
| Run all E2E tests | `npm run compile && npm run compile-tests && npm run e2e-test` |
22+
| Run specific test | `npm run e2e-test -- --grep "discovers"` |
23+
| Debug in VS Code | Debug panel → "E2E Tests" → F5 |
24+
25+
## How E2E Tests Work
26+
27+
Unlike unit tests (mocked) and smoke tests (quick checks), E2E tests:
28+
29+
1. Launch a real VS Code instance with the extension
30+
2. Exercise complete user workflows via the real API
31+
3. Verify end-to-end behavior (discovery → selection → execution)
32+
33+
They take longer (1-3 minutes) but catch integration issues.
34+
35+
## Workflow
36+
37+
### Step 1: Compile and Run
38+
39+
```bash
40+
npm run compile && npm run compile-tests && npm run e2e-test
41+
```
42+
43+
### Step 2: Interpret Results
44+
45+
**Pass:**
46+
47+
```
48+
E2E: Environment Discovery
49+
✓ Can trigger environment refresh
50+
✓ Discovers at least one environment
51+
✓ Environments have required properties
52+
✓ Can get global environments
53+
54+
4 passing (45s)
55+
```
56+
57+
**Fail:** Check error message and see Debugging section.
58+
59+
## Debugging Failures
60+
61+
| Error | Cause | Fix |
62+
| ---------------------------- | ---------------------- | ------------------------------------------- |
63+
| `No environments discovered` | Python not installed | Install Python, verify it's on PATH |
64+
| `Extension not found` | Build failed | Run `npm run compile` |
65+
| `API not available` | Activation error | Debug with F5, check Debug Console |
66+
| `Timeout exceeded` | Slow operation or hang | Increase timeout or check for blocking code |
67+
68+
For detailed debugging: Debug panel → "E2E Tests" → F5
69+
70+
## Prerequisites
71+
72+
E2E tests have system requirements:
73+
74+
- **Python installed** - At least one Python interpreter must be discoverable
75+
- **Extension builds** - Run `npm run compile` before tests
76+
- **CI needs webpack build** - Run `npm run compile` (webpack) before tests, not just `npm run compile-tests` (tsc)
77+
78+
## Adding New E2E Tests
79+
80+
Create files in `src/test/e2e/` with pattern `*.e2e.test.ts`:
81+
82+
```typescript
83+
import * as assert from 'assert';
84+
import * as vscode from 'vscode';
85+
import { waitForCondition } from '../testUtils';
86+
import { ENVS_EXTENSION_ID } from '../constants';
87+
88+
suite('E2E: [Workflow Name]', function () {
89+
this.timeout(120_000); // 2 minutes
90+
91+
let api: ExtensionApi;
92+
93+
suiteSetup(async function () {
94+
const extension = vscode.extensions.getExtension(ENVS_EXTENSION_ID);
95+
assert.ok(extension, 'Extension not found');
96+
if (!extension.isActive) await extension.activate();
97+
api = extension.exports;
98+
});
99+
100+
test('[Test description]', async function () {
101+
// Use real API (flat structure, not nested!)
102+
// api.getEnvironments(), not api.environments.getEnvironments()
103+
await waitForCondition(
104+
async () => (await api.getEnvironments('all')).length > 0,
105+
60_000,
106+
'No environments found',
107+
);
108+
});
109+
});
110+
```
111+
112+
## Test Files
113+
114+
| File | Purpose |
115+
| ----------------------------------------------- | ------------------------------------ |
116+
| `src/test/e2e/environmentDiscovery.e2e.test.ts` | Discovery workflow tests |
117+
| `src/test/e2e/index.ts` | Test runner entry point |
118+
| `src/test/testUtils.ts` | Utilities (`waitForCondition`, etc.) |
119+
120+
## Notes
121+
122+
- E2E tests are slower than smoke tests (expect 1-3 minutes)
123+
- They may create/modify files - cleanup happens in `suiteTeardown`
124+
- First run downloads VS Code (~100MB, cached in `.vscode-test/`)
125+
- For more details on E2E tests and how they compare to other test types, refer to the project's testing documentation.
Lines changed: 112 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,112 @@
1+
---
2+
name: run-integration-tests
3+
description: Run integration tests to verify that extension components work together correctly. Use this after modifying component interactions or event handling.
4+
---
5+
6+
Run integration tests to verify that multiple components (managers, API, settings) work together correctly.
7+
8+
## When to Use This Skill
9+
10+
- After modifying how components communicate (events, state sharing)
11+
- After changing the API surface
12+
- After modifying managers or their interactions
13+
- When components seem out of sync (UI shows stale data, events not firing)
14+
15+
## Quick Reference
16+
17+
| Action | Command |
18+
| ------------------------- | ---------------------------------------------------------------------- |
19+
| Run all integration tests | `npm run compile && npm run compile-tests && npm run integration-test` |
20+
| Run specific test | `npm run integration-test -- --grep "manager"` |
21+
| Debug in VS Code | Debug panel → "Integration Tests" → F5 |
22+
23+
## How Integration Tests Work
24+
25+
Integration tests run in a real VS Code instance but focus on **component interactions**:
26+
27+
- Does the API reflect manager state?
28+
- Do events fire when state changes?
29+
- Do different scopes return appropriate data?
30+
31+
They're faster than E2E (which test full workflows) but more thorough than smoke tests.
32+
33+
## Workflow
34+
35+
### Step 1: Compile and Run
36+
37+
```bash
38+
npm run compile && npm run compile-tests && npm run integration-test
39+
```
40+
41+
### Step 2: Interpret Results
42+
43+
**Pass:**
44+
45+
```
46+
Integration: Environment Manager + API
47+
✓ API reflects manager state after refresh
48+
✓ Different scopes return appropriate environments
49+
✓ Environment objects have consistent structure
50+
51+
3 passing (25s)
52+
```
53+
54+
**Fail:** Check error message and see Debugging section.
55+
56+
## Debugging Failures
57+
58+
| Error | Cause | Fix |
59+
| ------------------- | --------------------------- | ------------------------------- |
60+
| `API not available` | Extension activation failed | Check Debug Console |
61+
| `Event not fired` | Event wiring issue | Check event registration |
62+
| `State mismatch` | Components out of sync | Add logging, check update paths |
63+
| `Timeout` | Async operation stuck | Check for deadlocks |
64+
65+
For detailed debugging: Debug panel → "Integration Tests" → F5
66+
67+
## Adding New Integration Tests
68+
69+
Create files in `src/test/integration/` with pattern `*.integration.test.ts`:
70+
71+
```typescript
72+
import * as assert from 'assert';
73+
import * as vscode from 'vscode';
74+
import { waitForCondition, TestEventHandler } from '../testUtils';
75+
import { ENVS_EXTENSION_ID } from '../constants';
76+
77+
suite('Integration: [Component A] + [Component B]', function () {
78+
this.timeout(120_000);
79+
80+
let api: ExtensionApi;
81+
82+
suiteSetup(async function () {
83+
const extension = vscode.extensions.getExtension(ENVS_EXTENSION_ID);
84+
assert.ok(extension, 'Extension not found');
85+
if (!extension.isActive) await extension.activate();
86+
api = extension.exports;
87+
});
88+
89+
test('[Interaction test]', async function () {
90+
// Test component interaction
91+
});
92+
});
93+
```
94+
95+
## Test Files
96+
97+
| File | Purpose |
98+
| -------------------------------------------------------- | -------------------------------------------------- |
99+
| `src/test/integration/envManagerApi.integration.test.ts` | Manager + API tests |
100+
| `src/test/integration/index.ts` | Test runner entry point |
101+
| `src/test/testUtils.ts` | Utilities (`waitForCondition`, `TestEventHandler`) |
102+
103+
## Prerequisites
104+
105+
- **CI needs webpack build** - Run `npm run compile` (webpack) before tests, not just `npm run compile-tests` (tsc)
106+
- **Extension builds** - Run `npm run compile` before tests
107+
108+
## Notes
109+
110+
- Integration tests are faster than E2E (30s-2min vs 1-3min)
111+
- Focus on testing component boundaries, not full user workflows
112+
- First run downloads VS Code (~100MB, cached in `.vscode-test/`)

0 commit comments

Comments
 (0)