Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

supports test-results panel and refactor output settings #1087

Merged
merged 13 commits into from
Nov 2, 2023
Merged
Prev Previous commit
Next Next commit
update dependency, adjust verbose message, and upgrade settings
  • Loading branch information
connectdotz committed Nov 1, 2023
commit 39061d201d2df39c21d122c24c3316265891cfae
4 changes: 4 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,8 @@
"cSpell.words": [
"unmock"
],
"testing.openTesting": "neverOpen",
"jest.outputConfig": {
"clearOnRun": "terminal"
}
}
43 changes: 28 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -358,7 +358,7 @@ for example:

#### outputConfig

The `outputConfig` controls the Jest output experience by specifying when and where to create, display, and clear the output content. It supports 2 output panels: `TEST RESULTS` and `TERMINAL`. The `TEST RESULTS` panel displays test results in the order they were run, while the `TERMINAL` panel organizes outputs by workspace folder. `TERMINAL` panel also contains the non test run outputs, such as [quick-fix link](quick-fix-chooser), extension auto-config info and tips.
The `outputConfig` controls the Jest output experience by specifying when and where to create, display, and clear the output content. It supports 2 output panels: `TEST RESULTS` and `TERMINAL`. The `TEST RESULTS` panel displays test results in the order they were run, while the `TERMINAL` panel organizes outputs by workspace folder. `TERMINAL` panel also contains the non-test run outputs, such as [quick-fix link](quick-fix-chooser), extension auto-config info and tips.

**Type Definitions**
```ts
Expand Down Expand Up @@ -399,14 +399,26 @@ This setting can be one of the predefined types or a custom object.

**Handling Conflicts with "TEST RESULTS" panel**
<a id="outputconfig-conflict"></a>
It's important to note that outputConfig settings can conflict with the default settings provided by the VSCode testing framework. The issue arises specifically with the `"testing.openTesting"` setting, which is set to "openOnTestStart" by default.

Unless this setting is set to "neverOpen," it will automatically focus on the "TEST RESULTS" panel during various phases of testing, as dictated by the `"testing.openTesting"` setting. This behavior could potentially clash with your `"jest.outputConfig"` settings, especially when "revealWithFocus" is NOT set to "test-results".
_The Problem_

To resolve this, the extension offers built-in validation with quick-fix actions:
The behavior of the "TEST RESULTS" panel is influenced by VSCode's native `"testing.openTesting"` setting. This can cause inconsistencies with your `"jest.outputConfig"` settings.

1. Option 1: Set `"testing.openTesting": "neverOpen"` to let the extension manage the panel using `"jest.outputConfig"`.
2. Option 2: Set `"jest.outputConfig": {..."revealWithFocus": "test-results"}` and let the VSCode testing framework manage the panel with `"testing.openTesting"` independently.
For instance, if you set `"jest.outputConfig": {"revealWithFocus": "none"}` to prevent automatic focus changes, but leave `"testing.openTesting"` at its default value of `"openOnTestStart"`, the "TEST RESULTS" panel will still automatically switch focus whenever tests run.

_The Universal Solution_

For a consistent Jest output experience, the simplest solution is to set `"testing.openTesting": "neverOpen"`. This allows the extension to manage the "TEST RESULTS" and "TERMINAL" panels together using `"jest.outputConfig"` alone.

_Further Customization_

However, if you prefer "TEST RESULTS" and "TERMINAL" panels to behave differently and don't mind managing 2 settings yourself, you could play with different combinations.

For instance, if `"testing.openTesting"` is set to `"openOnTestFailure"`, and you want your terminal panel to still reveal when any tests run, your setting would look like this: `"jest.outputConfig": {revealWithFocus: "test-results"}`

_Built-in Validation_

The extension also features built-in conflict detection and quick fixes to assist.

**Examples**
- Choose a passive output experience that is identical to the previous version.
Expand All @@ -432,7 +444,7 @@ To resolve this, the extension offers built-in validation with quick-fix actions
"testing.openTesting": "openOnTestFailure",
"jest.outputConfig": {
"revealOn": "error",
"revealWithFocus": "none"
"revealWithFocus": "test-results"
}
```
- Clear the terminal output on each run but do not automatically switch focus to any panel.
Expand All @@ -448,14 +460,14 @@ To resolve this, the extension offers built-in validation with quick-fix actions

Migrating to the new `"jest.outputConfig"` can require some manual adjustments, especially if you're working in a multi-root workspace. Here are some guidelines to help with the transition:

1. **Workspace Level vs Workspace-Folder Level**: The new `"jest.outputConfig"` is a workspace-level setting, unlike legacy settings like `"jest.autoClearTerminal"` and `"autoRevealOutput"`, which are workspace-folder level settings.
1. **Workspace Level vs Workspace-Folder Level**: The new `"jest.outputConfig"` is a workspace-level setting, unlike legacy settings like `"jest.autoClearTerminal"` and `"jest.autoRevealOutput"`, which are workspace-folder level settings.

2. **Backward Compatibility**: If no `"jest.outputConfig"` is defined in your settings.json, the extension will attempt to generate a backward-compatible outputConfig in memory. This uses the `"testing.openTesting"` setting and any legacy settings (`"jest.autoClearTerminal"`, `"jest.autoRevealOutput"`) you might have. Note that this is more straightforward for single-root workspaces.
2. **Backward Compatibility**: If no `"jest.outputConfig"` is defined in your settings.json, the extension will attempt to generate a backward-compatible outputConfig in memory. This uses the `"testing.openTesting"` setting and any legacy settings (`"jest.autoClearTerminal"`, `"jest.autoRevealOutput"`) you might have. Note that this might only work for single-root workspaces.

3. **Migration Steps**:
- Use the `"Jest: Save Current Output Config"` command from the command palette to update your settings.json.
- Review and adjust the generated outputConfig as needed.
- Finally, remove any deprecated settings from settings.json.
- (optional) Fix warning: The save does not include `"testing.openTesting"`, so you might see the conflict warning message. You can either use the "Quick Fix" action or adjust the `settings.json` manually (see [handling conflict](#outputconfig-conflict)).
- Finally, remove any deprecated settings.

By following these guidelines, you should be able to smoothly transition to using `"jest.outputConfig"`.

Expand All @@ -465,8 +477,9 @@ By following these guidelines, you should be able to smoothly transition to usin

The `runMode` controls test UX, determining when tests should run, and housing the common run-time toggles like coverage.

**runMode type**
**Type Definitions**
```ts
// typescript types
interface JestRunModeOptions {
runAllTestsOnStartup?: boolean;
coverage?: boolean;
Expand Down Expand Up @@ -510,7 +523,7 @@ The following are the predefined `runMode` configurations for convenience. They
|"on-demand"|run tests on-demand through UI | {type: "on-demand", revealOutput: "on-run"} |
|"deferred"|defer test run and discovery until the first on-demand run | {type: "on-demand", revealOutput: "on-run", deferred: true } |

**runMode Examples**
**Examples**
- Run jest with watch mode - the default runMode if none is specified.
```json
"jest.runMode": "watch"
Expand Down Expand Up @@ -554,13 +567,13 @@ While the concepts of performance and automation are generally clear, "completen

**Migration Guide**
<a id="runmode-migration"></a>
Starting from v6.1.0, if no runMode is defined in settings.json, the extension will automatically generate one using legacy settings (`autoRun`, `autoRevealOutput`, `showCoverageOnLoad`). To migrate, simply use the `"Jest: Save Current RunMode"` command from the command palette to update the setting, then remove the legacy settings.
Starting from v6.1.0, if no runMode is defined in settings.json, the extension will automatically generate one using legacy settings (`autoRun`, `autoRevealOutput`, `showCoverageOnLoad`). To migrate, simply use the `"Jest: Save Current RunMode"` command from the command palette to update the setting, then remove the deprecated settings.

---

#### autoRun
<div style="background-color: yellow; color: black; padding: 10px; border-radius: 5px;">
Note: As of v6.1.0, <a href="#runmode">runMode</a> has superseded autoRun. For transition details, please refer to the <a href="#runmode-migration">runMode migration</a>.
Note: As of v6.1.0, autoRun will be replaced by <a href="#runmode">runMode</a>. For transition details, please refer to the <a href="#runmode-migration">runMode migration</a>.
</div>

AutoRun controls when tests should be executed automatically.
Expand Down
2 changes: 1 addition & 1 deletion jest.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ module.exports = {
'debug',
'@babel/template',
'graceful-fs',
'@babel/core',
'@babel/types',
],
moduleNameMapper: {
'\\.(svg)$': '<rootDir>/tests/fileMock.ts',
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -663,7 +663,7 @@
"dependencies": {
"istanbul-lib-coverage": "^3.2.0",
"istanbul-lib-source-maps": "^4.0.1",
"jest-editor-support": "^31.1.1"
"jest-editor-support": "^31.1.2"
},
"devDependencies": {
"@types/fs-extra": "^11.0.2",
Expand Down
2 changes: 1 addition & 1 deletion src/JestExt/process-listeners.ts
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ const CONTROL_MESSAGES =
*/
export const DEFAULT_LONG_RUN_THRESHOLD = 60000;
export class LongRunMonitor {
private timer: NodeJS.Timer | undefined;
private timer: NodeJS.Timeout | undefined;
public readonly thresholdMs: number;
constructor(private callback: () => void, private logging: Logging, option?: MonitorLongRun) {
if (option == null) {
Expand Down
2 changes: 1 addition & 1 deletion src/StatusBar.ts
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ class TypedStatusBarItem {
const { text, tooltip, backgroundColor } = options;
this.actual.text = text ?? this.actual.text;
this.actual.tooltip = tooltip ?? this.actual.tooltip;
this.actual.backgroundColor = backgroundColor ?? this.actual.backgroundColor;
this.actual.backgroundColor = backgroundColor;
}
dispose() {
this.actual.dispose();
Expand Down
18 changes: 6 additions & 12 deletions src/test-provider/jest-test-run.ts
Original file line number Diff line number Diff line change
Expand Up @@ -62,13 +62,7 @@ export class JestTestRun implements JestExtOutput, TestRunProtocol {
const runName = `${this.name} (${this.runCount++})`;
this._run = this.createRun(runName);
if (this.verbose) {
if (this.runCount > 1) {
console.warn(
`JestTestRun "${this.name}": Recreating a TestRun for the same instance should not happen.`
);
} else {
console.log(`[${this.context.ext.workspace.name}] JestTestRun "${runName}": created.`);
}
console.log(`[${this.context.ext.workspace.name}] JestTestRun "${runName}": created.`);
}
}
return this._run;
Expand Down Expand Up @@ -118,24 +112,24 @@ export class JestTestRun implements JestExtOutput, TestRunProtocol {
if (!delay) {
this.processes.delete(pid);
if (this.verbose) {
console.log(`JestTestRun "${runName}": process "${pid}" ended because: ${reason}.`);
console.log(`JestTestRun "${runName}": process "${pid}" ended because: ${reason}`);
}
} else {
timeoutId = setTimeout(() => {
if (this.verbose) {
console.log(
`JestTestRun "${runName}": process "${pid}" ended after ${delay} msec delay because: ${reason}.`
`JestTestRun "${runName}": process "${pid}" ended after ${delay} msec delay because: ${reason}`
);
}
this.processes.delete(pid);
this.end({
reason: `last process "${pid}" ended after ${delay} msec delay because: ${reason}.`,
reason: `last process "${pid}" ended by ${reason}`,
});
}, delay);
this.processes.set(pid, timeoutId);
if (this.verbose) {
console.log(
`JestTestRun "${runName}": starting a ${delay} msec timer to end process "${pid}" because: ${reason}.`
`JestTestRun "${runName}": starting a ${delay} msec timer to end process "${pid}" because: ${reason}`
);
}
}
Expand All @@ -147,7 +141,7 @@ export class JestTestRun implements JestExtOutput, TestRunProtocol {
this._run.end();
this._run = undefined;
if (this.verbose) {
console.log(`JestTestRun "${runName}": ended because: ${options?.reason}.`);
console.log(`JestTestRun "${runName}": TestRun ended because: ${options?.reason}.`);
}
};
}
6 changes: 5 additions & 1 deletion src/test-provider/test-item-data.ts
Original file line number Diff line number Diff line change
Expand Up @@ -279,7 +279,11 @@ export class WorkspaceRoot extends TestItemDataBase {
'debug',
`update status from run "${event.process.id}": ${event.files.length} files`
);
event.files.forEach((f) => this.addTestFile(f, (testRoot) => testRoot.discoverTest(run)));
if (event.files.length === 0) {
run.write(`No tests were run.`, `new-line`);
} else {
event.files.forEach((f) => this.addTestFile(f, (testRoot) => testRoot.discoverTest(run)));
}
run.end({ pid: event.process.id, delay: 1000, reason: 'assertions-updated' });
break;
}
Expand Down
10 changes: 8 additions & 2 deletions tests/StatusBar.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -29,8 +29,6 @@ describe('StatusBar', () => {
let statusBar: StatusBar;
let updateSpy;
let renderSpy;
// let mockActiveSBItem;
// let mockSummarySBItem;
let createFolderItemSpy;
let createSummaryItemSpy;
let mockSummarySBItems;
Expand Down Expand Up @@ -234,6 +232,14 @@ describe('StatusBar', () => {
expect(mockActiveSBItems[0].tooltip).toContain('on-save');
expect(mockSummarySBItems[0].tooltip).toContain('success 1, fail 2, unknown 3');
});
it('error background will be reset upon next successful run', () => {
statusBar.bind(makeWorkspaceFolder('testSource1')).update({ state: 'exec-error' });
expect(mockActiveSBItems[0].backgroundColor.id).toEqual(
expect.stringContaining('statusBarItem.errorBackground')
);
statusBar.bind(makeWorkspaceFolder('testSource1')).update({ state: 'running' });
expect(mockActiveSBItems[0].backgroundColor).toEqual(undefined);
});
});
});
});
Expand Down
Loading
Loading