-
Notifications
You must be signed in to change notification settings - Fork 28
Dev: Testing, Continuous Integration
To execute tests, use karma start
command.
Jasmine spy can be used to control if some method is getting called, how many times it has been called, which argument were passed and so on. It can even fake response from method call.
To create spy on existing object, call spyOn(object, 'method')
. Now you can use spy instead of original object instance.
Jasmine can be used to mock dependencies. It uses spies again, but when there is no existing object instance, it needs to create spy object using jasmine.createSpyObj
. First argument is name of created object and second argument is string array of method names.
Usage might look like this:
let migrationIssuesSpy = jasmine.createSpyObj('MigrationIssuesService', [
'getAggregatedIssues',
'getIssuesPerFile'
])
When you want to mock backend, you need to add this:
{
provide: Http,
useFactory: (backend: ConnectionBackend, defaultOptions: BaseRequestOptions) => {
return new Http(backend, defaultOptions);
},
deps: [MockBackend, BaseRequestOptions]
}
To TestBed.configureTestingModule
providers
. It will create Http object with MockBackend which won't do actual Http requests, but call Observable instead.
In tested method you can use inject()
to get MockBackend
instance. MockBackend.connections.subscribe
subscribes to Observable which creates new event for each HTTP request.
To create mock response, call connection.mockRespond
method. For most use cases providing JSON data is enough, so creating new Response(new ResponseOptions({ body: {// this is JSON response, it can be any valid JSON} })
will create response.
See example below:
it('Should make a POST request on backend with path', async(inject([FileService, MockBackend],
(service: FileService, mockBackend: MockBackend) => {
mockBackend.connections.subscribe((connection: MockConnection) => {
expect(connection.request.url).toEqual(Constants.REST_BASE + '/file/pathExists');
expect(connection.request.method).toEqual(RequestMethod.Post);
expect(connection.request.getBody()).toEqual('src/main/java');
connection.mockRespond(new Response(new ResponseOptions({
body: true
})));
});
service.pathExists("src/main/java").toPromise()
.then(result => {
expect(result).toEqual(true);
}, error => {
expect(false).toBeTruthy("Service call failed due to: " + error);
});
})));
Full example here: https://github.com/windup/windup-web/blob/master/ui/src/main/webapp/tests/app/file.service.spec.ts
If you need to create mock backend which will handle multiple requests with different responses, add if-else
statements into code and check request properties.
async() block is used to create new zone.js environment for asynchronous code. For testing asynchronous code, instead of providing () => { // test code here }
callback to it()
method, use async(() => { // test code here })
wrapper.
Or use (and this is probably better, since it is universal) done
parameter. Example:
it(done, () => {
let x = 0;
setTimeout(() => {
expect(x).toBe(1);
done(); // here call done after finishing async action
}, 100);
x++;
});
For testing components I recommend to mock service dependencies.
https://angular.io/docs/ts/latest/guide/testing.html
https://jasmine.github.io/2.0/introduction.html
Screenshots of Jasmine test results: http://wonka.mw.lab.eng.bos.redhat.com/jenkins/job/windup-web-pr-builder/ws/ui/target/screenshots/
Execute mvn test -pl tests/e2e
- if you want to see, what is going on, edit file
tests/e2e/src/main/npm/protractor.conf.js
and remove--headless, --disable-gpu
parameters fromexports.config.capabilities.chromeOptions.args
array.
Visual regression tests are executed in similar manner as e2e tests, they just require additional -DvisualRegression
maven parameter. So to execute them, run mvn test -pl tests/e2e -DvisualRegression
.
Workflow:
-
First, reference screenshots need to be made (or updated). To do that, run
npm run e2e-capture-reference
(you will probably have to update parameters, copy fileprotractor-params.template.js
toprotractor-params.js
and update parameters (URL, username, password, file path to some file which will be used for registering an application). -
Then you can run actual visual regression tests and if they fail, see
screenshots/diff
directory to diff images. I recommend manually starting them before making new publicly available build. I don't think it is good idea to have them in PR builder task, since they tend to have lot of false negatives and manual control of screenshots is required.