Note
|
This section is a continuation of Chapter-Two. |
Before we get started, all of the code used in this entire Mongo - section of this course is based on the code we’ve created in "Chapter-2: Creating a Rest API application".
We are doing this so we can see and easily compare how an application can leverage either NoSQL or SQL in implementing similar features with NestJS.
By fracturing these NoSQL and SQL sections, you can easily choose and watch whichever section you need for your project.
If you save your code form Chapter-2. Make sure to switch back to that branch, if you want to code along with us.
Docker is a platform for developers to: Build, Run, and Share applications that are within containers. The use of containers to deploy applications is called containerization. Containerization has become increasingly popular over years due to many benefits they bring to the development process.
Let’s go over a few key benefits. Container are:
-
Flexible,
-
Lightweight,
-
Portable, meaning you can build them locally, deploy to the cloud and run anywhere.
-
Loosely - Coupled, this means that containers are highly self-sufficient and encapsulated. Allowing them to be replaced or upgraded, without disrupting any other containers.
There are many other great benefits to containerization and Docker, but hopefully you get the idea so far.
Fundamentally a container is, just a running process with some added encapsulation features applied to it. This helps keep the container isolated from the host and other container.
In this course, we’ll be using docker to set up a database locally on your machine. If you do not have Docker installed already, go ahead and pause in second and check out installation instructions on the official docker - website.
In addition to Docker, we’ll be utilizing Docker Compose.
"docker-compose"
is, a tool for defining and running multiple container
Docker applications. With Docker - Compose, you can use a YAML - file to
configure our application - services. If you’re not familiar with YAML,
don’t worry, we’ll briefly cover how we’ll be setting up the file, but it’s
nothing too intimidating.
Once we have all this things set up, with just single command, we can create and start all the services from our configuration on any machine. Docker is going to make working with our application that much simpler in the future, not just for us, but also for other developers, different machines, and can even help us quickly set up our application in the cloud.
For Mac and Windows users, Docker - Compose is includes out of the box with Docker. But if you’re on Linux, there are a few additional steps you’ll need to do in order to get Docker - Compose installed[1].
Make sure to check the Docker - Compose documentation[2] for more information.
All right, so now we now had Docker installed, make sure that you have it running on you machine before continuing.
Until now we’ve created a mock data resource for our "CoffeesService". Let’s take it up a notch and use a real database so we can really take our application to the next level.
In this chapter we are using MongoDB, which is a popular "document oriented NoSQL" database.
In the past we may have gone to the MongoDB website and installed the database locally on our machines. But let’s take advantage of Docker to handle all of this for us.
Let’s use the "docker-compose" - tool and it’s "YAML" - file to set everything up that our application needs.
First let’s create a "docker-compose.yml" - file in the root directory. Then let’s define a database container in YAML - format.
YANL is an interseting configuration file in that "space indentation" and "dashes" all matter and do something. If you want to learn more about YAML. Please check out yaml.org
# docker-compose.yml
version: "3"
service:
db:
image: mongo
restart: always
ports:
- 27017:27017
environment:
MONGODB_DATABASE: nest-course
There is a lot here. If you look down below the course video, you will be able to copy what you need for this file.
The most important pieces here are, that we have "db"
- service, that is
using the Docker "image" of "Mongo".
A "Docker - images" is, just a multilayered file, that will execute code within our "Docker - container". In this case it will be creating a "Mongo" - database.
Next up we have "ports". We can see that we’ll be using the default port for
MongoDB of "27017"
, but notice the colon " : "
and the same port again.
This indicate to Docker, that internally within the "container", it should have
the database setup on "PORT 27017"; and also we have it accessible outside of
Docker on the same port as well.
This actually lets us be able to access this database, that’s inside a "Docker - container", outside the "container" on our current machine.
Lastly we setup a name" for our database which we are calling `"nest-course"*.
I know all of that might seem like a lot but with all of that in place. We can
use the "docker-compose" - CLI to run the "db" - service
effortlessly
- with one command.
Let’s bring our container up using "docker-compose". In our terminal let’s enter:
docker-compose up -d Creating network "iluvcoffe_default" with the default driver Creating iluvcoffe_db_1 ... done
The "-d"
- flag means that we wnat to run "containers" in detached mode.
Meaning that they are running in the background.
We only have one "service* listed in our "docker-compose.yml" - file but for a future reference: If you have other "services" here and wanted to run specific one, you can pass the "name" of the service you want to run by entering the name of it.
docker-compose up db -d ~~~~
Just remember that when you pass nothing. "docker-compose" will spawn all of the services defined.
Nest itself is "database agnostic". Allowing you to easily integrate with any SQL or NoSQL — database of your choice.
There are a lot different ways you can integrate Nest with databases, and they all depend on your personal preferences or projects needs.
For this chapter we’ll use the most popular MongoDB - "object modelling tool" called Mongoose.
To get started with Mongoose. Let’s get necessary dependencies installed for our
application. We’ll need to install "mongoose"
itself, as well as
"@nestjs/mongoose"
which helps integrate the two.
$ npm i mongoose @nestjs/mongoose
Also for better "Type - safety". Let’s install Mongoose typescript definitions as a "dev" dependencies.
$ npm i -D @types/mongoose
We will continue ahead as we already have these installed. But if you are following along, just pause a second and comeback when it’s finished.
All right, so you can see we have Mongoose itself, along with "TypeScript
- definitions" for it, as well as the @nestjs/mongoose"
- package which helps
simplify Nest integration with Mongoose.
This package ships with a set of useful decorators and the "MongooseModule". Which allows us ti connect our application to Mongo effortlessly.
Once the installation process is complete. Let’s get Mongoose setup in our "iluvcoffe" - app.
Let’s open up our "AppModule" - file, and head over to the "imports:"
Array.
Adding "MongooseModule"
which accepts a database "connection URI" - String,
where our Mongo database located, and as a second argument a configuration
- Object that gets passed through to the Mongoose connect - method itself.
// app.module.ts
import { Module } from "@nestjs/common";
import { MongooseModule } from "@nestjs/mongoose";
import { AppController } from "./app.controller";
import { AppService } from "./app.service";
import { CoffeesModule } from "./coffees/coffees.module";
@Module({
imports: [CoffeesModule,
MongooseModule.forRoot("mongodb://localhost:27017/nest-course"),
],
controllers: [AppController],
providers: [AppService],
})
export class AppModule {}
You can see we are using the standard "MongoDB - URI" - scheme of
"mongodb://localhost"
(and the port we set up in "docker-compose.yml" - file,
which was "27017"
, followed by a slash " / "
and our MongoDB database name
which we set to be "nest-course"
).
We don’t need to pass any configuration options to Mongoose, so we are not going to pass the second - parameter.
We are all set.
Now let’s open up a terminal and start the application in development - mode.
$ npm run start:dev
[Nest] 3712922 - 04/18/2021, 7:33:48 PM [NestFactory] Starting Nest application...
[Nest] 3712922 - 04/18/2021, 7:33:48 PM [InstanceLoader] MongooseModule dependencies initialized +54ms
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[Nest] 3712922 - 04/18/2021, 7:33:48 PM [InstanceLoader] AppModule dependencies initialized +0ms
...
...
If we look carefully we should see "MongooseModule depend intiliazed"
,
meaning we have successfully established a connection to our "Docker Mongo"
- database
If you having any issues setting up "MongooseModule" here. Make sure that Docker
is runnig with $ "docker-compose up -d"
.
Also make sure the "db - name" inside your "MongooseModule.forRott()"
matches
what you have in your "docker-compose.yml" - file.
One of the most vital concepts in MongoDB is, the idea of "data - models".
These models are responsible for "Creating", "Reading", and "Deleting" - documents from the Mongo - database.
If you are coming from SQL background, one thing to remember about Mongo - databases is, that these documents are being stored in "collections", NOT "tables".
To create a Mongoose - Model. We first have to define a Schema definition for it. Every "schema" we create maps to a MongoDB - collections and defines the shape of the documents within that collection.
These "Schemas" ca be created with NestJS - decorators or even with Mongoose itself manually. Whatever works best for you.
In this course we’ll be using Nest - decorators to create a schemas. Since they are much easier to make, greatly reduced boilerplate, and improve overall code readability.
So let’s get started and create our first MongoDB - Model and Schema.
Let’s head over to our mock - "Coffee - Entity", and set it up NestJS - Mongoose.
// coffee.entity.ts
import { Schema } from "@nestjs/mongoose";
@Schema()
export class Coffee {
id: number;
name: string;
brand: string;
flavors: string[];
}
First, let’s add the new "@Schema()"
- decorator on top, making sure to import
it from "@nestjs/mongoose"
.
This "@Schema()"
- decorator maps our Coffee - class to our MongoDB
- collection of the same name, but we have an additional "s"
at the end.
So the final Mongo - collection name will be *"coffees"
.
Mongo automatically makes all "collections" plural and lowercase by default. So keep that in mind when naming your "Schema" - classes.
With that setup. Now we can start defining the properties of our "Coffee - Schema".
To define properties. All we need to do is use a "Prop()"
- decorator. Before
we use this new decorator. Let’s remove the "id"
- property form our class,
since we won’t need it anymore.
// coffee.entity.ts
import { Schema, Prop, SchemaFactory } from "@nestjs/mongoose";
import { Document } from "mongoose";
@Schema()
export class Coffee extends Document {
@Prop()
name: string;
@Prop()
brand: string;
@Prop([String])
flavors: string[];
}
export const CoffeeSchema = SchemaFactory.createForClass(Coffee)
Mongoose adds on underscore "_Id"
- property to Schemas by default and if we
want to stick with that standard best practice.
With "Id"
removed from our Schema, let’s add the @Prop()
- decorator to all
the rest of our properties, "name:", "brand:", and "flavors:".
Since "flavors" is an Array of Strings. We need to do something a little bit different since NestJS is not capable of reflecting the expected Type.
Inside this "@Prop()"
- decorator. Let’s pass an Array with String inside it,
to indicate that this property represents an Array of String.
Last but not least, we need to make sure we extend our Coffee - "Schema
Definition" with the "Document"
- class from the "mongoose"
- package.
With all in place, we created our first "Schema - Definition".
Now that we have our definitions set up. Let’s actually create the real "Mongoose - Schema".
To create actual Mongoose - Schema. Let’s "export" a "const variable" and call
it Uppercase "CoffeeSchema"
, and make it equal to
"SchemaFactory.createForClass()"
and pass it in the "Coffee - Schema
Definition" that we just create it.
Note
|
You can also generate a raw Schema Definition using the
DefinitionFactoryClass() from "@nestjs/mongoose"
|
This allows you to manually modify the "Schema - Definition", generated based on the metadata you provided.
This useful for certain edge-cases where it might be hard to represent everything with decorators.
All right, the last thing we need to do is wire everything up.
Since our application is now a bit modularized, and we are dealing with this "Schema" inside of the "CoffeesModule". We need to make Mongoose aware of the Schema inside of THIS "child - Module".
To do this. Let’s open up our "CoffeesModule", and create an "imports:[]"
Array.
// coffees.module.ts
import { Module } from "@nestjs/common";
import { MongooseModule } from "@nestjs/mongoose";
import { CoffeesController } from "./coffees.controller";
import { CoffeesService } from "./coffees.service";
import {Coffee, CoffeeSchema } from "./entities/coffee.entity";
@Module({
imports: [MongooseModule.forFeature([
{
name: Coffee.name,
schema: CoffeeSchema
}
])
],
controllers: [CoffeesController],
providers: [CoffeesService],
})
export class CoffeesModule {}
Adding "MongooseModule.forFeature()", passing in an Array with an Object inside.
[{name: Coffee.name, schema: Coffee.schema}]
.
We used "forFeature()"
to register Mongoose within our child - Module.
Previously we used "forRoot()"
in our main "AppModule", but we only do that
once. Every other Module in our application will use "forFeature()"
when
registering Schemas and Model based on them.
Inside of "forFeature()"
, we passed in an Array of Objects that consist of two
properties. "name:"` (name of the model), and "schema:"
(Schema to be used to
compiled the Model).
Note, that "Coffee.name"
, is just a way to get the function name from
a JavaScript - class, which in this case will give us the String of "coffee"
.
That’s all we need to do!.
Now we can fully interact with our new Coffees - "MongoDB - collection".
Our Mongoose - Model let us interact with MongoDB. With each Model representing a separate "collection".
In Mongo, and "instance of - Model" is called a "Document".
If you familiar with SQL database, it may help to think of "Document" as something similar to "rows".
There is a class from Mongoose called "Model", that acts as an abstraction over our data source - exposing a variety of useful methods for interacting with the "Document" stored in our database.
Since we’ve already registered "Coffee" in the scope of our "CoffeesModule".
"@nestjs/module"
automatically generate this "Model" - class that we can use,
simply by injecting it into the "CoffeesService".
// coffees.service.ts
import { Injectable, NotFoundException } from "@nestjs/common";
import { InjectModel } from "@nestjs/mongoose";
import { Model } from "mongoose";
import { Coffee } from "./entities/coffee.entity";
@Injectable()
export class CoffeesService {
...
...
constructor(@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>){}
...
...
}
Using the "@InjectModel()"
- decorator exported from the "@nestjs/mongoose"
- package. We can pass in our "Coffee.name"
to start utilizing it.
Let’s make sure to make our "coffeeModel"
of type Model
.
Previously in our "CoffeesService". We used the Coffees Array (Coffee[]
) as
our mock in-memory data source. Since we are using a real database now. We no
longer need this mock implementation. So let’s remove it.
Now let’s go inside each method one by one and use our new "coffeeModel"
.
Note
|
Before we dive into everything here. We’ll be using a JavaScript feature called "async/await" again, which makes "Promises" easier to manage and much more readable. |
We’ll to update most of our methods to be async in order for this to work.
Let’s start things off by looking at the "findAll()"
method.
// coffees.service.ts
import { Injectable, NotFoundException } from "@nestjs/common";
import { InjectModel } from "@nestjs/mongoose";
import { Model } from "mongoose";
import { Coffee } from "./entities/coffee.entity";
@Injectable()
export class CoffeesService {
constructor(@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>){}
findAll(){
return this.coffeeModel.find().exec();
}
...
...
}
First, let’s remove the current code inside the method and update it to use our
new "coffeeModel"
.
If we enter "this.coffeeModel. "
all available methods we have to interact
with our "Coffee - Document" now.
Since we are trying to find ALL Coffees. We are going to use the find()
- method without any arguments and then we have to execute this "Mongoose
- Query", by calling the "exec()"
- method.
Next, let’s update the code in our "findOne()"
method.
// coffees.service.ts
import { Injectable, NotFoundException } from "@nestjs/common";
import { InjectModel } from "@nestjs/mongoose";
import { Model } from "mongoose";
import { Coffee } from "./entities/coffee.entity";
@Injectable()
export class CoffeesService {
constructor(@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>){}
findAll() {
...
...
}
async findOne(id: string) {
const coffee = await this.coffeeModel.findOne({ _id: id }).exec()
if (!coffee) {
// throw new HttpException(`Coffee #${id} not found`, HttpStatus.NOT_FOUND);
throw new NotFoundException(`Coffee #${id} not found`);
}
return coffee;
// return this.coffees.find(item => item.id === +id);
}
...
...
}
This time using the "findOne()"
method from Mongoose; and since we are
searching by "id"
. Remember that Mongoose automatically created an "_id"
- property in our Schemas. So let’s search by that. Again, let’s make sure this
by calling the "exec()"
- method.
For our "create()"
- method, let’s replace the old code with the following.
// coffees.service.ts
import { Injectable, NotFoundException } from "@nestjs/common";
import { InjectModel } from "@nestjs/mongoose";
import { Model } from "mongoose";
import { Coffee } from "./entities/coffee.entity";
@Injectable()
export class CoffeesService {
constructor(@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>){}
findAll() {
...
...
}
async findOne(id: string) {
...
...
}
create(createCoffeeDto: any) {
const coffee = new this.coffeeModel(createCoffeeDto);
return coffee.save();
}
...
...
}
The first thing we need to do is "create" a Coffee - "class instance" based on
our partial DTO and save it to the variable "coffee"
.
Now all we need to do is call the "save()"
- method, which return
a "Promises".
Just like that. Our new Document will be saved to the database!.
An additional improvement we can do here is to replace "createCoffeeDto: any"
with the appropiate Type "CreateCoffeeDto", which will give us full "Type
- safety" here.
// coffees.service.ts
import { Coffee } from "./entities/coffee.entity";
...
...
create(createCoffeeDto: CreateCoffeeDto) {
// ~~~~~~~~~~~~~~
const coffee = new this.coffeeModel(createCoffeeDto);
return coffee.save();
}
...
...
Now onto our "update()"
- method, let’s replace our existing code with the
following.
// coffees.service.ts
import { Injectable, NotFoundException } from "@nestjs/common";
import { InjectModel } from "@nestjs/mongoose";
import { Model } from "mongoose";
import { Coffee } from "./entities/coffee.entity";
import { CreateCoffeeDto } from "./dto/create-coffee.dto";
@Injectable()
export class CoffeesService {
constructor(@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>){}
findAll() {
...
...
}
async findOne(id: string) {
...
...
}
create(createCoffeeDto: CreateCoffeeDto) {
...
...
}
async update(id: string, updateCoffeeDto: any) {
const existingCoffee = await this.coffeeModel
.findOneAndUpdate({ _id: id}, { $set: updateCoffeeDto }, { new: true })
.exec()
if (!existingCoffee) {
throw new NotFoundException(`Coffee with #${id} not found`);
}
return existingCoffee;
}
...
...
}
So let’s break this down so we understand what’s happening here.
First, we are going to use a new method on our "coffeeModel" and call
"findOneAndUpdate()"
. This method accepts a few different parameters,
-
The first parameter being the
"find"
- query. We want to find our "Coffee" based on the"_id"
. -
The second parameter is, the
"MongooseUpdateQuery"
- Object (which can do a lot of different things). In our case, we’re trying to update our Coffee so let’s use"$set:"
, and pass in our"updateCoffeeDto"
. -
The third parameter lets us set how Mongoose runs a
"Find"
after the process has been executed. In our case, we want to return the NEW and recently updated Coffee. So we are going to pass in an Object with"{new: true}"
. Had we not done this, we would receive the ORIGINAL Coffee - Object (prior to our update).
After all of that. Once again and we need to make sure to execute our command.
So let’s call the "exec()"
- method, and that’s it.
There are a lot of MongoDB nuances here, so make sure to look at the function definitions and intellisence provided by Mongoose to get an idea of what your options are and to help you quickly find out what you are looking for.
We save our "existing Coffee" into our variable (existingCoffee
) so we can
test to make sure we got something back from database. If NOT - we need to make
sure and throw a "NotFoundException"
here.
Just like we did in the "create()"
- method a moment ago, we can also replace
that Type "any"
to use our "UpdateCoffeeDto"
.
// coffees.service.ts
...
...
import { UpdateCoffeeDto } from "./dto/update-coffee.dto";
...
...
async update(id: string, updateCoffeeDto: UpdateCoffeeDto) {
// ~~~~~~~~~~~~~~~
const existingCoffee = await this.coffeeModel
.findOneAndUpdate({ _id: id}, { $set: updateCoffeeDto }, { new: true })
.exec()
if (!existingCoffee) {
throw new NotFoundException(`Coffee with #${id} not found`);
}
return existingCoffee;
}
...
...
So that we are fully Type - safe here as well.
That’s it for our "update()"
method.
Next we have the "remove()"
- method, let’s replace our existing code with the
following.
// coffees.service.ts
import { Injectable, NotFoundException } from "@nestjs/common";
import { InjectModel } from "@nestjs/mongoose";
import { Model } from "mongoose";
import { Coffee } from "./entities/coffee.entity";
import { CreateCoffeeDto } from "./dto/create-coffee.dto";
@Injectable()
export class CoffeesService {
constructor(@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>){}
findAll() {
...
...
}
async findOne(id: string) {
...
...
}
create(createCoffeeDto: CreateCoffeeDto) {
...
...
}
async update(id: string, updateCoffeeDto: any) {
...
...
}
async remove(id: string) {
const coffee = await this.findOne(id)
return coffee.remove();
}
}
This method is much simpler since we all need to do is call our existing
"findOne()"
- method here in "CoffeesService" to retrieve the "Document"
from the database.
Afterwards, we can call the "remove()"
- method on it, which returns
a "Promise".
We don’t need to throw any "errors - manually" here. If Coffee does not exist,
because the findOne()
- method will automatically handle that for us.
Now that we are all set, making all the necessary changes to our "CoffeesService". Let’s get over to our "CoffeesController" and change a few quick things.
Most importantly, we need to change the "id"
- Types that are a Number to
a String. We need to do this because "_id"'s
in Mongo - Document are Strings
not Numbers.
// coffees.controller.ts
...
...
@Controller("coffees")
export class CoffeesController {
constructor(private readonly coffeesService: CoffeesService) {}
...
...
@Get(":id")
findOne(@Param("id") id: number) {
// ~~~~~~
console.log("GET ===>", typeof id);
return this.coffeesService.findOne(id);
}
...
...
}
If we didn’t change the method signature here, our "ValidationPipe"
would try
to automatically cast any incoming Object "id"
to a Number. Leading to lots of
random errors.
With all of that in place. We managed to setup "entire CRUD" - workflow for our "coffeeModel", in just a few lines of code.
Let’s fire up insomnia and test some of these Routes to make sure everything still works.
Let’s make a few "POST - Requests" to create some entries in our "Coffee - Collection"
// request: 'POST - http://localhost:3002/coffees'
// Body - raw: JSON
{
"name": "Salemba Roast#1",
"brand": "Salemba Brand",
"flavors": ["chocolate", "vanilla"]
}
// response, 201 - CREATED
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "608051b70900c88889c42c2f",
"name": "Salemba Roast#1",
"brand": "Salemba Brand",
"__v": 0
}
/* -------------------- */
// request: 'POST - http://localhost:3002/coffees'
// Body - raw: JSON
{
"name": "Salemba Roast#2",
"brand": "Salemba Brand",
"flavors": ["chocolate", "vanilla"]
}
// response, 201 - CRAETED
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "6080529494b7b58dbc6d233f",
"name": "Salemba Roast#2",
"brand": "Salemba Brand",
"__v": 0
}
/* -------------------- */
// request: 'POST - http://localhost:3002/coffees'
// Body - raw: JSON
{
"name": "Salemba Roast#3",
"brand": "Salemba Brand",
"flavors": ["robusta", "tubruk"]
}
// response, 201 - CREATED
{
"flavors": [
"robusta",
"tubruk"
],
"_id": "608052e494b7b58dbc6d2340",
"name": "Salemba Roast#3",
"brand": "Salemba Brand",
"__v": 0
}
Now that we have a few entries in our collection.
Let’s make a GET - Request to "/coffees"
and see if we get data back from the
database.
// request: 'GET - http://localhost:3002/coffees'
// Body - raw: JSON
{}
// response, 200 - OK
[
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "608051b70900c88889c42c2f",
"name": "Salemba Roast#1",
"brand": "Salemba Brand",
"__v": 0
},
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "6080529494b7b58dbc6d233f",
"name": "Salemba Roast#2",
"brand": "Salemba Brand",
"__v": 0
},
{
"flavors": [
"robusta",
"tubruk"
],
"_id": "608052e494b7b58dbc6d2340",
"name": "Salemba Roast#3",
"brand": "Salemba Brand",
"__v": 0
}
]
Great. Everything worked so far!.
Next, let’s test the PATCH - endpoint and update the first "Coffee" by
changing the endpoint we want to hit to "/coffees/id"
"Object - id", and let’s
just pass an Object with a different "name:"
in the "Body - Payload".
// request: 'PATCH - http://localhost:3002/coffees/608051b70900c88889c42c2f'
// Body - raw: JSON
{
"name": "UPDATED COFEE Roast#1",
}
// response, 200 - OK
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "608051b70900c88889c42c2f",
"name": "UPDATED COFEE Roast#1",
"brand": "Salemba Brand",
"__v": 0
}
It worked!.
Our PATCH - Request actually returned the newly updated entity back from the database for us.
Just to make sure it really did make those changes, let’s do a GET - Request for
"/coffees/{object_id}"
to double check.
// request: 'GET - http://localhost:3002/coffees/608051b70900c88889c42c2f'
// Body - raw: JSON
{}
// response, 200 - OK
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "608051b70900c88889c42c2f",
"name": "UPDATED COFEE Roast#1",
"brand": "Salemba Brand",
"__v": 0
}
Perfect!. It really did update the "name:"
of that Coffee!.
Lastly, let’s make sure DELETE works and delete this item.
// request: 'DELETE - http://localhost:3002/coffees/608051b70900c88889c42c2f'
// Body - raw: JSON
{}
// response, 200 - OK
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "608051b70900c88889c42c2f",
"name": "UPDATED COFEE Roast#1",
"brand": "Salemba Brand",
"__v": 0
}
If we switch insomnia
to DELTE for the same "_id"
and push send, everything
OK.
Let’s do one more GET - Request to make sure it’s gone.
// request: 'GET - http://localhost:3002/coffees/608051b70900c88889c42c2f'
// Body - raw: JSON
{}
// response, 404 - NOT FOUND
{
"statusCode": 404,
"message": "Coffee #608051b70900c88889c42c2f not found",
"error": "Not Found"
}
It worked. We get "404 - NOT FOUND"
.
The item is really gone now.
Just like that, we have everything working perfectly. We are using a real database now, and we’ve managed to hit a lot of the fundamental aspects of Mongoose and integrating it with our Nest - application.
In a previous Chapter 2 lesson, we set up our GET - endpoint to take in pagination parameters in our "CoffeesController", but we didn’t actually implement them especially within the context of Mongoose.
// coffees.controller.ts
...
...
@Controller("coffees")
export class CoffeesController {
constructor(private readonly coffeesService: CoffeesService) {}
findAll(@Query() paginationQuery) {
// ~~~~~~~~~~~~~~~
const { limit, offset } = paginationQuery;
return this.coffeesService.findAll();
}
...
...
}
Let’s take a look at how we can apply "limit"
and "offset"
to our
"CoffeesService" - "findAll()"
- method, and retrieve THAT selection of
Coffees from the database.
First Let’s generate a class for our pagination "Query - DTO". We haven’t done
it before because the "paginationQuery"
- paramter we used merely for
demonstration purposes.
So using the CLI, let’s create it by entering
$ nest g class common/dto/pagination-query.dto --no-spec
CREATE src/common/dto/pagination-query.dto.ts (35 bytes)
Note
|
We generated this class in a new directory named "/common". Where we can keep things that are not tied to any specific domain and can be reused by multiple Controllers. |
Let’s open up this new "PaginationQueryDto".
// pagination-query.dto.ts
import { Type } from "class-transformer";
import { IsOptional, IsPostive } from "class-validator";
export class PaginationQueryDto {
@IsOptional()
@IsPostive()
@Type(() => Number)
limit: number;
@IsOptional()
@IsPostive()
@Type(() => Number)
offset: number;
}
First let’s set two properties: "limit:"
and "offset:"
. Bit which are
of Type Number.
Now let’s adds a few decorators to both of these properties. First being the
"@Type()"
- decorator, which comes from the "class-transformer"
- package.
On this line we are making sure that the value coming in, is parsed to a "Number".
You might remember from previous lessons, we said that "queryParams"
are sent
through the networks as Strings. This helps solve that automatically for us.
The next decorator we will use is "@IsOptional"
- decorator is from
"class-validator"
.
As the name implies, this decorator marks this property as "optional", meaning that no errors will be thrown if it is missing or undefined.
Lastly, we’ve added the "@IsPostive"
- decorators. Also from
"class-validator"
, which checks if the value is a "positive number" greater
than zero.
One thing to remember about that first "@Types()"
- decorator we added is, we
could alternatively enable the implicit "Type - conversion" on a global
level instead of manually doing this.
By adding the "transformOptions:{}"
- Object to our "ValidationPipe"
with
"enableImplicitConversion"
set to "true"
.
// main.ts
...
...
async function bootstrap() {
const app = await NestFactory.create(AppModule);
app.useGlobalPipes(
new ValidationPipe({
whitelist: true,
transform: true,
forbidNonWhitelisted: true,
transformOptions: { // <<<
enableImplicitConversion: true, // <<<
}
}),
);
await app.listen(3002);
// console.log("app is run on port: 3002");
}
bootstrap();
With these option set in our "ValidationPipe"
. We NO longer have to explicitly
specify Types with the "@Types()"
- decorator.
So let’s use this, and remove the "@Types()"
- decorator from our
"PaginationQueryDto".
import { Type } from "class-transformer";
import { IsOptional, IsPostive } from "class-validator";
export class PaginationQueryDto {
@IsOptional()
@IsPostive()
limit: number;
@IsOptional()
@IsPostive()
offset: number;
}
Now that we have all that in place. Let’s navigate to the "CoffeesController"
and give that "paginationQuery" - parameter, our new DTO - Type.
"`paginationQuery: PaginationQueryDto"` in the "findAll()"
- method signature.
// coffees.controller.ts
...
...
import { PaginationQueryDto } from "../common/dto/pagination-query.dto";
@Controller("coffees")
export class CoffeesController {
constructor(private readonly coffeesService: CoffeesService) {}
findAll(@Query() paginationQuery: PaginationQueryDto) {
// ~~~~~~~~~~~~~~~~~
const { limit, offset } = paginationQuery;
return this.coffeesService.findAll(paginationQuery);
}
...
...
}
Then, let’s make sure we are passing this "paginationQuery"
- argument to the
"CoffeesService.findAll(/here/)"
- method. We’ll need it for Mongoose.
Let’s jump into this "findAll()"
- method in our "CoffeesService"` and update
the input arguments there as well.
// coffees.service.ts
...
...
import { PaginationQueryDto } from "../common/dto/pagination-query.dto";
@Injectable()
export class CoffeesService {
constructor(@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>) {}
findAll(paginationQuery: PaginationQueryDto) {
// ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
const { limit, offset } = paginationQuery;
return this.coffeeModel.find().skip(offset).limit(limit).exec();
// ~~~~~~~~~~~ ~~~~~~~~~~~
}
...
...
}
Great, so we are using real DTO’s now and we are passing those "limit"
and
"offset"
- values down.
Now let’s update our Mongoose "coffeeModel.find()"
- method to utilze them.
Mongoose allows you to chain method calss, so right after our "find()"
, we can
still manipulate our "Query" before it’s actually executed.
If you push " . "
after "find()"
, you will see that intellisence offers
a variety of other method we could use. But in our case we are looking for
"skip()"
and "limit()"
.
"skip()"
of course means, how many items should be offset, and "limit()"
mans how many items should be returned.
Just like that, we’ve set up REAL pagination with our Mongo - database.
So let’s open up insomnia
and test this out, and see if it really works.
Let’s call our /coffees
- GET - endpoint, but this time let’s make sure we
pass in some "Query - parameters".
First let’s pass in ?limit=1
in our URL - Query. This should only return
1 result if everything goes well. Let’s push SEND, and try it out.
// request: 'GET - http://localhost:3002/coffees?limit=1'
// Body - raw: JSON
{}
// response, 200 - OK
[
{
"flavors": [
"chocolate",
"vanilla"
],
"_id": "6080529494b7b58dbc6d233f",
"name": "Salemba Roast#2",
"brand": "Salemba Brand",
"__v": 0
}
]
We received just ONE - Coffee back from our database.
Now let’s remove "limit=1"
and try "offset=1"
instead, and push SEND again.
// request: 'GET - http://localhost:3002/coffees?offset=1'
// Body - raw: JSON
{}
// response, 200 - OK
[
{
"flavors": [
"robusta",
"tubruk"
],
"_id": "608052e494b7b58dbc6d2340",
"name": "Salemba Roast#3",
"brand": "Salemba Brand",
"__v": 0
}
]
Just like that, we skipped that first - Coffee we just saw and got ALL the rest back from the database!.
Now Let’s try if we set "offset=0"
, and push SEND again,
// request: 'GET - http://localhost:3002/coffees?offset=0'
// Body - raw: JSON
{}
// response, 400 - BAD REQUEST
{
"statusCode": 400,
"message": [
"offset must be a positive number"
],
"error": "Bad Request"
}
Great, our exception works.
Let’s say that a new business requirement comes in for our application, and the Product team want users to have the ability to "recommend" different Coffees, and whenever that occurs - we need to add a new Event to the database that can be used later for data analytics purposes.
So we are going to need two things here.
-
First We have to provide "new - endpoint" that allows users to recommend coffees.
-
Second, we are going to need to store the Event -AFTER- the previous call finishes.
In order for this whole process to succeed. We need BOTH operations to be successful!. Otherwise we may have inconsistencies in our database. This is where the "Transactions" come in.
A "Database - transaction" symbolizes a unit of work performed within a "database management system" (DBMS). Transactions are reliable way to accomplish multiple task independent of other transactions.
Note
|
There are other advanced techniques which leverage Nest scoped Provides and Interceptors to automatically wrap every "write" - Query in the transaction. We’ll cover those techniques in the later advance Nest - courses in the future. |
For now let’s keep it simple so we can conceptually understand everything first.
Before we get started let’s create this new "Event - entity" that we use to signify Event that occurs in our application.
Let’s create this new Entity with the CLI and put in in "event/"
- folder. In
real world application we would want to create and wrap this entire domain, in
a Nest - Module, so that everything pertaining to Event’s: such as Services,
Controllers, etc, are isolated.
But for now we’ll skip that as we don’t need it at the moment.
Just a reminder that we are passing the "--no-spec"
again, so that CLI does
not generate a test file, as we won’t need them for Entities.
$ Nest g class events/entities/event.entity --no-spec
CREATE src/events/entities/event.entity.ts (28 bytes)
Now, let’s open our newly generated file and define corresponding properties.
Also - and this one is important, let’s make sure that the name of our class is "Event" WITHOUT the "entity" - suffix. As we don’t want word "entity" in our database "collection" name.
// event.entity.ts
import { Prop, Schema } from "@nestjs/mongoose";
import * as mongoose from "mongoose";
@Schema()
export class Event extends mongoose.Document {
@Prop()
type: string;
@Prop()
name: string;
@Prop(mongoose.SchemaTypes.Mixed)
payload: Record<string, any>;
}
export const EventSchema = SchemaFactory.createForClass(Event)
For our Event - Schema. Let’s keep it simple and add a few basic properties:
"types"
, "names"
, and "payload"
.
"payload"
is a generic property where we ill store Event - Payloads. These
payloads will essentially be a Dictionary filled with values of "any"
- Type.
For our Schema. We are going to use the "@Prop()"
- decorator, we’ve already
seen. But with one new additions here for "payload:"
.
Since this "payload:"
has values of "any" - Types. Mongoose offers a "Mixed"
(Mongoose.SchemaTypes.Mixed
) Type we can use here. That essentially means
"anything goes".
Use this techniques as last resort and whenever possible try to have your Type’s always be known.
Lastly, we need to make sure we extend our Event - Schema Definition with the
"Document" - class fro the "monoose"
package.
Now that we have our class set up. Let’s create a real Mongo - Schema.
For this, let’s import the "SchemaFactory"
- class from @nestjs/mongoose"
and let’s call the `"createForClass()" - mehod as we did in previous lessons to
generate a "Schema based" on our Events - class.
Just like when we made our CoffeeSchema. Let’s not forget to add "Event"
to the
"MongooseModule.forFeature([])"
- array, in the "CoffeesModule".
// coffees.module.ts
...
...
import { Event, EventSchema } from "../events/entities/event.entity";
@Module({
imports: [
MongooseModule.forFeature([
{
name: Coffee.name,
schema: CoffeeSchema,
},
{
name: Event.name, // <<<
schema: EventSchema, // <<<
},
]),
],
controllers: [CoffeesController],
providers: [CoffeesService],
})
export class CoffeesModule {}
Next, let’s make sure we add this new "recommendations" - property we talked about to our "CoffeeSchema" - Definition
// coffees.entity.ts
import { Schema, Prop, SchemaFactory } from "@nestjs/mongoose";
import { Document } from "mongoose";
@Schema()
export class Coffee extends Document {
@Prop()
name: string;
@Prop()
brand: string;
@Prop({ default: 0 }) // <<<
recommendations: number; // <<<
@Prop([String])
flavors: string[];
}
export const CoffeeSchema = SchemaFactory.createForClass(Coffee);
Great, now that we have our "Event" and "Coffee" - classes set up. Let’s get back to our "CoffeesService".
In order to create "Transactions". We are going to use the "Connection"
- Object from "mongoose"
. Let’s inject "connection"
into our
"CoffeesService" - constructor using "@InjectConnection()"
- decorator
exported from the "@nestjs/mongoose"
- package.
// coffee.service.ts
...
...
import { InjectModel, InjectConnection } from "@nestjs/mongoose";
import { Model, Connection } from "mongoose";
import { Event } from "../events/entities/event.entity";
...
...
@Injectable()
export class CoffeesService {
constructor(
@InjectModel(Coffee.name) private readonly coffeeModel: Model<Coffee>,
@InjectConnection() private readonly connection: Connection, // <<<
@InjectModel(Event.name) private readonly eventModel: Model<Event>, // <<<
) {}
...
...
}
For our second dependencies. Let’s inject the "Event" - Model we just made
again, using "@InjectModel()"
as we did in previous lessons.
Now, let’s use the "Connection" - Object to create our first Transaction!.
Let’s start by creating a "new" - method in our service. "async
recomendCoffee(coffee: Coffee)"
with one parameter, Coffee - instance.
// coffee.service.ts
...
...
import { InjectModel, InjectConnection } from "@nestjs/mongoose";
import { Model, Connection } from "mongoose";
import { Event } from "../events/entities/event.entity";
...
...
@Injectable()
export class CoffeesService {
constructor(
...
...
) {}
...
...
async recommendCoffe(coffee: Coffee) {
const session = await this.connection.startSession();
session.startTransaction();
try {
coffee.recommendations++;
const recommendEvent = new this.eventModel({
name: "recommend_coffee",
type: "coffee",
payload: { coffeeId: coffee.id },
});
await recommendEvent.save({ session });
await coffee.save({ session });
await session.commitTransaction();
} catch (err) {
await session.abortTransaction();
} finally {
session.endSession();
}
}
}
Inside our new - method, we can see we are first creating a Mongo - "Session". Once we have it established, we can start the Transaction process.
We want to wrap our entire Transaction - code in a "try/catch/finally"
- block
just to make sure if anything goes wrong, our "catch" - statement can roll back
the entire Transaction!.
Inside our "try{}"
- block statement, you can see we are increasing the
"recommendations"
- properties on our Coffee, and creating a new
"recommend_coffee"
- Event, and then using the "save()"
- method on BOTH
documents, to save -all- of our changes to the database.
In our "catch{}"
- block statement here, if anything goes wrong saving
-EITHER- Document, we are "catching" the error here to prevent
inconsistencies in our database by rolling back the entire - Transaction.
Lastly, the "finally{}"
- block statement is, used to make sure we end or
close the "session"
once everything has finished.
That’s it!. Just like that, we can create Transaction - statement that can help us achieve multiple actions to our database, ensuring they only happen if everything is successful!.
Indexes are special lookup tables that our database search engine can use to speed up data retrieval. Without Indexes, Mongo must perform as "Collection - scan", meaning it must scan every Document in a Collection, to select those Documents that match the Query - statement.
With Mongoose, we can define these indexes within in our Schema at either the "FIELD" - level or "SCHEMA" - level.
Let’s say in our application that a very common search request is retrieving an
Event based on its "name". To help speed up this search. We can defin an "\{
index: true\ }"
on the "name"
- property.
// event.entity.ts
import { Prop, Schema, SchemaFactory } from "@nestjs/mongoose";
import * as mongoose from "mongoose";
@Schema()
export class Event extends mongoose.Document {
@Prop()
type: string;
@Prop({ index: true }) // <<<
name: string;
@Prop(mongoose.SchemaTypes.Mixed)
payload: Record<string, any>;
}
export const EventSchema = SchemaFactory.createForClass(Event);
EventScema.index({ name: 1, type: -1 });
In more advanced cases. We may want to define "compound indexes" where a single index references "multiple properties".
We can do this by calling the "index()"
- method on the "EventSchema"
- Object ("EventSchema.index()"
); and passing the Object that consist of
fields that will form this index.
In this example, we passed a value of "1"
to "name:"
which specifies that
the "index" should order these items in an "Ascending" - order.
We passed "type:"
a value of "-1"
, which specifies that the index should
order these items in "Descending" - order.