TG
Development·26 min read

Continuing the GoBarber API

Let's continue the GoBarber application and learn some advanced techniques and best development practices.

Ler em português
Continuing the GoBarber API

These are my class notes from the Rocketseat Bootcamp.

Continuing the GoBarber API

Topics:

  • Lesson 18 - Configuring Multer
  • Lesson 19 - User Avatar
  • Lesson 20 - Listing service providers
  • Lesson 21 - Appointment migration and model
  • Lesson 22 - Booking a service
  • Lesson 23 - Appointment validations
  • Lesson 24 - Listing the user's appointments
  • Lesson 25 - Applying pagination
  • Lesson 26 - Listing the provider's schedule
  • Lesson 27 - Configuring MongoDB
  • Lesson 28 - Notifying new appointments
  • Lesson 29 - Listing user notifications
  • Lesson 30 - Marking notifications as read
  • Lesson 31 - Cancelling an appointment
  • Lesson 32 - Configuring Nodemailer
  • Lesson 33 - Configuring email templates
  • Lesson 34 - Configuring a queue with Redis
  • Lesson 35 - Monitoring queue failures
  • Lesson 36 - Listing available time slots
  • Lesson 37 - Virtual fields in appointments

Lesson 18 - Configuring Multer

Image upload

The user picks an image, the upload happens immediately, and the server returns the image ID. In the user registration JSON, for example, you send the image ID.

Using multer for file uploads.

When you need to send an image to the server, it has to be Multipart-data (Multipart Form) rather than json.

Installing multer:

yarn add multer

Create a folder outside src to store the images: tmp/uploads — inside the tmp folder create another folder uploads, where the physical uploaded files will live.

Create a multer.js config file inside config.

import multer from 'multer';
import crypto from 'crypto';
import { extname, resolve } from 'path';

export default {
  storage: multer.diskStorage({
	// Path where the file will be saved on the server machine
    destination: resolve(__dirname, '..', '..', 'tmp', 'uploads'),
    // Generating the image name as a hash using node's native crypto lib
    filename: (req, file, cb) => {
      crypto.randomBytes(16, (err, res) => {
        if (err) return cb(err);
        return cb(null, res.toString('hex') + extname(file.originalname));
      });
    },
  }),
};

Then create a route:

import multer from  'multer';
const upload =  multer(multerConfig);

const upload =  multer(multerConfig);

routes.post('/files', upload.single('file'), (req, res) => {
	return res.json({ ok:  true });
});

The route has to use the post method, and the request body must be multipart-form instead of json.

Then add a file attribute and attach the file to that attribute.

upload.single('file') means I'll send a single file inside the file property.

This multer lib also supports sending multiple files.

End: https://github.com/tgmarinho/gobarber/tree/aula18

Lesson 19 - User Avatar

Saving file information in the database

The req attribute exposes the file variable from file uploads, which stores some information about the uploaded file:

{
  "fieldname": "file",
  "originalname": "code-hoc.png",
  "encoding": "7bit",
  "mimetype": "image/png",
  "destination": "/Users/xxx/Developer/bootcamp_rocketseat_studies/gobarber/tmp/uploads",
  "filename": "1d05508938b533ef539026149c597ed5.png",
  "path": "/Users/xxx/Developer/bootcamp_rocketseat_studies/gobarber/tmp/uploads/1d05508938b533ef539026149c597ed5.png",
  "size": 115050
}

originalname: the original name of the file on the client machine that uploaded it. filename: the new name of the image that will be saved on the server.

To better handle file uploads, I'll create FileController.js, which contains the logic to save file upload references in the database.

To save the file data, let's create the files table in the database by creating a migration file.

yarn sequelize migration:create --name=create-files

And finish configuring:

module.exports = {
  up: (queryInterface, Sequelize) => {
    return queryInterface.createTable('files', {
      id: {
        type: Sequelize.INTEGER,
        allowNull: false,
        autoIncrement: true,
        primaryKey: true,
      },
      name: {
        type: Sequelize.STRING,
        allowNull: false,
      },
      path: {
        type: Sequelize.STRING,
        allowNull: false,
        unique: true,
      },
      created_at: {
        type: Sequelize.DATE,
        allowNull: false,
      },
      updated_at: {
        type: Sequelize.DATE,
        allowNull: false,
      },
    });
  },

  down: queryInterface => {
    return queryInterface.dropTable('files');
  },
};

And to generate the files table in the database according to the migration, just run in the terminal:

yarn sequelize db:migrate  

Then create the File Model:

import Sequelize, { Model } from 'sequelize';

class File extends Model {
  static init(sequelize) {
    super.init(
      {
        name: Sequelize.STRING,
        path: Sequelize.STRING,
      },
      {
        sequelize,
      }
    );

    return this;
  }
}

export default File;

And register the File Model in database/index.js:

...
import File from  '../app/models/File';
...
const models = [User, File];
...

Now update FileController.js to receive the upload request data and save the file references to the database:

import File from '../models/File';

class FileController {
  async store(req, res) {
    const { originalname: name, filename: path } = req.file;

    const file = await File.create({ name, path });

    return res.json(file);
  }
}

export default new FileController();

Now, the next time the file is uploaded, the table will be populated.

Linking the user to the avatar image (user <-> files)

To create the relationship we need to add the primary key from files to users.

For this, we'll create a migration to update these tables:

yarn sequelize migration:create --name=add-avatar-field-to-users

We add the avatar_id column to the users table, referenced by the files table on the ID attribute, which is the primary key of files. When undoing the migration, just drop the avatar_id attribute from users.

onUpdate: 'CASCADE': When the image is updated, it updates the user onDelete: 'SET NULL': When the avatar is deleted, sets avatar_id to null

module.exports = {
  up: (queryInterface, Sequelize) => {
    return queryInterface.addColumn('users', 'avatar_id', {
      type: Sequelize.INTEGER,
      references: { model: 'files', key: 'id' },
      onUpdate: 'CASCADE',
      onDelete: 'SET NULL',
      allowNull: true,
    });
  },

  down: queryInterface => {
    return queryInterface.removeColumn('users', 'avatar_id');
  },
};

Just run: yarn sequelize db:migrate to apply the change to the users table.

Next, you need to relate Users with Files inside the Users Model in the code.

Adding a method to associate the two entities:

Users.js:

...
static associate(models) {
	this.belongsTo(models.File, { foreignKey: 'avatar_id' });
}
...

And inside database/index.js, add another map to run the association (only on classes that have the associate method), passing the models to associate:

 models
      .map(model => model.init(this.connection))
      .map(model => model.associate && model.associate(this.connection.models));

Done — now, when saving the user, the association will happen and the ID provided in files will be set on users.

End: https://github.com/tgmarinho/gobarber/tree/aula19

Lesson 20 - Listing service providers

Let's create the listing of provider users — though it's the same User entity, we'll create a specific controller for the provider user type.

  • Created a new route for /providers
  • Created a new controller ProviderController.js
  • Fetched all users who are providers, including the relationship with Avatar, excluding unnecessary fields
  • Created a virtual url field to assemble the image URL
  • Aliased File as avatar in the response object
  • Allowed the server to serve files statically

End: https://github.com/tgmarinho/gobarber/tree/aula20

Lesson 21 - Appointment migration and model

  • Create the model and migration for the appointments table
  • Every time a user books an appointment with some service provider, a record will be generated in the appointments table
  • Relate the client user and the provider user in the appointments table
  • Register the Appointment model in the models array in database/index.js

End: https://github.com/tgmarinho/gobarber/tree/aula21

Lesson 22 - Booking a service

  • Create the booking route and controller
  • The client can only select a user who is a provider
  • Validate input data with Yup

End: https://github.com/tgmarinho/gobarber/tree/aula22

Lesson 23 - Appointment validations

  • Validate that the appointment date is in the future
  • Validate that the appointment date is not already taken for the same service provider
  • Allow scheduling only on the hour
  • Use DateFNS) to deal with dates
    • To install: yarn add date-fns@next
    • import { startOfHour, parseISO } from 'date-fns';
    • parseISO turns "2019-10-01T18:00:00-04:00" into a JS Date object
    • startOfHour drops minutes and seconds, returning only the hour. 18:35 becomes 18:00.
    • isBefore(x,y) checks whether the first argument's date is before the second's
  • Don't allow duplicate bookings for the same provider at the same hour.

End: https://github.com/tgmarinho/gobarber/tree/aula23

Lesson 24 - Listing the user's appointments

Show all appointments for the logged-in user, along with their service providers.

  • Create a new GET route in routes.js for the AppointmentController's index method.
  • Fetch all appointments of the logged-in user that aren't canceled, bringing in the provider user with their avatar. Ordered by date, returning only the id and date attributes of the appointment.
class AppointmentController {
  async index(req, res) {
    const appointments = await Appointment.findAll({
      where: {
        user_id: req.userId,
        canceled_at: null,
      },
      order: ['date'],
      attributes: ['id', 'date'],
      include: [
        {
          model: User,
          as: 'provider',
          attributes: ['id', 'name'],
          include: [
            {
              model: File,
              as: 'avatar',
              attributes: ['id', 'path', 'url'],
            },
          ],
        },
      ],
    });
    return res.json(appointments);
  }

End: https://github.com/tgmarinho/gobarber/tree/aula24

Lesson 25 - Applying pagination

  • In Insomnia, use the query tab to pass parameters: page = 1, 2, n...
  • Read the page: const { page = 1 } = req.query;
  • Define a limit for how many to bring per page, and an offset for the cut:

AppointmentController.js:

...
limit:  20,
offset: (page -  1) *  20,
...

Done — since I have few records, if I pass page: 1, it returns all records; with page: 2 it returns an empty array.

End: https://github.com/tgmarinho/gobarber/tree/aula25

Lesson 26 - Listing the provider's schedule

  • Show the provider's schedule in their dashboard
  • Create a new route for the provider's schedule
  • Create a new controller: ScheduleController.js
  • Check whether the logged-in user is a provider
  • Look up the schedule by date and do a parseISO
  • Fetch the logged-in provider's appointments that aren't canceled, where the date is between the start and end of the requested day.
import { startOfDay, endOfDay, parseISO } from 'date-fns';
import { Op } from 'sequelize';
import Appointment from '../models/Appointment';
import User from '../models/User';

class ScheduleController {
  async index(req, res) {
    const checkUserProvider = await User.findOne({
      where: {
        id: req.userId,
        provider: true,
      },
    });

    if (!checkUserProvider) {
      return res.status(401).json({ error: 'User is not a provider' });
    }

    const { date } = req.query;
    const parsedDate = parseISO(date);

    const appointments = await Appointment.findAll({
      where: {
        provider_id: req.userId,
        canceled_at: null,
        date: {
          [Op.between]: [startOfDay(parsedDate), endOfDay(parsedDate)],
        },
      },
      order: ['date'],
    });
    return res.json(appointments);
  }
}

export default new ScheduleController();

End: https://github.com/tgmarinho/gobarber/tree/aula26

Lesson 27 - Configuring MongoDB

  • Connect the application to a non-structural database, since we'll store some unstructured data.

  • Creating a Mongo container with Docker to download and configure:

docker run --name mongobarber -p 27017:27017 -d  -t mongo

To check if Mongo is running: http://localhost:27017/ Or run docker ps to see the running containers.

  • Install Mongoose as the ORM, similar to Sequelize for SQL:
yarn add mongoose
  • Using Mongoose

Let's initialize mongo inside database/index.js, just like postgres was initialized.

We create the mongo() function with the mongodb connection configuration. Since no username/password was created when building the container, we don't need them in the connection string — just the host. We pass the collection name, which is created as soon as the connection is established (no need to exist beforehand, unlike postgres/SQL).

import Sequelize from 'sequelize';
import mongoose from 'mongoose';
import User from '../app/models/User';
import File from '../app/models/File';
import Appointment from '../app/models/Appointment';
import databaseConfig from '../config/database';

const models = [User, File, Appointment];

class Database {
  constructor() {
    this.init();
    this.mongo();
  }

  init() {
    this.connection = new Sequelize(databaseConfig);

    models
      .map(model => model.init(this.connection))
      .map(model => model.associate && model.associate(this.connection.models));
  }

  mongo() {
    this.mongoConnection = mongoose.connect(
      'mongodb://localhost:27017/gobarber',
      {
        useNewUrlParser: true, // using a new format in the connection string
        useFindAndModify: true, // to be able to find and update records
        useUnifiedTopology: true, // DeprecationWarning showed up in the console so I'm using it, as recommended by mongo
      }
    );
  }
}

export default new Database();

End: https://github.com/tgmarinho/gobarber/tree/aula27

Lesson 28 - Notifying new appointments

  • We'll send a notification to the service provider every time they receive a new appointment, and for that we'll use MongoDB — adding notifications inside mongo.

  • Create mongo schemas, similar to table models.

  • MongoDB is Schema Free — one record in a Collection may have a field that another record doesn't. This differs from Tables, where if a record lacks a certain attribute it must have the field set to null; in mongo, the field/record doesn't even need to exist. That's why it's Schema Free and called NOSQL — no structure. Mongo also doesn't have migrations.

  • Notifications don't have much structure: they store the user's ID, and notifications don't need to relate to each other, nor do we need queries on this collection (entity).

  • Creating the Notification Schema:

app/schema/Notification.js:

import mongoose from 'mongoose';

const NotificationSchema = new mongoose.Schema(
  {
    content: {
      type: String,
      required: true,
    },
    user: {
      type: Number,
      required: true,
    },
    read: {
      type: Boolean,
      required: true,
      default: false,
    },
  },
  {
    timestamps: true, // Creates created_at and update_at fields automagically
  }
);

export default mongoose.model('Notification', NotificationSchema);

And in the controller we use the Notification schema to create a record in mongo:

...
import { startOfHour, parseISO, isBefore, format } from  'date-fns';
import pt from  'date-fns/locale/pt';
import Notification from  '../schemas/Notification';
...
...
 /**
     * Notify appointment provider
     */
    const user = await User.findByPk(req.userId);
    const formattedDate = format(
      hourStart,
      "'dia' dd 'de' MMMM', às' H:mm'h'",
      {
        locale: pt,
      }
    );
    await Notification.create({
      content: `Novo agendamento de ${user.name} para o ${formattedDate}`,
      user: provider_id,
    });
...

After creating the appointment, I create a notification and store it in the database. This data is unstructured — I dump the user name and date directly into content. I don't need to relate Notification to Users or Appointments and do tons of SQL joins, because the notification reflects the current state of the user; if they change their name later, that's not important for the notification at that moment. With mongodb you gain a lot of performance and convenience precisely by avoiding huge SQL queries, plus another advantage: we can write queries in JS for mongodb.

End: https://github.com/tgmarinho/gobarber/tree/aula28

Lesson 29 - Listing user notifications

  • Create a GET notifications route
  • Create the controller NotificationController.js:
import Notification from '../schemas/Notification';
import User from '../models/User';

class NotificationController {
  async index(req, res) {
    /**
     * Check if loggedUser is a provider
     */
    const checkIsProvider = await User.findOne({
      where: {
        id: req.userId,
        provider: true,
      },
    });

    if (!checkIsProvider) {
      return res
        .status(401)
        .json({ error: 'Only provider can load notifications' });
    }

    const notifications = await Notification.find({
      user: req.userId,
    })
      .sort({ createdAt: 'desc' })
      .limit(20);

    return res.json(notifications);
  }
}

export default new NotificationController();

End: https://github.com/tgmarinho/gobarber/tree/aula29

Lesson 30 - Marking notifications as read

  • Create a new PUT notifications route

Using mongo's findByIdAndUpdate to look up and update records. For this to work, the mongodb connection must have useFindAndModify: true set:

mongo() {
    this.mongoConnection = mongoose.connect(
      'mongodb://localhost:27017/gobarber',
      {
        useNewUrlParser: true,
        useFindAndModify: false, // Now I can use findByIdAndUpdate
        useUnifiedTopology: true,
      }
    );

Understanding the query:

 const notification = await Notification.findByIdAndUpdate(
      req.params.id,
      {
        read: true,
      },
      { new: true }
    );
  • findByIdAndUpdate = Look up and update the record
  • req.params.id = the mongo record id passed as a parameter in the frontend query
  • { read: true } is the value I want to change — by default it's false, and now I want to set it to true since it's been read.
  • { new: true } returns the updated notification into the const notification

Leandro VieiraToday at 2:42 PM

@Thiago Marinho I ran into this issue too — it really is deprecated, and the fix was to use "updateOne". You can pass the ID to filter (https://mongoosejs.com/docs/documents.html#updating). In a personal app example I used: await post.updateOne(req.params.id, { date: updatedDate, hidden: !req.body.hidden, }); hope it helps!

End: https://github.com/tgmarinho/gobarber/tree/aula30

Lesson 31 - Cancelling an appointment

  • The user can only cancel an appointment if it's two hours before the event
  • Create a DELETE appointment route taking the id.
    • routes.delete('/appointments/:id', AppointmentController.delete);
  • Create the delete method in AppointmentController.js:
...
import { startOfHour, parseISO, isBefore, format } from  'date-fns';
...
...
async delete(req, res) {
    const appointment = await Appointment.findByPk(req.params.id);

    if (appointment.user_id !== req.userId) {
      return res.status(401).json({
        error: "You don't have permission to cancel this appointment.",
      });
    }

    // subtract two hours from the scheduled date
    const dateWithSub = subHours(appointment.date, 2);
    const NOW = new Date();
    if (isBefore(dateWithSub, NOW)) {
      return res.status(401).json({
        error: 'You can only cancel appointment 2 hours in advance.',
      });
    }

    appointment.canceled_at = NOW;

    await appointment.save();

    return res.json(appointment);
  • Look up the appointment using the id passed in the query params
  • Check that the logged-in user owns the appointment
  • Subtract two hours from the scheduled date, since you can only cancel two hours before the event
  • Check whether that adjusted time is before the current time — if it is, return an error
  • Otherwise continue, setting canceled_at to the current date
  • Save the appointment and return it to the user

End: https://github.com/tgmarinho/gobarber/tree/aula31

Lesson 32 - Configuring Nodemailer

  • We'll send an email to the service provider when an appointment is cancelled.
  • Use the nodemailer) lib to send emails.
yarn add nodemailer

Email delivery services: - SendGrid - Mailgun - Amazon SES (Rocketseat Choice) - Sparkpost - Mandrill (Mailchimp) - Mailtrap (__DEV) // dev only

We'll use Mailtrap to send the email to a fake inbox — the email can even be a real address, but it never reaches the person's inbox. Ideal for development environments.

First step is to create an account at https://mailtrap.io/ and grab the configuration.

Create an email config file:

src/config/mail.js:

export default {
  host: 'smtp.mailtrap.io',
  post: 2525,
  secure: false,
  auth: {
    user: '109b42360028f1',
    pass: '907f3523d2a604',
  },
  default: {
    from: 'Equipe GoBarber <noreply@gobarber.com>',
  },
};

Then configure email sending using the data from mail.js to connect to the provider:

import nodemailer from 'nodemailer';
import mailConfig from '../config/mail';

class Mail {
  constructor() {
    const { host, port, secure, auth } = mailConfig;

    this.transporter = nodemailer.createTransport({
      host,
      port,
      secure,
      auth: auth.user ? auth : null,
    });
  }

  sendMail(message) {
    return this.transporter.sendMail({
      ...mailConfig.default,
      ...message,
    });
  }
}

export default new Mail();

Finally, use it in the controller:

...
import Mail from  '../../lib/Mail';
...

// Adjust the appointment lookup to include the user:
const appointment = await Appointment.findByPk(req.params.id, {
      include: [
        {
          model: User,
          as: 'provider',
          attributes: ['name', 'email'],
        },
      ],
    });
...
// Sending email
 await Mail.sendMail({
      to: `${appointment.provider.name} <${appointment.provider.email}>`,
      subject: 'Agendamento cancelado',
      text: 'Você tem um novo cancelamento',
   });

End: https://github.com/tgmarinho/gobarber/tree/aula32

Lesson 33 - Configuring email templates

  • Create email templates with html, css
  • Use a template engine: https://handlebarsjs.com/
  • Install libs to use handlebars:
    • yarn add express-handlebars nodemailer-express-handlebars

Basically, handlebars was integrated into Mail.js to declare where the template folders are and the file extension. And in AppointmentController.js, the cancellation.hbs template is used when sending the email.

For more details see the code.

End: https://github.com/tgmarinho/gobarber/tree/aula33

Lesson 34 - Configuring a queue with Redis

When we use routes that send email to the user, they take a bit longer to finish the request because they wait for the request to complete before returning a response. This is slow because it depends on an external email delivery service, which depends on the internet, etc.

I could remove async and make the request faster, since the email would be sent asynchronously too — but if something went wrong sending the email, I wouldn't be able to inform the user.

   Mail.sendMail({
    ...
    });

The best way to control this is with QUEUES — background jobs, running services in the background while still being able to send messages to the user.

We need a non-relational database that only stores key/value — no schemas or models. It's much more performant. We'll use Redis with Docker.

To configure https://redis.io/ on docker:

docker  run  --name redisbarber -p 6379:6379 -d  -t redis:alpine

The alpine version is very lightweight, coming with only the most essential Linux features.

Now let's install bee-queue, a background-jobs tool for node. It's simpler and doesn't have all the features others have, e.g. kue. But for this app it's enough. Kue is less performant but more robust. Bee Queue schedules jobs and retries email sending, which is necessary and sufficient for this app, so we chose this lib.

To install bee-queue:

yarn add bee-queue

Step-by-step coding for the email-sending queue configuration:

  • First configure Redis running on the machine alongside the application:

src/config/redis.js:

export default {
  host: '127.0.0.1',
  port: 6379,
};

I set the address of the server running Redis and the port for incoming requests.

  • Then I create the JOB: CancellationMail.js, responsible for sending the email when the user cancels an appointment, and I remove the email sending from AppointmentController.js:
import { format, parseISO } from 'date-fns';
import pt from 'date-fns/locale/pt';
import Mail from '../../lib/Mail';

class CancellationMail {
  get key() {
    return 'CancellationMail';
  }

  async handle({ data }) {
    const { appointment } = data;

    await Mail.sendMail({
      to: `${appointment.provider.name} <${appointment.provider.email}>`,
      subject: 'Agendamento cancelado',
      template: 'cancellation',
      context: {
        provider: appointment.provider.name,
        user: appointment.user.name,
        date: format(
          parseISO(appointment.date),
          "'dia' dd 'de' MMMM', às' H:mm'h'",
          {
            locale: pt,
          }
        ),
      },
    });
  }
}

export default new CancellationMail();

The Job needs a key (the job's name) and the Job itself — a function we call handle, which takes several params. In this case we destructure to grab only the data for the appointment that was sent when creating the JOB inside the Queue, which we'll see in the next snippet below. The rest of the code is what was previously in AppointmentController.js's delete method.

Then we create the Queue.js class, which represents the Queue — the class that manages initializing, adding to, and processing the queue.

import Bee from 'bee-queue';
import CancellationMail from '../app/jobs/CancellationMail';
import redisConfig from '../config/redis';

const jobs = [CancellationMail];

class Queue {
  constructor() {
    this.queues = {};
    this.init();
  }

  init() {
    jobs.forEach(({ key, handle }) => {
      this.queues[key] = {
        bee: new Bee(key, {
          redis: redisConfig,
        }),
        handle,
      };
    });
  }

  add(queue, job) {
    return this.queues[queue].bee.createJob(job).save();
  }

  processQueue() {
    jobs.forEach(job => {
      const { bee, handle } = this.queues[job.key];
      bee.process(handle);
    });
  }
}

export default new Queue();

We define a jobs const that holds an array of jobs — in our case only one job, but it could have more.

In the constructor we create a queues variable, an object that stores jobs in the queue, and we call init() to iterate the jobs array and add each one to the queue: key 'CancellationMail' with a value that is a Bee instance of the JOB (also receiving the same key and Redis config), plus the handle method inside the job, which in this case sends the email.

The add method takes the queue and the job. Queue is the process name in the queue and job is the object holding the data to be processed. The method creates inside the initialized queue the job itself, passing the values to be used during the job's execution (the appointment data in this case).

The processQueue method iterates the jobs and calls the queue for processing, passing the handle method that should be executed from inside the Job.

Now we use the Job and the Queue in AppointmentController.js:

Importing the code and replacing the email sending with the Queue instance, adding to the queue the job name and the data the job should process:

...
import CancellationMail from  '../jobs/CancellationMail';
import Queue from  '../../lib/Queue';
...
await Queue.add(CancellationMail.key, { appointment });
...

We can keep the job running in a separate Node instance on the server: Create a src/queue.js file:

import Queue from  './lib/Queue';
Queue.processQueue();

Create a script in package.json to run with sucrase since we're using import/export:

"queue":  "nodemon src/queue.js"

And to run it just call: yarn queue

To test, just try to cancel an appointment again.

End: https://github.com/tgmarinho/gobarber/tree/aula34

Lesson 35 - Monitoring queue failures

To listen to job processing errors and get a log, just change the code:

processQueue() {
    jobs.forEach(job => {
      const { bee, handle } = this.queues[job.key];
      bee.on('failed', this.handleFailure).process(handle);
    });
}

handleFailure(job, err) {
 console.log(`Queue ${job.queue.name}: FAILED`, err);
}

With that, every error that happens shows up in the server log; later we'll display it in a friendlier way.

The bee-queue docs also have other listeners — success, etc.

End: https://github.com/tgmarinho/gobarber/tree/aula35

Lesson 36 - Listing available time slots

Create a controller to show the provider's available time slots for a given day. I want to know all the provider's available slots for a given day.

  • Create a route:
routes.get('/providers/:providerId/available', AvailableController.index);

Create a controller AvailableController.js:

import {
  startOfDay,
  endOfDay,
  setHours,
  setMinutes,
  setSeconds,
  format,
  isAfter,
} from 'date-fns';
import { Op } from 'sequelize';
import Appointment from '../models/Appointment';

class AvailableController {
  async index(req, res) {
    const { date } = req.query;

    if (!date) {
      return res.status(400).json({ error: 'Invalid date' });
    }

    const searchDate = Number(date);

    // 2019-09-18 10:49:44

    const appointments = await Appointment.findAll({
      where: {
        provider_id: req.params.providerId,
        canceled_at: null,
        date: {
          [Op.between]: [startOfDay(searchDate), endOfDay(searchDate)],
        },
      },
    });

    const schedule = [
      '08:00', // 2019-09-18 08:00:00
      '09:00', // 2019-09-18 09:00:00
      '10:00', // 2019-09-18 10:00:00
      '11:00', // ...
      '12:00',
      '13:00',
      '14:00',
      '15:00',
      '16:00',
      '17:00',
      '18:00',
      '19:00',
    ];

    const available = schedule.map(time => {
      const [hour, minute] = time.split(':');
      const value = setSeconds(
        setMinutes(setHours(searchDate, hour), minute),
        0
      );

      return {
        time,
        // format to: 2019-09-18T15:40:44-04:00
        value: format(value, "yyyy-MM-dd'T'HH:mm:ssxxx"),
        available:
          isAfter(value, new Date()) &&
          !appointments.find(a => format(a.date, 'HH:mm') === time),
      };
    });

    return res.json(available);
  }
}

export default new AvailableController();

It will only return slots that don't have an appointment and where the desired value is after the current date (i.e., you can't book a slot that's already in the past).

I read from the request the provider ID and the day the user wants to see the available slots for.

This date comes as a timestamp string from the frontend datepicker component, so it needs to be converted to a Number.

Then I fetch all appointments for the given provider that aren't cancelled and whose date is between the first and last hour of the requested day.

I create a static table of fixed time slots.

And I do the rest of the logic, returning to the user the time slots in an object that returns:

 {
    "time": "15:00",
    "value": "2019-09-18T15:00:00-04:00",
    "available": false
  },
  {
    "time": "16:00",
    "value": "2019-09-18T16:00:00-04:00",
    "available": true
  },
  {
    "time": "17:00",
    "value": "2019-09-18T17:00:00-04:00",
    "available": true
  },

End: https://github.com/tgmarinho/gobarber/tree/aula36

Lesson 37 - Virtual fields in appointments

List extra fields so the frontend can show that an appointment already happened. For that we'll create a past virtual variable — a column that doesn't exist in the table, only in the Model.

In Appointment.js:

import Sequelize, { Model } from 'sequelize';
import { isBefore, subHours } from 'date-fns';

class Appointment extends Model {
  static init(sequelize) {
    super.init(
      {
        date: Sequelize.DATE,
        canceled_at: Sequelize.DATE,
        past: {
          type: Sequelize.VIRTUAL,
          get() {
            return isBefore(this.date, new Date());
          },
        },
        cancelable: {
          type: Sequelize.VIRTUAL,
          get() {
            return isBefore(new Date(), subHours(this.date, 2));
          },
        },
      },
      {
        sequelize,
      }
    );

    return this;
  }

  static associate(models) {
    this.belongsTo(models.User, { foreignKey: 'user_id', as: 'user' });
    this.belongsTo(models.User, { foreignKey: 'provider_id', as: 'provider' });
  }
}

export default Appointment;

And in the index method of AppointmentController.js, pass the past and cancelable attributes:

...
attributes: ['id', 'date', 'past', 'cancelable'],
...

Now in Insomnia just run the lookup and check that the fields are listed as true or false.

End: https://github.com/tgmarinho/gobarber/tree/aula37

Lesson 38 - Exception Handling

Let's handle the exceptions that happen in production — in the email queue, database queries, etc.

Looking at error logs on the server is messy, painful, and time-consuming. We'll use a friendlier tool with a nice UI that makes it easy to discover the error — instead of chasing the error, the error comes to us.

There are two tools to help with this:

We'll use Sentry, since it has very good integration with nodejs. The advantage of using this tool is that on every exception in the app we'll get a message on sentry.io, an email, and we can even integrate with slack to receive a message in some project group channel — and even auto-create an issue on github.

  • First step: create an account at https://sentry.io/.
  • Configure the project as Express or Node — since I'm using express, it's better to pick Node.
  • Then install the dependency in the project:
yarn add @sentry/node

and install https://www.npmjs.com/package/express-async-errors which is needed because async methods inside the controllers don't let express capture the exceptions and forward them to Sentry — installing and configuring this extension will make it work.

yarn add express-async-errors

And finally youch, a great tool for displaying error messages in a friendly, pretty way, either as json or html. As it describes itself: Pretty error reporting for Node.js 🚀

Then I create a config file storing the Sentry dsn:

export  default {
	dsn:  'https://xxxx_aqui_eh_meu_pega_o_seu_no@sentry.io/999999',
};

And in app.js we have to integrate Sentry and Youch into the app.

Note that the Sentry import has to be exactly as in the code below, and the express-async-errors import must come before the routes.

We have to initialize Sentry before the middlewares and routes, and we have to invoke exceptionHandler after the middlewares and routes.

And we have to call this code this.server.use(Sentry.Handlers.requestHandler()); before the routes and other middlewares.

All of this is in the Sentry docs.

exceptionHandler is an error-handling middleware — you can tell by the four parameters, the first being err. So express understands this middleware is an error handler. If any error happens in the app, this middleware is called and returns a 500 status using Youch to provide a richer message with a good UI.

import express from 'express';
import path from 'path';
import * as Sentry from '@sentry/node';
import Youch from 'youch';
import 'express-async-errors';
import routes from './routes';
import sentryConfig from './config/sentry';

import './database';

class App {
  constructor() {
    this.server = express();

    Sentry.init(sentryConfig);

    this.middlewares();
    this.routes();
    this.exceptionHandler();
  }

  middlewares() {
    this.server.use(Sentry.Handlers.requestHandler());
    this.server.use(express.json());
    this.server.use(
      '/files',
      express.static(path.resolve(__dirname, '..', 'tmp', 'uploads'))
    );
  }

  routes() {
    this.server.use(routes);
    this.server.use(Sentry.Handlers.errorHandler());
  }

  exceptionHandler() {
    this.server.use(async (err, req, res, next) => {
      const errors = await new Youch(err, req).toJSON();
      return res.status(500).json(errors);
    });
  }
}

export default new App().server;

End: https://github.com/tgmarinho/gobarber/tree/aula38

Lesson 39 - Environment variables

Create environment variables to protect sensitive data and let variables be configured per environment the app runs in.

We'll create an .env file and an .env.example.env must never be committed, it's specific to your environment; .env.example, as the name suggests, is an example of the variables that should be filled in. They're used in several files across the application. Non-sensitive data can stay in .env.example.

To use them we need to install dotenv, which loads variables from .env into nodejs's process.env.

For it to work we need to import the lib in the project's main file, app.js:

import 'dotenv/config';
...

and inside queue.js too.

require('dotenv/config');
...

.env.example file:

# create a .env and configure it for you environment

APP_URL=http://localhost:3333
NODE_ENV=development

# Auth

APP_SECRET=

# Database

DB_HOST=
DB_USER=
DB_PASS=
DB_NAME=

# Mongo

MONGO_URL=

# Redis

REDIS_HOST=127.0.0.1
REDIS_POST=6379

# Mail

MAIL_HOST=
MAIL_PORT=
MAIL_SECURE=false
MAIL_USER=
MAIL_PASS=
MAIL_FROM=

# Sentry

SENTRY_DSN=

Done — now just replace where these variables are used:

Example: src/config/database.js:

require('dotenv').config();

module.exports = {
  dialect: 'postgres',
  host: process.env.DB_HOST,
  username: process.env.DB_USER,
  password: process.env.DB_PASS,
  database: process.env.DB_NAME,
  define: {
    timestamps: true,
    underscored: true,
    underscoredAll: true,
  },
};

End: https://github.com/tgmarinho/gobarber/tree/aula39

We've reached the end of the application =)

Thiago Marinho

September 18, 2019 · Brazil