Initialize the

Realworld front-end

vue3-realworld-example-app (github.com)

nuxt-realworld: Nuxt & Composition API (github.com)

Realworld backend

RealWorld-API-in-Express (github.com)

Pull vuE3 or NUXT project locally

These two RealWorld projects demonstrate a full-fledged full-stack application built with VUE3, including CRUD operations, authentication, routing, paging, and more.

This is a blog-like website that provides basic functions such as login, registration, publishing, editing and displaying articles, data paging, tag list, etc., and then implements the back-end interface in the way of the copper drum GraphQL

Next, make changes based on learning ApolloServer’s project to get into development faster and start the server

nodemon index.js
Copy the code

Delete the original business code in index.js and add a foo query for testing

const { ApolloServer, gql } = require('apollo-server-express')
const express = require('express')
const http = require('http')


const typeDefs = gql` type Query { foo: String } `

const resolvers = {
  Query: {
    foo() {
      return "hello"}}}const app = express()

async function startApolloServer(typeDefs, resolvers) {
  const httpServer = http.createServer(app)
  const server = new ApolloServer({
    typeDefs,
    resolvers
  })
  await server.start()
  server.applyMiddleware({ app })
  await new Promise(resolve= > httpServer.listen({ port: 4000 }, resolve))
  console.log('🚀 Server ready at http://localhost:4000${server.graphqlPath}`)
}
startApolloServer(typeDefs,resolvers)
Copy the code

Modular project structure

The next step is to make a simple change to the project structure, modularizing the business code in order to have a clear structure after the complexity of the project

type-defs

Create a new type-defs/index.js, place the typeDefs code from index.js here, import the module and throw it

const { gql } = require('apollo-server-express')

const typeDefs = gql` type Query { foo: String } `
module.exports = typeDefs
Copy the code

resolvers

Create a new resolvers/index.js file, place the resolvers code in index.js here and throw it

const resolvers = {
  Query: {
    foo() {
      return "hello"}}}module.exports = resolvers
Copy the code

schema.js

Create a new schema.js in the root directory and import the makeExecutableSchema method of @graphqL-tools /schema, which is used to merge typeDefs and resolvers into one schema

Next we introduce type-defs/index.js and resolvers/index.js, which are called to get the schema and throw it

const { makeExecutableSchema } = require('@graphql-tools/schema')

const typeDefs = require('./type-defs')
const resolvers = require('./resolvers')

const schema = makeExecutableSchema({
  typeDefs,
  resolvers
})

module.exports = schema
Copy the code

index.js

Go back to index.js and import this schema, remove typeDefs and resolvers from ApolloServer, and replace them with this schema. Index.js looks like this:

const { ApolloServer } = require('apollo-server-express')
const express = require('express')
const http = require('http')
const schema = require('./schema.js')


const app = express()

async function startApolloServer(schema) {
  const httpServer = http.createServer(app)
  const server = new ApolloServer({
    schema
  })
  await server.start()
  server.applyMiddleware({ app })
  await new Promise(resolve= > httpServer.listen({ port: 4000 }, resolve))
  console.log('🚀 Server ready at http://localhost:4000${server.graphqlPath}`)
}
startApolloServer(schema)
Copy the code

If the server is started, the project is running properly

Query statements display and return results normally

config

Put the Mongoose connection configuration into config/config.default.js and throw it

module.exports = {
  dbConfig: 'mongo: / / Max: 123456 @192.168.35.108:27017 / realworld? ! authSource=admin&readPreference..... '.jwtSecret: '9f10a257-5fc8-4fb9-a4bd-f32d40f7918d'
}
Copy the code

models

Take the Model from the RealWorld backend project and import it here

Models /index.js is used to connect to the database and import the database model

const mongoose = require('mongoose')
const { dbConfig } = require('.. /config/config.default')

mongoose.connect(dbConfig)
const db = mongoose.connection

db.on('error'.console.error.bind(console.'connection error:'))
db.once('open'.async() = > {console.log("Database connection successful!")})module.exports = {
  User: mongoose.model('User'.require('./user.js')),
  Article: mongoose.model('Article'.require('./article.js'))}Copy the code

models/article.js

const mongoose = require('mongoose')
const baseModel = require('./base-model')
const Schema = mongoose.Schema

const articleSchema = new mongoose.Schema({
  title: {
    type: String.required: true
  },
  description: {
    type: String.required: true
  },
  body: {
    type: String.required: true
  },
  tagList: {
    type: [String].default: null
  },
  favoritesCount: {
    type: Number.default: 0
  },
  author: {
    type: Schema.Types.ObjectId,
    ref: 'User'.required: true
  },
  ...baseModel
})

module.exports = articleSchema
Copy the code

models/user.js

const mongoose = require('mongoose')
const baseModel = require('./base-model')
const md5 = require('.. /util/md5')

const userSchema = new mongoose.Schema({
  username: {
    type: String.required: true
  },
  email: {
    type: String.required: true
  },
  password: {
    type: String.required: true.set: value= > md5(value), // From the Model level, set password is md5 processed
    //select: false // query does not return password field, closed in graphQL, because allowed to return what fields are defined in the mode, set to close cannot read will affect password verification
  },
  bio: {
    type: String.default: null
  },
  image: {
    type: String.default: null
  },
  ...baseModel
})

module.exports = userSchema
Copy the code

And base-model introduced by both User.js and Article

// Public model field
module.exports = {
  createdAt: {
    type: Date.default: Date.now
  },
  updatedAt: {
    type: Date.default: Date.now
  }
}
Copy the code

util

Models /user.js imports an util/md5, creates the file, and installs the crypto module (encryption)

const crypto = require('crypto')

module.exports = str= > {
  return crypto.createHash('md5')
    .update(str)
    .digest('hex')}Copy the code

Util /jwt.js (cross-domain authentication) Install and import the JsonWebToken module for later account authentication

const jwt = require('jsonwebtoken')
const { promisify } = require('util')

exports.sign = promisify(jwt.sign)
exports.verify = promisify(jwt.verify)
exports.decode = promisify(jwt.decode)
Copy the code

data-sourses

This is used to define methods that manipulate the database, and these methods are provided to resolvers to retrieve data from the database and return it to the client

Create user.js, article.js, tag.js, and write it as you develop

const { MongoDataSource } = require('apollo-datasource-mongodb')

class Users extends MongoDataSource {
  getUser(userId) {
    return this.findOneById(userId)
  }
  getUsers() {
    return this.model.find()
  }
}

module.exports = Users
Copy the code
const { MongoDataSource } = require('apollo-datasource-mongodb')

class Articles extends MongoDataSource {}module.exports = Articles
Copy the code

Create data-sourses/index.js and introduce the database model and data-sources for manipulating the database

Instantiate data-Sources, pass in the database model and throw the dataSources object

const dbModel = require('.. /models')
const Users = require('./user')
const Articles = require('./article')

module.exports = () = > {
  return {
    users: new Users(dbModel.User),
    articles: new Articles(dbModel.Article)
  }
}
Copy the code

Import dataSources into index.js, and import ApolloServer

const dataSources = require('./data-sources')

//---------------------------------
const server = new ApolloServer({
  schema,
  dataSources
})
Copy the code

The current directory structure

Start the project

Test connection success

The connection completes and the database management tool shows that the database and collection are also created

Design the Schema for login registration

Start from user login registration to implement GraphQL interface, refer to realWorld official provided RESTful API specification, convert to GranphQL

Endpoints | RealWorld (gothinkster.github.io)

Here is the login and registration API. Click returns a User to see the data structure returned to the client after login and registration

Select type-defs.index.js, add the Mutation and create the user (s). You can see that the returned data is wrapped in a user (s). You can define a UserPayload that wraps the user (s)

const { gql } = require('apollo-server-express')

const typeDefs = gql` type Query { foo: String } type User { email: String! username: String! bio: String image: String token: String } type UserPayload { user: User } input LoginInput { email: String! password: String! } input CreateUserInput { username: String! email: String! password: String! } type Mutation { login(user: LoginInput): UserPayload createUser(user: CreateUserInput): UserPayload } `
module.exports = typeDefs
Copy the code

The next step is to implement the two mutations according to these schema requirements

Basic user registration process

Declare the creation of a user handler

Find resolvers/index. Js

Create the createUser Mutation, receive and print args, return UserPayload as typeDefs

const resolvers = {
  Query: {
    foo() {
      return "hello"}},Mutation: {
    createUser (parent, args, context) {
      console.log(args);
      return {
        user: {
          email: "[email protected]".username: "Max".bio: "xxx".image: "yyy".token: "zzz"
        }
      }
    }
  }
}
module.exports = resolvers
Copy the code

Select createUser (Mutation), enter the parameter, execute the statement, and return the scheduled data

The server also prints the request information

Verify the user name and email address

Deconstruct the user, and then determine and process the request

createUser (parent, { user }, context) {
  // Check whether the mailbox exists
  // Check whether the user name exists
  // Save the user
  // Generate the token and send it to the client
  return {
    user: {
      email: "[email protected]".username: "Max".bio: "xxx".image: "yyy".token: "zzz"}}}Copy the code

Declare the method of manipulating the database in data-sources/user.js

const { MongoDataSource } = require('apollo-datasource-mongodb')

class Users extends MongoDataSource {
  findByEmail (email) {
    return this.model.findOne({
      email
    })
  }
  findByUsername (username) {
    return this.model.findOne({
      username
    })
  }
}

module.exports = Users
Copy the code

Go back to resolvers/index.js and call an instance of this class

Note that ApolloServer does not throw errors using status codes; it provides UserInputError to throw user input errors

const { UserInputError } = require("apollo-server-core")

const resolvers = {
  Query: {
    foo() {
      return "hello"}},Mutation: {
    async createUser (parent, { user }, { dataSources }) {
      // Check whether the mailbox exists
      // Get the Users class thrown by data-sources/user.js to call its methods and return the data retrieved from the database
      const users = dataSources.users
      const hasEmail = await users.findByEmail(user.email)
      if(hasEmail) {
        Use the user-input error method provided by ApolloServer to return an error message
        throw new UserInputError('Mailbox already exists')}// Check whether the user name exists
      const hasUser = await users.findByUsername(user.username)
      if(hasUser) {
        throw new UserInputError('Username already exists')}// Save the user
      // Generate the token and send it to the client
      return {
        user: {
          email: "[email protected]".username: "Max".bio: "xxx".image: "yyy".token: "zzz"
        }
      }
    }
  }
}
module.exports = resolvers
Copy the code

Query page test, no error thrown, validation passed

Add user information to the database

Once the user name and email are verified, the next step is to save the user information to the database

Data-sources /user.js Adds a method to save users to the database

saveUser (user) {
  return new this.model(user).save()
}
Copy the code

Determine the username mailbox under which to execute this method, and print the test

// Check whether the mailbox exists
// Check whether the user name exists
// Save the user
const userData = await users.saveUser(user)
console.log(userData);
Copy the code

Add the registered user information on the test page, execute the statement, and return the scheduled result. The database adds data successfully, and the server terminal prints out the database return information

When this statement is executed again, an error message is returned indicating that the check has also taken effect

Returns data to the client for testing

Since the database is not returning a normal JS object, you need to call its toObject method to convert it. There is no need to remove the data that cannot be returned to the client, because the return value type of the GeaphQL statement already defines what the client can only retrieve

// Generate the token and send it to the client
return {
  user: {
    ...userData.toObject(),
    token: "zzz"}}Copy the code

A new user is added to the database, returning the newly entered mailbox and username

Generating user tokens

Import jwt.js and JWT secret keys above resolvers/index.js

A method is called

const jwt = require('.. /util/jwt')
const { jwtSecret } = require('.. /config/config.default')

// Save the user
const userData = await users.saveUser(user)
// Generate the token and send it to the client
const token = await jwt.sign({ userId: userData._id }, jwtSecret, { expiresIn: 60 * 60 * 24 })

return {
  user: {
    ...userData.toObject(),
    token
  }
}
Copy the code

Returns the token after the test creates the user

The user login

Resolvers/index. Js introduces md5

const md5 = require('.. /util/md5')

async login (parent, { user }, { dataSources }) {
  // Check whether the user exists
  const userData = await dataSources.user.findByEmail(user.email)
  if(! userData) {throw new UserInputError('Mailbox does not exist')}// Verify the password is correct
  if(md5(user.password) ! == userData.password) {throw new UserInputError('Password error')}/ / token is generated
  const token = await jwt.sign({ userId: userData._id }, jwtSecret, { expiresIn: 60 * 60 * 24 })
  // The response was sent successfully
  return {
    user: {
      ...userData.toObject(),
      token
    }
  }
}
Copy the code

Statements test

Gets the current login user

Get the user token in the global context

The type definition – defs schema

This is a GET method, which is a Query request, and returns a User, which is typed by type-defs/index.js. Add a Query class to Query

type Query {
  #foo: String
  crrentUser: User
}
Copy the code

Resolvers statement

Back to the resolvers/index. Js

Query: {
	currentUser (parent, args, context, info) {
  	// Obtain information about the current login user
    // Returns information about the user}}Copy the code

The client sends a token

The client sends a request to obtain information about the current login user by placing the token obtained after login into the request header and adding the Authorization field to the request header to return the token to the server

Initiates a login request to obtain the token

Put the token in the request header

The context receiving token

Note that the request header is not available in currentUser(), so we need context, pass the token to resolvers through context, and extract it from currentUser() for use

Index.js adds the context method, deconstructs the request body, and prints headers to see if the request body was received

onst server = new ApolloServer({
  schema,
  dataSources,
  // All GraphQL queries go through here
  context({req}) {
    console.log(req.headers)
  }
})
Copy the code

The terminal prints the request header information

Now that you can get it, the next thing is to return the token to the context

const server = new ApolloServer({
  schema,
  dataSources,
  context({req}) {
    const token = req.headers['authorization']
    return {
      token
    }
  }
})
Copy the code

Resolvers receives the token

Resolvers prints to see if the delivery is complete

Query: {
  currentUser (parent, args, context, info) {
    console.log(context.token); }}Copy the code

Statements test

The terminal prints successfully.

After obtaining the token, parse the token, get the user ID, and check the database to obtain the user data

This section describes the identity authentication modes

Why do YOU need to validate tokens

Before querying the database, you need to verify the login status of the token. If there is no token, the client has not logged in at all, and you need to return an unauthorized error to the client. If the authorization is invalid, you also need to return an invalid error. The simplest way to do this is to process the judgment within the request to return information about the login request, but such validation is used in addition to the current request, and the other interfaces are also such a logic to verify the login status

Therefore, you need to encapsulate the current login status verification by writing a function inside the resolvers and calling the function within the request requiring verification

But such frequent and selective call verification function Korean is too troublesome

Methods provided by official documentation

Apollo documentation gives you a solution for how to verify permissions in GraphQL. This page describes how to verify permissions. First, verify that information token is put into context

Authentication and authorization – Apollo Server – Apollo GraphQL Docs

The following are the methods provided for permission validation

Authorization methods

  • API-wide authorization

    Only login-related data can be accessed for background management and app for internal use. All other operations need login authentication. In this case, unified authentication can be carried out within the context

    context: ({ req }) = > {
     // get the user token from the headers
     const token = req.headers.authorization || ' ';
    
     // try to retrieve a user with the token
     const user = getUser(token);
    
     // optionally block the user
     // we could also check user roles/permissions here
     if(! user)throw new AuthenticationError('you must be logged in');
    
     // add the user to the context
     return { user };
    }
    Copy the code

    However, if some permissions are open, some interfaces can be accessed without login

  • In resolvers

    Performing judgments in each resolver is cumbersome

  • In data models

    Authentication for different requirements is also not very useful

  • With custom directives

    A custom directive in a Schema adds a behavior when accessing a property, similar to middleware, to achieve uniform processing

    • Define instructions @ auth (role authorization specified argument) effect on location (object | field)

    • Enum Enumerates the permissions of roles

    • Add this directive after the type you want to validate and pass in parameters specifying role permissions

    const typeDefs = ` directive @auth(requires: Role = ADMIN) on OBJECT | FIELD_DEFINITION enum Role { ADMIN REVIEWER USER } type User @auth(requires: USER) { name: String banned: Boolean @auth(requires: ADMIN) canPost: Boolean @auth(requires: REVIEWER) } `
    Copy the code

    It does not mean that a custom directive defined and invoked here will take effect

    You also need to implement the behavior of custom instructions, which require you to write your own logic

  • Outside of GraphQL

The default commandDefault directives

The GraphQL specification defines the following default directives:

DIRECTIVE DESCRIPTION
@deprecated(reason: String) When a field is accessed, the Schema Schema is updated, but the original field does not want to be deleted. Otherwise, the client will report an error and add the instruction to indicate that the field will be deprecated in the future
@skip(if: Boolean!) If true is given to if, the field is skipped and not returned to the client, which is used in the client query syntax
@include(if: Boolean!) If the argument given to if is true, the field is included and allowed to be returned to the client for use in the client query syntax

@ include and @ the skip

It is said above that @include and @skip are instructions provided to the client to pass parameters to the query statement to judge whether to return. When the back end receives the query statement, skip if is true will skip this field, and include if is false will not include this field in the returned data

@deprecated

Add directives in typeDefs and specify the information to return

const typeDefs = gql` type User { email: String! username: String! @deprecated(reason: "Please use newUserName") newUserName: String bio: String image: String Token: String} '
Copy the code

When a statement query test is performed, the query results are returned correctly, but with a yellow warning

Custom instruction

Define an instruction

Custom directives are written in typeDefs, regardless of order, but it is best to write them at the top for clarity

Here, directive defines a @upper custom directive, and on is used for fields, i.e. single properties

Add the @upper directive to name to convert the user name returned by the database to uppercase by the directive

const typeDefs = gql` directive @upper on FIELD_DEFINITION type User { email: String! username: String! @upper bio: String image: String token: String } `
Copy the code

Creating schema directives – Apollo Server – Apollo GraphQL Docs

Supported locations

In the following table lists the typeDefs can use the location of the instruction, specify the location of the custom instructions appear free combination, through | space

NAME / MAPPERKIND DESCRIPTION
SCALARSCALAR_TYPE The definition of a custom scalar
OBJECTOBJECT_TYPE The definition of an object type
FIELD_DEFINITIONOBJECT_FIELD The definition of a field within any defined type except an input type (see INPUT_FIELD_DEFINITION)
ARGUMENT_DEFINITIONARGUMENT The definition of a field argument
INTERFACEINTERFACE_TYPE The definition of an interface
UNIONUNION_TYPE The definition of a union
ENUMENUM_TYPE The definition of an enum
ENUM_VALUEENUM_VALUE The definition of one value within an enum
INPUT_OBJECTINPUT_OBJECT_TYPE The definition of an input type
INPUT_FIELD_DEFINITIONINPUT_OBJECT_FIELD The definition of a field within an input type
SCHEMAROOT_OBJECT The top-level schema object declaration with query.mutation, and/or subscription fields (this declaration is usually omitted)

Defining instruction logic

If you want to define and use it this way, you can’t do it without a logical implementation. Then you define the class implementation operation in the Schema. You need these two dependencies to verify that they are installed

npm i @graphql-tools/utils
Copy the code

Add the instruction converter function in the root directory schema.js

const { makeExecutableSchema } = require('@graphql-tools/schema')

const typeDefs = require('./type-defs')
const resolvers = require('./resolvers')

// Import these dependencies
const { mapSchema, getDirective, MapperKind } = require('@graphql-tools/utils')
const { defaultFieldResolver } = require('graphql')

// Add a custom instruction converter
function upperDirectiveTransformer(schema, directiveName) {
  return mapSchema(schema, {
    
    Execute once for each object in the schema
    [MapperKind.OBJECT_FIELD]: (fieldConfig) = > {

      // Check if there is any directive passed in this field that needs to be executed
      let upperDirective = getDirective(schema, fieldConfig, directiveName)
      upperDirective = upperDirective && upperDirective[0]
      if (upperDirective) {

        // If there is an original resolver that gets this field without instructions
        const { resolve = defaultFieldResolver } = fieldConfig

        // Replace the original resolver with a function that *first* calls the original resolver
        fieldConfig.resolve = async function (parent, args, context, info) {
          // Get the data returned by the database
          const result = await resolve(source, args, context, info)
          // Determine whether the returned data is a string, if so, convert to uppercase, if not return directly
          if (typeof result === 'string') {
            return result.toUpperCase()
          }
          return result
        }
        return fieldConfig
      }
    }
  })
}

let schema = makeExecutableSchema({
  typeDefs,
  resolvers
})

// Convert schema to apply instruction logic
schema = upperDirectiveTransformer(schema, 'upper');

module.exports = schema
Copy the code

One thing to note is that the code in the official example will report an error when copied over

The reason is no support? . This syntax needs to be configured


According to the description of this syntax, can be replaced with a && a.b this form, so the source code was changed to the following, and the js syntax and GQL syntax have passed the test

let upperDirective = getDirective(schema, fieldConfig, directiveName)
		upperDirective = upperDirective && upperDirective[0]
Copy the code

In addition, the code in the upper document also caused me to report an error later

fieldConfig.resolve = async function (source, args, context, info){}
Copy the code

He wrote source instead of parent

Other instruction examples

Schema directives (graphql-tools.com)

The encapsulation Auth directive handles authentication

Declare directives and use them

As in the custom directive example above, we first define an enumerated type Role to provide to the @Auth directive. This Role defaults to ADMIN. The @Auth directive can appear on types like User that return data types of OBJECT, or on single fields like Banned. Default access is set for fields within all users, and can also appear on individual fields, limiting User access to specific fields via @auth

const typeDefs = gql` directive @auth on FIELD_DEFINITION type User{ email: String! name: String! bio: String @auth image: String token: String } `
Copy the code

Encapsulate custom instruction converter

Package the directive to shema-directives/user.js and throw it

const { mapSchema, getDirective, MapperKind } = require('@graphql-tools/utils')
const { defaultFieldResolver } = require('graphql')

const { AuthenticationError } = require("apollo-server-core")

const jwt = require('.. /util/jwt')
const { jwtSecret } = require('.. /config/config.default')

// The instruction converter function does not need to be changed as a whole
function autnDirectiveTransformer(schema, directiveName) {
  return mapSchema(schema, {
    [MapperKind.OBJECT_FIELD]: (fieldConfig) = > {
      // Change the name to authDirective
      let authDirective = getDirective(schema, fieldConfig, directiveName)
      authDirective = authDirective && authDirective[0]
      if (authDirective) {
        const { resolve = defaultFieldResolver } = fieldConfig
        // Return the database data here, and proceed with the judgment processing
        fieldConfig.resolve = async function (parent, args, { token }, info) {
          if(! token) {throw new AuthenticationError('Unauthorized')}try {
            const decodedData = await jwt.verify(token, jwtSecret)
            // Print it to see if it works
            console.log(decodedData);
          } catch (error) {
            throw new AuthenticationError('Unauthorized')}}return fieldConfig
      }
    }
  })
}

module.exports = autnDirectiveTransformer
Copy the code

Application instruction logic

Introduced to the schema. Js

const { makeExecutableSchema } = require('@graphql-tools/schema')

const typeDefs = require('./type-defs')
const resolvers = require('./resolvers')
// Introduce the instruction converter module
const autnDirectiveTransformer = require('./schema-directives/auth')

let schema = makeExecutableSchema({
  typeDefs,
  resolvers
})
// Convert schema to apply instruction logic
schema = autnDirectiveTransformer(schema, 'auth');

module.exports = schema
Copy the code

Whether the test takes effect

Statement test returns error message “not authorized” without passing token

After re-logging in to obtain the token, the server re-initiates a request to obtain the BIO. The server successfully returns the BIO data.

The request has gone to the @auth directive

The server terminal displays the provisioning time and expiration time of the userId and token

Get user data by userID

Go to data-sources/user.js and add a method to find data by userId

findById (userId) {
  return this.findOneById(userId)
}
Copy the code

Call the findById method under datasources. users, and pass in the userId resolved by the token

Make a request with token and print the data obtained from the database

fieldConfig.resolve = async function (parent, args, { token, dataSources }, info) {
  if(! token) {throw new AuthenticationError('Unauthorized')}try {
    const decodedData = await jwt.verify(token, jwtSecret)
    const user = await dataSources.users.findById(decodedData.userId)
    console.log(user);
  } catch (error) {
    throw new AuthenticationError('Not authorized! ')}}Copy the code

Because if you deconstruct user from the parameter, you cannot return changes to user to the context, you need to deconstruct the desired object internally and update the user with context.user

The return resolver(parent, args, context, info) will always be returned as long as the code in the try catch executes correctly

fieldConfig.resolve = async function (parent, args, context, info) {
  const { token, dataSources } = context
  if(! token) {throw new AuthenticationError('Unauthorized')}try {
    const decodedData = await jwt.verify(token, jwtSecret)
    const user = await dataSources.users.findById(decodedData.userId)
    // Mount the current logged-in user to the context object for use by the resolver
    context.user = user
  } catch (error) {
    throw new AuthenticationError('Not authorized! ')}return await resolve( parent, args, context, info)
}
return fieldConfig
Copy the code

CurrentUser application

If currentUser = @auth; if currentUser = @auth; if currentUser = @auth

type Query {
  currentUser: User @auth
}
Copy the code

Resolvers /index.js, return user

const resolvers = {
  Query: {
    currentUser (parent, args, context, info) {
      return context.user
    }
  }
}
Copy the code

The test did not add @auth

After add @ auth

Update the login user information

typeDefs

UpdataUser = user (type updataUserInput); UserPayLoad (payload); UpdataUserInput is of a different type than the document. Email is used as an account. Change it to username and add password. Note that this is an input.)

input updataUserInput {
  username: String
  password: String
  image: String
  bio: String
}
type Mutation {
  login(user: LoginInput): UserPayload
  createUser(user: CreateUserInput): UserPayload
  updataUser(user: updataUserInput): UserPayLoad @auth
}
Copy the code

resolvers

The next step is to write the handler, add a usdataUser method, and deconstruct user from the request body args, user from the context returned by the database, and the dataSources that called the database operation method. Since user has the same name, we rename the parameter to the request body user

Check whether the password has been changed and encrypt it

Call the database operation method that data-Sources will declare below and pass in the parameters

Since @auth is added, any request will first go through the instruction flow. The @Auth will get the token, decrypt it, get the full user data containing user._id from the userID, put it into the context, and then get user._id in subsequent processing

async updataUser (parent, { user: userInput }, { user, dataSources }) {
  // If the modified data contains a password, encrypt the password and store it in userInput
  if(userInput.password) {
    userInput.password = md5(userInput.password)
  }
  // Manipulate the database and receive the returned data
  const res  = await dataSources.users.updataUser(user._id, userInput)
  // Returns the result wrapped in the user object
  return {
    user: res
  }
}
Copy the code

data-sources

Open data-sources/user.js to add the operation database method

updataUser (userId, data) {
  // Manipulate the database and receive the returned data
  return this.model.findByIdAndUpdate(
    { _id: userId}, // Query conditions
    data, // Pass in the changed data
    {
      new: true // Default false returns the data before the change, true returns the data after the update})}Copy the code

The test interface

In the lower part, the data to be modified is passed in. In the upper part, the data returned by the request server is sent with the token. The data returned is the updated data

Create the article design schema

Endpoints | RealWorld (realworld-docs.netlify.app)

Next to do the management of the article, to achieve the increase, deletion, change and check of the article

typeDefs

CreateArticle first declares that receiving the article type requires input, only the tagList field is not required, and returns the article

const typeDefs = gql` input CreateArticalInput { title: String! description: String! body: String! tagList: [String!]  } type Article { } type Mutation { # Artical createArticle(article: CreateArticleInput): Article } `
Copy the code

Information required for the article

The return value is an article object that contains the word Payload, so it needs to be wrapped with a Payload. The author object below the article is a User object with an additional following. Add the following to the User type, author: User

The author field is required to create the article, which needs to be authenticated when the request is sent and saved to the database along with the identity information, so it is followed by @auth

const typeDefs = gql` type User { email: String! username: String! bio: String @auth image: String token: String following: Boolearn } type Article { _id: String! slug: String! title: String! description: String! body: String! tagList: [String!]  createdAt: String! updatedAt: String! favorited: Boolearn favoritesCount: Int author: User } type CreateArticlePayload { article: Article } type Mutation { # Artical createArticle(artcle: CreateArticleInput): CreateArticlePayload @auth } `
Copy the code

resolvers

Resolver is now full of User handlers, so it is hard to distinguish between Article handlers and resolver handlers

Rename resolvers/index.js to user.js, create an article

const resolvers = {
  Query: {

  },
  Mutation: {
    createArticle () {
      console.log(111);
    }
  }
}

module.exports = resolvers
Copy the code

schema.js

Open the root directory schema.js, import user.js and article. Js respectively, and import resolvers as an array

const { makeExecutableSchema } = require('@graphql-tools/schema')

const typeDefs = require('./type-defs')
// Modularity introduces resolvers
const userRresolvers = require('./resolvers/user')
const articleRresolvers = require('./resolvers/article')

const autnDirectiveTransformer = require('./schema-directives/auth')



let schema = makeExecutableSchema({
  typeDefs,
  resolvers: [userRresolvers, articleRresolvers] // in the form of an array
})

schema = autnDirectiveTransformer(schema, 'auth');

module.exports = schema
Copy the code

Alternatively, you can import schema.js in resolvers/index.js by composing an array

Test statement, return with success

Test the newly added createArticle() again

The server terminal successfully printed 111

CreateArticle () function implemented

Data-sources /article. Js Method of operating the database

const { MongoDataSource } = require('apollo-datasource-mongodb')

class Articles extends MongoDataSource {
  createArticle (data) {
    const article = new this.model(data)
    return article.save()
  }
}

module.exports = Articles
Copy the code

Using article. Js also needs to be confirmed that this module is configured in data-sources/index.js

const dbModel = require('.. /models')
const Users = require('./user')
const Articles = require('./article')

module.exports = () = > {
  return {
    users: new Users(dbModel.User),
    articles: new Articles(dbModel.Article)
  }
}
Copy the code

The dataSources can then be processed in resolvers

async createArticle (parent, {article}, {dataSources}) {
  const res = await dataSources.articles.createArticle(article)
  return {
    article:res
  }
}
Copy the code

Article author is required to test statements that report errors

The following error message points to Mongoose’s Model.document. Indeed, sending a request to the database requires an AUTHOR ID with a key to determine which user created the article

Back at resolvers/article.js, the @auth directive puts the user returned by the database after authentication into the context, where the user is deconstructed and assigned to article

async createArticle (parent, {article}, {user, dataSources}) {
  article.author = user._id
  const res = await dataSources.articles.createArticle(article)
  return {
    article:res
  }
}
Copy the code

The request is sent again, and the article information is returned as normal

The database also successfully saved the request

Handle authors in articles

As you can see from the author stored in the database above, the _id of user is saved, not the user object that the request needs to return. If you request author now, you will get an error because you cannot read the required field

Methods a

Assign the user returned by the @auth directive to article. Author. This method only applies to returning user data

async createArticle (parent, {article}, {user, dataSources}) {
  article.author = user._id
  const res = await dataSources.articles.createArticle(article)
  res.author = user
  return {
    article:res
  }
}
Copy the code

Method 2

Populate () uses the _ID for field mapping using Mongoose’s method

data-sources/article.js

createArticle (data) {
  const article = new this.model(data)
  article.populate('author')
  return article.save()
}
Copy the code

Here, the value ID of author in the article database is mapped to the data object corresponding to this ID in the author collection to author: _id, where author is returned as an object to complete the request

Mongoose’s populate() is tablable, referencing its documents in another collection, and populate() automatically replaces the specified fields in the document with content from another collection

When you create a Model, you can set the REF option to the fields in the Model that are associated with storing the _id of other collections. The REF option tells Mongoose which Model to use when using populate()

author: {
  type: Schema.Types.ObjectId,
  ref: 'User'.required: true
}
Copy the code

ObjectId, Number, String, and Buffer can all be used as refs. However, it is best to use ObjectId

Populate (), and there are other optional properties you can pass in

Query.populate(path, [select], [model], [match], [options])
Copy the code
  • Path – Specifies the table to query

  • Select (optional) – Specifies the fields to be queried

  • Model (Optional) — Type: model, optional, specifies the model of the associated field, or if not, the ref of the Schema is used.

  • 4. Match (Optional) — Type: Object, optional, specifies additional query conditions.

  • 5. Options (Optional) — Type: Object, optional, specifies additional query options, such as sorting and number limits, etc.

    Populate – SegmentFault Think no

Methods three

Through the resolver chain structure, separate to write separate parsing, this method processing, higher degree of freedom

const resolvers = {
  Query: {},Mutation: {
    async createArticle (parent, {article}, {user, dataSources}) {
      article.author = user._id
      const res = await dataSources.articles.createArticle(article)
      return {
        article:res
      }
    }
  },
  Article: {
    async author (parent, args, { dataSources }) {
      console.log(parent)
      const user = await dataSources.users.findById(parent.author)
      return user
    }
  }
}

module.exports = resolvers
Copy the code

The author field in Article is parsed by the resolver. By printing parent, we can see that this is the Article returned by the database. Then we get the user information of the user set by the id of parent. CreateArticle returns all data to the client

The above three methods all returned the correct data

Get all articles

typeDefs

Get the data structure for all articles, including an article member’s array articles and a articlesCount number, which temporarily returns the entire content before returning the article list and paging

Adding a Query Type

type ArticlesPayload {
  articles: [Article!]
  articlesCount: Int!
}
type Query {
  articles: ArticlesPayload
}
Copy the code

data-sources

Declares methods for manipulating the database

const { MongoDataSource } = require('apollo-datasource-mongodb')

class Articles extends MongoDataSource {
  createArticle (data) {
    const article = new this.model(data)
    return article.save()
  }
  getArticles () {
    return this.model.find()
  }
  getCount () {
    return this.model.countDocuments()
  }
}

module.exports = Articles
Copy the code

resolvers

(resolvers) (data-sources) (resolvers) (data-sources) (resolvers) (resolvers) (data-sources) (Resolvers) (resolvers) (resolvers) (resolvers) (resolvers) (resolvers) (resolvers) Return an object containing articles and articlesCount

const resolvers = {
  Query: {
    async articles (parent, {}, { dataSources }) {
      const [articles, articlesCount] = await Promise.all([
        dataSources.articles.getArticles(),
        dataSources.articles.getCount()
      ])
      return {
        articles,
        articlesCount
      }
    }
  }
  / /...
}
Copy the code

Tests get all articles and the total number of articles

Page to get the list of articles

Paging function starts with parameters, page number and page size two parameters for paging

typeDefs

Add two parameters to the article query, offset is Int, default is 0, indicates the offset of 0 data, limit from the current position, Int, default is 6 data per page

type Query {
  currentUser: User @auth
  articles(offset: Int = 0.limit: Int = 6): ArticlesPayload
}
Copy the code

resolvers

Deconstruct the parameters

async articles (parent, { offset, limit }, { dataSources }) {
  console.log({ offset, limit })
  const [articles, articlesCount] = await Promise.all([
    dataSources.articles.getArticles(offset, limit),
    dataSources.articles.getCount()
  ])
  return {
    articles,
    articlesCount
  }
}
Copy the code

First pass the parameter, test the statement print

The server terminal successfully prints parameters

data-sources

Receive parameters and call methods to query the database

getArticles (offset, limit) {
	return this.model.find().skip(offset).limit(limit)
}
Copy the code

Use resolver chains to improve query performance

One of the benefits of GraphQL is that a request can optionally return only part of the data, but this operation is performed on the server side, the database is searched completely, but the data returned to the client is filtered

The above request to get a list of articles and totals cannot be written as before if you want to look at only one of them without performing a query on the server for the other

It needs to be separated by a resolver chain

For article, only an empty object is returned so that it can continue to return content. This cannot be absent, otherwise the resolver chain will not continue

A typeDefs articles query returns a payload containing two fields, writes the logic of the two fields separately, and prints the request status

Note that you need to put the request arguments into return to get the offset and limit arguments from the parent subclass

const resolvers = {
  Query: {
    async articles (parent, {offset, limit}) {
        return {offset, limit}
    }
  },
  ArticlesPayload: {
    async articles({offset, limit}, args, { dataSources }) {
      console.log('articles query')
      const articles = await dataSources.articles.getArticles(offset, limit)
      return articles
    },
    async articlesCount (parent, args, { dataSources }) {
      console.log('articlesCount query')
      const count = await dataSources.articles.getCount()
      return count
    }
  }
}

module.exports = resolvers
Copy the code

Both queries return the correct data

The server terminal prints two requests

Next, just query the total number of articles and pass no arguments

The server terminal printed the articlesCount query, indicating that the server did not perform the query operation for Articles

In this way, the optimization of resolver chain can avoid doing too many useless IO operations and improve the performance of the program when only one of them is queried