• Absurdly Deep Dive – Part 2 Elixir, Phoenix, Absinthe, GraphQL, React, and Apollo: An Absurdly Deep Dive – Part 2
  • Originally written by Zach Schneider
  • The Nuggets translation Project
  • Permanent link to this article: github.com/xitu/gold-m…
  • Translator: Fengziyin1234
  • Proofreader: Xuyuey, Portandbridge

If you haven’t seen part 1 of this series, I suggest you do:

  • Elixir, Phoenix, Absinthe, GraphQL, React, and Apollo: An Almost Crazy Deep Practice — Part 1
  • Elixir, Phoenix, Absinthe, GraphQL, React, and Apollo: An Almost Insane Deep Practice — Part II

Testing – server side

Now that we’ve done all the coding, how do I make sure my code always works? We need to test the following different levels. First, we need to unit test the model layer — do the models validate (data) correctly? Do these Model helpers return the expected results? Second, we need to unit test the Resolver layer — can resolver handle different (success and failure) scenarios? Is the correct result returned or the correct database update made based on the result? Third, we should write some complete integration tests, such as sending a query request to the server and expecting the correct results back. This allows us to better control our application globally and ensure that these tests cover cases such as authentication logic. Fourth, we want to test our subscription layer to see if they can correctly notify the socket when the relevant changes occur.

Elixir has a very basic built-in test library called ExUnit. ExUnit includes simple assert/refute functions that can also help you run your tests. It is also common to create a series of “case” support files in Phoenix. These files are referenced in the tests to run common initialization tasks, such as connecting to a database. Also, in my tests, I found the ex_spec and ex_machina libraries to be very helpful. Ex_spec adds simple describe and IT, and for my Ruby background, ex_spec makes the syntax for writing tests more user-friendly. Ex_machina provides function factories that make it easier to insert test data dynamically.

The function factory I created looks like this:

# test/support/factories.ex
defmodule Socializer.Factory do
  use ExMachina.Ecto, repo: Socializer.Repo

  def user_factory do
    %Socializer.User{
      name: Faker.Name.name(),
      email: Faker.Internet.email(),
      password: "password".password_hash: Bcrypt.hash_pwd_salt("password")}end

  def post_factory do
    %Socializer.Post{
      body: Faker.Lorem.paragraph(),
      user: build(:user)}end

  # ...factories for other models
end
Copy the code

After importing the function factory in your environment setup, you can use some very intuitive syntax in your test cases:

# Insert a user
user = insert(:user)

# Insert a user with a specific name
user_named = insert(:user.name: "John Smith")

# Insert a post for the user
post = insert(:post.user: user)
Copy the code

After the setup is complete, your Post Model will look like this:

# test/socializer/post_test.exs
defmodule Socializer.PostTest do
  use SocializerWeb.ConnCase

  alias Socializer.Post

  describe "#all" do
    it "finds all posts" do
      post_a = insert(:post)
      post_b = insert(:post)
      results = Post.all()
      assert length(results) == 2
      assert List.first(results).id == post_b.id
      assert List.last(results).id == post_a.id
    end
  end

  describe "#find" do
    it "finds post" do
      post = insert(:post)
      found = Post.find(post.id)
      assert found.id == post.id
    end
  end

  describe "#create" do
    it "creates post" do
      user = insert(:user)
      valid_attrs = %{user_id: user.id, body: "New discussion"}
      {:ok, post} = Post.create(valid_attrs)
      assert post.body == "New discussion"
    end
  end

  describe "#changeset" do
    it "validates with correct attributes" do
      user = insert(:user)
      valid_attrs = %{user_id: user.id, body: "New discussion"}
      changeset = Post.changeset(%Post{}, valid_attrs)
      assert changeset.valid?
    end

    it "does not validate with missing attrs" do
      changeset =
        Post.changeset(
          %Post{},
          %{}
        )

      refute changeset.valid?
    end
  end
end
Copy the code

The test case is straightforward. For each case, we insert the required test data, call the function to be tested, and assert the results.

Let’s take a look at the following resolver test case:

# test/socializer_web/resolvers/post_resolver_test.exs
defmodule SocializerWeb.PostResolverTest do
  use SocializerWeb.ConnCase

  alias SocializerWeb.Resolvers.PostResolver

  describe "#list" do
    it "returns posts" do
      post_a = insert(:post)
      post_b = insert(:post)
      {:ok, results} = PostResolver.list(nil.nil.nil)
      assert length(results) == 2
      assert List.first(results).id == post_b.id
      assert List.last(results).id == post_a.id
    end
  end

  describe "#show" do
    it "returns specific post" do
      post = insert(:post)
      {:ok, found} = PostResolver.show(nil, % {id: post.id}, nil)
      assert found.id == post.id
    end

    it "returns not found when post does not exist" do
      {:error, error} = PostResolver.show(nil, % {id: 1}, nil)
      assert error == "Not found"
    end
  end

  describe "#create" do
    it "creates valid post with authenticated user" do
      user = insert(:user)

      {:ok, post} =
        PostResolver.create(nil, % {body: "Hello"}, {%context:% {current_user: user}
        })

      assert post.body == "Hello"
      assert post.user_id == user.id
    end

    it "returns error for missing params" do
      user = insert(:user)

      {:error, error} =
        PostResolver.create(nil, % % {}, {context:% {current_user: user}
        })

      assert error == [[field: :body.message: "Can't be blank"]]
    end

    it "returns error for unauthenticated user" do
      {:error, error} = PostResolver.create(nil, % {body: "Hello"}, nil)

      assert error == "Unauthenticated"
    end
  end
end
Copy the code

The resolver tests are also fairly simple — they’re also unit tests, running on top of the Model. Here we insert arbitrary test data, call the tested resolver, and expect the correct result to be returned.

Integration testing is a little more complicated. We first need to establish a connection to the server (possibly requiring authentication), then send a query and make sure we get the correct result. I found this post that was very helpful in learning how to build integration tests for Absinthe.

First, we create a helper file that will contain some common functionality needed for integration testing:

# test/support/absinthe_helpers.ex
defmodule Socializer.AbsintheHelpers do
  alias Socializer.Guardian

  def authenticate_conn(conn, user) do
    {:ok, token, _claims} = Guardian.encode_and_sign(user)
    Plug.Conn.put_req_header(conn, "authorization"."Bearer #{token}")
  end

  def query_skeleton(query, query_name) do% {"operationName"= >"#{query_name}"."query"= >"query #{query_name} #{query}"."variables"= >"{}"
    }
  end

  def mutation_skeleton(query) do% {"operationName"= >""."query"= >"mutation #{query}"."variables"= >""
    }
  end
end
Copy the code

This file contains three helper functions. The first function takes a connection object and a user object as parameters and authenticates the connection by adding an authenticated user token to the HTTP header. The second and third functions each take a query as an argument, and when you send a query to the server over a network connection, they return a JSON structure containing the result of the query.

Then back to the test itself:

# test/socializer_web/integration/post_resolver_test.exs
defmodule SocializerWeb.Integration.PostResolverTest do
  use SocializerWeb.ConnCase
  alias Socializer.AbsintheHelpers

  describe "#list" do
    it "returns posts" do
      post_a = insert(:post)
      post_b = insert(:post)

      query = """ { posts { id body } } """

      res =
        build_conn()
        |> post("/graphiql", AbsintheHelpers.query_skeleton(query, "posts"))

      posts = json_response(res, 200) ["data"] ["posts"]
      assert List.first(posts)["id"] == to_string(post_b.id)
      assert List.last(posts)["id"] == to_string(post_a.id)
    end
  end

  #...
end
Copy the code

This test case tests our terminal by querying for a set of posts. We first insert some POST records into the database, then write a query statement, then send the statement to the server through the POST method, and finally check the server’s response to ensure that the results returned are as expected.

Here’s a very similar example to see if you can query for individual posts. We won’t go into details here (if you want to see all the integration tests, you can check them out here). Let’s take a look at the integration tests for Mutation that created the post.

# test/socializer_web/integration/post_resolver_test.exs
defmodule SocializerWeb.Integration.PostResolverTest do
  #...

  describe "#create" do
    it "creates post" do
      user = insert(:user)

      mutation = """ { createPost(body: "A few thoughts") { body user { id } } } """

      res =
        build_conn()
        |> AbsintheHelpers.authenticate_conn(user)
        |> post("/graphiql", AbsintheHelpers.mutation_skeleton(mutation))

      post = json_response(res, 200) ["data"] ["createPost"]
      assert post["body"] = ="A few thoughts"
      assert post["user"] ["id"] == to_string(user.id)
    end
  end
end
Copy the code

Very similar, only two different – this time we are through AbsintheHelpers. Authenticate_conn (user) to add the user’s token header fields to establish connections, and we call mutation_skeleton, Instead of query_skeleton.

What about the subscription test? The subscription test also requires some basic setup to establish a socket connection, and then we can set up and test our Subscription. I found this article that was very helpful in understanding how to build tests for Subscription.

First, we create a new case file to provide basic scaffolding for the subscription test. The code looks like this:

# test/support/subscription_case.ex
defmodule SocializerWeb.SubscriptionCase do
  use ExUnit.CaseTemplate

  alias Socializer.Guardian

  using do
    quote do
      use SocializerWeb.ChannelCase
      use Absinthe.Phoenix.SubscriptionTest, schema: SocializerWeb.Schema
      use ExSpec
      import Socializer.Factory

      setup do
        user = insert(:user)

        # When connecting to a socket, if you pass a token we will set the context's `current_user`
        params = %{
          "token" => sign_auth_token(user)
        }

        {:ok, socket} = Phoenix.ChannelTest.connect(SocializerWeb.AbsintheSocket, params)
        {:ok, socket} = Absinthe.Phoenix.SubscriptionTest.join_absinthe(socket)

        {:ok.socket: socket, user: user}
      end

      defp sign_auth_token(user) do
        {:ok, token, _claims} = Guardian.encode_and_sign(user)
        token
      end
    end
  end
end
Copy the code

After some common imports, we define a setup step. This step inserts a new user and establishes a Websocket connection using the user’s token. We return this socket and the user for use in our other tests.

Next, let’s take a look at the test itself:

defmodule SocializerWeb.PostSubscriptionsTest do
  use SocializerWeb.SubscriptionCase

  describe "Post subscription" do
    it "updates on new post", % {socket: socket} do
      # Query to establish the subscription.
      subscription_query = """ subscription { postCreated { id body } } """

      # Push the query onto the socket.
      ref = push_doc(socket, subscription_query)

      # Assert that the subscription was successfully created.
      assert_reply(ref, :ok, % {subscriptionId: _subscription_id})

      # Query to create a new post to invoke the subscription.
      create_post_mutation = """ mutation CreatePost { createPost(body: "Big discussion") { id body } } """

      # Push the mutation onto the socket.
      ref =
        push_doc(
          socket,
          create_post_mutation
        )

      # Assert that the mutation successfully created the post.
      assert_reply(ref, :ok, reply)
      data = reply.data["createPost"]
      assert data["body"] = ="Big discussion"

      # Assert that the subscription notified us of the new post.
      assert_push("subscription:data", push)
      data = push.result.data["postCreated"]
      assert data["body"] = ="Big discussion"
    end
  end
end
Copy the code

First, we write a subscription query and push it to the socket we established in the previous step. Next, we write a mutation statement that triggers the subscription (for example, to create a new post) and push it to the socket. Finally, we examine push’s reply and assert that a post’s newly created update will be pushed to us. There was more upfront build design, but it also allowed us to build better integration tests for the Subscription lifecycle.

The client

That’s a rough description of what happens on the server side — the server processes GraphQL queries by defining types, implementing resolvers, querying model, and persist data. Let’s take a look at how the client is set up.

We’ll start with create-react-app. This is a great way to build react projects from 0 to 1 — it will build a “Hello World” React application with default Settings and structure, and simplify a lot of configuration.

Here I use the React Router to route the application; It will allow users to browse through a list of posts, a single post page, a chat page, and more. The root component of our application should look like this:

// client/src/App.js
import React, { useRef } from "react";
import { ApolloProvider } from "react-apollo";
import { BrowserRouter, Switch, Route } from "react-router-dom";
import { createClient } from "util/apollo";
import { Meta, Nav } from "components";
import { Chat, Home, Login, Post, Signup } from "pages";

const App = (a)= > {
  const client = useRef(createClient());

  return (
    <ApolloProvider client={client.current}>
      <BrowserRouter>
        <Meta />
        <Nav />

        <Switch>
          <Route path="/login" component={Login} />
          <Route path="/signup" component={Signup} />
          <Route path="/posts/:id" component={Post} />
          <Route path="/chat/:id?" component={Chat} />
          <Route component={Home} />
        </Switch>
      </BrowserRouter>
    </ApolloProvider>
  );
};
Copy the code

A few notable points — util/ Apollo — here a createClient function is output. This function creates and returns an instance of the Apollo client (which we’ll focus on later). Wrapping createClient in useRef makes the instance available for the lifetime of the application (that is, all RerEnders). ApolloProvider is a higher-order component that enables clients to use it in all child component/query contexts. BrowserRouter uses HTML5’s History API to keep the URL state in sync as we navigate the app.

Switch and Route need to be discussed separately here. The React Router is built around the concept of dynamic routing. Most sites use static routing, which means that your URL will match a unique route and render an entire page based on the matched route. With dynamic routing, routes are distributed throughout the application, and a URL can match multiple routes. This may sound confusing, but it’s actually pretty great when you master it. It makes it easy to build a page with different components that can react to different parts of a route. For example, imagine a facebook-like Messenger page (the Socializer chat interface is also very similar) with a list of conversations on the left and selected conversations on the right. Dynamic routing allows me to express:

const App = (a)= > {
  return (
    // ...
    <Route path="/chat/:id?" component={Chat} />
    // ...
  );
};

const Chat = (a)= > {
  return (
    <div>
      <ChatSidebar />

      <Switch>
        <Route path="/chat/:id" component={Conversation} />
        <Route component={EmptyState} />
      </Switch>
    </div>
  );
};
Copy the code

If the path starts with /chat (and may end with an ID, for example, /chat/123), the root-level App renders the Chat component. Chat renders the Conversation list bar (the Conversation list bar is always visible), then renders its route, displaying a Conversation component if the path has an ID, and EmptyState otherwise (note that if it is missing? , the :id parameter is no longer optional. This is the power of dynamic routing — it allows you to gradually render different components of an interface based on the current URL, localizing path-based problems into related components.

Even with dynamic routing, sometimes you just want to render a path (similar to traditional static routing). This is where the Switch component comes into play. Without the Switch, the React Router would render every component that matches the current URL, so in the Chat component above, we would have both the Conversation component and the EmptyState component. The Switch tells the React Router to render only the first route that matches the current URL and ignore the rest.

Apollo, the client

Now, let’s take a closer look at Apollo’s client — specifically, the createClient function mentioned above. The util/apollo.js file looks like this:

// client/src/util.apollo.js
import ApolloClient from "apollo-client";
import { InMemoryCache } from "apollo-cache-inmemory";
import * as AbsintheSocket from "@absinthe/socket";
import { createAbsintheSocketLink } from "@absinthe/socket-apollo-link";
import { Socket as PhoenixSocket } from "phoenix";
import { createHttpLink } from "apollo-link-http";
import { hasSubscription } from "@jumpn/utils-graphql";
import { split } from "apollo-link";
import { setContext } from "apollo-link-context";
import Cookies from "js-cookie";

const HTTP_URI =
  process.env.NODE_ENV === "production"
    ? "https://brisk-hospitable-indianelephant.gigalixirapp.com"
    : "http://localhost:4000";

const WS_URI =
  process.env.NODE_ENV === "production"
    ? "wss://brisk-hospitable-indianelephant.gigalixirapp.com/socket"
    : "ws://localhost:4000/socket";

// ...
Copy the code

It’s easy to start by importing a bunch of dependencies that we need to use next, and setting the HTTP URL and websocket URL as constants depending on the current environment — pointing to my Gigalixir instance in the production environment, Point to localHost in the Development environment.

// client/src/util.apollo.js
// ...

export const createClient = (a)= > {
  // Create the basic HTTP link.
  const httpLink = createHttpLink({ uri: HTTP_URI });

  // Create an Absinthe socket wrapped around a standard
  // Phoenix websocket connection.
  const absintheSocket = AbsintheSocket.create(
    new PhoenixSocket(WS_URI, {
      params: (a)= > {
        if (Cookies.get("token")) {
          return { token: Cookies.get("token")}; }else {
          return{}; }}}));// Use the Absinthe helper to create a websocket link around
  // the socket.
  const socketLink = createAbsintheSocketLink(absintheSocket);

  // ...
});
Copy the code

The Apollo client asks you to provide a link — essentially, a connection to the GraphQL server that your Apollo client is requesting. There are generally two types of links — HTTP links, which send requests to the GraphQL server over standard HTTP, and WebSocket links, which open a WebSocket connection and send requests over sockets. In our example, we used both. For query and mutation in general, we will use HTTP links and for subscription, we will use websocket links.

// client/src/util.apollo.js
export const createClient = (a)= > {
  / /...

  // Split traffic based on type -- queries and mutations go
  // through the HTTP link, subscriptions go through the
  // websocket link.
  const splitLink = split(
    (operation) = > hasSubscription(operation.query),
    socketLink,
    httpLink,
  );

  // Add a wrapper to set the auth token (if any) to the
  // authorization header on HTTP requests.
  const authLink = setContext((_, { headers }) = > {
    // Get the authentication token from the cookie if it exists.
    const token = Cookies.get("token");

    // Return the headers to the context so httpLink can read them.
    return {
      headers: {
        ...headers,
        authorization: token ? `Bearer ${token}` : "",}}; });const link = authLink.concat(splitLink);

  // ...
};
Copy the code

Apollo provides the split function, which lets you route different query requests to different links depending on the criteria you choose — think of it as a trinomial: If the request has subscription, it is sent via a socket link, otherwise (Query or Mutation) it is sent using an HTTP link.

If the user is logged in, we may also need to provide authentication for both links. When the user logs in, we set their authentication token into the token’s cookie (more on that later). When we set up a Websocket connection with Phoenix, we use token as a parameter, and in the HTTP link, we use the setContext wrapper to set token in the header field of the request.

// client/src/util.apollo.js
export const createClient = (a)= > {
  // ...

  return new ApolloClient({
    cache: new InMemoryCache(),
    link,
  });
});
Copy the code

As shown above, an Apollo client also needs a cached instance in addition to the link. GraphQL automatically caches the results of requests to avoid repeated requests for the same data. The basic InMemoryCache is already available for most user cases — it simply stores the queried data in the browser’s local state.

Client use – our first request

Well, we’ve set up a client instance of Apollo and made it available throughout the application through ApolloProvider’s higher-order functions. Now let’s take a look at running Query and mutation. We’ll start with the Posts component, which will render a list of Posts on our home page.

// client/src/components/Posts.js
import React, { Fragment } from "react";
import { Query } from "react-apollo";
import gql from "graphql-tag";
import produce from "immer";
import { ErrorMessage, Feed, Loading } from "components";

export const GET_POSTS = gql` { posts { id body insertedAt user { id name gravatarMd5 } } } `;

export const POSTS_SUBSCRIPTION = gql` subscription onPostCreated { postCreated { id body insertedAt user { id name gravatarMd5 } } } `;

// ...
Copy the code

First comes the introduction of various libraries, and then we need to write some queries for the posts we want to render. There are two — first, a basic Query that gets a list of posts (which also includes information about the author of the post), and then a subscription that notifies us of new posts and allows us to update the screen in real time to keep our list up to date.

// client/src/components/Posts.js
// ...

const Posts = (a)= > {
  return( <Fragment> <h4>Feed</h4> <Query query={GET_POSTS}> {({ loading, error, data, subscribeToMore }) => { if (loading) return <Loading />; if (error) return <ErrorMessage message={error.message} />; return ( <Feed feedType="post" items={data.posts} subscribeToNew={() => subscribeToMore({ document: POSTS_SUBSCRIPTION, updateQuery: (prev, { subscriptionData }) => { if (! subscriptionData.data) return prev; const newPost = subscriptionData.data.postCreated; return produce(prev, (next) => { next.posts.unshift(newPost); }); },})} />); }} </Query> </Fragment> ); };Copy the code

Now we’ll implement the real component part. First, to perform the basic Query, we render Apollo’s

. It provides rendering props — Loading, Error, data, and subscribeToMore — to its child components. If the query is loading, we render a simple loading image. If there is an error, we render a generic ErrorMessage component to the user. If not, we render a Feed component (data.posts contains the posts to render, and the structure is the same as in Query).

SubscribeToMore is an Apollo help function that implements a subscription that only gets new data from the collection the user is browsing. It should be rendered in the componentDidMount phase of the child component, which is why it is passed to the Feed as props — once the Feed is rendered, the Feed is responsible for calling the subscribeToNew. We provide subscribeToMore with our Subscription query and an updateQuery callback function that is called when Apollo receives notification that a new post has been created. When that happens, we simply push the new post into our current post array, using immer to return a new array to ensure that the component is rendered correctly.

Authentication (and mutation)

Now that we have a front page with a list of posts that responds to new posts in real time – how do we create new posts? First, we need to allow the user to log in with their account, so we can associate their account with the post. We need to write a mutation for this — we need to send the email and password to the server, which sends a new token to authenticate the user. Let’s start with the landing page:

// client/src/pages/Login.js
import React, { Fragment, useContext, useState } from "react";
import { Mutation } from "react-apollo";
import { Button, Col, Container, Form, Row } from "react-bootstrap";
import Helmet from "react-helmet";
import gql from "graphql-tag";
import { Redirect } from "react-router-dom";
import renderIf from "render-if";
import { AuthContext } from "util/context";

export const LOGIN = gql` mutation Login($email: String! , $password: String!) { authenticate(email: $email, password: $password) { id token } } `;
Copy the code

The first part is very similar to the Query component — we import the required dependency files and do the mutation for the login. This mutation takes an email and password as parameters, and then we want to get the ID of the authenticated user and their authentication token.

// client/src/pages/Login.js
// ...

const Login = (a)= > {
  const { token, setAuth } = useContext(AuthContext);
  const [isInvalid, setIsInvalid] = useState(false);
  const [email, setEmail] = useState("");
  const [password, setPassword] = useState("");

  if (token) {
    return <Redirect to="/" />;
  }

  // ...
};
Copy the code

In the component, we first get the current token from the context and a function called setAuth (which we’ll cover later). We also need to use useState to set up some local state so we can store temporary values for the user’s email, password, and whether their certificate is valid (so we can display error status in the form). Finally, if the user already has an authentication token to indicate that they are logged in, we direct them to the home page.

// client/src/pages/Login.js
// ...

const Login = (a)= > {
  // ...

  return (
    <Fragment>
      <Helmet>
        <title>Socializer | Log in</title>
        <meta property="og:title" content="Socializer | Log in" />
      </Helmet>
      <Mutation mutation={LOGIN} onError={() => setIsInvalid(true)}>
        {(login, { data, loading, error }) => {
          if (data) {
            const {
              authenticate: { id, token },
            } = data;
            setAuth({ id, token });
          }

          return (
            <Container>
              <Row>
                <Col md={6} xs={12}>
                  <Form
                    data-testid="login-form"
                    onSubmit={(e) => {
                      e.preventDefault();
                      login({ variables: { email, password } });
                    }}
                  >
                    <Form.Group controlId="formEmail">
                      <Form.Label>Email address</Form.Label>
                      <Form.Control
                        type="email"
                        placeholder="[email protected]"
                        value={email}
                        onChange={(e) => {
                          setEmail(e.target.value);
                          setIsInvalid(false);
                        }}
                        isInvalid={isInvalid}
                      />
                      {renderIf(error)(
                        <Form.Control.Feedback type="invalid">
                          Email or password is invalid
                        </Form.Control.Feedback>,
                      )}
                    </Form.Group>

                    <Form.Group controlId="formPassword">
                      <Form.Label>Password</Form.Label>
                      <Form.Control
                        type="password"
                        placeholder="Password"
                        value={password}
                        onChange={(e) => {
                          setPassword(e.target.value);
                          setIsInvalid(false);
                        }}
                        isInvalid={isInvalid}
                      />
                    </Form.Group>

                    <Button variant="primary" type="submit" disabled={loading}>
                      {loading ? "Logging in..." : "Log in"}
                    </Button>
                  </Form>
                </Col>
              </Row>
            </Container>
          );
        }}
      </Mutation>
    </Fragment>
  );
};

export default Login;
Copy the code

The code here looks foreign, but don’t get confused — most of the code here is just making a Bootstrap component for the form. We started with a component called A Helmet (React-Helmet) — it’s a top-level form component (in contrast, the Posts component is just a child of the Home page rendering), so we wanted to give it a browser title and some metadata. Next we render the Mutation component and pass our Mutation statement to it. If mutation returns an error, we use the onError callback function to set the status to invalid to display the error in the form. Mutation passes a function to the calling child component (in this case login), and the second argument is the same array we got from the Query component. If data exists, it means that mutation was successfully performed, and we can store our authentication token and user ID through the setAuth function. The rest of the React component is pretty standard — we render the input and update the state value as it changes, displaying an error message when the user tries to log in but the email password is invalid.

So what is AuthContext? When the user is successfully authenticated, we need to store their authentication token on the client in some way. GraphQL doesn’t help here because it’s like a chicken and egg problem — you need to make a request to get an authentication token, and the authentication token itself is used to authenticate the request. We could use Redux to store tokens in local state, but it would be too complicated if I only needed to store this value. We can use the React Context API to store tokens in the root directory of our application and call them as needed.

First, let’s create a helper function to help us create and export the context:

// client/src/util/context.js
import { createContext } from "react";

export const AuthContext = createContext(null);
Copy the code

Let’s create a new StateProvider high-level function that will be rendered at the root of the application — it will help us save and update the authentication status.

// client/src/containers/StateProvider.js
import React, { useEffect, useState } from "react";
import { withApollo } from "react-apollo";
import Cookies from "js-cookie";
import { refreshSocket } from "util/apollo";
import { AuthContext } from "util/context";

const StateProvider = ({ client, socket, children }) = > {
  const [token, setToken] = useState(Cookies.get("token"));
  const [userId, setUserId] = useState(Cookies.get("userId"));

  // If the token changed (i.e. the user logged in
  // or out), clear the Apollo store and refresh the
  // websocket connection.
  useEffect((a)= > {
    if(! token) client.clearStore();if (socket) refreshSocket(socket);
  }, [token]);

  const setAuth = (data) = > {
    if (data) {
      const { id, token } = data;
      Cookies.set("token", token);
      Cookies.set("userId", id);
      setToken(token);
      setUserId(id);
    } else {
      Cookies.remove("token");
      Cookies.remove("userId");
      setToken(null);
      setUserId(null); }};return (
    <AuthContext.Provider value={{ token.userId.setAuth}} >
      {children}
    </AuthContext.Provider>
  );
};

export default withApollo(StateProvider);
Copy the code

There’s a lot of stuff here. First, we establish state for the token and userId of the authenticated user. We initialize state by reading cookies so that we can guarantee the user’s login status after the page refreshes. Next we implement our setAuth function. Calling this function with NULL will log the user out; Otherwise, the token and userId provided are used to login the user. Either way, this function updates the local state and cookies.

There is a big problem with using authentication and Apollo WebSocket Link at the same time. When we initialize the Websocket, we use the token if the user is authenticated, and not the token if the user logs out. However, when the authentication status changes, we need to reset the WebSocket connection based on the status. If the user logs out before logging in, we need the user’s new token to reset the Websocket so that they can receive updates in real time about the activity they need to log in to, such as a chat. If the user logs in and then logs out, we reset the WebSocket to an unauthenticated state so that they no longer receive updates from their logged out accounts in real time. It turned out to be really hard — without a well-documented solution, it took me hours to solve. I ended up manually implementing a reset function for the socket:

// client/src/util.apollo.js
export const refreshSocket = (socket) = > {
  socket.phoenixSocket.disconnect();
  socket.phoenixSocket.channels[0].leave();
  socket.channel = socket.phoenixSocket.channel("__absinthe__:control");
  socket.channelJoinCreated = false;
  socket.phoenixSocket.connect();
};
Copy the code

This will disconnect the Phoenix socket, leaving the existing Phoenix channel for GraphQL to update, creating a new Phoenix channel (with the same name as the default channel created by Abisnthe), Mark the channel as connected (so that Absinthe rejoins it when it connects), and then reconnect the socket. In the file, the Phoenix socket is configured to dynamically look for the token in the cookie before each connection, so that whenever it reconnects, it will use the new authentication state. It broke my heart that there was no good solution to such a seemingly ordinary problem, although with some manual effort it worked fine.

Finally, useEffect, used in our StateProvider, is where refreshSocket is called. The second argument [token] tells React to reevaluate the function each time the token value changes. If the user simply logs out, we also execute the client.clearStore() function to ensure that the Apollo client does not continue to cache query results that contain data that requires permissions, such as user conversations or messages.

That’s about it for the client side. You can look at the rest of the components for more examples of Query, mutation, and subscription, of course, all of which follow roughly the same pattern as we mentioned.

Test – client

Let’s write some tests to override our React code. Jest is built into our app (create-react-app includes it by default); Jest is a very simple and intuitive test runner for JavaScript. It also includes advanced features such as snapshot testing. We will use it in our first test case.

I’m a big fan of using react-testing-library to write test cases for React — it provides a very simple API that helps you render and test forms from a user’s perspective (regardless of the implementation of the component). In addition, its helper functions can help you to ensure readability of components to a certain extent, because if your DOM nodes are difficult to access, you can’t interact with them by manipulating them directly (for example, labeling text correctly, etc.).

Let’s start by writing a simple test for the Loading component. This component just renders static HTML, so there is no logic to test; We just want to make sure the HTML is rendered as we expect.

// client/src/components/Loading.test.js
import React from "react";
import { render } from "react-testing-library";
import Loading from "./Loading";

describe("Loading", () => {
  it("renders correctly", () = > {const { container } = render(<Loading />);
    expect(container.firstChild).toMatchSnapshot();
  });
});
Copy the code

When you call.tomatchsnapshot (), jest will create a file in the relative path of __snapshots__/ loading.test.js.snap to record the current state. Subsequent tests compare the output to the snapshot we recorded and fail if it does not match the snapshot. The snapshot file looks like this:

// client/src/components/__snapshots__/Loading.test.js.snap
// Jest Snapshot v1, https://goo.gl/fbAQLP

exports[`Loading renders correctly 1`] = ` 
      
Loading...
`
; Copy the code

In this case, because the HTML never changes, the snapshot test isn’t all that efficient — it certainly serves the purpose of verifying that the component has rendered successfully without any errors. In more advanced test cases, snapshot testing is very effective in ensuring that a component changes only when you want it to — for example, if you are optimizing the logic within the component but don’t want the output of the component to change, a snapshot test will tell you if you made a mistake.

Next, let’s look at a test of a component connected to Apollo. From there, it gets a little more complicated; The component expects Apollo’s client in its context, and we need to simulate a Query statement to make sure the component handles the response correctly.

// client/src/components/Posts.test.js
import React from "react";
import { render, wait } from "react-testing-library";
import { MockedProvider } from "react-apollo/test-utils";
import { MemoryRouter } from "react-router-dom";
import tk from "timekeeper";
import { Subscriber } from "containers";
import { AuthContext } from "util/context";
import Posts, { GET_POSTS, POSTS_SUBSCRIPTION } from "./Posts";

jest.mock("containers/Subscriber", () =>
  jest.fn().mockImplementation(({ children }) = > children),
);

describe("Posts", () => {
  beforeEach((a)= > {
    tk.freeze("2019-04-20");
  });

  afterEach((a)= > {
    tk.reset();
  });

  // ...
});
Copy the code

First, some imports and simulations. The emulation here is to prevent the Subscription of the Posts component from being registered where we don’t want it to be. I was frustrated here – Apollo has documentation for simulated Query and mutation, but there wasn’t much documentation for simulated Subscription, and I often ran into all sorts of mysterious, internal, and very difficult problems. When I just wanted the component to execute its original Query (rather than simulate receiving an update from its subscription), I couldn’t think of a reliable way to simulate a Query at all.

But it does give a good opportunity to talk about JEST — a case that works very well. I have a Subscriber component that usually calls subscribeToNew when loading (mount) and then returns its child components:

// client/src/containers/Subscriber.js
import { useEffect } from "react";

const Subscriber = ({ subscribeToNew, children }) = > {
  useEffect((a)= >{ subscribeToNew(); } []);return children;
};

export default Subscriber;
Copy the code

So, in my tests, I only need to simulate the implementation of this component to return the child components without actually calling the subscribeToNew.

Finally, I used TimeKeeper to fix the timing of each test case — Posts rendered some text based on when the post was published and the current time (for example, two days ago), so I needed to make sure the test was always running at the “same” time, otherwise the snapshot test would fail due to time lapse.

// client/src/components/Posts.test.js
// ...

describe("Posts", () = > {// ...

  it("renders correctly when loading", () = > {const { container } = render(
      <MemoryRouter>
        <AuthContext.Provider value={{}}>
          <MockedProvider mocks={} [] addTypename={false}>
            <Posts />
          </MockedProvider>
        </AuthContext.Provider>
      </MemoryRouter>,); expect(container).toMatchSnapshot(); }); / /... });Copy the code

Our first test checked the state of the loaded. We have to wrap it around several higher-order functions — the MemoryRouter, which provides a simulated Route for the React Router Link and Route; Authcontext. Provider, which provides the status of authentication, and Apollo MockedProvider. Because we’ve taken an instant snapshot and returned, we don’t really need to simulate anything; An instant snapshot captures the loading status before Apollo has a chance to execute the Query.

// client/src/components/Posts.test.js
// ...

describe("Posts", () = > {// ...

  it("renders correctly when loaded".async() = > {const mocks = [
      {
        request: {
          query: GET_POSTS,
        },
        result: {
          data: {
            posts: [{id: 1.body: "Thoughts".insertedAt: "2019-04-18T00:00:00".user: {
                  id: 1.name: "John Smith".gravatarMd5: "abc",},},],},},},];const { container, getByText } = render(
      <MemoryRouter>
        <AuthContext.Provider value={{}}>
          <MockedProvider mocks={mocks} addTypename={false}>
            <Posts />
          </MockedProvider>
        </AuthContext.Provider>
      </MemoryRouter>,); await wait(() => getByText("Thoughts")); expect(container).toMatchSnapshot(); }); / /... });Copy the code

For this test, we want to take a snapshot as soon as the post is displayed after loading. To achieve this, we must have the test async and then await the end of the load state with the wait of the react-testing-library. wait(() => …) The function is simply retried until the result is no longer an error – usually no more than 0.1 seconds. As soon as the text appeared, we took a snapshot of the entire component to make sure it was what we expected.

// client/src/components/Posts.test.js
// ...

describe("Posts", () = > {// ...

  it("renders correctly after created post".async () => {
    Subscriber.mockImplementation((props) = > {
      const { default: ActualSubscriber } = jest.requireActual(
        "containers/Subscriber",);return<ActualSubscriber {... props} />; }); const mocks = [ { request: { query: GET_POSTS, }, result: { data: { posts: [ { id: 1, body: "Thoughts", insertedAt: "2019-04-18T00:00:00", user: { id: 1, name: "John Smith", gravatarMd5: "abc", }, }, ], }, }, }, { request: { query: POSTS_SUBSCRIPTION, }, result: { data: { postCreated: { id: 2, body: "Opinions", insertedAt: "2019-04-19T00:00:00", user: { id: 2, name: "Jane Thompson", gravatarMd5: "def", }, }, }, }, }, ]; const { container, getByText } = render( <MemoryRouter> <AuthContext.Provider value={{}}> <MockedProvider mocks={mocks} addTypename={false}> <Posts /> </MockedProvider> </AuthContext.Provider> </MemoryRouter>, ); await wait(() => getByText("Opinions")); expect(container).toMatchSnapshot(); }); });Copy the code

Finally, we’ll test subscription to ensure that when the component receives a new post, it renders correctly as expected. In this test case, we need to update the Subscription simulation so that it actually returns the original implementation and subscribes to the changes (new post) for the component. We also simulated a query called POSTS_SUBSCRIPTION to simulate the subscription receiving a new post. Finally, as with the test above, we wait for the end of the query (and the text of the new post) and take a snapshot of the HTML.

And that’s pretty much it. Jest and the React-testing-library are both powerful and make it easy to test components. Testing Apollo was a little more difficult, but by using mock data wisely, we were able to write some very complete tests that tested the state of all the major components.

Server-side rendering

There is now only one problem with our client – all the HTML is rendered on the client side. The HTML returned from the server is an empty index.html file with a

Here, the concept of server-side rendering (SSR) is introduced. Essentially, instead of serving static HTML index files, we route requests to the Node.js server side. The server renders the component (which parses any query to the GraphQL endpoint) and returns the output HTML, with

Unfortunately, I’ve found that there really isn’t a universal way to configure SSR — the basics are the same (both running a Node.js server that can render components) but there are a few different implementations, and none of them are standardized. Most of the configuration of my application comes from CRA-SSR, which provides a very easy to understand SSR implementation for create-React-app built applications. Because the CRA-SSR tutorial provides a pretty good introduction, I won’t go into any more depth here. I mean, SSR is great and makes applications load very fast, although it’s a little difficult to implement.

Conclusions and Results

Thank you all for seeing this! It’s a lot to cover here, because I want to really dive into a complex application, practice all the techniques from start to finish, and solve some real world problems. If you’ve read this far, hopefully you have some good understanding of how all this technology works together. You can see the full code on Github. Or try this online demo. This demo is deployed on the free version of Heroku Dyno, so it may take up to 30 seconds to wake up the server when you visit. If you have any questions, leave them in the comments below the demo, and I’ll do my best to answer them.

My deployment experience was also full of frustration and problems. Some of this is to be expected, including some learning curves for new frameworks and libraries — but there are also areas where better documentation and tools could have saved me a lot of time and a lot of headaches. Apollo in particular, I had a lot of trouble understanding how to get WebSocket to reinitialize its connection after an authentication change; Normally this would have been written down in a document, but apparently I couldn’t find anything. Similarly, I had a lot of problems testing subscriptions and eventually had to give up and use mock tests instead. The documentation for the tests was good enough for basic tests, but I found it too shallow when I wanted to write more advanced test cases. I am also often confused by the lack of documentation for the API, mainly part of the Apollo and Absinthe client libraries. For example, when I looked into how to reset webSocket connections, I couldn’t find any documentation of Absinthe Socket instances and Apollo Link instances. The only thing I could do was read the GitHub source code from cover to cover. My experience with Apollo was much better than my experience with Relay a few years ago — but the next time I used it, I had to accept that it would take a lot more time to crack the modified code if I wanted to do things differently.

All in all, I give this stack a very high rating, and I really like this project. Elixir and Phoenix are a breath of fresh air; There is a bit of a learning curve if you come from Rails, but I really like some of Elixir’s language features, such as pattern matching and pipe operators. Elixir has a lot of fresh ideas (and a lot of battle-tested concepts from functional programming) that make writing meaningful, good-looking code a lot easier. The use of Absinthe is like a stroke of spring breeze; It was well implemented, well documented, covered almost all the reasonable use cases for implementing the GraphQL server, and overall I found that the core concepts of GraphQL were well delivered. It’s easy to query the data I need for each page, and it’s also easy to update it in real time using Subscription. I’ve always been a big fan of React and React Router, and this time is no exception — they make it easy to build complex, interactive front-end user interfaces. In the end, I was very happy with the overall result — as a user, the app loads and browsed very quickly, and everything is live so you can always stay in sync. If the ultimate measure of the tech stack is the user experience, this combination must be a huge success.


  • Elixir, Phoenix, Absinthe, GraphQL, React, and Apollo: an absurdly deep dive – Part 1
  • Elixir, Phoenix, Absinthe, GraphQL, React, and Apollo: an absurdly deep dive – Part 2

If you find any mistakes in your translation or other areas that need to be improved, you are welcome to the Nuggets Translation Program to revise and PR your translation, and you can also get the corresponding reward points. The permanent link to this article at the beginning of this article is the MarkDown link to this article on GitHub.


The Nuggets Translation Project is a community that translates quality Internet technical articles from English sharing articles on nuggets. The content covers Android, iOS, front-end, back-end, blockchain, products, design, artificial intelligence and other fields. If you want to see more high-quality translation, please continue to pay attention to the Translation plan of Digging Gold, the official Weibo, Zhihu column.