IE retrospective • Web Services

Iconic Engine, Web-Services

Web services using Kubernetes, Docker and node.js.

  1. Service for "friending" and interacting with other users, link
  2. Authentication service using OAuth2, link
  3. Service for deploying other services, link
  4. GeoIP service, link
  5. Health-monitoring service, link

2. Social Profile

Javascript, GraphQL, Koa, SSE

vasoovasoovasoo

Social-profile was used to send "friend requests" and do other things like what is done in messaging apps like Telegram and Discord. It was used by Iconic Engine's MobileVR applications, such as Vasoo. The showcase feature was; event-streaming. Event-streams notified users when friends came online or requested to join a room.

These screenshots show interactive representations of its GraphQL API,

voyager, social profile apivoyager, social profile api, subscriptions voyager, social profile api, mutations

example event stream curl request
curl https://social-profile-prod.main.eks.iconicengine.com \
 -H "Authorization: Bearer $BEARER" \
 -H "Accept: text/event-stream" \
 -G --data-urlencode "$(echo '
query=subscription {
  events {
    id,
    user { id, displayName },
    timePublished,
    roomId,
    eventType,
    ...on EventMembershipType {
      membershipId
      membershipType
      groupType
      userSender { id displayName }
    }
    ...on EventAvailabilityType {
      presenceType
    }
  }
}' | tr -d '\n')"
# event: heartbeat
# data: {"heartbeat":1670280881220}
#
# data: {"events":["…"]}

Many "flows" were unit-tested for example; login, then create a room, then invite a friend to the room, then accept the room invitation […] etc.

Event streams were carefully architected to reduce the number of listeners attached to RethinkDB's queue-based changefeed system (see this document) and, for a time, the service supported both major streaming technologies at once; 1) "Web Sockets" and 2) "Server Sent Events", read this document for my comparison of those.

What I did

I created this service. Social profile stood out for reliability at all times, particularly during important events and demonstrations.

Related documents,


3. Auth Service

Javascript, REST, Koa

login

Auth-service was an OAuth2 authorization and authentication service supporting most OAuth2 "grant types". It was used for authenticating Iconic Engine request traffic from users, developers, apps and other services.

Auth-service had the ability to connect and relate seemingly disparate user data across multiple applications and clients. Inversely, it also had the ability to isolate data for premium applications, by using separated database instances.


login

Why write a custom OAuth2 service? Why not use "Ory" or some other free OAuth2 software?

Mainly, auth-service started with custom login flows written by someone else. It was incrementally transformed to an OAuth2 service. Other services using the older flows were gradually migrated to OAuth2 flows. If we had started “from scratch” we would have used a different approach.

Too complicated! You can't be trusted to write your OAuth2 service!

Not true. The spec is not too difficult to follow.


Some cool things about auth-service,

example registration curl request
curl https://sso.auth.iconicengine.com/user/register \
  -X POST \
  -H "Content-Type: application/json" \
  -H "Accept: application/json" \
  -H "Accept-Language: ja-JP" \
  -d '
{
  "client_id": "your-id",
  "display_name": "loongcat",
  "password": "password1234",
  "email": "loongcat@mailinator.com",
  "tos_is_agree": "true",
  "date_of_birth_ISO8601": "1970-06-06"
}'
#
# example error
# 400 {
#   "code":1204,
#   "error":"invalid_request",
#   "error_description":"表示名はすでに使用されています「loongcat」"
# }
#
# example success
# 200 {"token_type":"Bearer","scope":"openid",
# "expires_in":172799989,"refresh_token":"...","access_token":"..."}

What I did

I created this service and maintained routes from the legacy service that preceded it.

Auth-service was praised for its reliability, particularly around registration and verification emails, which had previously been un-reliable.


4. Jyobu Hashiru

Javascript, REST, Koa, Kube

Jyobu-Hashiru ジョブ走る replaced an older service named "Job Runner". It supported two main features,

  1. It updated deployments in the kube. For example, this request would launch the "1.1.1" version of the auth-service (by updating auth-service's kube "Deployment" spec to use docker image "1.1.1")
    curl $JYOBUHOST/deploy/auth-prod.ns-auth-prod:1.1.1 \
      -H "Authorization: Bearer $TOKEN"
    
  2. It started Kube "jobs" for things like sending emails and transcoding videos
    curl $JYOBUHOST/job --insecure \
      -X POST \
      -H "Authorization: Bearer $TOKEN" \
      -H "Content-Type: application/json" \
      -H "Accept: application/json" \
      -d '{
        "repoName": "send-email",
        "args": [
          "--to", "test-123123@mailinator.com",
          "--from", "Iconic Engine <automatic-notifications@iconicengine.com>",
          "--type", "validateEmail",
          "--token", "a-test-token",
          "--appName", "vasoo"
        ]
    }'
    

What I did

I created this service to emergency-replace an older failing-service that had been a constant source of outages and problems. Upon first release, Jyobu-Hashiru was stable and provided instant relief.

I communicated with other people to make Jyobu-Hashiru useful to them. When people wanted to use Jyobu-Hashiru to deploy cloud services, I helped to make everything work. When people wanted to use it for sending emails, I explained the process and provided curl requests for them to experiment with.

I managed the service and wrote "jobs" scripts it used: send-email, s3-copy, media-metadata-probe, transocde-elemental, transcode-elastic, transcode-clearvr, check-cloudfront. I managed whitelists allowing certain tokens to request certain routes.

Jyobu-Hashiru improved the deployment and reliability of services it supported. Client-services remained "small" by off-loading complex tertiary-behaviour to Jyobu-Hashiru.


5. GeoIP Service

Javascript, REST, Koa

geoip-service returned simple geographic details for ip addresses to authorized internal requests. It used maxmind GeoIP data.

curl $GEOIP_HOST/geoInfo?ip=47.138.130.146 -H "$AUTH_HEADER"
# 200 {
#   "global_pos":{
#     "accuracy_radius_km":10,
#     "latitude":33.9776,
#     "longitude":-117.7375,
#     "time_zone":"America/Los_Angeles"
#   },
#   "continent":{"name":"North America"},
#   "country":{"iso_code":"US","name":"United States"},
#   "registered_country":{"iso_code":"US","name":"United States"},
#   "subdivisions":{"iso_code":"CA","name":"California"},
#   "postal_code":"91709","city":{"name":"Chino Hills"}
# }

What I did

I reverse-engineered, maintained and updated this service resolving outages that originated from it. I added unit-tests, documentation and deployment pipelines that did not previously exist.

I updated both "geoip-service" and "mercury" in ways that dramtically reduced their size and number of dependencies. Removal of the "Babel" transpiler and other dependencies reduced size and complexity everywhere (from ~2GB to 80Mb and faster unit-tests).

Geoip-service made it easy for other services to use GeoIP data.


6. Mercury

Javascript, REST, Koa

Mercury was a health-monitoring service that tracked other services in the cluster, to identify which services were "UP" and which were "DOWN".

curl https://mercury.iconicengine.com/status/geoip-service
{"status":"pass","lastStatusUpdate":"2022-01-11T23:53:19.082Z"}

What I did

I reverse-engineered and updated this service to resolve problems inside the cluster. I added unit-tests and documentation where previously none had existed.

I updated both "geoip-service" and "mercury" to reduce their sizes.


← Return to IE Retrospective

bumblehead.com