project screenshot
project screenshot
project screenshot


Users can create and/or curate content. 360 video content that is. Experience a social WebVR metaverse built on top of lens protocol.


Created At


Project Description

This project is the beginning of a larger scope idea that is inspired by VR therapy and VR gaming research. This app is a prototype for a VR Metaverse that is sociable, accessible, ownable, and secure. Front end entirely developed during LFGrow hackathon only.

Once you login, you get to your feed. The feed will look like a bunch of portals and you basically can interact with the portal to consume the portal's content. Each portal would be dynamically rendered based upon what get's queried against the lens protocol. Thus decentralized profiles. Two main integrations of front end with lens protocol are: 1. uploading your own avataruri and 2. uploading your own contenturi.

Finally, it is social because you can see and hear other player in real time. Provided by an in-house solution for server side multiplayer api.

How it's Made

It is a web app in Typescript and npm. Based on Angular framework with native web socket api that is proxied on a local client. Wrapped inside Angular, we use for the markup to render 3D elements which are then rendered by Angular. The Web socket (ws) api is developed as a DOM reader which is able to track the camera on the browser-> send that as a json through ws on an external ws server -> server handles the messages from valid clients and creates server sided rendering for a multiplayer webvr scene.

Asset ownership

360 video and 3d model assets displayed in this project are used with permission and allowed to be displayed in demo

WebRTC sdk

Then imported Agora SDK to allow for cross platform webrtc server. We build a canvas user interface in 3D to interact with the voice channel in 3 ways: Join, Leave, Mute/Unute Mic.

Angular components:

Built a form component that will appear if 0 lens protocol profiles in signer’s wallet.

After form submission, wallet receives minted profile and is routed to the scene component.

Upload component:

We’d like the user to purely use metamask to sign (due to dev work flow, we still sign with PK for lens protocol). This is how we would sign for avatar and content uploads. The upload to ipfs and also token uri metadata are handled by implementing Moralis.

Scene component

Tested lighting, textures, 3d model animations

Tested on different versions of aframe. Works on v.1.0.3, but latest as of today version is v1.3.0

background image mobile

Join the mailinglist

Get the latest news and updates