[{"data":1,"prerenderedAt":706},["ShallowReactive",2],{"blog-posts":3},{"data":4,"meta":702},[5,18,29,41,54,67,79,92,105,117,129,141,154,166,178,190,202,214,226,238,250,262,274,286,298,310,322,334,346,358,370,382,393,405,417,428,440,452,464,476,488,500,512,524,536,548,559,571,583,595,607,619,630,642,654,666,678,690],{"id":6,"attributes":7},69,{"title":8,"content":9,"createdAt":10,"updatedAt":11,"publishedAt":12,"date":13,"description":14,"keywords":15,"slug":16,"category":17},"The New Front Door to Nanome","\u003Cp>Two years ago, our web platform started at mara.nanome.ai. Starting this week, we launch \u003Ca href=\"http:\u002F\u002Fapp.nanome.ai\">app.nanome.ai\u003C\u002Fa>. It will contain the full collaborative experience of interacting with Nanome workspaces via web or XR and workflow orchestration via MARA and soon our new agents.\u003C\u002Fp>\n\n\u003Cp>That URL change is a small thing that points at a bigger one. For most of our history, the answer to \"how do I try Nanome?\" started with a headset. That was fine when the people asking were already XR-native. It also introduced friction for others. Today we're opening a new front door.\u003C\u002Fp>\n\n\u003Cp>A rebuilt nanome.ai, a web-based workspace embedded, shareable sessions that don't need a login, and new ways to come into the platform at whatever level fits your team. This is the biggest shift in how people experience Nanome since we shipped the first version.\u003C\u002Fp>\n\n\u003Cp>It's also why you'll see a new line across the site:\u003C\u002Fp>\n\n\u003Cp>\u003Cstrong>Where molecules, scientists, and agents meet.\u003C\u002Fstrong> \u003Cem>The collaborative workspace for drug discovery.\u003C\u002Fem>\u003C\u002Fp>\n\n\u003Cp>Three nouns and a verb. A plain description of what we actually are. We've spent 10 years earning the right to say it this way, and the rest of this post is the evidence.\u003C\u002Fp>\n\n\u003Ch2>app.nanome.ai: The full platform\u003C\u002Fh2>\n\n\u003Cdiv style=\"text-align:center;\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fwebapp_3b57a02ceb.png\" alt=\"Nanome web app full platform screenshot\" style=\"max-width:100%;height:auto;\" \u002F>\u003C\u002Fdiv>\n\n\u003Cp>With v2.5, the web app is no longer a preview of Nanome. It \u003Cem>is\u003C\u002Fem> Nanome. Visualization, scenes, workspaces, MARA, natural language representations, PDB\u002FSDF\u002FCIF uploads, the 2D chemical builder, trajectory playback, spotlight and follow, multi-user sessions alongside XR teammates. All of it in the browser.\u003C\u002Fp>\n\n\u003Cp>It runs cross-device too. We've been testing on iPad, smartphones, Linux, Mac, and Windows. Nanome is still best in XR, but the web experience now stands on its own legs.\u003C\u002Fp>\n\n\u003Ch2>Shareable workspaces, no login required\u003C\u002Fh2>\n\n\u003Cdiv style=\"text-align:center;\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fpublic_workspace_8f8f785e3c.png\" alt=\"Nanome shareable public workspace\" style=\"max-width:100%;height:auto;\" \u002F>\u003C\u002Fdiv>\n\n\u003Cp>You can now share a workspace link with anyone, and they can open it in their browser with zero friction. No account, no install, no sign-up wall.\u003C\u002Fp>\n\n\u003Cp>This is how the embedded workspace on our landing page works. The molecule you're looking at on the homepage is a real, live Nanome workspace. Click in and you're navigating it.\u003C\u002Fp>\n\n\u003Cp>For lab groups, collaborators at partner institutions, or the scientist you sat next to on a flight and want to show something cool to, this lowers the activation energy to basically zero. Send a link, they're in.\u003C\u002Fp>\n\n\u003Ch2>A homepage that actually shows you the thing\u003C\u002Fh2>\n\n\u003Cdiv style=\"text-align:center;\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fwebembed_fbf70bf5c6.png\" alt=\"Nanome web workspace embedded on homepage\" style=\"max-width:100%;height:auto;\" \u002F>\u003C\u002Fdiv>\n\n\u003Cp>We've embedded the web workspace directly into the new homepage. We've lined up example scenes that show where Nanome fits across drug discovery: small molecules, molecular glues, antibodies, enzymes, MOFs and more. Each one opens in the full workspace with a click, and if there's a headset on your desk, you can jump straight into XR from the same page.\u003C\u002Fp>\n\n\u003Cp>There's also a new homepage animation that walks through how Nanome works and where it sits in your stack. Worth a watch if you've ever tried to explain us to someone at a conference in under 30 seconds.\u003C\u002Fp>\n\n\u003Ch2>Integrations, front and center\u003C\u002Fh2>\n\n\u003Cdiv style=\"text-align:center;\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fintegrations_fcce96a93f.png\" alt=\"Nanome integrations page\" style=\"max-width:100%;height:auto;\" \u002F>\u003C\u002Fdiv>\n\n\u003Cp>We've given integrations their own home. The page highlights a handful of our third-party partners up front, with the full catalog of hundreds of tools sitting behind it. If you've been wondering whether Nanome plays nice with the software your team already runs, odds are good the answer is yes. Take a look at \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fintegrations\">nanome.ai\u002Fintegrations\u003C\u002Fa>.\u003C\u002Fp>\n\n\u003Ch2>Agents have a proper home now\u003C\u002Fh2>\n\n\u003Cp>We touched on this in an earlier post, but our scientific agents now have a dedicated page. They build on MARA and plug into the MARA, so you can chain tool calls, run workflows, generate reports, and keep a human in the loop the whole time. More at \u003Ca href=\"http:\u002F\u002Fnanome.ai\u002Fagents\">nanome.ai\u002Fagents\u003C\u002Fa>. We're actively updating this page, so stay tuned for updates!\u003C\u002Fp>\n\n\u003Ch2>docs.nanome.ai is now help.nanome.ai\u003C\u002Fh2>\n\n\u003Cp>We've renamed our documentation site from \u003Cstrong>docs.nanome.ai\u003C\u002Fstrong> to \u003Cstrong>help.nanome.ai\u003C\u002Fstrong>.\u003C\u002Fp>\n\n\u003Cp>This is where you'll find installation guides, feature walkthroughs, how to make MARA integrations, and a contact form that goes straight to our team.\u003C\u002Fp>\n\n\u003Ch2>Setup page, now with v2 APKs\u003C\u002Fh2>\n\n\u003Cdiv style=\"text-align:center;\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fv2apk_924112c7c5.png\" alt=\"Nanome v2 APK setup page\" style=\"max-width:100%;height:auto;\" \u002F>\u003C\u002Fdiv>\n\n\u003Cp>The setup page is where you've always grabbed our XR builds and store listings. It now features the v2 APKs front and center.\u003C\u002Fp>\n\n\u003Cp>We're also giving v1 a proper name: Nanome Classic. The platform that changed molecular visualization forever. v1 was the first real XR-first molecular visualization and design platform, and thousands of scientists, students, and teams built their first immersive chemistry experience on it. It's still there, still runs, and still holds a place on the setup page for anyone who wants to fire it up. Classic earned the name.\u003C\u002Fp>\n\n\u003Ch2>Meet you where you are\u003C\u002Fh2>\n\n\u003Cp>This is the part we're most excited about. With v2.5, we're opening up two new ways to come into Nanome alongside the existing Full license.\u003C\u002Fp>\n\n\u003Cp>\u003Cstrong>The Web license.\u003C\u002Fstrong> A seat built around the web-first experience. No headset required. This is the most honest entry point for anyone curious about Nanome who isn't ready to strap on an XR device yet. The web workspace is good on its own, and combined with shareable no-login sessions, it's finally possible to run a productive Nanome session entirely in the browser.\u003C\u002Fp>\n\n\u003Cp>\u003Cstrong>The Collab license.\u003C\u002Fstrong> Previously only available to commercial and enterprise customers, Collab seats are now open to academic accounts too. We were heavily inspired by how Figma handles collaborators: users can jump into a file, follow along, and contribute without driving the session themselves. Same idea here.\u003C\u002Fp>\n\n\u003Cp>Starting with v2.5, Academic pricing lands at \u003Cstrong>$8\u002Fmo\u003C\u002Fstrong> for a Web seat and \u003Cstrong>$21\u002Fmo\u003C\u002Fstrong> for a Full seat. New accounts can try the Full experience with a 14-day trial.\u003C\u002Fp>\n\n\u003Cp>\u003Cstrong>The Full license.\u003C\u002Fstrong> The complete Nanome experience, unchanged. XR chemical builder, MARA voice commands, MCP server calls, Claude Skills, the full kit.\u003C\u002Fp>\n\n\u003Cp>The full feature breakdown lives on the pricing page. Credits are purchasable on any seat type and redeemable for tool calls, agentic use, and MCP calls. More at \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fintegrations\">nanome.ai\u002Fintegrations\u003C\u002Fa> and \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fagents\">nanome.ai\u002Fagents\u003C\u002Fa>.\u003C\u002Fp>\n\n\u003Ch2>Customer work, front and center\u003C\u002Fh2>\n\n\u003Cdiv style=\"text-align:center;\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fcasestudies_690f2fd87c.png\" alt=\"Nanome case studies page\" style=\"max-width:100%;height:auto;\" \u002F>\u003C\u002Fdiv>\n\n\u003Cp>We also gave the case studies, webinars, and press pages a full pass. The press page has our latest mentions, the webinars page has recent talks and recordings, and the case studies page reflects the work we've been doing with partners over the last year. If you've been curious what customers are actually building with Nanome, these pages are the fastest way to get a feel for it.\u003C\u002Fp>\n\n\u003Chr \u002F>\n\n\u003Cp>That's the tour. Poke around at \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002F\">nanome.ai\u003C\u002Fa>, open the embedded workspace on the homepage, and send a shareable link to someone who's never heard of us. This facelift is the foundation for everything we've got in store, and there's a lot more to come.\u003C\u002Fp>\n\n\u003Cp>Where molecules, scientists, and agents meet. We'll see you in the workspace.\u003C\u002Fp>","2026-04-10T19:45:45.481Z","2026-04-10T19:51:38.249Z","2026-04-10T19:45:45.173Z","2026-04-10","Nanome launches app.nanome.ai with a rebuilt web platform, shareable workspaces, new licensing tiers, and a homepage that shows you the product in action.","Nanome, app.nanome.ai, molecular visualization, drug discovery, XR platform, web workspace, MARA, scientific agents, collaborative workspace, VR chemistry","the-new-front-door-to-nanome",null,{"id":19,"attributes":20},68,{"title":21,"content":22,"createdAt":23,"updatedAt":24,"publishedAt":25,"date":13,"description":26,"keywords":27,"slug":28,"category":17},"Nanome v2.5: Web, Trajectories, Torsion Tool, and More","\u003Cp>With v2.5, Web users and XR users can finally share the same live session. Trajectories can be scrubbed frame by frame inside the workspace. A new Torsion Angle Tool lets you rotate dihedrals with real-time strain coloring. Projects and permissions landed. Surfaces load instantly on revisit. And that's before we get to interaction analysis, new file formats, and 30+ fixes.\u003C\u002Fp>\n\n\u003Cp>Here's the tour.\u003C\u002Fp>\n\n\u003Ch2>Web and XR, finally in the same room\u003C\u002Fh2>\n\n\u003Cvideo src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Ftrimmed_1080_fb538a2433.mov\" controls muted playsinline style=\"width:100%;\">\u003C\u002Fvideo>\n\n\u003Cp>This is the one we've been waiting on. Until v2.5, web users and Quest users were on parallel tracks. You could use either but you couldn't be \u003Cem>in\u003C\u002Fem> a workspace together, until now.\u003C\u002Fp>\n\n\u003Cp>In v2.5 they're the same session. Web users join the live room alongside standalone XR devices and PCVR users, see each other's position, and watch molecular changes happen in real time. A med chemist in XR can talk through a binding pocket while a comp chemist on a laptop follows along, pulls up a trajectory, and can set up the next scene. Through Spotlight mode, everyone can look at the same molecule from the same perspective instantly.\u003C\u002Fp>\n\n\u003Cp>If \u003Cem>where molecules, scientists, and agents meet\u003C\u002Fem> was going to mean anything literal, this was the feature that had to work.\u003C\u002Fp>\n\n\u003Ch2>Three new scientific features\u003C\u002Fh2>\n\n\u003Ch3>Torsion Angle Tool (beta feature)\u003C\u002Fh3>\n\n\u003Cvideo src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Ftorsion_d8181a1515.mp4\" controls muted playsinline style=\"width:100%;\">\u003C\u002Fvideo>\n\n\u003Cp>Interactively rotate dihedral angles and observe the strain updating in real time. Atoms change color from green to yellow to red as they are pushed out of plane. During rotation, red indicates high strain, yellow signifies that the angles are outside standard ranges, and dark green represents acceptable angles with low to no strain. You can see strain across the whole structure or just locally around the bond you're editing. Fast-trigger prevention catches invalid rotations before they happen. Please note: There are a few known bugs with the Torsion tool, so we're rolling this feature out in beta for now.\u003C\u002Fp>\n\n\u003Ch3>Trajectory Playback\u003C\u002Fh3>\n\n\u003Cvideo src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Ftrajectories_18c1a123ef.mp4\" controls muted playsinline style=\"width:100%;\">\u003C\u002Fvideo>\n\n\u003Cp>Load a molecular dynamics trajectory and scrub through it frame by frame. Playback syncs across users, so a whole team can watch the same frame at the same moment. Current cap is 2,000 frames per trajectory, which is where we landed balancing usability against Quest memory. Streaming and indexing for larger trajectories is queued up for the next release.\u003C\u002Fp>\n\n\u003Cvideo src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMD_desktop_5ea586928a.mp4\" controls muted playsinline style=\"width:100%;\">\u003C\u002Fvideo>\n\n\u003Ch3>Interaction Analysis &amp; Annotation\u003C\u002Fh3>\n\n\u003Cvideo src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Finteractions_95df7869ca.mp4\" controls muted playsinline style=\"width:100%;\">\u003C\u002Fvideo>\n\n\u003Cp>Visualize and Pin chemical interactions between molecular components with hover highlighting, drop annotations inline, and generate an interaction report when you're done. SDF support included.\u003C\u002Fp>\n\n\u003Cp>\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Finteractions_d315b4e95e.jpg\" alt=\"Interaction analysis screenshot\" style=\"width:100%;\" \u002F>\u003C\u002Fp>\n\n\u003Cp>\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Finteractions_renaming_0eafc0571c.jpg\" alt=\"Interaction renaming screenshot\" style=\"width:100%;\" \u002F>\u003C\u002Fp>\n\n\u003Cp>\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Finteractions2_c61aefc1bf.jpg\" alt=\"Interaction analysis detail screenshot\" style=\"width:100%;\" \u002F>\u003C\u002Fp>\n\n\u003Ch2>Focus mode\u003C\u002Fh2>\n\n\u003Cvideo src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Ffocus_07c48c97ee.mp4\" controls muted playsinline style=\"width:100%;\">\u003C\u002Fvideo>\n\n\u003Cp>Click on a molecular component (e.g. chain, ligand, water) and it displays a persistent highlight in the user's color. Click again to release.\u003C\u002Fp>\n\n\u003Cp>The user color is what makes this interesting in a shared session. If Alice focuses on a binding pocket, it glows in her color, and everyone else in the room sees it immediately. Same principle as colored cursors in Google Docs: you always know who's paying attention to what, without anyone having to say it out loud.\u003C\u002Fp>\n\n\u003Ch2>Projects, permissions, and running Nanome with a team\u003C\u002Fh2>\n\n\u003Cp>\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fprojects_8bdcd23e33.png\" alt=\"Projects and permissions interface\" style=\"width:100%;\" \u002F>\u003C\u002Fp>\n\n\u003Cp>Teams kept asking for this. v2.5 ships a full project and permission system. Workspaces live inside projects, projects have owners, and every workspace carries owner\u002Feditor\u002Fviewer roles. Public workspaces let you share a session anonymously (this is the feature behind the embedded workspace on our homepage). XR session concurrency is capped at 2 per account.\u003C\u002Fp>\n\n\u003Cp>It's the layer that makes running Nanome across a lab group, a department, or a whole company workable without writing your own access control on top.\u003C\u002Fp>\n\n\u003Cp>\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fworkspaces_ec92a3b3a4.png\" alt=\"Workspaces interface\" style=\"width:100%;\" \u002F>\u003C\u002Fp>\n\n\u003Ch2>Surfaces that load instantly\u003C\u002Fh2>\n\n\u003Cp>Persistent surface caching is a performance win we're proud of in this release. Molecular surfaces used to recompute from scratch every time you opened a workspace. Now they're cached, including in WebGL.\u003C\u002Fp>\n\n\u003Ch2>New file formats, better parsing\u003C\u002Fh2>\n\n\u003Cp>We expanded what Nanome can ingest:\u003C\u002Fp>\n\n\u003Cul>\n  \u003Cli>\u003Cstrong>MAE, PSE, MOE\u003C\u002Fstrong> can now be loaded as new entries into existing workspaces (previously only as brand-new workspaces via web). MOE files now parse per-atom, per-residue, and per-chain representations to recreate the original visualization.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>PQR\u003C\u002Fstrong> is a whitespace-delimited PDB variant used by APBS for electrostatic potential calculations, storing charge and radius per atom.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>PDB\u003C\u002Fstrong> parsing improved for shorter and more flexible line formats.\u003C\u002Fli>\n\u003C\u002Ful>\n\n\u003Cp>We also shipped full-surface electrostatic coloring with custom ESP color ramps that sync across users in real time.\u003C\u002Fp>\n\n\u003Ch2>The rest of the list\u003C\u002Fh2>\n\n\u003Cp>A few more things worth calling out:\u003C\u002Fp>\n\n\u003Cul>\n  \u003Cli>\u003Cstrong>Scene management.\u003C\u002Fstrong> Save and restore point-of-view per scene. Apply component transforms and visibility changes consistently across multiple scenes with \"Apply Across Scenes.\" Recenter view with a confirmation overlay.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>Web workspace UI overhaul.\u003C\u002Fstrong> New panel layout with Scenes, Entries, MARA, and Import panels.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>Workspace chat and pinned chats.\u003C\u002Fstrong> Chat with other users directly inside a workspace. Pin the messages that matter.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>Follow a user from the web.\u003C\u002Fstrong> Follow another user's viewpoint in real time from the browser, with viewer avatars visible in the workspace.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>Electron desktop app.\u003C\u002Fstrong> Optional native wrapper for the web experience.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>CSV export from MARA tables.\u003C\u002Fstrong> Download data tables from MARA responses as CSV.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>Customizable pocket size\u003C\u002Fstrong> for binding site analysis.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>Handedness support\u003C\u002Fstrong> for XR controllers, with handedness-aware grab mechanics and rotation.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>Passthrough camera support\u003C\u002Fstrong> on PC VR devices.\u003C\u002Fli>\n  \u003Cli>\u003Cstrong>30+ bug fixes\u003C\u002Fstrong> across rendering, controllers, workspace management, tools, and networking.\u003C\u002Fli>\n\u003C\u002Ful>\n\n\u003Cp>Full changelog at \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fsetup\">nanome.ai\u002Fsetup\u003C\u002Fa>\u003C\u002Fp>\n\n\u003Ch2>How to get it\u003C\u002Fh2>\n\n\u003Cp>If you're on a headset, the v2 APKs are on the \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fsetup\">setup page\u003C\u002Fa>. If you're on web, head to \u003Ca href=\"https:\u002F\u002Fapp.nanome.ai\">app.nanome.ai\u003C\u002Fa> and sign in. If you don't have an account and just want to poke around, the homepage has an embedded workspace you can open without logging in.\u003C\u002Fp>\n\n\u003Cp>We also rolled out a new front door for the whole Nanome experience alongside this release: a rebuilt website, shareable no-login workspaces, and new license tiers for academic and web-first users. That story lives in a \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fblog\u002F\">companion post\u003C\u002Fa>.\u003C\u002Fp>\n\n\u003Cp>See you in the workspace.\u003C\u002Fp>","2026-04-10T19:27:46.199Z","2026-04-10T23:41:05.470Z","2026-04-10T19:27:46.140Z","Nanome v2.5 brings Web+XR shared sessions, trajectory playback, a Torsion Angle Tool, projects & permissions, surface caching, new file formats, and 30+ fixes.","Nanome v2.5, molecular visualization, XR collaboration, trajectory playback, torsion angle tool, web VR, molecular dynamics, interaction analysis, binding site analysis, drug discovery software","nanome-v2.5:-web-trajectories-torsion-tool-and-more",{"id":30,"attributes":31},66,{"title":32,"content":33,"createdAt":34,"updatedAt":35,"publishedAt":36,"date":37,"description":38,"keywords":39,"slug":40,"category":17},"Announcing: Nanome Claude Code Skill & MCP Server, and What's Next for MARA","\u003C!-- BLOG BODY HTML v4 — placement per Keita's instructions -->\n\n\u003Cp>Chances are you've already used or heard of Claude Code or Cowork. Maybe you've spun up your own \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fopenclaw\u002Fopenclaw\">OpenClaw\u003C\u002Fa> instance. Maybe you saw \u003Ca href=\"https:\u002F\u002Fwww.nextplatform.com\u002Fai\u002F2026\u002F03\u002F17\u002Fnvidia-says-openclaw-is-to-agentic-ai-what-gpt-was-to-chattybots\u002F5209428\">Jensen Huang call it the next Linux\u003C\u002Fa>. Maybe you watched \u003Ca href=\"https:\u002F\u002Finsilico.com\u002Fnews\u002Fspjz8fzmb1-insilico-medicine-launches-pandaclaw-emp\">Insilico drop PandaClaw\u003C\u002Fa>, or noticed \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FFreedomIntelligence\u002FOpenClaw-Medical-Skills\">869 scientific agent skills\u003C\u002Fa> show up on GitHub overnight, or \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Fwu-yc\u002FLabClaw\">240 more from Stanford\u002FPrinceton\u003C\u002Fa>.\u003C\u002Fp>\n\n\u003Cp>Drug discovery and scientific research are going agentic. Fast.\u003C\u002Fp>\n\n\u003Cp>At Nanome, we've been building in this space for a while. \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">MARA\u003C\u002Fa> has been running agentic workflows for research teams since 2024: \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fblog\u002Fnanome-ai-february-2025-update!\u002F\">plan mode, user approval gates\u003C\u002Fa>, tool orchestration, even an API so scientists could trigger and monitor workflows directly from Slack or Teams. A lot of the patterns that Claude Code and OpenClaw just shipped have been in production in our system for two years.\u003C\u002Fp>\n\n\u003C!-- ===== IMAGES 1 + 5 + 4: Side by side — 2021 browser comparison with year subtitles ===== -->\n\u003Cdiv style=\"display: flex; align-items: center; justify-content: center; gap: 1rem; margin: 2rem 0; flex-wrap: wrap;\">\n  \u003Cdiv style=\"text-align: center; flex: 1; min-width: 200px;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image1_c7fd6d4704.png\" alt=\"Nanome VR browser in 2021\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n    \u003Cp style=\"margin-top: 0.5rem;\">(2021)\u003C\u002Fp>\n  \u003C\u002Fdiv>\n  \u003Cdiv style=\"text-align: center; flex-shrink: 0;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image5_919e3aa336.png\" alt=\"Arrow transition\" style=\"max-width: 100px; border-radius: 8px;\" \u002F>\n  \u003C\u002Fdiv>\n  \u003Cdiv style=\"text-align: center; flex: 1; min-width: 200px;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image4_cb08e845d0.gif\" alt=\"Nanome in 2024\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n    \u003Cp style=\"margin-top: 0.5rem;\">(2024)\u003C\u002Fp>\n  \u003C\u002Fdiv>\n\u003C\u002Fdiv>\n\n\u003Cp>This isn't the first time we had such a clear vision for what the future of science ought to look like. Back in 2021, VR operating systems didn't have web browsers, so we \u003Ca href=\"https:\u002F\u002Fblog.matryx.ai\u002Fnanome-1-22-the-web-browser-update-346b1c231097\">built one inside Nanome\u003C\u002Fa>. Then the platforms matured and we let them handle it. We see the same arc here. We built MARA's orchestration layer because the models weren't ready. Now they are, (frankly, faster than we expected) and we're \u003Cem>glad\u003C\u002Fem>!! Scientists getting better agentic tools faster is a win for science.\u003C\u002Fp>\n\n\u003Cp>So, together with Claude Code, Nanome and MARA are evolving. Ship the integrations. Open the APIs. Let every agentic system in the world pipe into Nanome while we focus on the parts that actually require us.\u003C\u002Fp>\n\n\u003Ch2>Announcing: Nanome Claude Code Skill &amp; MCP Server\u003C\u002Fh2>\n\n\u003Cp>Our team built a Claude Code Skill and MCP (Model Context Protocol) servers that let Claude directly create and manage Nanome workspaces. Load structures, set up scenes, configure representations, propagate content across views. All through Claude, all hitting our workspace API.\u003C\u002Fp>\n\n\u003C!-- ===== IMAGE 3: MCP chat conversation interface ===== -->\n\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image3_dd09719a60.png\" alt=\"Nanome MCP server workspace creation conversation showing scene setup and structure loading\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\n\u003Cp>What this means practically: you're in Claude Code working on a docking analysis. Instead of switching to Nanome, opening a workspace, manually loading your structures, and setting up your visualization, you just tell Claude to do it. Claude calls the Nanome MCP server, builds the workspace, and it's ready for you in XR or on the web.\u003C\u002Fp>\n\n\u003Cp>The Skill handles workspace creation through Claude Code's local environment. The MCP server goes deeper: full workspace API access, scene editing, content propagation. Together they turn Nanome into a first-class destination for any agentic scientific workflow, regardless of which orchestrator you prefer.\u003C\u002Fp>\n\n\u003C!-- ===== IMAGES 2 + 6: Side by side — terminal + workspace view ===== -->\n\u003Cdiv style=\"display: flex; align-items: flex-start; justify-content: center; gap: 1rem; margin: 2rem 0; flex-wrap: wrap;\">\n  \u003Cdiv style=\"flex: 1; min-width: 280px;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image2_bc43537d55.png\" alt=\"Claude Code Skill terminal showing ABL Kinase workspace plan\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n  \u003C\u002Fdiv>\n  \u003Cdiv style=\"flex: 1; min-width: 280px;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image6_f1e78286af.png\" alt=\"ABL Kinase Imatinib Binding Analysis workspace in Nanome\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n  \u003C\u002Fdiv>\n\u003C\u002Fdiv>\n\n\u003Cp>\u003Cstrong>The MCP server is currently internal-use only.\u003C\u002Fstrong> We'll be rolling it out in the next update, v2.5, coming in April 2026.\u003C\u002Fp>\n\n\u003C!-- ===== IMAGE 7: Agent status dashboard — right below heading ===== -->\n\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image7_ab15f76bec.png\" alt=\"Agent status dashboard showing Docking, Electrostatics, R-Group SAR, MPNN, and Antibody agents\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\n\u003Cp>\u003Cstrong>12 Full Scientific Agents with Autonomous Tool Use (and Counting)\u003C\u002Fstrong>\u003C\u002Fp>\n\n\u003C!-- ===== IMAGE 8: Docking pipeline — inline\u002Ffloated with subtitle ===== -->\n\u003Cdiv style=\"float: left; margin: 0 1.5rem 1rem 0; max-width: 45%;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image8_05e4473fdc.png\" alt=\"Automated docking pipeline UI showing Boltz2, Smina, molecule and protein folder selection\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n  \u003Cp style=\"text-align: center; font-style: italic; margin-top: 0.5rem; font-size: 0.9em;\">(an image of automating docking between a molecule and a protein folder)\u003C\u002Fp>\n\u003C\u002Fdiv>\n\n\u003Cp>Meanwhile, we've built roughly a dozen specialized agents covering docking, antibodies\u002Fenzyme design, electrostatics, R-group analysis, de novo binders, MD, and more. These agents run over 50 tools independently, chain together, or operate on cron jobs for continuous monitoring. Think: \"dock every new molecule that lands in this folder against these targets, and provide a detailed analysis of the results when complete, automatically.\" Our agents work seamlessly with MARA and Nanome to create workspaces with real world data used to color and highlight atomic data.\u003C\u002Fp>\n\n\u003Cdiv style=\"clear: both;\">\u003C\u002Fdiv>\n\n\u003Cp>This is the same pattern you're seeing from Insilico with \u003Ca href=\"https:\u002F\u002Finsilico.com\u002Fnews\u002Fspjz8fzmb1-insilico-medicine-launches-pandaclaw-emp\">PandaClaw\u003C\u002Fa> and from the \u003Ca href=\"https:\u002F\u002Fgithub.com\u002FFreedomIntelligence\u002FOpenClaw-Medical-Skills\">OpenClaw medical-skills community\u003C\u002Fa>. The difference is that ours pipe directly into Nanome's visualization &amp; collaboration layer. The output isn't a text report in a chatbox. It's a 3D workspace you can walk into in XR or collaborate on from the web.\u003C\u002Fp>\n\n\u003Ch2>The Stack We're Betting On\u003C\u002Fh2>\n\n\u003Cp>Here's our thesis: \u003Cstrong>skills and agents on the outside, spatial computing and enterprise security on the inside.\u003C\u002Fstrong>\u003C\u002Fp>\n\n\u003Cp>Claude Code, OpenClaw, Cowork, whatever your team prefers for general-purpose orchestration, data retrieval, and computational workflows. Those are good and getting better fast.\u003C\u002Fp>\n\n\u003Cp>Nanome handles the parts those systems can't touch: 3D molecular visualization, XR collaboration, the human-science interface where a researcher actually looks at a binding pocket and makes a judgment call. And critically, the enterprise security controls that regulated research environments require before any AI tool touches their data.\u003C\u002Fp>\n\n\u003Cp>MARA sits in the middle. It coexists with all of it. You can use Claude Code to call Nanome's APIs through our Skill and MCP server. You can use OpenClaw's cron jobs to run our agents against the workspace API. You can use Cowork to manage tasks that end up in Nanome workspaces. And you can use MARA directly for the things it was purpose-built for: voice commands in XR, the representation agent that is native to Nanome, and the secure orchestration harness that wraps the whole thing in enterprise-grade controls.\u003C\u002Fp>\n\n\u003Cp>MARA isn't competing with Claude Code. MARA is the layer that makes Claude Code (or OpenClaw, or Cowork, or whatever comes next) actually work for research teams with security requirements.\u003C\u002Fp>\n\n\u003Ch2>A Note on Security\u003C\u002Fh2>\n\n\u003C!-- ===== IMAGE 9: Right under A Note on Security heading ===== -->\n\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_claude_code_skill_mcp_server_mara_image9_ab60e4dcaa.png\" alt=\"Enterprise security architecture for MARA\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\n\u003Cp>Some large enterprises are already rolling out Claude Code widely, and that's great. \u003Ca href=\"https:\u002F\u002Fwww.anthropic.com\u002Fnews\u002Fhealthcare-life-sciences\">Anthropic is investing heavily in healthcare and life sciences\u003C\u002Fa>, and the security posture of these tools improves with every release.\u003C\u002Fp>\n\n\u003Cp>But not every organization is there yet. We talk to top pharma IT folks regularly, and plenty of teams are still working through the security calculus. The concerns aren't hypothetical: \u003Ca href=\"https:\u002F\u002Fblogs.cisco.com\u002Fai\u002Fpersonal-ai-agents-like-openclaw-are-a-security-nightmare\">Cisco\u003C\u002Fa>, \u003Ca href=\"https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Fsecurity\u002Fblog\u002F2026\u002F02\u002F19\u002Frunning-openclaw-safely-identity-isolation-runtime-risk\u002F\">Microsoft\u003C\u002Fa>, and \u003Ca href=\"https:\u002F\u002Fwww.bitsight.com\u002Fblog\u002Fopenclaw-ai-security-risks-exposed-instances\">Gartner\u003C\u002Fa> have all flagged real risks with agentic systems like OpenClaw. \u003Ca href=\"https:\u002F\u002Fthehackernews.com\u002F2026\u002F02\u002Fclaude-code-flaws-allow-remote-code.html\">CVEs have been filed against Claude Code itself\u003C\u002Fa>. \u003Ca href=\"https:\u002F\u002Fwww.bitsight.com\u002Fblog\u002Fopenclaw-ai-security-risks-exposed-instances\">341 malicious skills landed on ClawHub\u003C\u002Fa>. \u003Ca href=\"https:\u002F\u002Fblog.smu.edu\u002Fitconnect\u002F2026\u002F03\u002F04\u002Fopenclaw-security-risks-institutional-position\u002F\">Universities have banned OpenClaw from institutional devices\u003C\u002Fa>. These things are getting better fast, but they're real today.\u003C\u002Fp>\n\n\u003Cp>For those teams not yet ready to widely deploy Claude Code within a biopharma enterprise setting, MARA was built from the ground up with this in mind: on-prem deployment behind your firewall (no data leaves your environment), BYO-LLM (use whatever model your security team approved), SSO authentication, sandboxed tool execution, 170+ vetted tools with no unregulated marketplace, and human-in-the-loop plan mode baked into the architecture since 2024.\u003C\u002Fp>\n\n\u003Cp>If your organization is comfortable with Claude Code or OpenClaw out of the box, awesome. We want you using our Skill and MCP server to pipe those workflows into Nanome. If your organization needs tighter controls, MARA is there. Either way, your scientists can view, design, and collaborate with their team and AI agents with our interface layer, Nanome, and science gets done.\u003C\u002Fp>\n\n\u003Ch2>What's Next\u003C\u002Fh2>\n\n\u003Cp>We'll be releasing the Nanome Claude Code Skill to the public. The MCP server follows on an as-requested basis, then more broadly.\u003C\u002Fp>\n\n\u003Cp>If you're a research team exploring agentic workflows and need enterprise-grade security around them, \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fdemo\">reach out\u003C\u002Fa>. We can get you connected to Nanome's workspace API in an afternoon.\u003C\u002Fp>\n\n\u003Cp>If you're a researcher who wants to try the scientific agent workflows, early access is coming in the next few weeks.\u003C\u002Fp>","2026-03-27T17:52:58.357Z","2026-03-27T18:40:43.811Z","2026-03-27T17:52:58.312Z","2026-03-27","Nanome announces a Claude Code Skill & MCP Server for agentic scientific workflows, plus updates to MARA's enterprise-grade AI orchestration for drug discovery.","Nanome, Claude Code, MCP Server, MARA, agentic AI, drug discovery, molecular visualization, XR collaboration, scientific agents, biopharma, OpenClaw, Model Context Protocol","announcing:-nanome-claude-code-skill-and-mcp-server-and-what's-next-for-mara",{"id":42,"attributes":43},60,{"title":44,"content":45,"createdAt":46,"updatedAt":47,"publishedAt":48,"date":49,"description":50,"keywords":51,"slug":52,"category":53},"Nanome Joins the OpenFold Consortium","\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_joins_the_openfold_consortium_image5_70b3589609.png\" alt=\"KRAS G12C experimental crystal structure comparison in Nanome\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\n\u003Cp>We're excited to share that Nanome has joined the \u003Ca href=\"https:\u002F\u002Fopenfold.io\u002F\">OpenFold Consortium\u003C\u002Fa>, a non-profit AI research consortium building open-source tools for biology and drug discovery.\u003C\u002Fp>\n\n\u003Cp>OpenFold is home to some of the most impactful open-source work in structural biology right now. Their flagship, OpenFold3, is a fully open biomolecular structure prediction model developed by Mohammed AlQuraishi's lab at Columbia University and a growing community of industry and academic contributors. It's licensed under Apache 2.0, which means researchers and companies can test, retrain, fine-tune, and build on top of it freely. Other available co-folding methods, by comparison, restrict commercial use through lack of training code, training data, or even inference code.\u003C\u002Fp>\n\n\u003Cp>The consortium \u003Ca href=\"https:\u002F\u002Fwww.businesswire.com\u002Fnews\u002Fhome\u002F20260313170622\u002Fen\u002FOpenFold-Consortium-Announces-Major-OpenFold3-Update-and-Public-Release-of-Training-Data-for-Reproducible-Biomolecular-AI\">just released a major OpenFold3 update\u003C\u002Fa> alongside publicly available training datasets, making the entire training and inference stack reproducible and auditable by anyone. That kind of radical openness is exactly the ethos we believe in at Nanome.\u003C\u002Fp>\n\n\u003Ch2>Why This Matters to Us\u003C\u002Fh2>\n\n\u003Cp>At Nanome, we build the interface layer for scientific AI. Our platform spans immersive 3D molecular visualization in XR and our AI co-pilot MARA sits right at the point where computational predictions meet human scientific judgment. Structure prediction models like OpenFold3 generate the molecular geometries that researchers then need to see, interpret, manipulate, and act on.\u003C\u002Fp>\n\n\u003Cblockquote>\n  \u003Cp>\"The gap between generating a predicted structure and actually using it in a drug discovery workflow is still massive,\" said Steve McCloskey, CEO and Co-Founder of Nanome. \"OpenFold is doing the hard work of making structure prediction open and reproducible. We want to make sure the output of that work doesn't just live in a terminal somewhere. It needs to be in scientists' hands, in 3D, where they can make real decisions with it.\"\u003C\u002Fp>\n\u003C\u002Fblockquote>\n\n\u003Cp>Joining OpenFold puts us at the table with organizations pushing the boundaries of what's computationally possible in drug discovery, while we focus on making those outputs \u003Cem>usable\u003C\u002Fem> — in context, in 3D, and alongside your team.\u003C\u002Fp>\n\n\u003Ch2>What It Looks Like in Practice\u003C\u002Fh2>\n\n\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia1.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMHpqaWY2b3dqbmV2eHR1aTNrd2txZ3IxbzhpYmhubmNibm1zcHFzbiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FiDhmcOQuxkB0c0RAVk\u002Fgiphy.gif\" alt=\"Upcoming web-accessible molecular visualization view in Nanome\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\n\n\u003Cp>This is KRAS G12C — probably the most famous \"undruggable\" target in oncology — predicted by OpenFold3 and loaded into Nanome right alongside its experimental crystal structure. For context: KRAS burned through two decades of failed drug programs before Sotorasib finally cracked it in 2021. Here, a sequence goes in, a predicted geometry comes out, and within minutes your team is standing inside the structure together, comparing prediction against experiment, poking around the binding pocket, making actual decisions. No crystallography lab, no months of sample prep — just structure prediction to spatial collaboration in one sitting.\u003C\u002Fp>\n\n\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_joins_the_openfold_consortium_image4_32231fe876.png\" alt=\"KRAS G12C structure predicted by OpenFold3 loaded in Nanome\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\n\n\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia0.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExanNtcmR5Z3lhemJ2aHI0ZHMxeWFjbG53YTg1Y3Q3dHU1YXNzZDNseCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FCnAjZjXVlKmIhPGNdp\u002Fgiphy.gif\" alt=\"Nanome molecular visualization in action\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\n\u003Cp>With our next release, this same view will be accessible directly from the web:\u003C\u002Fp>\n\n\u003Ch2>Who's in the Consortium\u003C\u002Fh2>\n\u003Cdiv style=\"text-align: center; margin: 2rem 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_joins_the_openfold_consortium_image3_869dc09977.png\" alt=\"Nanome joins the OpenFold Consortium announcement\" style=\"max-width: 100%; border-radius: 8px;\" \u002F>\n\u003C\u002Fdiv>\n\u003Cp>We're \u003Ca href=\"https:\u002F\u002Fopenfold.io\u002F#membersarea\">joining alongside global pharmaceutical leaders\u003C\u002Fa> like Johnson &amp; Johnson, Novo Nordisk, Roche, and Bristol Myers Squibb, as well as cutting-edge biotechs and tech companies like Cyrus Biotechnology, Structure Therapeutics, SandboxAQ, Lambda, Dassault Systèmes, and Outpace Bio. The full roster spans 30+ organizations across pharma, biotech, synthetic biology, GPU cloud, and scientific software, all contributing to the shared goal of making foundational AI for biology accessible to everyone.\u003C\u002Fp>\n\n\u003Ch2>What's Next\u003C\u002Fh2>\n\n\u003Cp>We're looking forward to contributing to the consortium's mission and exploring how Nanome's visualization and AI orchestration capabilities can complement the structural prediction tools coming out of OpenFold. As these models get more powerful, the need for intuitive, spatial interfaces to make sense of their outputs only grows.\u003C\u002Fp>\n\n\u003Cp>More to come. If you're interested in how Nanome fits into the evolving AI drug discovery stack, \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fdemo\u002F\">reach out to us\u003C\u002Fa> or follow along on our blog.\u003C\u002Fp>\n\n\u003Chr>\n\n\u003Cp>\u003Cem>Nanome is the interface layer for scientific AI, combining immersive molecular visualization with MARA, our AI-powered scientific co-pilot. We serve enterprise pharmaceutical clients and research teams worldwide. Learn more at \u003Ca href=\"https:\u002F\u002Fnanome.ai\u002F\">nanome.ai\u003C\u002Fa>.\u003C\u002Fem>\u003C\u002Fp>","2026-03-24T01:00:38.550Z","2026-03-27T17:43:47.813Z","2026-03-24T17:23:42.823Z","2026-03-24","Nanome has joined the OpenFold Consortium, bringing immersive 3D molecular visualization and AI co-pilot MARA to open-source structure prediction tools like OpenFold3.","Nanome, OpenFold Consortium, OpenFold3, molecular visualization, drug discovery, structure prediction, AI drug discovery, MARA, XR visualization, biomolecular AI, open-source biology","nanome-joins-the-openfold-consortium","partnerships",{"id":55,"attributes":56},55,{"title":57,"content":58,"createdAt":59,"updatedAt":60,"publishedAt":61,"date":62,"description":63,"keywords":64,"slug":65,"category":66},"Nanome v2.4.0: Early Access Release, Mara Voice Commands, Minimization, Chem Interactions, and more! ","---\n\nWe're excited to announce **Nanome v2.4.0**, a milestone release that marks our transition to open access, brings back a beloved feature, and introduces powerful new capabilities for molecular analysis.\n\n---\n\n## Now Available: Early Access Release on Meta Quest Store\n\n**This is a big one.** With v2.4, we're transitioning Nanome V2 from closed beta to **early access  release** [on the Meta Quest Store](https:\u002F\u002Fwww.meta.com\u002Fexperiences\u002Fnanome-v2\u002F25124020873911281\u002F?srsltid=AfmBOopbbe4PZ09zhyoXa2w5COVTT9nnODJGH93DVkx0WT6nUAhClWpV).\n\nUntil now, accessing V2 required an invitation, a special URL, or being added to our developer channel via request. That friction is gone. Anyone can now download Nanome V2 directly from the Quest Store and start exploring.\n\nThis milestone reflects our confidence that V2 is approaching the quality and feature completeness needed for a full public launch. We've spent the past year refining the core experience based on feedback from our beta users, and we're ready to open the doors wider.\n\nTo support new users discovering V2 for the first time, we've added the **controller tutorial banner** to the login experience—showing basic controls and linking to our YouTube tutorial library. Whether you're a longtime Nanome user transitioning from V1 or completely new to the platform, you'll have the resources to get up to speed quickly. The tutorial panel stays accessible from Settings whenever you need a refresher.\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Flobby_with_controller_instructions_banner_4d6e7c1eae.png\"\n       style=\"width:100%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\n\n**Access options:**\n- **Paid Nanome license holders:** During the early access period, V2 is included. Just log in with your credentials\n- **New users:** Create an account to start your 14-day trial with full V2 access\n- **Meta Quest Store:** Download directly and explore\n\n---\n\n## Voice Commands Are Back—And Better Than Ever\n\nLong-time Nanome users will remember our original voice commands, which were among our most popular features before Microsoft deprecated Cortana in late 2023. After more than two years, **voice commands have returned**—and they've been completely reimagined.\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fmara_listening_2d2587ca05.png\"\n       style=\"width:60%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\nPowered by a MARA integration, voice commands in v2.4.0 let you control your workspace using natural language. Unlike the original Alexa-style commands that required specific phrases, you can now speak naturally to change representations, adjust rendering styles.\n\n**What's new:**\n- Works on standalone headsets like Meta Quest for the first time—something that wasn't possible with Cortana\n- Natural language processing means more flexible, conversational interactions\n- Currently supports representation changes (colors, rendering styles, visibility)\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fmara_b_factor_00071f11e4.png\"\n       style=\"width:100%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\n**Pro tip:** When working with multiple entries, be specific about which structure you're referencing. For example, \"Change chain A of 1ABC to blue ribbon\" will get better results than \"make this blue\" since voice commands don't yet incorporate gaze or gesture data.\n\n*Coming soon: Selection state support and expanded command capabilities in future updates.*\n\n---\n\n## Multi-Frame Playback: Animate Your Molecules\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia3.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMXpkNmQxeXFxYWd4cjA0eHhqZ3hwbTRtbDR6Z2l0bzQzNGJnbjdjMCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FSDso2xuzxGEbdlxCNf\u002Fgiphy.gif\"\n       style=\"width:50%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\n\nThe new **Automated Playback Controls** bring molecular dynamics and conformational analysis to life in XR. Load multi-model files and watch your structures animate with full control over playback speed.\n\n**Use cases include:**\n- **Docking pose comparison:** Play through multiple poses to see how ligands explore binding site space\n- **Chemical interaction analysis:** Watch interactions update in real-time as frames advance\n- **Protein animations:** Import morph trajectories or molecular movies to visualize conformational changes\n- **Complex assembly:** Show multi-model complexes coming together in 3D space\n\nThis feature transforms how teams review docking results and communicate structural dynamics during collaborative sessions.\n\n---\n\n## Ligand Minimization with Force Field Options\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002FkWhs0x8YOrj8Tb3qtv.webp\"\n       style=\"width:50%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\nThe Builder tool now includes **ligand minimization** with three force field options:\n- **MMFF94** (Merck Molecular Force Field)\n- **MMFF94s** (static variant)\n- **UFF** (Universal Force Field)\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fminimization_options_f8c2b5cf4d.png\"\n       style=\"width:100%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\nAfter editing a ligand, run minimization to optimize geometry and enjoy a satisfying audio confirmation when the calculation finishes. This brings professional-grade ligand refinement directly into your immersive workflow.\n\n*Note: Pocket minimization (minimizing within the context of a protein binding site) is coming in a future update. For now, you can use MARA tools for protein-referenced minimization.*\n\n---\n\n## Custom Chemical Interactions\n\nChemical interactions have been completely rebuilt from the ground up with **custom, in-house calculations**. Unlike V1's plugin-based approach using open-source libraries, V2.4's interactions are native to the platform, delivering faster performance and precise control.\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fchemical_interactions_22eed05d22.png\"\n       style=\"width:100%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\n**Supported interaction types:**\n- Hydrogen Bonds\n- VDW Clashes\n- π-Stacking\n- Salt Bridges\n- Cation-π\n- Halogen Bonds\n- Hydrophobic Interactions\n- Metal Coordination\n\n**Why this matters:** With Position Molecules, you can manually adjust ligand placement and watch chemical interactions update in real-time. Move a ligand half an angstrom and immediately see how your interactions change. It's perfect for intuitive lead optimization and understanding binding site complementarity.\n\nThe clashes visualization is particularly valuable for identifying unwanted steric interactions between your ligand and protein residues, helping teams make better decisions about candidate selection.\n\n---\n\n## Beautiful Molecular Surfaces, Computed Server-Side\n\nThe much-loved high-quality surface rendering from V1 makes its V2 debut. Molecular surfaces are now computed server-side, meaning:\n- No lag or freeze-up during generation\n- Faster rendering on standalone devices\n- Beautiful, publication-quality surfaces that help you understand pocket shape and volume\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fsurfaces_e9b417a89b.png\"\n       style=\"width:100%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\nSee your molecules with better clarity than ever before and understand exactly how much space is available in that binding pocket.\n\n---\n\n## Measurement Tool Enhancements\n\nBuilding on V2.2's measurement foundations, this release adds:\n\n- **Dihedral angle measurements** for analyzing torsional relationships\n- **Live measurement preview** that updates in real-time as you position your selection\n- **Drag-and-drop measurement creation** for faster workflow\n- Improved angle measurement deletion and live preview\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fmeasurements_list_0c958a51b7.png\"\n       style=\"width:60%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\nSee distances change in your hand as you move through space. There's something uniquely intuitive about watching measurements update live in 3D.\n\n---\n\n## Redesigned Menus and New Login Experience\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F2_4_menus_6ec9db6b98.png\"\n       style=\"width:80%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n\nThe application menu has been reorganized for better scalability:\n- **Scene menu** now lives on the left\n- **Tools menu** on the right\n- **Tab navigation** across the top\n\nThe login experience now includes an onboarding banner with basic controls and a link to our YouTube tutorial library. It's helpful for getting new team members up to speed quickly. The tutorial panel can be toggled on\u002Foff anytime from Settings.\n\n---\n\n## New Selection Menu & Tool-Specific UIs\n\nEvery tool now has its own dedicated menu showing relevant information:\n- The **Selection tool** displays selected residues, atoms, and selection counts at a glance\n- No more navigating through nested menus to find selection details\n- Everything you need is right there when you need it\n\n---\n\n## Bulk Display & Ligand Tagging\n\nManaging large ligand sets just got much easier:\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fligand_tagging_d1fd54cf8b.png\"\n       style=\"width:100%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\n**Bulk Display:** Show multiple ligands from an SDF file simultaneously without splitting. Compare pose 5 and pose 9 side-by-side, examine their chemical interactions, and make decisions faster.\n\n**Ligand Tagging:** Tag ligands during review sessions with preset or custom labels. Later, use MARA to pull all \"hit\" or \"candidate\" tagged molecules from a workspace directly into a chat for further analysis.\n\n---\n\n## Quality of Life Improvements\n\n- **Sound effects** for UI interactions and events (minimization complete, stop following user, etc.) are configurable in Settings\n- **Scene reordering** lets you arrange your molecular PowerPoint exactly how you want it\n- **Hidden controller models in passthrough mode** so you can see exactly which finger is on which button during onboarding\n- **Improved annotation labels** with no black backgrounds and labels that don't obscure measurement lines\n- **Selection persistence across scenes** so you don't lose your 30-residue selection when switching views\n- **Improved parsers** for PSE, mmCIF, and PDB files with better data support and accuracy\n- **Ion component** added to the component menu\n- **Avatar color indicators** on hover in the Scene List\n- **Sequence menu optimizations** for smoother navigation\n\n---\n\n## What's Next\n\nV2.4 represents a major step forward in making molecular visualization and analysis more intuitive, powerful, and collaborative. We're continuing to expand MARA integration, add more tools, and refine the immersive experience based on your feedback.\n\n**Look out for the full Nanome V2 launch coming later in 2026.** We're getting close, and we can't wait to share what's next.\n\nHaven't tried V2 yet? Nothing beats experiencing it yourself.\n\n🚀 **Get started today:** [https:\u002F\u002Fnanome.ai\u002Fdemo\u002F](https:\u002F\u002Fnanome.ai\u002Fdemo\u002F)\n\n---\n\n*Written by the Nanome Team*\n","2026-01-28T20:29:49.764Z","2026-03-24T17:53:29.518Z","2026-01-28T20:30:48.665Z","2026-01-28","Nanome v2.4.0 is now available as an open access early release on Meta Quest. This milestone update brings voice-controlled molecular visualization, multi-frame playback for docking pose analysis, ligand minimization with MMFF94\u002FUFF force fields, and native chemical interaction calculations—all in immersive XR. Designed for computational chemists and drug discovery teams, v2.4 accelerates structure-based drug design workflows with real-time collaboration and intuitive 3D molecular manipulation.","molecular visualization software, VR drug discovery, structure-based drug design, CADD software, computational chemistry XR, ligand minimization, docking pose visualization, MMFF94 force field, UFF force field, chemical interactions visualization, protein-ligand interactions, virtual reality molecular modeling, Meta Quest scientific software, immersive drug design, medicinal chemistry tools, 3D molecular visualization, collaborative drug discovery, SBDD workflow, binding site analysis, lead optimization software, multi-frame molecular dynamics, voice commands molecular visualization, XR computational chemistry, Nanome VR, spatial computing drug discovery","nanome-v2.4.0:-early-access-release-mara-voice-commands-minimization-chem-interactions-and-more!","releases",{"id":68,"attributes":69},54,{"title":70,"content":71,"createdAt":72,"updatedAt":73,"publishedAt":74,"date":75,"description":76,"keywords":77,"slug":78,"category":53},"Meta Horizon Managed Services Goes Free: What This Means for Enterprise VR Users","On January 15th, [Meta announced](https:\u002F\u002Fwork.meta.com\u002Fhelp\u002F1964851097790493) a significant shift in their enterprise VR strategy: **Horizon Managed Services (HMS) will become completely free starting February 20, 2026**.\n\nFor those unfamiliar, HMS is Meta's required business platform for enrolling Quest devices into device management—essential infrastructure for any organization deploying VR headsets at scale.\n\n## What's Changing\n\n**The Good News:**\n- HMS will be **free to use** starting February 20, 2026\n- Meta will continue to fix bugs and provide support through **January 4, 2030**\n- Your existing MDM platforms ([ArborXR](https:\u002F\u002Farborxr.com\u002F), [ManageXR](https:\u002F\u002Fwww.managexr.com\u002F), etc.) will continue working seamlessly with HMS throughout this period\n\n**What to Know:**\n- Meta will **discontinue commercial SKUs** for Quest devices\n- However, you can still purchase and manage consumer Meta devices through meta.com, Amazon, Best Buy, and other retail channels\n- HMS enters \"maintenance mode\" through 2030, ensuring stability while Meta focuses on next-generation platforms\n\n## What This Means for Nanome Users\n\nIf you're running Nanome on Meta Quest devices, nothing changes immediately. Your current setup will continue working as expected through at least 2030.\n\n**For organizations with existing HMS licensing budgets:** This transition represents an opportunity to reallocate previously committed HMS licensing costs toward expanding your VR infrastructure, additional Nanome capabilities, or other strategic research technology investments.\n\n**Need bulk Quest devices for your team?** We can help coordinate purchasing and deployment. Reach out to us at support@nanome.ai or contact your account manager or application scientist directly.\n\n## Looking Ahead: Android XR\n\nAs Meta transitions HMS to maintenance mode, the industry is evolving. [**Nanome is a launch partner for Android XR**](https:\u002F\u002Fwww.linkedin.com\u002Ffeed\u002Fupdate\u002Furn:li:activity:7386593721460592640), which recently became available for enterprise deployments.\n\nIf you're interested in future-proofing your VR infrastructure and exploring migration to Android XR, we're here to support that transition.\n\n**Important compatibility note:** Nanome v2.4 and newer versions support Android XR. Earlier versions are not currently compatible.\n\n---\n\n**Questions about device management, bulk purchasing, or Android XR migration?**  \nContact us at support@nanome.ai or reach out through your dedicated account manager or application scientist.","2026-01-16T05:02:39.635Z","2026-03-24T17:53:30.133Z","2026-01-16T05:02:46.741Z","2026-01-16","Meta announces free Horizon Managed Services for Quest VR device management starting February 2026. Learn how this impacts computational chemistry, CADD, and drug discovery workflows using Nanome's molecular visualization platform, plus migration options to Android XR","Meta Quest enterprise, VR drug discovery, computational chemistry VR, CADD software, computer-aided drug design, molecular visualization, Meta Horizon Managed Services, HMS free, Android XR, enterprise VR management, pharmaceutical research technology, structure-based drug design, virtual reality chemistry, Quest device management, Nanome molecular modeling","meta-horizon-managed-services-goes-free:-what-this-means-for-enterprise-vr-users",{"id":80,"attributes":81},53,{"title":82,"content":83,"createdAt":84,"updatedAt":85,"publishedAt":86,"date":87,"description":88,"keywords":89,"slug":90,"category":91},"Setting Up Boltz-2 Configuration Files and Analysis with Nanome AI","# Docking Adventures with Boltz-2 and Nanome AI\n\nPredicting protein-ligand interactions can be a complex task, but tools like **MARA** are making it significantly easier to assist with and automate the preparation of configuration files, and visualize a system analytically.\n\nIn this tutorial, we explore a streamlined workflow for generating **Boltz-2 YAML configuration files** for docking experiments. Following a prediction we will have MARA prepare a workspace with our docking results, each colored by Predicted Aligned Error (pae) and predicted Local Distance Difference Test (plddt) from the resulting `.npz` files provided after a Boltz-2 prediction.\n\n---\n\n## YAML Configuration\n\nA dedicated workflow (Boltz Docking with Pocket Constraint) was designed to \n1. retrieve and prepare protein coordinate files, and \n2. to obtain ligands and convert directly to a SMILES string for docking experiments.\n\n\u003Cp align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fblog_1_a48d3fdf6c.png\" width=\"50%\">\n\u003C\u002Fp>\n\n### Primary steps of the workflow:\n\n1.  **Protein Retrieval:** Downloading a protein (specifically in **PDB format**) from the RCSB.\n2.  **Cleaning:** Removing water molecules from the crystal structure to create a **protein-only file**.\n3.  **Sequence Extraction:** Converting the amino acid residues from the PDB into a **FASTA file**.\n4.  **Ligand Processing:** Converting an SD file containing **multiple small molecules** into individual **SMILES strings**.\n    > *Multiple molecules in the SD file produces multiple yaml configuration files*.\n5.  **Config Generation:** Combining the protein sequence and SMILES string for each ligand into the final Boltz-2 configuration format.\n\n---\n\n## Preparation Considerations: Residue Numbering\n\nOne of the most important aspects of setting up a Boltz-2 prediction for docking results is the **residue numbering**. Boltz-2 starts its numbering at **one**, regardless of the original numbering in the crystal structure.\n\nFor example, while the protein **5CEN** actually starts at residue 117 in the crystal structure, the workflow identifies residues **57 and 59** (originally 174 and 176, respectively) for contact constraints because the predicted output from Boltz will be renumbered to begin from one. This ensures Boltz applies the proper constraints to **Binder B** from **Chain A** while completing a prediction.\n\n---\n\n## Predicting from Scratch vs. Structural Inputs\n\nWhile Boltz can be forced to employ a structural template and build off of existing coordinates, `-template` tag can be used with a path to a structural file, this workflow intentionally omits the protein PDB from the final Boltz input.\n\nThis forces a new prediction for each docking run by providing only the sequence and a ligand SMILES string. Researchers can then compare Boltz’s predicted protein fold against the known crystal structure to see how the software handles the docking across various iterations. An alignment within Nanome provides an RMSD value when comparing the folds from Boltz to the crystal structure.\n\n\u003Cp align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_80f504398d.png\" width=\"50%\">\n\u003C\u002Fp>\n\n---\n\n## Create a Library of YAML Output Files from MARA\n\nOnce the workflow is executed, MARA generates a report confirming the creation of the input files. Each file contains:\n\n* **Protein Definition:** Specifies the chain and the full amino acid sequence.\n* **Ligand Definition:** Includes a unique ID and the specific SMILES string.\n* **Constraints:** Defines the residues and chain from the protein that will interact with the ligand.\n\nBy passing a single protein and an SD file with multiple ligands into MARA, you can instantly generate dozens of YAML files, each ready for computation in Boltz-2.\n\n---\n\n## Visualization of PAE and pLDDT Values per Structure\n\nUnderstanding the results of a protein prediction or an affinity prediction requires understanding the system from many angles. MARA has built in tools to take a pdb output from Boltz and easily apply a color scheme derived from the pae and plddt analysis `.npz` files produced after a Boltz prediction.\n\nIn the prompt below (*color each pdb by pae and plddt*), 15 files were uploaded at once and MARA ran all the tools necessary to produce new files for each protein scaffold colored appropriately.\n\n\u003Cp align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fblog_3_75ff84dc9c.png\" width=\"50%\">\n\u003C\u002Fp>\n\n---\n\n## Visual Analysis with MARA Preview or in XR with Nanome V2\n\nAsking MARA to create a new workspace, move all the pdbs, and color them by beta factor is all it takes for the workspace setup.\n\n\u003Cp align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fblog_4_7a752bb2d9.png\" width=\"50%\">\n\u003C\u002Fp>\n\nTo make things easy, we wrote the pae and plddt scores to the beta factor column in the pdb file.\n\n\u003Cp align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fblog_5_585451190f.png\" width=\"50%\">\n\u003C\u002Fp>\n\nClicking on a workspace id brings you directly to that workspace, where the “**Preview**” is waiting so you can ensure the workspace is set up properly before launching into XR. The Preview can be used in two modes, in the default mode users can interact with the entry list and make representation changes using the UI; fullscreen mode is also available to provide a more detailed view.\n\n\u003Cp align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fblog_6_e49581272e.png\" width=\"50%\">\n\u003C\u002Fp>\n\n> \\*Representations changes can also be made with MARA using natural language (*and voice to text where supported*) directly in a workspace allowing users to easily switch back and forth between between multiple coloring schemes\n\nMultiple scenes containing targeted analytical visualizations can be prepared on the web ahead of an XR investigation. This allows users on the web to set a perspective for the scene so when they arrive in that scene with the augmented reality device they will already be in the action and not be required to find their protein of interest and set up everything again from scratch.\n\n\u003Cp align=\"center\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fblog_7_b6792b270f.png\" width=\"50%\">\n\u003C\u002Fp>\n\n---\n\n## Concluding Remarks\n\nThe Nanome AI ecosystem acts as a single repository where all your files are at your fingertips. Comparing and contrasting multiple docking experiments with various ligands can be tedious when collecting the results from many different folders.\n\nEmploying MARA as a place where your protein sequences and molecules can live, and are easily retrievable; makes setting up a docking run easy for anyone on your team. As they will always have access to Workflows like the one in this blog and they can easily convert a pdb to fasta file and convert any sdf to SMILES. Tools that streamline coloring a prediction by analysis files make it that much easier to predict just one more structure.","2026-01-07T22:09:59.035Z","2026-03-24T17:53:29.992Z","2026-01-07T22:10:58.851Z","2026-01-07","Learn how to set up Boltz-2 for protein structure prediction with MSA and YAML files, plus streamline analysis using Nanome AI's MARA assistant.","Boltz-2, protein structure prediction, MSA file, multiple sequence alignment, YAML configuration, Nanome AI, MARA, molecular analysis, structural biology, AlphaFold alternative, biomolecular structure, protein modeling, computational biology, drug discovery, binding site analysis, PDB comparison, VR molecular visualization, bioinformatics tools, enzyme analysis, structure prediction workflow","setting-up-boltz-2-configuration-files-and-analysis-with-nanome-ai","tutorials",{"id":93,"attributes":94},52,{"title":95,"content":96,"createdAt":97,"updatedAt":98,"publishedAt":99,"date":100,"description":101,"keywords":102,"slug":103,"category":104},"Interface to Design the Future","![](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_wcaptions_372b19161f.png)\n\n_Written by Keita Funakawa COO & Cofounder of Nanome_\n\nI was at an LA networking event last month when a gentleman with a finance background asked. \"So you work with molecules and stuff, right?\" He leaned in with genuine curiosity. \"What exactly are peptides?\"\n\nHis colleagues nodded eagerly. They'd been hearing the buzzword everywhere: in wellness circles, from their longevity-focused friends, in earnings calls from pharmaceutical giants. They knew [peptides were driving massive valuations](https:\u002F\u002Fwww.gminsights.com\u002Findustry-analysis\u002Fpeptide-therapeutics-market). But they had no idea what they actually were.\n\nSo I started with etymology. Peptide comes from the Greek péptō (πέπτω), meaning to cook, to digest. The \\-ide suffix refers to the amide bonds that link amino acids together. A peptide is literally a chain of amino acids connected by these bonds, and the length of that chain determines almost everything about how it behaves in your body.\n\nThen I told them something that seemed to genuinely surprise them: the difference between the Ozempic they'd heard about, the small molecule drugs in their medicine cabinet, and the monoclonal antibodies treating cancer patients comes down to atom count.\n\n**That's it. How many atoms are chained together.**\n\nTheir reaction wasn't confusion. It was recognition. Like something that had been fuzzy suddenly snapped into focus.\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia3.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZHY1NTJvMWNsZ2lob3lsbWNycHd3a2pkMnA5MjhwaG9oOTQwYWM5aiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FnMlKbwRMPnOTLysXAa\u002Fgiphy.gif\"\n       style=\"width:60%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n\n# The Problem With Hand-Wavy Medicine\n\nHere's what struck me about that conversation: these were intelligent, successful people who make complex decisions for a living. They understood market dynamics, risk profiles, competitive moats. But when it came to the actual thing driving a trillion-dollar valuation, their mental model was essentially: \"*hand-wavy breakthrough does hand-wavy thing, therefore I lose weight*.\"\n\nAnd honestly? That's not their fault. It's how we talk about medicine, health, and medicine.\n\nWe say \"GLP-1 agonist\" without explaining that it's a specific molecular structure, a chain of amino acids folded into a precise 3D shape, that fits into a receptor on your cells like a key into a lock. We say \"targets the glucagon pathway\" without often conveying that this means a physical object (a molecule) is literally binding to another physical object (a protein) and triggering a cascade of events inside a cell that eventually translates into you feeling less hungry.\n\nIt's all real. It's all physical. It's all happening at a scale we can't see, but **absolutely can visualize**.\n\nThink about how most people understand cars. They don't know gear ratios or combustion thermodynamics. But they have a rough mental model: rubber tires connect to wheels, wheels connect to axles, axles connect to an engine, engine burns fuel, car moves. That's enough to understand why a flat tire matters, why you need gas, why the engine overheating is bad.\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_dc60fa6425.png\"\n       style=\"width:60%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\nWe don't have that for medicine. When someone without a scientific background takes Ozempic, they don't picture a [31-amino-acid chain binding to a receptor](https:\u002F\u002Fpubchem.ncbi.nlm.nih.gov\u002Fcompound\u002FSemaglutide) on their intestinal cells. They don't visualize the conformational change that triggers intracellular signaling. They just know \"it works for weight loss.\"\n\nThe gap between those two levels of understanding (between hand-wavy abstraction and tangible mechanism) is what we think about constantly. Because closing that gap isn't just about education. It's about how we design better drugs, communicate their value, and make informed decisions about our own biology.\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_ef646d4115.png\"\n       style=\"width:60%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n# Everything Is Atoms, and Atoms Are 3D\n\nTo make this concrete, when we talk about drug categories, I like to think we're really talking about size: how many atoms, how they're arranged, how much space they occupy.\n\nSmall molecules are tiny. [Aspirin is about 21 atoms](https:\u002F\u002Fgo.drugbank.com\u002Fdrugs\u002FDB00945). These can slip through cell membranes easily, which is why you can swallow a pill and have it work throughout your body. But their small size limits how precisely they can interact with large, complex protein targets.\n\nPeptides are chains of amino acids linked by amide bonds. Semaglutide (Ozempic) is 31 amino acids, maybe 500 atoms total. [Tirzepatide (Mounjaro) is 39 amino acids](https:\u002F\u002Fpubchem.ncbi.nlm.nih.gov\u002Fcompound\u002F156588324). These are large enough to bind protein targets with high specificity, but small enough to be synthesized and modified with precision.\n\nAntibodies are massive: around 20,000 atoms, roughly 150,000 Daltons. They're incredibly specific, which is why they're used for targeted cancer therapies, but their size means they can't enter cells and are expensive to manufacture.\n\n**Here's the thing: all of these exist in three-dimensional space.** They fold, they flex, they rotate. A peptide isn't a string of letters (HAEGTFTSDV...). It's a physical object with a shape. And that shape determines everything: whether it binds to its target, how tightly, for how long, what side effects it causes.\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FUntitled_1_974751cf2e.gif\"\n       style=\"width:60%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\nWhen I explain this to people and they finally see it (when the abstraction becomes tangible), something clicks. \"Oh, it's like... a thing. A physical thing. That does a physical thing to another physical thing.\"\n\nYes. Exactly.\n\nAnd yet that's exactly where drug design happens now.\n\n[Tirzepatide isn't effective because someone discovered it in a plant](https:\u002F\u002Fwww.researchgate.net\u002Ffigure\u002FMolecular-recognition-of-tirzepatide-by-GIPR-and-GLP-1R-a-Structural-comparison-of_fig3_358853201). It's effective because researchers designed a specific sequence of amino acids, optimized the chain length, modified certain residues to improve stability, and engineered the 3D fold to bind two different receptors simultaneously. Every decision was made at the nano scale level.\n\nThis is the transition happening in medicine: from discovering molecules to engineering them. From finding things that work to designing things that work. And that transition requires a different way of seeing.\n\n# The Atomic Precision Revolution\n\nThree shifts are driving this:\n\n### 1\\. We've moved from discovery to design.\n\nFor most of pharmaceutical history, drug discovery was trial and error. Screen thousands of compounds, see what sticks, and optimize.\n\nThat era is ending. [Today we have AlphaFold predicting protein structures with near-perfect accuracy](https:\u002F\u002Fwww.nature.com\u002Farticles\u002Fs41586-021-03819-2), computational chemistry [modeling molecular interactions before synthesis](https:\u002F\u002Fwww.nature.com\u002Farticles\u002Fs41586-023-06415-8), cryo-EM [revealing exactly how drugs bind to targets](https:\u002F\u002Fshuimubio.com\u002Fblogs\u002Fcryo-em-in-drug-discovery-target-to-lead-optimization), and [solid-phase peptide synthesis allowing us to build custom amino acid chains on demand](https:\u002F\u002Fwww.bioduro.com\u002Fnews-resources\u002Finsights\u002Fsolid-phase-or-liquid-phase-how-has-peptide-synthesis-revolutionized-drug-discovery.html).\n\nWe're not just finding molecules anymore. We're designing them, atom by atom, with clear intent.\n\nThis is something we've been tracking closely at Nanome. Nearly 6 years ago, we published a video breaking down semaglutide and the GLP-1 mechanism, showing the actual molecular structure, how it binds, why it works. This was years before Ozempic became a household name. At the time, few people outside pharma cared about peptide visualization. What's changed isn't the science. It's the recognition that this level of understanding matters.\n\n\u003Ciframe width=\"560\" height=\"315\" src=\"https:\u002F\u002Fwww.youtube.com\u002Fembed\u002FkkrnEJ4dJ9Y?si=2vHOFafNKPWiBys1\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen>\u003C\u002Fiframe>\n\n## 2\\. The results are already transforming medicine.\n\nGLP-1 agonists like semaglutide and tirzepatide are [peptides that mimic natural gut hormones](https:\u002F\u002Fwww.diabetes.org.uk\u002Fabout-diabetes\u002Flooking-after-diabetes\u002Ftreatments\u002Ftablets-and-medication\u002Fglp-1\u002Fsemaglutide\u002Fwegovy). They slow gastric emptying, regulate insulin and glucagon, and act on the brain to reduce appetite. Eli Lilly's tirzepatide alone [generated over $10 billion in sales in 2024](https:\u002F\u002Ffinviz.com\u002Fnews\u002F237194\u002Feli-lilly-hits-1-trillion-milestone-etfs-to-invest-in) and [just pushed Lilly past a $1 trillion market cap](https:\u002F\u002Fwww.reuters.com\u002Fbusiness\u002Fhealthcare-pharmaceuticals\u002Flilly-becomes-first-drugmaker-join-trillion-dollar-club-weight-loss-demand-boom-2025-11-21\u002F).\n\n**But GLP-1s are just the beginning.** Antibody-drug conjugates combine antibody targeting precision with small molecule killing power. PROTACs hijack the cell's own protein disposal system to [eliminate previously \"undruggable\" targets](https:\u002F\u002Fwww.nature.com\u002Farticles\u002Fs41573-021-00371-6). mRNA vaccines proved they could be developed at pandemic speed.\n\nEach represents a different point on the molecular weight spectrum. All share one thing: they were designed with atomic precision to do something previously impossible.\n\n### 3\\. The old categories are breaking down.\n\nWe now have cyclic peptides with small molecule-like oral bioavailability, stapled peptides with enhanced stability that bridge the peptide-protein gap, and macrocycles that span categories entirely.\n\nInstead of asking \"should we make a small molecule or a biologic?\", drug designers now ask: \"What's the optimal molecular architecture to hit this target with the right pharmacokinetics, stability, and safety profile?\"\n\nSometimes that's 20 atoms. Sometimes it's 20,000. Increasingly, it's something in between that didn't exist in nature and wouldn't have been possible to design a decade ago.\n\n# Closing the Tangibility Gap: Why The Interface Matters\n\nHere's why I think all of this matters beyond the science.\n\nWhen people can't visualize what's happening at the molecular level, they can't really evaluate claims about medicine. They can't distinguish between genuine breakthroughs and marketing hype. They can't understand why one drug costs $1,000\u002Fmonth and another costs $10. They can't participate meaningfully in decisions about their own health.\n\nThe tangibility gap isn't just an education problem. It's a communication problem. And it's one that gets harder as molecular design gets more sophisticated.\n\nThis is fundamentally why we built Nanome. Our XR molecular visualization platform exists because we believe the most important insights in drug discovery happen when you can actually see molecular structures in 3D, at atomic resolution, and interact with them spatially. When a medicinal chemist can reach out and rotate a peptide, zoom into a binding pocket, show a colleague exactly where a modification changes the interaction surface, that's when abstraction becomes tangible.\n\nThat's where the actual insight occurs. That's when abstraction becomes tangible.  \nWe've watched drug discovery teams using Nanome cut weeks from lead optimization cycles because design discussions happen directly on the structures instead of over static slides. The limiting factor in molecular design isn't data anymore (we have petabytes of structural data). The limiting factor is interface. Whether you can actually manipulate what you're trying to build.  \nWhen you can see the thing you're building at the scale you're building it, you make different decisions.\n\nOur XR molecular visualization platform exists because the most important insights in drug discovery happen when you can actually see molecular structures in 3D, at atomic resolution, and interact with them spatially. When the abstraction becomes tangible, everything changes: how fast you iterate, how clearly you communicate, how confidently you decide.\n\n# The Future Is Atomically Precise\n\nBack at that networking event, after we'd talked through peptides and atom counts and why molecular shape matters, I decided to push a little further.\n\n\"You know the H100?\" I asked. The NVIDIA chip powering the AI revolution. Of course they knew it.\n\n\"It's built on a [4-nanometer process](https:\u002F\u002Fwww.pny.com\u002Fnvidia-h100). A single transistor gate is about the width of four aspirin molecules lined up side by side.\"\n\nI watched their expressions shift. The same look of recognition from earlier, but deeper now.\n\nThe chip driving every AI breakthrough they'd been reading about, every LLM, every billion-dollar compute cluster, it exists at the same scale as the molecules in their medicine cabinet. The transistors switching on and off billions of times per second are operating at the same level of atomic precision as the peptides regulating their metabolism.\n\nAnd suddenly they saw what I see every day:\n\nEverything is made of molecules. Everything that matters in the future (computing, medicine, materials, energy) is converging on molecular-scale design. The companies that win will be the ones that can engineer at this level with precision and intent.\n\nThe atomic revolution isn't coming. It's here. It's in the chips powering AI. It's in the peptides treating obesity. It's in the antibody-drug conjugates targeting cancer cells. It's in the mRNA vaccines that responded to a pandemic in months instead of years.\n\nThe question isn't whether the future will be built at atomic scale. It's whether we'll have the tools to see it, design it, and communicate it clearly.\n\nThat's what Nanome is for. We're building the interface to design the future.\n\n\n","2025-11-25T20:27:42.582Z","2026-03-24T17:53:28.055Z","2025-11-25T20:32:20.662Z","2025-11-25","Bridging the gap between molecular abstraction and atomic reality: insights for CADD scientists and computational chemists on the precision design revolution. Discover how visualizing peptides and protein structures in 3D is transforming the engineering of GLP-1s, ADCs, and small molecules.","CADD, computer aided drug design, computational chemistry, chem informatics, cheminformatics, medicinal chemistry, structure based drug design, structure enabled drug design, molecular modeling, molecular visualisation, molecular visualization, 3D molecular visualization, XR molecular visualization, VR drug discovery, VR for drug discovery, AR drug discovery, peptides, peptide therapeutics, GLP 1 agonists, semaglutide, tirzepatide, PROTACs, antibody drug conjugates, ADCs, small molecule drug discovery, biologics design, macrocycles, AlphaFold, protein structure prediction, cryo EM drug discovery, solid phase peptide synthesis, atomic precision, atomic scale design, binding pocket analysis, binding site analysis, structure activity relationships, SAR, lead optimization, virtual reality for scientists, VR for chemists, Nanome, Nanome XR platform, drug discovery collaboration tools, pharma R&D visualization, 3D interface for drug design, h100, peptide, ozempic","interface-to-design-the-future","philosophy",{"id":106,"attributes":107},51,{"title":108,"content":109,"createdAt":110,"updatedAt":111,"publishedAt":112,"date":113,"description":114,"keywords":115,"slug":116,"category":66},"Important Security Update for Nanome – Unity Vulnerability Patch","## Overview\n\nUnity recently disclosed a vulnerability in its engine that affects VR applications built after **Unity v2017.1**.  \nWhile Unity reports no evidence of exploitation and no impact to users, we want to ensure all Nanome users are aware of the issue.  \n\nWe recommend updating:\n- **Horizon OS** to v79  \n- **Windows Defender**  \n- And the **latest versions of Nanome** that include a patch for the vulnerability.\n\n---\n\n## 🔧 What’s Been Done\n\n### For Meta Quest Users\nMeta has already addressed this vulnerability for Meta Horizon OS users through automatic over-the-air updates in **v79**.\n\n### For PC VR Users\nWindows released an update to [Windows Defender](https:\u002F\u002Fwww.microsoft.com\u002Fen-us\u002Fwdsi\u002Fdefenderupdates) as of **October 3** to address this vulnerability.\n\n### Additional Layer of Security\nWith the platform-level updates above in place, the vulnerability is effectively mitigated.  \nAs an additional layer of security, we have also released patched versions of Nanome with updated builds now available:\n\n- **Nanome v1.24.6** — now patched and available on the Meta Store and [nanome.ai\u002Fsetup](https:\u002F\u002Fnanome.ai\u002Fsetup)  \n- **Nanome v2.3.1** — accessible through closed beta channels on the Meta Store (currently the only official distribution method).  \n  If you received a v2 APK through other means, please contact **support@nanome.ai** or your Nanome representative.\n\n---\n\n## ✅ Recap: Recommended Actions\n\n1. **Update Horizon OS (Meta Quest users):** Ensure your Meta Quest is running Horizon OS v79 or later  \n2. **Update Windows Defender (PC VR users):** Ensure Windows devices are current with the October 3 security update  \n3. **Update Nanome:** Download the latest patched version (v1.24.6 or v2.3.1) from the Meta Store, [nanome.ai\u002Fsetup](https:\u002F\u002Fnanome.ai\u002Fsetup), or through your Nanome representative  \n4. **Review Unity’s guidance:**  \n   - [Security update advisory](https:\u002F\u002Funity.com\u002Fsecurity\u002Fsept-2025-01)  \n   - [Remediation guide](https:\u002F\u002Funity.com\u002Fsecurity\u002Fsept-2025-01\u002Fremediation)\n\n---\n\n## 🏢 For Enterprise Deployments\n\nIf your devices operate within a closed network environment and don’t have external network exposure, your risk is already significantly mitigated.  \nHowever, we still recommend updating to the latest patched version as a best practice.\n\nWe have prioritized this update to ensure your team can continue using Nanome with confidence.  \n\nIf you have questions about updating your deployment, please reach out to  \n📧 [support@nanome.ai](mailto:support@nanome.ai) or your Nanome representative.\n\n---\n\n\nWe want to thank all our users for your continued trust and support. Your feedback helps us keep improving Nanome and delivering secure, reliable, and innovative tools for molecular design and collaboration.\n\nIf you experience any issues or have questions about this update, please don’t hesitate to reach out — our team is here to help.\n\nThank you again for being part of the Nanome community.\n\n– The Nanome Team","2025-10-09T17:38:04.602Z","2026-03-24T17:53:29.887Z","2025-10-09T17:39:45.123Z","2025-10-09","Security update for Nanome VR addressing a Unity engine vulnerability. Learn what’s been fixed, recommended actions for Meta Quest and PC VR users, and how to update Nanome to the latest patched versions for maximum protection.\n","Nanome, Nanome VR, Unity vulnerability, Unity security patch, VR security update, Meta Quest, Horizon OS v79, Windows Defender update, Nanome v1.24.6, Nanome v2.3.1, VR application security, Unity engine, Nanome enterprise, VR software update, Unity vulnerability fix, Nanome support, molecular design, Nanome AI, virtual reality collaboration","important-security-update-for-nanome-unity-vulnerability-patch",{"id":118,"attributes":119},50,{"title":120,"content":121,"createdAt":122,"updatedAt":123,"publishedAt":124,"date":125,"description":126,"keywords":127,"slug":128,"category":66},"Nanome V2.3.0: New Tools (Builder and Selection), Sequence Menu, and Web Preview","\u003C!-- Title: Nanome V2.3.0: New Tools (Builder and Selection), Sequence Menu, and Web Preview -->\n\nThis is a monumental update, literally our biggest v2 update yet, and it represents a transformative leap in molecular visualization and creation. After introducing our first 3D tool with measurements in v2.2, we're thrilled to unveil **3D molecular building in v2**, alongside revolutionary web-based workspace previews that make Nanome more accessible than ever before.\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia1.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExenN1OGxhZm1xamVqMmg1eWxuN2FrMmg0bHM2azVmeTdnN2JycjViMyZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FDbn1NzayTpW5AN3ZjB\u002Fgiphy.gif\"\n       style=\"width:60%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n---\n\n## The Ligand Builder Tool: Molecular Construction Reimagined\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia4.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExdDdjem1ndzVtcXgzMWZ1endpdG5ydHc4Z2hvaXVsMTl5eGVqdGd3dCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002Fd3VGWzWqO55FF4O3PS\u002Fgiphy.gif\"\n       style=\"width:100%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\nBuilding molecules is better than ever. While v1 allowed for more free-form building with reliance on minimization, v2 introduces **precision-based construction** with fixed bond lengths and angles. The new Ligand Builder features three fundamental geometry modes: linear, trigonal planar, and tetrahedral, ensuring chemically accurate structures from the start. \n\nSimply click on an atom to reveal guided placement options based on your selected geometry, with automatic rotation for proper bond angles. Build your molecule atom by atom, add fragments from our library, and leverage **full undo\u002Fredo support from day one** so you can experiment freely and correct mistakes instantly without losing your work. Minimization is available today through MARA tools and workflows, which can grab ligands from workspaces, run minimization, and push results back to v2. Native in-app minimization is on the roadmap, and the builder is designed for seamless integration between XR building and computational refinement.\n\n---\n\n## Web-Based 3D Workspace Preview: Nanome Everywhere\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia4.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZGpmZmlobTZxaGR4bWgzZHp1cGxjdDFjNmIxdjJrNnFxMXA0ZnFyZiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002Fg7cKPURA8cLU76rnJu\u002Fgiphy.gif\"\n       style=\"width:60%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nRemember the 2D mode on Windows PCs from v1? We've completely reimagined accessibility. **You can now preview your full VR workspace directly from the web**—complete with interactive 3D navigation, scene switching, structure zooming, and POV control. This means complete hardware support across all devices, from high-end workstations to mobile phones and tablets. Nanome has never been more accessible, bringing immersive molecular visualization and collaboration to any device with a browser.\n\n---\n\n## The Selection Tool: Precision Meets Intuition\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia3.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMDVjbDU1a3l4MWdsZWlpMW90M2Nib3Bqc2U2cmRlaTdybzZlYnlmcSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FhN7EjqS0AndWcgUfeT\u002Fgiphy.gif\"\n       style=\"width:100%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nOur new Selection Tool transforms how you interact with molecular structures in 3D space. Select atoms, residues, or entire regions with natural hand movements, then instantly convert selections into custom components. Hovering reveals contextual information, making structural analysis more intuitive than ever.\n\n---\n\n## The Sequence Menu: A Game-Changer for Structure Navigation\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fsequence_menu_two_sequences_selected_c642d0111c.png\"\n       style=\"width:100%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nThe new **Sequence Menu** is a breakthrough for navigating and understanding protein structures. This interactive sequence viewer displays the primary structure of your molecules with unprecedented clarity. Simply click and drag to select sequence spans—whether you need a specific alpha helix, a binding pocket, or a functional domain—and instantly convert those selections into custom components. \n\nFor structural biologists and medicinal chemists working with protein-ligand complexes, this means you can now rapidly identify and isolate regions of interest directly from the sequence. No more hunting through 3D space or writing complex selection scripts. See an interesting motif in the sequence? Select it, create a component, and immediately focus your analysis. This seamless connection between sequence and structure fundamentally changes how you explore biomolecules in VR.\n\n---\n\n## Property Annotations: Data-Driven Design in 3D Space\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia0.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExbndsNGI0bTFpYnh3emtwc3l0a3g5dnV6cWxjOWJmZWpmc3FpbWR3YSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FphIl9NDDjqU77MOTmS\u002Fgiphy.gif\"\n       style=\"width:100%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nTransform how you visualize structure-activity relationships with our new **3D property annotations**. Any property from the ligand menu can now be displayed as floating labels directly in your workspace. Want to see LogP values hovering above each compound? Need to track the number of hydrogen bonds in the binding pocket? Comparing molecular weights across a series? This feature has you covered.\n\nFor medicinal chemists engaged in lead optimization, this means instant visual access to critical molecular properties without menu diving. Imagine walking through a virtual room where each molecule displays its binding affinity, ADMET properties, or synthetic accessibility score—all customizable with full color support for quick visual categorization. During collaborative sessions, teams can now literally see the data that drives decisions, making SAR discussions more intuitive and productive. Properties update in real-time as you modify structures, creating a truly dynamic environment for molecular design.\n\n---\n\n## Enhanced Grab Mechanics: Natural Interaction at Any Distance\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia2.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExNTBoNDRqbGQ3Znc3OHh4cXE0cWZlM2V6dHJqYzQ1bTNxN2ZxMWsyaiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FnHHowYY1BqiIY87N0N\u002Fgiphy.gif\"\n       style=\"width:100%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nWe've completely reimagined how you manipulate structures from a distance. V1 users may remember the cumbersome process of grabbing a distant structure then using the joystick to push and pull it closer. That's history. Now, simply **point at any structure and grab, and it comes directly to your hand**. The system is smart enough to know when you're inside a structure and prevents unwanted movement, maintaining your orientation when it matters most.\n\nThis seemingly simple change has profound implications for workflow efficiency. Navigate large protein complexes, compare multiple conformations, or arrange structures for presentation—all with natural, intuitive gestures. And for those who prefer the classic controls, this feature can be disabled in the settings menu. It's all about giving you the control you need, the way you want it.\n\n---\n\n## Notable Enhancements\n\n### Visual and Interaction Improvements\n- **Wireframe highlighting** when hovering over structures in UI or 3D space\n- **Haptic feedback** for all UI interactions, enhancing the tactile experience\n- **Hydrogen visibility modes** for customizable structural views\n- **Multi-model text annotations** with full color support\n- **Site components** support with dedicated cross-entry component organization\n- **Improved UI contrast** with clearer item separation and better collapsed layouts\n\n### Building and Editing Features\n- **Room-wide autoplay** for cycling through models collaboratively\n- **Build\u002FReplace\u002FDelete modes** with contextual visual feedback\n- **Elements and Fragments library** for rapid construction\n- **Smart valence checking** prevents chemically impossible structures\n- **Persistent undo\u002Fredo** throughout the building process\n\n### Collaboration and Accessibility\n- **Instant reconnection** when resuming the app after connection loss\n- **PDB export** directly from entries\n- **Embedded 3D preview** on workspace web pages\n- **Dramatic performance improvements** with intelligent rendering optimization\n\n---\n\n## Bug Fixes and Stability\n\nWe've squashed numerous bugs including fixes for hydrogen bonding in nucleic acids, surface rendering inconsistencies, UI raycast behavior, workspace invitations, and permission management. The complete list includes over 50 targeted fixes ensuring a stable, reliable experience.\n\n---\n\n## Looking Forward\n\nVersion 2.3.0 represents not just an update, but a platform evolution. We're building the foundation for even deeper integration between immersive visualization and computational chemistry, including native minimization and direct MARA tool triggering within XR. As we continue to blur the lines between web and XR experiences, we're committed to making powerful molecular tools accessible to scientists everywhere, on any device.\n\n**Experience the future of molecular design today.** Request a demo or early access to v2 at \u003Chttps:\u002F\u002Fnanome.ai\u002Fdemo\u002F>.\n\nHappy building!  \nThe Nanome Team","2025-09-23T22:39:41.935Z","2026-03-24T17:53:29.771Z","2025-09-23T22:51:43.023Z","2025-09-23","Nanome v2.3.0 delivers 3D ligand building, web-based VR workspace preview, and precision molecular design tools for drug discovery and computational chemistry","nanome, VR drug discovery, molecular visualization, ligand builder, 3D molecular modeling, computational chemistry, CADD, computer aided drug design, structure based drug design, SBDD, molecular design software, VR chemistry, augmented reality molecules, medicinal chemistry tools, cheminformatics, computational drug discovery, molecular builder, protein ligand visualization, virtual reality pharma, XR molecular tools, web based molecular viewer, 3D structure viewer, drug design platform, molecular docking visualization, fragment based drug design, FBDD, structural biology software, biomolecular visualization, collaborative drug discovery, molecular workspace, chemical structure builder","nanome-v2.3.0:-new-tools-(builder-and-selection)-sequence-menu-and-web-preview",{"id":130,"attributes":131},49,{"title":132,"content":133,"createdAt":134,"updatedAt":135,"publishedAt":136,"date":137,"description":138,"keywords":139,"slug":140,"category":53},"From PDB to Pose: Integration of Nanome’s MARA and Cresset’s Flare","# **From PDB to Pose: Integration of Nanome's MARA and Cresset's Flare** \n\nEvery small‑molecule project starts with a question: can we place the right chemistry in the right pocket, quickly and without giving up scientific control? This demonstration follows that exact journey. Fujitsu, our reseller and systems integrator in Japan, set out to connect MARA with Flare™ from Cresset®, to enable scientists to move from an accession code to validated poses in a single conversation. The images below are frames from a Fujitsu recording of that workflow in action.\n\n## **Why Flare**\n\nFlare is Cresset's  complete CADD solution for ligand-based and structure-based design, high-resolution 3D visualization and in-depth analysis of ligand series and biological targets. Combining robust computational methods with AI\u002FML, it is widely used across pharma, agrochem and biotech companies to generate novel, active molecules with optimum efficiency.  Scientists choose Flare because it balances prediction quality, interactive control and traceability with ease of use and cost efficiency. In this integration, especially two tools Pyflare and the Flare Python API helped Fujitsu a lot. The Python API and the pyflare module facilitates simple, third-party integration, and the advanced scope of methods including docking and scoring, Electrostatic Complementarity™, Free Energy Perturbation enable scientists to make fast, informed decisions. Parameters are explicit, results are inspectable, and workflows are proven in production.\n\n## **Why this milestone matters**\n\nMARA already orchestrates open source tools such as RDKit, P2Rank, and AutoDock Vina, which is ideal for rapid prototyping. This integration shows the same agentic approach working cleanly with a closed source, enterprise grade platform. That unlocks governed data paths, vendor support, and the compliance posture that global pharma and biotech expect. In short, MARA can run commercial engines side by side with open source utilities inside one private, audited pipeline.\n\n## **Protein context: FKBP12 and 1FKG**\n\n1FKG is human FKBP12, the FK506 binding protein that acts as a peptidyl prolyl cis\u002Ftrans isomerase and is present in many tissues. FKBP12 forms complexes with tacrolimus and sirolimus, which modulate calcineurin and mTOR pathways. This makes it relevant to immunology, transplant medicine, and oncology research. The 1FKG entry contains a high affinity synthetic ligand, SB3, and was solved at 2.0 Å resolution. It is a practical demo target because the pocket is well defined, the protein is small and stable, and there is strong benchmarking data for docking and SAR. Using 1FKG lets us show how MARA moves a real structure through preparation, cavity detection, and docking, then produces poses that scientists can inspect or compare with the crystallographic reference.\n\n## **How the integration works at a glance**\n\nMARA orchestrates each action, passes correctly formatted files into Flare, and records every step with inputs and outputs. This hand-off is performed programmatically via pyflare and the Flare Python API, allowing automated execution of protein preparation, docking, and scoring without brittle manual steps. Scientists keep control of parameters while skipping brittle file handoffs. The flow runs on premises or in a private cloud with SSO and audit logs.\n\n## **Walkthrough of the 1FKG demo**\n\n### **1\\) Retrieve the structure**\n\nA user asks MARA to download PDB 1FKG. MARA pulls the entry and stores it as 1fkg.pdb, then presents an interactive 3D preview so the user can confirm the target before proceeding.\n\n![Figure 1. MARA interface showing successful download of PDB 1FKG with 3D molecular visualization preview of the FKBP12 protein structure.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F01_dc9db3d34e.png)\n\n![Figure 2. Task completion panel displaying the saved 1fkg.pdb file with confirmation message for PDB structure retrieval.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F02_03abfb684b.png)\n\n### **2\\) Prepare the protein with Flare**\n\nMARA runs a Flare protein preparation step on 1fkg.pdb, producing a docking‑ready file named 1fkg\\_P.pdb. Preparation typically covers hydrogen addition, protonation assignment, residue and ligand filtering, and other conditioning that makes a structure suitable for docking.\n\n![Figure 3. MARA plan editor interface for Flare protein preparation workflow with input parameter specification for 1fkg.pdb.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F03_46dd8495cc.png)\n\n![Figure 4. Protein preparation completion panel showing input file (1fkg.pdb) and prepared output file (1fkg_P.pdb) with live 3D molecular preview.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F04_65081b4887.png)\n\n### **3\\) Detect likely binding pockets**\n\nTo guide search space selection, MARA calls a pocket predictor on the prepared protein. In this demo Fujitsu used P2Rank. The result includes a probability and a pocket center.\n\n![Figure 5. P2Rank cavity detection setup interface showing input parameters for binding pocket prediction on prepared 1fkg_P.pdb structure.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F05_2e75f05ffe.png)\n\n![Figure 6. P2Rank results table displaying identified binding pocket with probability score (0.422) and 3D center coordinates (-27.76, 26.1046, 3.7486).](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F06_bd6579c596.png)\n\n### **4\\) Identify any co‑crystallized ligands**\n\nMARA parses the macromolecule to list three‑letter ligand codes. For 1FKG, the code SB3 is detected.\n\n![Figure 7. Ligand extraction interface and results panel confirming detection of co-crystallized ligand SB3 from the 1fkg.pdb structure.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F07_ec171dac77.png)\n\n### **5\\) Retrieve the reference ligand as SDF**\n\nWith the code in hand, MARA downloads the SDF for SB3 for use as a docking reference and displays a 3D preview.\n\n![Figure 8. Ligand download confirmation showing saved ligand_SB3.sdf file with 3D structural preview of the SB3 reference compound.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F08_8179cb9f07.png)\n\n### **6\\) Dock SB3 into 1FKG using Flare**\n\nMARA passes the prepared protein, the ligand SDF, and the cavity center into Flare with explicit box dimensions and pose count. The demo shows these parameters: center x \\= \\-27.76, y \\= 26.1046, z \\= 3.7486, box length 16 Å, box width 16 Å, box height 16 Å, maximum conformations 10\\.\n\n![Figure 9. Flare docking parameter setup form showing input files (1fkg_P.pdb protein, ligand_SB3.sdf), binding site coordinates, and docking box dimensions (16 Å³).](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F09_2bf49a106e.png)\n\n![Figure 10. Docking completion results showing successful SB3 docking into 1fkg with output file (1fkg_P_ligand_SB3_D.sdf) and 3D viewer displaying Model 1 of 7 generated poses.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F10_81af0c906f.png)\n\n### **7\\) Create a combined complex for review**\n\nFor easier sharing and visual inspection, MARA merges the prepared protein with all docked poses into a single multi‑frame PDB.\n\n![Figure 11. Complex merging workflow interface showing combination of prepared protein file (1fkg_P.pdb) with docked ligand poses (1fkg_P_ligand_SB3_D.sdf).](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F11_5c2c53145c.png)\n\n![Figure 12. Merged complex output panel displaying the combined PDB file (complex_1fkg_P_ligand_SB3_D.2025_08_21_03_00_47.333241.pdb) with orange cartoon protein structure and SB3 poses.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F12_e09b0125ad.png)\n\n### **8\\) Browse through poses**\n\nThe complex opens with an interactive model selector so the scientist can step through poses and zoom into the pocket.\n\n![Figure 13. Interactive pose browser showing protein-ligand complex with model selector set to \"Model 2 of 7\" for systematic evaluation of different docking conformations.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F13_db4016b2d2.png)\n\n![Figure 14. Detailed binding site view showing SB3 ligand positioned within the FKBP12 active site cavity (Model 3 of 7) for structure-activity relationship analysis.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F14_0ac034c936.png)\n\n## **Why this matters for teams**\n\nThis pattern saves time without hiding science. Parameters remain visible, intermediate files are preserved, and the full run is repeatable. Teams can template the sequence and apply it to other targets, then extend it with enumeration, rescoring, MMGBSA or FEP, PLIF analysis, and immersive inspection in Nanome. The integration runs in a controlled environment so data never leaves your boundary.\n\n## **Acknowledgments and next steps**\n\nThank you to Fujitsu for the implementation and demo, and to Cresset for building Flare, a powerful ligand and structure‑based design platform. Flare is a trademark of Cresset. All other names are the property of their respective owners.\n\n## **Get in touch**\n\nIf you are a Japanese biopharma company and would like to explore this workflow, please reach out directly to Fujitsu. For organizations in the US or Europe interested in seeing MARA and Flare in action, contact us to arrange a demonstration.","2025-09-18T21:25:49.163Z","2026-03-24T17:53:28.375Z","2025-09-18T21:27:57.313Z","2025-09-18","Streamline drug discovery with MARA-Flare integration. Automate protein preparation, pocket detection, and molecular docking from PDB to pose in one workflow.","molecular docking, drug discovery, CADD software, protein preparation, structure-based drug design, computational chemistry, cheminformatics, MARA AI, Cresset Flare, PDB structure, binding pocket detection, pose generation, pharmaceutical research, biotech tools, medicinal chemistry, molecular modeling, virtual screening, ligand docking, protein-ligand interaction, automated workflow, enterprise drug discovery, FKBP12, chemical informatics, structural biology, computer-aided drug design, molecular visualization, docking automation, pharma software integration, API integration, bioinformatics tools, chemical biology, drug development pipeline, molecular dynamics, binding affinity, structure activity relationship, SAR analysis, lead optimization, hit identification, target preparation, compound screening, molecular recognition","from-pdb-to-pose:-integration-of-nanome's-mara-and-cresset's-flare",{"id":142,"attributes":143},48,{"title":144,"content":145,"createdAt":146,"updatedAt":147,"publishedAt":148,"date":149,"description":150,"keywords":151,"slug":152,"category":153},"V2.2 Usecase Blog: How a biochemist leveraged Measurement Tools in XR to Streamline Protein Design","# How a biochemist leveraged Measurement Tools in XR to Streamline Protein Design\n\nProtein design is a meticulous and intricate process, often hinging on precise measurements and visualizations. For Joe Laureanti, PhD, a biochemist deeply experienced in protein engineering, the journey from traditional computational tools to immersive XR (AR\u002FVR\u002FMR) solutions highlights the transformative impact of intuitive technology.\n\n---\n\n## The Challenge of 2D Protein Visualization\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fpymol_measurements_af1d540698.png\"\n       style=\"width:60%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nEarly in his research career, Joe faced significant hurdles visualizing proteins with his team. He recounted a particularly challenging scenario at a lab meeting:\n\n“I was trying to show an image of a protein binding site that included measured distances critical for our project. But my 2D representation led to confusion—one line looked longer but was actually shorter. It sparked a 30-minute debate among PhDs about whether the measurements could be trusted.”\n\nThis incident was more than just a frustrating meeting; it encapsulated a core problem scientists frequently encounter—accurately conveying 3D structural data in a flat 2D interface.\n\n---\n\n## The Shift to Virtual Reality\n\nDiscovering Spatial Computing (AR\u002FVR\u002FMR) completely changed Joe’s research experience. “I realized if we could just all stand inside the molecule, nobody would argue about distances,” he explained. Joe began exploring XR tools like UnityMol and later, Nanome.\n\n---\n\n## The Breakthrough: Nanome V2’s Measurement Tool\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia1.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExdDgxZWMxd2NzZG1oYXZkYXoycDFtOHpod3lyZGNvY3dtbzhhbG9peSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FoVMDxsmhfZiYtI6X2t\u002Fgiphy.gif\"\n       style=\"width:60%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nWith Nanome v2, Joe re-experienced a fundamental shift. The new 3D Measurement Tool provided an intuitive and precise method to select atoms and measure distances instantly:\n\n“In version 2 of Nanome, it’s literally just touch and click—you touch an atom, click, touch another, click again, you instantly have your distance. No more struggling with angles on a flat screen.”\n\n---\n\n## Immediate and Tangible Results\n\nNanome V2’s Measurement Tool dramatically accelerates research workflows. For Joe, the impact was clear:\n\n- **Time Efficiency:** Eliminating the (sometimes) week-long waits for computational chemists to return measurements. Anyone can now perform instant, real-time measurements in collaboration meetings.\n- **Enhanced Accuracy:** By interacting directly with proteins in immersive 3D space, teams can ensure more precise and confident decisions, avoiding costly misunderstandings.\n- **Improved Collaboration:** Teams easily share insights directly within the XR environment, streamlining discussions and enhancing mutual understanding.\n\n---\n\n## Joe’s Journey Comes Full Circle\n\nReflecting on his journey, Joe emphasizes the value XR tools bring to molecular research:\n\n“My protein design projects depended heavily on accurate spatial understanding. XR made that possible. The new Measurement Tool in Nanome V2 enables XR design efforts—an intuitive solution for a long-standing challenge.”\n\nFor Joe Laureanti and researchers like him, Nanome V2 isn’t merely an incremental upgrade; it’s an innovative leap forward, redefining how scientists visualize, measure, and collaborate in the molecular world.\n\n---\n\n\u003Cdiv style=\"text-align: center; padding: 2% 0;\">\n\u003Ciframe width=\"560\" height=\"315\" src=\"https:\u002F\u002Fwww.youtube.com\u002Fembed\u002F2lWhJmxGX7k?si=YTyaPBxjp_z0UHfy\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen>\u003C\u002Fiframe>\n\u003C\u002Fdiv>\n\n---\n\nEnjoyed this blog? For a deeper dive into Joe's fascinating journey, the groundbreaking v2.2 updates, and a wealth of additional insights, be sure to check out The Science Metaverse Podcast. Specifically, tune into Episode 53, where Joe himself shares his experiences and the exciting advancements in detail.\n\nRequest a demo today and discover how Nanome v2 can streamline your research, accelerate your discoveries, and unlock new possibilities in biochemistry and beyond.\n\nhttps:\u002F\u002Fnanome.ai\u002Fdemo\u002F","2025-07-22T20:51:40.444Z","2026-03-24T17:53:28.742Z","2025-07-22T20:51:41.487Z","2025-07-22","Overcome the limits of 2D visualization in drug discovery. See how an intuitive XR measurement tool provides precise 3D data for structural biology and CADD","drug discovery, industrial research, computational chemistry, medicinal chemistry, structural biology, computer-aided drug design (CADD), cheminformatics, bioinformatics, lead optimization, hit-to-lead, structure-activity relationship (SAR), ADMET, preclinical development, molecular modeling, molecular docking, virtual screening, structure-based drug design (SBDD), ligand-based drug design (LBDD), pharmacophore modeling, QSAR, molecular dynamics, x-ray crystallography, cryo-EM, NMR spectroscopy, protein-ligand interactions, small molecules, biologics, target identification, target validation, high-throughput screening (HTS), compound management, chemical databases, systems biology","v2.2-usecase-blog:-how-a-biochemist-leveraged-measurement-tools-in-xr-to-streamline-protein-design","case-studies",{"id":155,"attributes":156},47,{"title":157,"content":158,"createdAt":159,"updatedAt":160,"publishedAt":161,"date":162,"description":163,"keywords":164,"slug":165,"category":66},"Nanome v2.2.0: Measure, Interact, and Analyze from Atom to Angstroms","Nanome v2.2.0 delivers powerful new features for scientists analyzing molecular structures in immersive, collaborative settings. Whether you're measuring atomic distances, customizing structural representations, or exploring chemical interactions with teammates, this release deepens your ability to understand complex data intuitively and efficiently.\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia2.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExcmx3ejA2d3huNXk2M20xd3p0cTQ3bzJrZmtlZm1xZmZtZGY5OW90ZCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FlkpGaCdNEPqi75gkMn\u002Fgiphy.gif\"\n       style=\"width:60%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n---\n\n## The 3D Measurement Tool\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia0.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExeXZhcTdyZGM1OGR5ZGVqcDV3bnBtazR0dWRqaGF5anhvbHVlM2hociZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FoVMDxsmhfZiYtI6X2t\u002Fgiphy.gif\"\n       style=\"width:60%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nAt the heart of this release is our highly requested 3D Measurement Tool. This feature enables precise measurement of distances between atoms directly in 3D space. The intuitive interface lets you easily toggle measurement and deletion modes from the tools menu, streamlining your molecular analysis. Expect future enhancements, including additional measurement types like dihedral angles.\n\n---\n\n## Advanced Selection Improvements\n\nCreate custom components to represent residue ranges, specific ligands, or specific chains.\n\n---\n\n## Parsing PyMOL Metadata\n\nLoad PyMOL (.pse) session files effortlessly, enhancing the import of complex scientific data.\n\n---\n\n## Polymer-Polymer Interactions\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia0.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExdTNzc2kwY3I2ZGZmMmppMTBkaXZvODNrc3FwY29kbnppdW51dWxiZiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002F2h4B1C9LsFvTkUqXS0\u002Fgiphy.gif\"\n       style=\"width:60%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nExplore chemical interactions in greater depth by examining inter- and intra-entry polymer interactions, including protein interfaces between structures.\n\n---\n\n## Notable Quality-of-Life Improvements\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia1.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExM2RtbzRwcWNrcGE5bGYxN3NzbGVuemtkOTV4M3FhczB1bzhmdHJydSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FnBgFU4uwyHmMOm6ATV\u002Fgiphy.gif\"\n       style=\"width:45%; height:auto; padding:2%;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia2.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZXAzb3pmMjRrOXpjZDFqY2lxazhhbzI0cHo5anUwZnpiNjhncjI0OCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FxUx3tsHiR582edWN0Q\u002Fgiphy.gif\"\n       style=\"width:45%; height:auto; padding:2%;\">\n\u003C\u002Fdiv>\n\n- Dynamic Menu Resolution: The menu now dynamically adjusts its resolution based on your computational demands, ensuring optimal performance even with complex molecular structures.  \n- Improved Keyboard: the keyboard now previews input text directly, drastically enhancing typing efficiency and experience in XR.  \n- Customizable Menus: Expand or collapse the left sidebar to reveal full menu labels, helpful especially for new users. Additionally, adjust the application menu width to accommodate extensive ligand data columns, saved per user preference.  \n- Lock Button: Prevent accidental repositioning of the main menu by easily locking it in place, enhancing stability.  \n- Meta Quest Controller Integration: Instantly toggle the main menu using the left-hand controller’s hamburger button, streamlining menu access.  \n- Cursor Dot Implementation: A cursor dot is now visible at the end of your pointer ray, improving menu navigation precision, especially beneficial when using tools like the measurement feature.  \n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage_115_86d452b45c.png\"\n       style=\"width:70%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\n---\n\n## Performance Optimizations\n\nThe application now launches and runs significantly faster. Thanks to dedicated efforts from our development team, both application and workspace startups have been dramatically optimized for speed.\n\n---\n\n## Rendering and Representation\n\n- Cartoon Rendering: Nucleic acids now feature enhanced cartoon representations, providing clearer, more visually appealing structures.  \n- Atomistic Representation Optimization: Manage visual clutter and conserve resources by toggling single atomistic representations per component, preventing overlapping visuals.  \n\n---\n\n## Foundation for Hooks\n\nWe’re laying the groundwork for greater computational flexibility. Once completed, Hooks would enable users to substitute native computations with their own custom calculations enabling more calculations to be run in the cloud. This is just the beginning of a broader effort to support more advanced and customizable scientific workflows.\n\n---\n\n## Additional Improvements\n\nSeveral smaller, yet impactful updates further refine the experience:\n\n- Enhanced Spotlight\u002FFollow: Quickly synchronize orientation with the spotlight leader, ensuring everyone is aligned during collaborative sessions.  \n- Ligand Metadata (Computed & Imported) Clearly distinguish between imported and computed ligand metadata directly within the Ligands menu, providing full visibility into your metadata sources.  \n- Simplified popup management and clearer ligand detail icons.  \n\n---\n\n## Looking Ahead\n\nNanome v2.2.0 represents a significant milestone in our ongoing mission to deliver intuitive, powerful tools for scientists. A huge thanks to our development team who worked hard across design, implementation, and internal testing to bring this to life. We're excited for you to experience these enhancements and look forward to delivering even more powerful updates soon.\n\n**Ready to explore?** Request a demo at \u003Chttps:\u002F\u002Fnanome.ai\u002Fdemo\u002F>.\n\n\nHappy exploring!  \nThe Nanome Team","2025-07-09T17:37:44.465Z","2026-03-24T17:53:29.258Z","2025-07-09T17:37:44.921Z","2025-07-09","Nanome v2.2.0 delivers powerful new features for scientists analyzing molecular structures in immersive, collaborative settings. Whether you're measuring atomic distances, customizing structural representations, or exploring chemical interactions with teammates, this release deepens your ability to understand complex data intuitively and efficiently.\n","structural biology, computational chemistry, medicinal chemistry, drug discovery, molecular modeling, 3D molecular visualization, protein-ligand interactions, VR drug design, binding pocket analysis, structure-based drug design, ligand metadata, RMSD alignment, polymer interactions, molecular measurements, chemical informatics, atomistic modeling, PyMOL integration, hydrogen bond analysis, immersive molecular collaboration, cheminformatics, nucleic acid modeling, protein interface mapping, custom molecular components, interactive molecular analysis, scientific XR tools, pharma R&D software, virtual screening, molecular scene management, cloud-based computation, structural alignment tools","nanome-v2.2.0:-measure-interact-and-analyze-from-atom-to-angstroms",{"id":167,"attributes":168},46,{"title":169,"content":170,"createdAt":171,"updatedAt":172,"publishedAt":173,"date":174,"description":175,"keywords":176,"slug":177,"category":66},"Nanome v2.1.1 Patch Release Update: New Scene POV, SDF Metadata Support, and Rendering Improvements","# Nanome v2.1.1: Precision and Clarity Update\n\nWe just released v2.1.0 last month, and already we’re back with another update we’re excited about. While v2.1.1 is technically a patch, it includes several major improvements that we hope you'll love. This update enhances visual clarity, user experience, collaboration, and molecular visualization precision. Here’s a look at what’s new:\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia1.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMnczcnFoZzQ1cnY0ZG00dGd3bnZyZ3N1d2JodXJzeHByZ3hiamhpdCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FGv1LATHqcAIBoUo8KY\u002Fgiphy.gif\"\n       style=\"width:50%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n---\n\n## New: Scene Point of View\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fset_scene_pov_98f4f33ca6.jpg\" style=\"width:50%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\nOne of the most impactful changes in v2.1.1 is the ability to define a specific point of view for each scene. Users can now set their current workspace perspective as the default view when switching to that scene. This streamlines collaboration, ensures consistent context across users, and makes presentations and walkthroughs much more effective. Whether you're guiding a team through a complex binding pocket or organizing your own layout, this feature brings a new level of control. Adjust the scene and scale to your liking, then click the new Scene Point of View button in the Scenes menu to save your view for that scene.\n\n---\n\n## New: User Indicators in Scene Menu\n\nUsers can now easily track which scenes other users are in with new color‑coded indicators in the Scene menu. Hovering over a color reveals a full list of users in that scene. To reduce confusion, we’ve also removed 3D cursors for users who are in different scenes than your own.\n\n\u003Cdiv style=\"display: flex; justify-content: center; align-items: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fscene_menu_colors_0929d4a4d5.png\" style=\"width:50%; height:auto; padding:2.5%;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fscene_menu_user_list_d26dc7930c.png\" style=\"width:50%; height:auto; padding:2.5%;\">\n\u003C\u002Fdiv>\n\n---\n\n## Enhanced Molecular Visualization and Menus\n\nNanome v2 introduced a complete overhaul to rendering and UI, and we continue to build on that foundation to make molecule visualization even clearer:\n\n- **Pocket Rendering:** Improved visuals for cross‑entry components  \n- **Smooth Surfaces:** Reduced jagged edges on molecular surfaces  \n- **Surface Caching:** Faster surface toggling through smart caching  \n- **Protein Alignment:** RMSD (Root Mean Square Deviation) and alignment length added for more accurate comparisons\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002F0ZglaW5lWQBhl1pVyW\u002Fgiphy.gif\"\n       style=\"width:75%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n- **High‑Resolution Menus:** Menus are now significantly sharper  \n- **Tooltip Enhancements:** Full text now appears for truncated fields, with additional tooltips in the collaboration menu  \n- **Distinctive Avatars and Cursors:** More visually distinct, high‑contrast avatars and cursors  \n\n---\n\n## All‑New Settings Menu\n\n\u003Cdiv style=\"text-align: left;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fsettings_menu_af34c565ad.png\" style=\"width:50%; height:auto; padding:5%;\">\n\u003C\u002Fdiv>\n\nA redesigned settings menu offers new ways to customize your experience:\n\n- **Night Mode:** Dim the environment  \n\n\u003Cdiv style=\"text-align: left;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002FW1oH8YZy7MTX9yWufn\u002Fgiphy.gif\"\n       style=\"width:50%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n- **Rotation Control:** Optionally disable rotation when scaling  \n\n\u003Cdiv style=\"text-align: left;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002FNrOV8CrEnWQcr15fQs\u002Fgiphy.gif\"\n       style=\"width:50%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n- **Avatar Visibility:** Option to hide background avatars  \n\n\u003Cdiv style=\"text-align: left;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002Ftw3ZhrEbfGys4uoBhV\u002Fgiphy.gif\"\n       style=\"width:50%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n- **Client Preferences:** User settings now persist across sessions on your headset  \n\n---\n\n## Advanced Chemical Properties & Metadata\n\nWe’ve improved how Nanome handles and visualizes chemical data:\n\n- **SDF Metadata Import:** Directly import metadata into the Ligands Menu  \n- **Hydrogen Bonds:** Distinguish between probable and less probable bonds while focusing on protein‑ligand interactions and ignoring intramolecular bonds  \n- **Chemical Precision:** Property values are now rounded to 3‑4 decimals for clarity  \n\n---\n\n## Connectivity & Stability\n\nThis release also improves reliability and robustness:\n\n- **Workspace API Errors:** Better error feedback when issues arise  \n- **Hydrogen Bond Rendering:** Improved accuracy by ignoring intra‑component bonds in cross‑entry scenarios  \n\n---\n\n## Small Fixes & Improvements\n\n- Environmental Visuals: Removed purple ambient lighting for a cleaner look  \n- Controller Color Update: Controllers are now sleek black  \n- Clicking on ligands no longer auto‑zooms your view  \n- Removed unused 3D surface highlight interactions  \n- Backend support for rainbow coloring (available via the web companion app)  \n- Increased Scene Scale: More flexibility in scene sizing  \n- Workspace Handling: Log back into your last‑used workspace instead of the default  \n- Workspace Controls: Easily delete the current workspace or scene  \n- Audible Notifications: Get alerts for new join requests  \n- Workspace Sharing: Fixed duplication issues when sharing  \n- Connection Safety: Older clients are now blocked from joining to avoid workspace corruption  \n\n---\n\nWhile this may be labeled a patch, the number of improvements makes it feel like a much larger release. Even bigger features are right around the corner. In the meantime, let us know what you think. If you’re interested in trying v2, reach out at \u003Chttps:\u002F\u002Fnanome.ai\u002Fdemo\u002F> and we’ll be happy to get you started.","2025-05-23T20:14:08.940Z","2026-03-24T17:53:28.334Z","2025-05-23T20:31:08.521Z","2025-05-23","Nanome v2.1.1 delivers major upgrades to molecular visualization, scene management, and collaborative workflows. Set scene-specific perspectives, enjoy smoother rendering, and experience an improved user interface across the board.\n","Nanome, molecular visualization, VR chemistry, scene POV, protein alignment, hydrogen bonds, ligand metadata, scientific collaboration, workspace management, structural biology, cheminformatics, virtual reality drug design, computational chemistry, scientific software, VR molecule viewer","nanome-v2.1.1-patch-release-update:-new-scene-pov-sdf-metadata-support-and-rendering-improvements",{"id":179,"attributes":180},45,{"title":181,"content":182,"createdAt":183,"updatedAt":184,"publishedAt":185,"date":186,"description":187,"keywords":188,"slug":189,"category":53},"View Your OpenEye\u002FCadence Data with Fresh Eyes using Nanome XR","**Powerful New Integration: Nanome XR + OpenEye Tools for Drug Discovery**  \nNanome and Cadence have an exciting new integration that allows direct access to results calculated by the OpenEye platform directly into Nanome XR. Our integration of many of the OpenEye tools is just beginning and we are proud to announce that we have created tools to work with molecular search (Fast-ROCS), molecular dynamics simulations and other trajectories, and cryptic pocket evaluation. \n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia4.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExa3cyOGZ3Y3g4YXI0bDhzbzQwd3phcHh6Zmp1Nmkxc2I0b3MyYnF0MSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002F7w2Z3qu2um73LvlPnX\u002Fgiphy.gif\"\n       style=\"width:75%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n\n**Tools Currently Available to Work within Nanome XR and MARA**\n\n* FastROCS  \n  * Conduct shape based searches at super fast speeds  \n  * This exceptional speed enables you to perform highly accurate 3D shape similarity calculations on millions of molecules within seconds  \n    * Offering a significant advantage over traditional 2D approaches.  \n  * Iterate on interesting molecular search results directly in Nanome XR, then launch new queries from molecules drawn directly in a protein pocket  \n* Molecular Dynamics Simulations and other Trajectories  \n  * MARA can now parse .h5 files created by Orion  \n  * This allows users to download their data from Orion, then convert to .PDB and .XTC (using natural language or MARA workflows) to be parsed by Nanome XR.  \n  * MARA further allows users to:  \n    * Remove waters  \n    * Center the trajectory  \n    * Export specific frames as pdb  \n    * Truncate a trajectory a specific section of frames  \n    * Easily return information about the trajectory  \n* Cryptic pocket evaluation  \n  * OpenEye's Cryptic Pocket Detection tools empowers users to thoroughly explore a protein's conformational space, potentially revealing one or more cryptic pockets.  \n  * Readily locate sites that are rarely or never observed in experimental structures that can remain druggable  \n  * Find novel opportunities for modulating the activity of a target protein  \n  * Aids in crafting isoform-selective ligands  \n  * Cryptics pockets are located using state-of-the-art enhanced sampling molecular dynamics simulations  \n  * Plugins are available to directly load .OEDU files directly into Nanome to highlight the design units that encompass cryptic pockets.\n\n**How can this help our team innovate?**  \nScientific innovations require a thorough assessment of target protein scaffolds and the small molecules that bind therein. Coupling the computational toolsets provided by OpenEye with collaborative three-dimensional visualization empowered by Nanome, teams can leverage a new perspective on target molecules and their high quality hits to allow expedited forward progress. Bringing computational, medicinal, and synthetic chemists together with structural biologists, biochemists, and protein engineers in a shared three-dimensional environment creates the opportunity to see the unseen, predict better targets, and bring products to market while spending less time doing so.  \n\n**How can I get my hands on these new integrations?**  \nDo you already have access to the OpenEye tools?\n\n* Contact [sales@nanome.ai](mailto:sales@nanome.ai) or [use this calendar link](https:\u002F\u002Fmeetings.hubspot.com\u002Fjlaureanti) to directly schedule an XR or MARA demo with the Nanome team at a time that works best for you and your team\\!\n\n**Written by:**  \nJoseph A. Laureanti, PhD and Edgardo Leija from Nanome ","2025-05-16T22:13:13.178Z","2026-03-24T17:53:29.077Z","2025-05-16T22:13:14.177Z","2025-05-16","Explore Nanome’s new integration with Cadence’s OpenEye platform, featuring FastROCS molecular search, molecular dynamics trajectory analysis, and cryptic pocket detection—all directly accessible in Nanome XR and MARA to accelerate drug discovery and collaborative molecular design.\n","Nanome,OpenEye,Cadence,XR,MARA,drug discovery,molecular dynamics,FastROCS,cryptic pocket detection,protein modeling,medicinal chemistry,computational chemistry,3D visualization,scientific collaboration,Orion platform,pharmaceutical research","view-your-openeyecadence-data-with-fresh-eyes-using-nanome-xr",{"id":191,"attributes":192},44,{"title":193,"content":194,"createdAt":195,"updatedAt":196,"publishedAt":197,"date":198,"description":199,"keywords":200,"slug":201,"category":66},"Nanome v2.1.0 Release Update: New Workspace API, Interactions, Alignment, Ligands Table, and more!","We are excited to announce the release of Nanome AI v2.1.0. This release brings a host of improvements and new features designed to streamline workflows between NanomeAI’s companion web portal, display interactions between molecular structures, and an enhanced experience for ligand evaluation. \n\n### Consolidated Workspace API:\nOur redesigned API consolidates access to the database for both XR and MARA. This integration ensures consistent data access across the web and XR interfaces such that further workspace preparation or real-time changes can happen on the web before or during an XR session.\n\u003Cdiv style=\"display: flex; justify-content: flex-start; text-align: left;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002F1NJzUhibZBqSoEyVKb\u002Fgiphy.gif\"\n       style=\"width:50%; height:auto; padding:1%;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002FytW5Gvm3T7ihpHWL0t\u002Fgiphy.gif\"\n       style=\"width:50%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n### Multi-Structure Interactions:\n\u003Cdiv style=\"float: left; width:50%; margin: 0 1em 1em 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002FDhYQR48SrVW8eKXPFC\u002Fgiphy.gif\"\n       style=\"width:100%; height:auto; padding:0;\">\n\u003C\u002Fdiv>\nExperience seamless interactions with the introduction of Cross-entry components that enable cross-entry interactions, along with new features such as Protein Alignment and a dedicated Ligands Menu. These additions are engineered to make protein exploration more intuitive and effective. We’ve smoothed out processes for protein grabbing, rotating, and scaling while also implementing several smaller UX improvements.  \n\u003Cbr style=\"clear: both;\">\n\n### Protein Alignment:\n\u003Cdiv style=\"float: left; width:50%; margin: 0 1em 1em 0;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia.giphy.com\u002Fmedia\u002FCPHgZ7mpO0jPRgdevq\u002Fgiphy.gif\"\n       style=\"width:100%; height:auto; padding:0;\">\n\u003C\u002Fdiv>\nExperience protein alignment like never before. In our immersive VR environment, aligning proteins is as intuitive as manipulating physical objects—simply grasp, rotate, and position with natural 3D gestures or leverage our integration with the CEAlign algorithm as shown in the video below.  \n\u003Cbr style=\"clear: both;\">\n\n### Ligands Table Menu:\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fembeddable_aa697efc_2597_4ce6_acba_e9ef2e6bb361_c17e188b16.png\"\n       style=\"width:75%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\nExplore our new Ligand Table menu in XR which lets you effortlessly view and rate ligands while accessing detailed, RDKit-calculated properties—all within an immersive 3D environment. It's designed to make the evaluation and comparison of potential compounds both intuitive and engaging.\n\n\u003Cdiv style=\"clear: both; height: 2em;\">\u003C\u002Fdiv>\n\n\n### Other Improvements:\nAdditional updates include faded avatars and cursors for users in different scenes, an automatic workspace load when clicking “Join” on an organization member list item (for users with existing access), and an increased minimum scene scale to avoid unreasonably small protein displays. A floating prompt now shows which user you are currently following, adding clarity during collaborative sessions.\n\nWe've also improved the XR platform to work more smoothly so there's less lag or delays when switching between views, scenes, or representations. Changes made in either XR or the web portal are now instantly shared with everyone, keeping everyone perfectly in sync. We’ve polished the app to ensure smoother logins and better workspace permission handling. Working together with other members of your organizations is easier than ever. You can join your team's workspace easily ensuring that collaborative sessions get kicked off more seamlessly. \n\nWe invite you to experience these updates firsthand. Provide your [details here](https:\u002F\u002Fnanome.ai\u002Fdemo\u002F) to gain early access and discover the future of collaborative protein visualization and interaction.","2025-04-29T20:33:08.886Z","2026-03-24T17:53:28.823Z","2025-04-29T20:33:10.253Z","2025-04-29","Discover Nanome AI v2.1.0, featuring a unified Workspace API for seamless XR-web data sync, intuitive multi-structure interactions, advanced protein alignment, and immersive ligand evaluation.\n","Nanome AI v2.1.0, Workspace API, XR-web integration, MARA platform, molecular visualization, multi-structure interactions, protein alignment, ligand evaluation, immersive VR, RDKit properties, collaborative protein visualization, VR protein tools","nanome-v2.1.0-release-update:-new-workspace-api-interactions-alignment-ligands-table-and-more!",{"id":203,"attributes":204},43,{"title":205,"content":206,"createdAt":207,"updatedAt":208,"publishedAt":209,"date":210,"description":211,"keywords":212,"slug":213,"category":53},"LiveDesign Live Reports — Literally at Your Fingertips","## Take Your Collaboration to the Next Level\n\nIf your team is already using LiveDesign, you’re operating at a high level of scientific collaboration and design iteration. But what if you could take that collaboration and insight even further?\n\n**Nanome XR** helps your team reach maximum efficiency and innovation by bringing molecular insights into immersive 3D environments—transforming how you interpret and act on LiveDesign data.\n\n---\n\n## Why Bring LiveDesign Live Reports into Virtual or Mixed Reality?\n\nLiveReports in LiveDesign are powerful: real-time updates, deep metadata tracking, and rich visualizations of your drug design process.\n\nBut viewing these insights on a 2D screen can limit the spatial understanding required for truly innovative decisions. Visualizing Live Reports in **Nanome XR** lets your team:\n\n- **Experience molecular data spatially**  \n  Complex structures and relationships become instantly clearer when viewed in 3D—walk around molecules, scale them, or explore binding pockets from the inside out.\n\n- **Collaborate across geographies**  \n  Remote teams can share a virtual workspace, manipulate models together, and annotate in real time—as if in the same lab.\n\n- **Accelerate decision-making**  \n  Interact with SAR data, propose edits, and make changes in real time, moving from analysis to action faster than ever.\n\n- **Engage stakeholders more effectively**  \n  VR makes insights accessible to leadership, investors, and partners—bridging communication gaps and creating lasting impact.\n\n---\n\n## How to View LiveDesign Results in Nanome XR\n\nNanome XR offers two seamless ways to bring your LiveDesign data into immersive environments:\n\n### 1. Web Browser within Nanome XR\n\nJust log in to your LiveDesign account from Nanome XR’s in-app browser. Download your results and bring them into your virtual workspace instantly—no extra setup needed.\n\n\u003Ciframe width=\"560\" height=\"315\" src=\"https:\u002F\u002Fwww.youtube.com\u002Fembed\u002F9jopZy-n-yQ?si=Oekmp9d9F0R_xDAJ\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen>\u003C\u002Fiframe>\n\n### 2. Integrated Plugin via Stacks\n\n- Connect your LiveDesign account  \n- Browse and import relevant design data  \n- Push edits or proposals from XR directly back into LiveDesign\n\n\u003Ciframe width=\"560\" height=\"315\" src=\"https:\u002F\u002Fwww.youtube.com\u002Fembed\u002F6VNiyzkXwJw?si=unZ7DaJxi24y3mki\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen>\u003C\u002Fiframe>\n---\n\nWhether you're brainstorming early discoveries or refining final candidates, **LiveDesign + Nanome XR** turns your data into a shared, spatial experience—driving faster innovation and deeper understanding.\n\n---\n\n## Want to Hold Your Team's LiveReports in Your Hands?\n\nReach out to us:  \n📧 [sales@nanome.ai](mailto:sales@nanome.ai)  \n📅 [Book a demo](https:\u002F\u002Fmeetings.hubspot.com\u002Fjlaureanti)\n\n---\n\n**Written by:**  \nJoseph A. Laureanti, PhD  \nEdgardo Leija\n","2025-04-08T21:14:21.408Z","2026-03-24T17:53:28.552Z","2025-04-08T21:14:22.192Z","2025-04-08","Discover how Nanome XR enhances LiveDesign Live Reports by transforming 2D molecular insights into immersive 3D collaboration. Boost scientific innovation, spatial understanding, and remote teamwork in drug discovery.\n","LiveDesign,Nanome XR,virtual reality,drug discovery,scientific collaboration,3D molecular visualization,pharma R&D,spatial computing,VR collaboration,mixed reality,LiveReports,biotech innovation,remote scientific teams","livedesign-live-reports-literally-at-your-fingertips",{"id":215,"attributes":216},42,{"title":217,"content":218,"createdAt":219,"updatedAt":220,"publishedAt":221,"date":222,"description":223,"keywords":224,"slug":225,"category":66},"MARA’s Updated Workflow Processes Provide Huge Time Savings!","**What's New?**  \nPreviously, MARA took user inputs, automatically generated plans, and executed them immediately. While effective, this method sometimes struggled when multiple tools existed for similar tasks. This led to scenarios where the generated plan technically worked but didn't align precisely with the user's specific objectives.\n\nNow, MARA's enhanced workflow planning offers users greater flexibility, visibility, and control:\n\n**Step-by-Step Plan Generation**: MARA takes user prompts and clearly outlines a proposed workflow, step-by-step, without immediate execution. Users can now review, adjust, and confirm each step.\n\n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia0.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExY3ZmNWNjeWJubHQxbXNzOWUza2gyMWJxYnIwOHJ5ZWgxdmZiNTdqaCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002F8vcUjv1K6IaI8em5Yt\u002Fgiphy.gif\" style=\"width:50%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n**Interactive Workflow Revision**: Users can easily revise workflows using natural language, rearrange steps, and clearly visualize tool dependencies and outputs at every stage. MARA retains all necessary context, ensuring smooth transitions between tools and steps.\n\u003Cdiv style=\"white-space: nowrap;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia2.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExemV3bndkNjlmbml0aHkzcm8ycjVsMm85OTAxcjRtaHZkcDZvcDV1dSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FXHGXTvqonjlW0dYJ2d\u002Fgiphy.gif\" style=\"width:50%; height:auto; padding:1%; display:inline-block;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia2.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMzAzamlhbmV4NTNiMTQxMGM5NnYwNXM5ZGg2cTVraXFsYXZpNGp3byZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FFCrgwk2bkYLadSy8b1\u002Fgiphy.gif\" style=\"width:50%; height:auto; padding:1%; display:inline-block;\">\n\u003C\u002Fdiv>\n\n**Save, Run, and Share Workflows**: Workflows can now be saved at any stage, either before initial execution or after successful completion. Once saved, these workflows become accessible organization-wide, facilitating easy duplication and customization. This allows anyone in the organization to then reuse and amend a workflow to their specific needs for a similar project.   \n\u003Cdiv style=\"text-align: center;\">\n  \u003Cimg src=\"https:\u002F\u002Fmedia0.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExNXAxZXQ2MWp2Mzhwenpic241Znd3cDBud3ZmaG0xeWJmcDNuY2N6OSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FSTthjeBvHXoWuEJe53\u002Fgiphy.gif\" style=\"width:50%; height:auto; padding:1%;\">\n\u003C\u002Fdiv>\n\n\n## Example Use Case \nBelow is a short video demonstrating a typical docking experiment workflow:\n\n* Planning a complete docking experiment  \n* Reviewing and revising the initial plan  \n* Adjusting steps by rearranging tools  \n* Executing and saving the finalized workflow  \n* Adding a final docking evaluation step to ensure desired outcomes \n\u003Cdiv style=\"text-align: center;\">\n  \u003Ciframe width=\"560\" height=\"315\" src=\"https:\u002F\u002Fwww.youtube.com\u002Fembed\u002F5_dtowWhdwU?si=ECF4JYl1rwJ_crFe\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen>\u003C\u002Fiframe>\n\u003C\u002Fdiv>\n\n\n## How the New Process Saves You Time:  \nThe updated MARA workflow significantly reduces inefficiencies associated with:\n\n* Overcoming limitations of LLMs in understanding high-level scientific workflows  \n* Manually constructing workflows step-by-step\n\nMARA’s new approach ensures consistent, accurate, and efficient workflows—eliminating potential \"hallucinations\" or unintended changes over time. Your team can trust workflows to remain reliable day after day.\n\nAdditionally, MARA streamlines handling batch inputs and outputs between tools, further accelerating daily operations.\n\n**Want to optimize your team's productivity?**\n\nContact [sales@nanome.ai](mailto:sales@nanome.ai) to schedule a demo or [use this calendar link](https:\u002F\u002Fmeetings.hubspot.com\u002Fjlaureanti) to directly schedule a meeting at a time that works best for you and your team\\!\n\nWritten by:  \nJoseph A. Laureanti, PhD and Edgardo Leija","2025-04-01T23:43:43.565Z","2026-03-24T17:53:28.293Z","2025-04-01T23:43:50.493Z","2025-04-01","Discover how MARA’s updated workflow system dramatically improves scientific productivity. With step-by-step planning, natural language editing, and reusable workflows, teams save time and gain precise control over complex computational tasks.\n","MARA, workflow automation, scientific workflows, computational biology, AI in drug discovery, molecular design software, workflow revision, workflow planning, LLM tools, natural language workflows, save workflows, share workflows, scientific productivity, docking experiments, batch processing tools, step-by-step execution, tool integration, lab automation, biotech software, pharma informatics","mara's-updated-workflow-processes-provide-huge-time-savings!",{"id":227,"attributes":228},41,{"title":229,"content":230,"createdAt":231,"updatedAt":232,"publishedAt":233,"date":234,"description":235,"keywords":236,"slug":237,"category":66},"Nanome AI February 2025 Update!","# Device Code Login, Chat File Sharing & Human-in-the-Loop Planning\n\nIn case you didn't catch it, be sure to check out the full [Nanome v2 walkthrough video ](https:\u002F\u002Fyoutu.be\u002F8dcJ8K0-MxY?si=diZsBjFsFJF2-wpM)where we dive into what it's really like to use Nanome v2—from seeing how colleagues appear around a structure, to exploring our brand new menu system and much more.\n\nWe're excited to announce that last week, we released **XR v2.0.11** along with **Mara v0.16.1**. Together, these make up the Nanome AI February 2025 update, bringing a host of new features and enhancements designed to streamline collaboration and sharpen your structure-based drug discovery workflow.\n\n---\n\n## **XR v2.0.11 Highlights**\n\n### **Enhanced Visualization for Ligand Evaluation**\n\nOur new **Residue Labels** representation gives you immediate clarity on amino acid residues. This detail is critical when evaluating ligand binding, allowing your team to quickly pinpoint key interaction sites and streamline candidate assessment.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_2b0612416d.jpg\" style=\"width:100%; height:auto; padding:5%;\">\n\n\n### **Seamless Collaboration**\n\n**Request to Join Your Colleague’s Workspace**  \nConnecting with teammates has never been easier. With our new feature, you can now instantly request to join a online colleague’s workspace—eliminating the hassle of searching for the right room or fumbling with codes. This seamless integration ensures that everyone is literally on the same page when evaluating complex structures.\n\n### **Other XR Enhancements**\n\n* **Color Only Carbons for Ligands:** Simplify ligand visualization by highlighting only the carbons.  \n* **Rapidly Set Up a Collaborative Workspace from Your Default Workspace:** Get your team together in seconds.  \n  * Users can now on-click add users while in the default workspace and it will create a duplicate sharable one, making it easy to \n\n\n### **Streamlined Access with Device Code Login**\n\n**Login via a Device Code Support**  \nFor SSO-configured accounts, we’re introducing a device code login that replaces the traditional password entry. Now, simply type the 4-digit code you see in XR on the MARA\u002Fweb portal to gain access—making your sign-in process both faster and more secure.\n\n\n### **Small Fixes & Tweaks**\n\n* Accurate App Info UI  \n* Fixed issues with joining rooms, custom atomistic representations, voice transmission, and SDF loading  \n* Minor adjustments including reduced keyboard click volume, better ring search, and resolution tweaks\n\n## **MARA Web v0.16.1: Key Features & Improvements**\n\n### **Human-in-the-loop Planner**\n\nThe flagship feature of this release is how it now incorporates Human-in-the-loop Planning. This means when a user prompts MARA, it will check to see if there is an existing Workflow or create one dynamically. Instead of running the plan automatically, it prompts the user with an interactive form that you can review, override values, ask it to revise certain parts of it, or even evaluate\u002Fdiscriminate the plan more deeply. Then the user can explicitly run the workflow.  \nThis change means that the overall UX of MARA has changed quite a bit as it requires user input prior to start to execute tools. For very simple prompts like downloading a PDB file (1-2 steps max), it will go ahead and do it without showing a plan and awaiting approval.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_307db466ac.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n\n### **Saving Plans as Workflows**\n\nUsers can now save a Plan, whether it has already been run or before, as a Workflow. That way, in future MARA chats, the workflow can be found and invoked in the pre-planning stages.\n\n### **Secrets & API Key Manager**\n\nSecurely manage your integration keys and sensitive data without leaving Mara. This helps teams follow security best practices when extending Nanome to other systems or services.\n\nSecrets are used to store sensitive information like API keys, passwords, etc, and can be used by multiple tools. Secrets can be created for your organization or for yourself.\n\nThere is now a Secrets tab in the Tools Page. Here, Secrets can be created and managed by Administrators\u002FEditors.\n\nWhen tools requiring a Secret do not have a value, they will be disabled and show up in the Secrets tab when expanding the key.   \nOut of the box, MARA has certain tools that are 3rd party integrations requiring API keys such as Nvidia’s BioNemo Drug Discovery Informatics tools. \n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_3dfc971719.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_f3f3e91cf2.png\" style=\"width:100%; height:auto; padding:5%;\">\n\nWhen creating tools, there is now a section for Secrets to be used or newly created\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage5_2ccadbbd7b.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n### **Add & Search Chat Files to Workspaces**\n\nYou can now attach files directly from a chat to any workspace—and vice versa—so the structures you discuss in Mara are instantly available in XR. By **searching recent chat files** and adding them to a workspace, teams can rapidly iterate on the latest lead series without juggling multiple tabs or tools.\n\nThis can be done in a few ways:\n\n1. Import Chat Files from a Workspace page  \n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage6_c7eef7a48c.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n2. Import Session Files to create a new workspace from the Workspace List  \n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage7_666dd6f593.png\" style=\"width:100%; height:auto; padding:5%;\">\n3. Add Files to a Chat from another Chat  \n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage8_af5cc4b34d.png\" style=\"width:100%; height:auto; padding:5%;\">\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage9_ca2228c073.png\" style=\"width:100%; height:auto; padding:5%;\">\n4. Add a File from Chat Content directly to a workspace\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage10_532f904511.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n### **Share a Join Workspace URL**\n\nSimply click “Share Workspace” to generate a link that colleagues can use to queue up a VR join. It’s a frictionless way to keep everyone aligned when fine-tuning ligand design or analyzing binding poses.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage11_5bcf0e3719.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage12_fa5aac38d8.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n\n\n### **Adjust Workspace Permissions**\n\nFrom the web, set a workspace’s access level (Private, Viewer, Editor) and preview the join code. This extra control helps you ensure the right contributors have the right access, whether you’re dealing with sensitive structural data or open internal projects.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage13_40e8278b36.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n\n**Python Tool Creation Input Autofill**  \nIn the Python Tool Creation process, when adding new secrets or input arguments, the python code method automatically populates the input variables\n\nThe **Nanome AI February 2025 Update** makes it easier than ever to **collaborate, visualize, and secure** your structure-based drug discovery workflows. Whether you’re streamlining how teams join workspaces, improving molecular representations, or making MARA work smarter with chat-driven file sharing, these enhancements are all about helping you **accelerate discovery and showcase candidates with more clarity.**\n\nWe’d love to hear what you think—try out the new features and let us know how they impact your workflow\\!\n\n🚀 **Get started with Nanome v2.0.11 and MARA v0.16.1 today\\!**\nhttps:\u002F\u002Fnanome.ai\u002Fdemo\u002F\n","2025-02-14T23:39:44.822Z","2026-03-24T17:53:28.854Z","2025-02-15T00:02:41.116Z","2025-02-14","Discover the latest Nanome AI February 2025 Update with XR v2.0.11 and MARA v0.16.1—enhancing collaboration, visualization, and security for structure-based drug discovery. Explore new features like Residue Labels, seamless workspace joining, device code login, Human-in-the-loop Planning, and chat-driven file sharing. Streamline molecular analysis and accelerate discovery with improved tools for ligand evaluation, workflow automation, and secure API management.\n","Nanome AI, XR v2.0.11, MARA v0.16.1, drug discovery, structure-based drug design, molecular visualization, ligand evaluation, VR collaboration, device code login, workspace sharing, AI-driven workflows, MARA planner, human-in-the-loop AI, Python tool creation, API key management, biotech software, VR for pharma, scientific informatics, molecular modeling, virtual reality drug discovery","nanome-ai-february-2025-update!",{"id":239,"attributes":240},40,{"title":241,"content":242,"createdAt":243,"updatedAt":244,"publishedAt":245,"date":246,"description":247,"keywords":248,"slug":249,"category":91},"How Nanome’s agentic platform can enhance and integrate with your docking and structural visualization workflows","# Elevating Molecular Visualization with MARA and Nanome 2.0\n\nIn the competitive landscape of molecular visualization and structure-based design, efficiency and clarity are non-negotiable. Tools like PyMOL and Maestro have long been cornerstones in the computational chemistry, medicinal chemistry, and protein engineering communities. However, integrating MARA and Nanome 2.0 into your workflows can unlock a new level of productivity, interactivity, and communication while preserving the visual fidelity and data integrity from these widely used platforms.\n\nSwitching tools often brings concerns about compatibility, data fidelity, and learning curves. MARA and Nanome 2.0 address this directly, ensuring that your docking results, multi-chain PDB files, and custom color schemes remain intact during the transition while maintaining an intuitive interface. Whether you’re analyzing multiple ligand poses from docking simulations or working with multi-chain complexes, these platforms carry over essential visual elements like respective atom and chain colors, annotations, and rendered surfaces. This seamless integration allows researchers to focus on their scientific insights rather than amending and troubleshooting representations.\n\nWe’re diving straight into how MARA and Nanome can effortlessly integrate with existing workflows. Imagine this: conducting docking experiments powered by MARA, performing quick visual analyses and setting your color schemes in PyMOL, and then transitioning into virtual reality with Nanome for an immersive understanding of the results. You can see the process from beginning to end [here](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=xCiS5wPh8Wk). While this entire workflow could be completed within MARA and Nanome 2.0 alone, the focus here is on how these tools enhance and complement the established processes already familiar to your laboratory.\n\n\u003Cdiv style=\"text-align: center;\">\n    \u003Ciframe width=\"800\" height=\"400\" src=\"https:\u002F\u002Fwww.youtube.com\u002Fembed\u002FxCiS5wPh8Wk?si=sMPZYnlhSQWYh0YI\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen>\u003C\u002Fiframe>\n\u003C\u002Fdiv>\n\n---\n\n## Docking a Ligand in MARA\n\nTo generate docking data we will use MARA to dock a small molecule at a protein of interest. Simply asking MARA to prepare a protein file for docking and to find pockets is all that is needed to set up the protein for docking. A small molecule can then be easily docked at the protein by requesting to dock at a specific pocket.\n\n\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExdWJwMzF6a3VsYTZvOW1hZzlyam9odjQ3NWU0MnBkN3I3ZWd6cGhlOSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FcaI7GWARWxRAI1Cn5h\u002Fgiphy.gif\" style=\"width:100%; height:auto; padding:5%;\">\n\n---\n\n## Exporting from PyMOL\n\nAfter setting up the visualization state of the docking results in PyMOL, simply save the system as a `.pse` file in a directory of your choice.\n\n\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZ3J2ajQwc3R1bXFpdWRhOXB3cTRmeWYwZGJnenRib3FzM2d6b3RrMCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FfBYLzTryH8Zqn8LkJs\u002Fgiphy.gif\" style=\"width:100%; height:auto; padding:5%;\">\n\n---\n\n## Loading a `.pse` in MARA\n\nMARA makes it easy to import data from PyMOL and begin using tools in MARA on a system that has been set up in PyMOL. Users only need to drag and drop their `.pse` file directly into a MARA chat to get started. Once imported, the full potential of MARA tools are available to continue scientific investigations.\n\n---\n\n## Sending a Structure from MARA to Nanome 2.0\n\nOne simple click is all that is needed to move a molecular representation from the MARA web interface to an investigation in virtual reality. *Pro tip:* MARA can be accessed through the web application native to the Meta Quest, thus allowing work saved in MARA to be sent to Nanome without ever leaving virtual reality!\n\n\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExeHJvdHJ4ZzlrNGl0d3l0bjEyNnJsOW9kMXp3cXU2cmtnNWY1dmZlZCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FrcVfCPdSoQRaepkihe\u002Fgiphy.gif\" style=\"width:100%; height:auto; padding:5%;\">\n\n---\n\n## Elevating Your Workflows with Nanome AI (MARA + Nanome 2.0)\n\nThe transition to MARA and Nanome 2.0 doesn’t mean abandoning established workflows—it means augmenting and empowering them. By keeping your PyMOL and Maestro representations intact and extending capabilities with cutting-edge immersive tools, you gain the freedom to focus on innovation rather than process. The ability to explore, design, and communicate molecular insights in a truly intuitive and collaborative environment is a game-changer. Check out the full video [here](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=xCiS5wPh8Wk). We look forward to seeing you in Nanome!","2024-12-20T22:43:22.948Z","2026-03-24T17:53:29.021Z","2024-12-20T22:44:51.999Z","2024-12-20","Discover Nanome, the ultimate interface for molecular design, computational chemistry, and drug discovery. Explore how spatial computing, AI, and VR revolutionize the way scientists interact with molecules to accelerate breakthroughs in biopharma and materials science.\n","Nanome, drug discovery, computational chemistry, molecular design, drug hunter, spatial computing, virtual reality, VR for scientists, chem molecules, pharma innovation, drug development, AI in drug discovery, chem informatics, biopharma, structure-based drug design, materials science, molecular visualization, AI agents, drug discovery tools, cheminformatics, molecule design software","how-nanome's-agentic-platform-can-enhance-and-integrate-with-your-docking-and-structural-visualization-workflows",{"id":251,"attributes":252},39,{"title":253,"content":254,"createdAt":255,"updatedAt":256,"publishedAt":257,"date":258,"description":259,"keywords":260,"slug":261,"category":104},"Looking Back at 2024 and What’s Next for Nanome in 2025","# Reflecting on Another Year at Nanome\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_fcc9afca7c.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_fcc9afca7c.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nAs we close out another year at Nanome, we’ve been reflecting on just how far we’ve come and what an incredible journey it’s been. From new plugins and patches, the emergence of Apple’s Vision Pro, to launching Nanome AI, Nanome 2.0 & MARA, this year brought both milestones and glimpses into the future of scientific discovery.\n\n## Starting Strong with New Plugins\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_95a67a778a.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_95a67a778a.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nWe kicked off the year by expanding our platform’s versatility through new plugins like the **Scene Viewer** and the **Jupyter Cookbook**. The Scene Viewer simplified how our users curate and present molecular narratives, while the Jupyter Cookbook Plugin blended Python’s adaptability with the immersive power of XR. Scientists could now open multiple Python notebooks right inside Nanome, share algorithms with colleagues, and integrate the computational tools they care about—all within virtual and mixed reality environments.\n\n## A Sneak Peek into the Future: Apple Vision Pro\n\n\u003Cdiv style=\"overflow: hidden;\">\n  \u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExbng3eTJjcHdta2d3aWltdDFzMDJkbHFxMGVjNnpnMzk3b2h1ZmhlYSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FTUHmoWyW1s3rxlAC4g\u002Fgiphy.gif\">\n    \u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExbng3eTJjcHdta2d3aWltdDFzMDJkbHFxMGVjNnpnMzk3b2h1ZmhlYSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FTUHmoWyW1s3rxlAC4g\u002Fgiphy.gif\" \n         style=\"max-width: 150%; height: auto; float: left; margin: 0 15px 10px 0;\">\n  \u003C\u002Fa>\n  \u003Cp>\nWe were thrilled about Apple unveiling the Vision Pro. From day one, we saw how its eye, hand, and voice-based interfaces could transform immersion and usability. We built prototypes like NanoSpin—a delightful prototype that showcased molecules spinning in bounded volumes—and even ported our classic Calcflow to Vision Pro, reimagining 3D math visualization for this new era. The Vision Pro demonstrated that true spatial computing is finally here, and we’re proud to have taken those first steps on the platform.\n  \u003C\u002Fp>\n\u003C\u002Fdiv>\n\n\n\n## Refining Collaboration and User Experiences\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_d142cb90bd.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n\nMid-year, we rolled out patch **1.24.6**, focusing on one of the most essential aspects of scientific work: collaboration. This update offered simpler room setups, better participant controls, and more seamless meeting flows, making it easier for teams around the world to walk into a virtual space and get right to work. Throughout these improvements, we learned that small changes to workflows can have a huge impact on productivity and creativity.\n\n## MARA: Our Scientific Copilot Emerges\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F2_be83909738.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nThe public beta launch of **MARA (Molecular Analysis and Reasoning Assistant)** was another exciting milestone. Serving as an AI-driven scientific coworker, MARA made it easier to analyze molecules, run specialized tools, and create custom workflows—all with just a few prompts. This wasn’t about adding just another feature; it was about rethinking what’s possible when AI and XR unite to streamline scientific innovation.\n\n## A Leap Forward: Nanome AI and Nanome 2.0\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMzg4OGwzY3RuazJ2ZmhqbmdtNjVzemFlZWgyNHp0MWIzdDZkaXBhaiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FVLdKwmbuf2HO0thI9P\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMzg4OGwzY3RuazJ2ZmhqbmdtNjVzemFlZWgyNHp0MWIzdDZkaXBhaiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FVLdKwmbuf2HO0thI9P\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nToward the end of the year, we introduced **Nanome AI Early Access** and **Nanome 2.0**—a true culmination of nearly a decade of vision. Nanome 2.0 gave us a reimagined platform built from the ground up for intuitive collaboration, robust data handling, and richer visuals. Paired with the AI-driven magic of MARA, we’re now closer than ever to delivering an interface that feels like a real-life “JARVIS” for scientists, helping them tackle complex challenges and spark fresh breakthroughs. It’s the strongest foundation we’ve ever built and the perfect launchpad for whatever comes next.\n\n# What to Expect from Nanome in 2025\n\n## Feature-Fueled Expansion in Nanome 2.0 & MARA\n\nIn the coming year, we’ll be rolling out a wide range of new features designed to unlock fresh scientific workflows. Nanome 2.0 is as intuitive as it is powerful, and we can’t wait to support even more ways for you to explore, innovate, and accelerate your discoveries.\n\n## Deeper Integration with MARA\n\nBy blending MARA’s extensive toolkit directly into Nanome 2.0, we’ll be amplifying this cohesive platform where raw data seamlessly transforms into actionable insights. This integrated ecosystem will empower users to tackle complex challenges without missing a beat.\n\n# Industry Trends to Watch in 2025\n\n## Rising Reasoning Models and Tool Use\n\n\n\u003Cdiv style=\"overflow: hidden;\">\n  \u003Ca href=\"https:\u002F\u002Fimages.ctfassets.net\u002Fkftzwdyauwt9\u002FbgJUZGtbvelVjCeoIjfUl\u002F5e78175b7f7324d72f20927df2c3424d\u002Fo1-research-blogcard.png\">\n    \u003Cimg src=\"https:\u002F\u002Fimages.ctfassets.net\u002Fkftzwdyauwt9\u002FbgJUZGtbvelVjCeoIjfUl\u002F5e78175b7f7324d72f20927df2c3424d\u002Fo1-research-blogcard.png\" \n         style=\"max-width:25%; height:auto; float: left; margin: 0 15px 10px 0;\">\n  \u003C\u002Fa>\n  \u003Cp>\nWe’re closely following how models like **OpenAI’s O1** evolve to leverage tools. Once these reasoning assistants seamlessly integrate real-world utilities, the scope of AI-powered analysis will broaden, making research more adaptive and context-aware.\n\n## Competitive Reasoning Platforms from Tech Giants\n\nThe next wave of reasoning models from **Meta**, **Google**, and **Anthropic** promises a surge in innovation. As they vie for prominence, expect cutting-edge features and integrations that push AI-driven scientific workflows to exciting new heights.\n\n\u003C\u002Fp>\n\u003C\u002Fdiv>\n\n\n\n## More Accessible Apple Vision Headsets\n\n\u003Cdiv style=\"overflow: hidden;\">\n  \u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FScreenshot_2024_12_17_at_11_47_59_AM_96e1326d7a.png\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FScreenshot_2024_12_17_at_11_47_59_AM_96e1326d7a.png\" \n         style=\"max-width: 50%; height: auto; float: left; margin: 0 15px 10px 0;\">\n  \u003C\u002Fa>\n  \u003Cp>\nRumors of a more affordable Apple Vision device are sparking excitement, as it could be the catalyst that brings spatial computing to the mainstream. Greater accessibility means more researchers can leverage immersive XR to explore and unravel molecular complexities like never before.\n\nWhat’s particularly intriguing, for our team at least, is how Apple will cut costs. Will they ditch EyeSight, the controversial front-facing display that shows the user’s eyes? Or perhaps reduce the onboard processing power and shift computation to the iPhone?\n\nWhatever approach they take, one thing is clear: with another trillion-dollar tech giant pushing the boundaries in XR, the market is only heating up—and that’s great news for Nanome!  \u003C\u002Fp>\n\u003C\u002Fdiv>\n\n## Meta’s Orion AR Glasses in Production & Speculations on Meta Quest Pro 2\n\n\u003Cdiv style=\"overflow: hidden;\">\n  \u003Ca href=\"https:\u002F\u002Fen.teqnoverse.com\u002Fmedia\u002Fimages\u002Forion-meta.max-1200x300.png\">\n    \u003Cimg src=\"https:\u002F\u002Fen.teqnoverse.com\u002Fmedia\u002Fimages\u002Forion-meta.max-1200x300.png\" \n         style=\"max-width:100%; height:auto; float: left; margin: 0 15px 10px 0;\">\n  \u003C\u002Fa>\n  \u003Cp>\nAs Meta moves its Orion AR glasses toward commercial release, lightweight and highly portable AR will find its way into everyday lab environments. Imagine having critical data and simulations at your fingertips, wherever you’re working. \nIf Meta rolls out a Quest Pro 2 focused on productivity and mixed reality, it will raise the bar for immersive research platforms. Such an advancement could further validate and refine XR’s role as an essential component of scientific innovation.\n\u003C\u002Fp>\n\u003C\u002Fdiv>\n\n\n\n# Looking Back, Looking Ahead\n\nThis year, we moved from concept to reality on multiple fronts. We turned notebooks into living tools, tested the limits of spatial computing, refined how teams interact in XR, and brought AI onto the molecular stage. There’s a certain nostalgia in revisiting how each step—small or bold—brought us closer to our mission: to empower scientists with the most intuitive, immersive, and powerful platform imaginable.\n\nAs we stand at this year’s end, we do so with gratitude for our community of users, partners, and dreamers who helped shape these achievements. The journey continues, and if this year taught us anything, it’s that the future of science will be more accessible, more interactive, and more collaborative than ever before. We can’t wait to take those next steps forward with all of you.","2024-12-17T19:34:26.344Z","2026-03-24T17:53:28.781Z","2024-12-17T19:35:37.778Z","2024-12-17","Dive into Nanome’s transformative year—expanding its platform with new plugins, integrating with Apple Vision Pro, launching MARA and Nanome AI, and unveiling Nanome 2.0. Discover how these milestones are reshaping immersive, collaborative, and AI-powered scientific workflows, and get a glimpse of the future trends in spatial computing and cutting-edge reasoning models coming in 2025.\n","Nanome, Nanome 2.0, Nanome AI, MARA, Spatial Computing, Apple Vision Pro, Immersive Research, Molecular Visualization, Jupyter Cookbook Plugin, XR Collaboration, Scientific Workflows, AI-driven Analysis, AR\u002FVR Innovation, Mixed Reality, drug hunter, compchem, drug discovery, medchem, chemistry, molecules","looking-back-at-2024-and-what's-next-for-nanome-in-2025",{"id":263,"attributes":264},38,{"title":265,"content":266,"createdAt":267,"updatedAt":268,"publishedAt":269,"date":270,"description":271,"keywords":272,"slug":273,"category":91},"From Vault to Workspaces: The Evolution of File and Workflow Management in Nanome 2.0","# **From Vault to Workspaces: The Evolution of File and Workflow Management in Nanome 2.0**\n\nAt Nanome, we are dedicated to revolutionizing the way scientists interact with complex molecular data. With the launch of Nanome 2.0, we're taking a massive leap forward, refining workflows, and delivering a seamless experience for our users. Here's a breakdown of how Nanome has evolved, particularly in terms of file and workspace management, from the 1.x era to this exciting new iteration.\n\n---\n\n## **The Roots: Nanome 1.x**\n\nNanome 1.x first started its development in the PCVR era, where the focus was on desktop-centric workflows. Users loaded files directly from their local systems or cloud services via traditional file explorers. This approach worked well for the times but came with limitations as technology progressed, particularly with the rise of all-in-one VR devices like the Oculus (now Meta) Quest.\n\n\u003Cdiv style=\"float: left; width: 50%; padding: 5%;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage6_9a7a268c98.png\" style=\"width:100%; height:auto;\">\n\u003C\u002Fdiv>\n\n\u003Cbr>\n\u003Cbr>\n\n\nWhile we supported local file management on these devices, it was cumbersome. Transferring files required USB connections and Android File Explorer tools, making the process far from user-friendly and simple.\n\nRecognizing these pain points and the growing importance of cloud computing, we developed Nanome Vault—a plugin that allowed users to manage files through a cloud directory.\n\n\u003Cbr>\n\u003Cbr>\n\u003Cbr>\n\u003Cbr>\n\u003Cbr>\n\u003Cbr>\n\nThis solution, accessible both in VR and via a web portal, enabled uploading, downloading, and organizing files. However, the workflow still involved multiple steps:  \n\\- Setting up folders on the web portal and dragging\u002Fdropping relevant files  \n\\- Activating the Vault plugin in VR.  \n\\- Navigating to the desired directory within VR and loading the files.\n\u003Cdiv style=\"display: flex; justify-content: space-between; align-items: center; gap: 10px;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage5_475daf7e31.png\" style=\"width:50%; height:auto; padding:5%;\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_98b09b48fc.png\" style=\"width:50%; height:auto; padding:5%;\">\n\u003C\u002Fdiv>\n\n\nWhile Nanome Vault demonstrated our vision for integrated workflows, we knew we could continue to iterate on the user experience and simplicity. Enter Nanome 2.0.\n\n---\n\n## **Nanome 2.0: A Cloud-native Approach to Everything Nanome**\n\nNanome 2.0 represents a significant paradigm shift from the previous generation. Designed with a cloud-native approach and optimized for the future of XR, all-in-one devices, this version eliminates many of the friction points from 1.x. Here's how:\n\n### **1. Hierarchical Organization: Projects and Workspaces**\n\nNanome 2.0 introduces **Projects** and **Workspaces**, a new, intuitive hierarchy to streamline collaborative structural analysis.\n\n- **Projects**: These act as the overarching containers for a specific discovery effort, bringing together team members, project-specific settings, and, most importantly, **Workspaces**.\n\n- **Workspaces**: These are Nanome 2.0 Session Files, housing structural data, representations, and relative positioning—all ready for immersive viewing in XR, solo or collaboratively.\n\nUsers can start fresh with a new workspace or import session files from popular cheminformatics tools like **Pymol (.PSE)**, **Schrödinger’s Maestro (.MAE)**, or **CCG’s MOE (.MOE)**. Workspaces also support traditional molecular structure files, including **MMCIF**, **PDB**, **SDF**, and **SMILES**, for maximum flexibility.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_8c87fa27fc.png\" style=\"width:100%; height:auto; padding:5%;\">\n\nThis structure mirrors modern CADD and productivity tools, enabling seamless data management directly from the web. Currently, users can manage Workspaces for the Default Project via the MARA WebUI. **Soon**, you’ll also be able to create and manage Projects directly within MARA, unlocking even greater capabilities.\n\n---\n\n## **2. Significantly Reduced Number of Clicks**\n\nThe days of activating plugins and navigating many complex menus are gone.\n\nIn Nanome 2.0:\n\n- Users log in to see a list of their Workspaces immediately.  \n- Files are just one or two clicks away after login.  \n- Permissions allow organizations to define shared or private access, tailoring the experience to team needs.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_3c105400e3.png\" style=\"width:100%; height:auto; padding:5%;\">\n\n---\n\n## **3. “Auto-save”: A Key Benefit of the Web-native Approach**\n\nSince Nanome 2.0 is so web-native, requiring users to explicitly save a workspace is a thing of the past. When a user makes a change in a Nanome workspace —whether representations, ligand edits, or other changes— it actually makes changes directly to the Workspace state in the cloud, meaning it is saved in real-time. This eliminates the need for users to save a file locally then explicitly save it to the cloud.\n\n---\n\n## **The Future: AI-Powered Molecular Suite**\n\nNanome 2.0 sets the stage for AI-assisted workflows:\n\n- Through MARA, users will soon be able to manage files and workspaces using natural language commands.  \n- Voice input functionality is already (indirectly) available, with more advanced capabilities on the horizon.\n\n---\n\n## **Example Workflow with Apple Vision Pro**\n\nHere’s the really exciting part: thanks to these updates and the game-changing advancements in spatial computing, like **VisionOS on the Apple Vision Pro**, the MARA UI can now run directly in the **base OS Safari browser** (not just an in-app web browser). This means you can set everything up with the ease of a **physical keyboard** and, with a single eye-tracked click, launch your structure into a fully immersive **Nanome 3D volumetric window**. This kind of seamless, native interaction between the base OS and web browsers is the spatial computing revolution we’ve all been waiting for—and it’s finally here!\n\n\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMHc3MHgwZXNid3Z3NHdkOGk2MWppMHo3MGFsdHduYXdtMzFkdWdzZyZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FNEkJsHZVju31LurVRK\u002Fgiphy.gif\" style=\"width:100%; height:auto; padding:5%;\">\n\n---\n\n## **Looking Ahead**\n\nWe’re thrilled to share this leap forward with our users. Beta testing is underway, and the feedback has been overwhelmingly positive. Stay tuned for more updates on Nanome 2.0 and our broader vision for transforming scientific informatics.\n\nLet us know your thoughts, and thank you for being part of the journey. Together, we’re shaping the future of molecular design.\n\n---\n\n**Haven't tried MARA\u002FNanome2.0? Nothing beats trying it yourself; sign up for a demo today at https:\u002F\u002Fnanome.ai\u002Fdemo\u002F\\!**","2024-12-06T22:33:57.510Z","2026-03-24T17:53:28.256Z","2024-12-06T22:35:14.611Z","2024-12-06","Discover the evolution of file and workflow management with Nanome 2.0. Learn how this cloud-native platform streamlines molecular design with intuitive Workspaces, real-time cloud autosave, and seamless XR integration. Unlock the future of scientific collaboration with AI-powered tools and Apple Vision Pro compatibility.\n","Nanome 2.0, molecular design, scientific collaboration, workflow management, drug discovery, computational chemistry, medicinal chemistry, cheminformatics, drug hunter, structural analysis, cloud-native platform, XR integration, AI-powered tools, molecular visualization, MARA platform, spatial computing, Apple Vision Pro, immersive VR, molecular workflow, med chem, comp chem, scientific informatics, molecular modeling, real-time collaboration, drug discovery tools.","from-vault-to-workspaces:-the-evolution-of-file-and-workflow-management-in-nanome-2.0",{"id":275,"attributes":276},37,{"title":277,"content":278,"createdAt":279,"updatedAt":280,"publishedAt":281,"date":282,"description":283,"keywords":284,"slug":285,"category":91},"Assay Data Enabled Analysis Workflows in MARA","## MARA is here to aid in analyzing assay data outputs\n\n## Overall Workflow\n\n* Bring in assay data and perform 4 parameter logistical analysis  \n* Clean-up data outputs  \n  * Calculate pIC50  \n  * Add SMILES to table  \n  * Merge datasets  \n* Calculate Murcko Scaffolds to better categorize molecules  \n* Graph molecules by pIC50 values to determine trends  \n* Perform Principle Component Analysis (PCA) on the dataset  \n* Graph molecules by Principle Components and highlight scaffold clusters  \n* Perform an unbiased Cluster Analysis  \n* Graph molecules by Principle Components highlighting new clusters  \n* Generate and interrogate dose-response curve graphs of outlier molecules\n\nFor this example, we will use mock data that has been prepared by normalizing dosages in the nM range\n\n### Upload and analyze your assay data for IC50\n\nFirst, normalize your assay data and upload it to MARA by attaching the files. Then, prompt MARA to assess for a logarithmic regression to produce an IC50 and report the statistical significance. “Calculate the IC50s from these datasets, my key column is Molecule” is a simple prompt as it does indicate the key column to group the data properly.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FLoad_IC_50_5a94c40932.gif\" style=\"width:100%; height:auto; padding:5%\">\n\n### Clean up, merge data sets, and add structure-based properties\n\nNext, we want to analyze all the data together, but merging all of the data from the report isn’t directly helpful. We’ll also take the IC50 and improve the scaling by taking the pIC50. Running this over all three CSVs is easy with a prompt like “Calculate the \\-Log10 of the IC50 (nM) column for each of the results of the last run.”\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fp_IC_50_e89b32c094.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nNext we need to merge the CSV files into a single file to make it easier to analyze the dataset together. Asking to “Merge these three CSVs the key column is Molecule” is a good approach.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FBig_Merge_4f427f95c6.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nAdding in the SMILES from a previous dataset matched to each Molecule allows us to include structure-based analysis with our assay data.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FAdd_SMILES_7dee514ea5.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nSuch as Murcko scaffolds…\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMurcko_deb880f32a.gif\" style=\"width:100%; height:auto; padding:5%\">\n\n…and chemical property data.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FChem_Props_clip_d2bcb64c5d.gif\" style=\"width:100%; height:auto; padding:5%\">\n\n### Graphing pIC50 pairs and examining for clusters\n\nI want to see if trends in assay activity create clusters along the lines of my scaffolds or not. So, I’ll simply graph the three pIC50 values (pIC50\\_x, pIC50\\_y, and pIC50) against each other with a single prompt “Graph pIC50\\_x, pIC50\\_y, and pIC50 against each other colored by Murcko\\_No in 2D plots”.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FGraph_p_IC_50_63aa3289d9.gif\" style=\"width:100%; height:auto; padding:5%\">\n\n### Analyzing, graphing principal components, and performing cluster analysis\n\nWe see some clustering, but let’s use principal component and cluster analysis to find more robust clusters. We need to perform principal component analysis first so we can perform cluster analysis on the condensed data. We will ask MARA to “Please perform PCA on the most recently created CSV” and MARA will limit the analysis to the numerical data. I want to see how the Murcko Scaffolds map to the principal component graphs, as well.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FPCA_e32963d0bf.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nNext we can graph the principal components against each other with scaffold coloring.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FGraph_PC_d6517e0608.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nWe then follow with cluster analysis focused on the principal components “Perform cluster analysis on the most recently created CSV using only PC1, PC2, and PC3”. \n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FCluster_d7a5bce3bb.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nThen we can graph to examine this information “Graph PC1, PC2, and PC3 against each other colored by Label in 2D plots”.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FGraph_Cluster_PC_6fd8ff5b43.gif\" style=\"width:100%; height:auto; padding:5%\">\n\n### Examining dose-response curves of a relevant subset\n\nLast thing we’ll do is study the outliers from this analysis that show IC50’s that are sub-micromolar using this query in the table data engine “Give me a list of Molecule values that have a Label of \\-1 and a pIC50, pIC50\\_x, and pIC50\\_y greater than 6”.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMol_List_ca8871901e.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nThen, we’ll ask for the graphs of the molecules that are given to us.\n\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FDose_Response_Curves_4f92ebbef0.gif\" style=\"width:100%; height:auto; padding:5%\">\n\nMARA streamlines the process of analyzing assay data by integrating advanced computational tools with an intuitive workflow. From data normalization to cluster analysis and dose-response curve interrogation, MARA empowers scientists to extract meaningful insights with ease. By combining structural information with assay outputs, MARA enables a deeper understanding of molecular behavior, fostering more informed decision-making in drug discovery and research. Embrace MARA to accelerate your scientific breakthroughs and unlock the full potential of your data.\n\nWritten by Jonathon Gast, PhD, Joe Laureanti, PhD, and Edgardo Leija.\n\nIf you'd like to talk with us about any of your data challenges and learn more about how MARA can help, feel free to [tell us more about your project here](https:\u002F\u002Fshare.hsforms.com\u002F1ZjjmGFQWSWC_qjPYHWPSCQ59dwf).","2024-11-25T19:57:18.987Z","2026-03-24T17:53:28.010Z","2024-11-25T19:58:21.687Z","2024-11-25","Discover how MARA supports drug hunters and computational chemists with seamless assay data analysis, IC50 calculations, pIC50 scaling, cluster analysis, and molecular visualization for advanced research and drug discovery","assay data analysis, IC50 calculation, pIC50 scaling, cluster analysis, principal component analysis, PCA, Murcko scaffolds, data visualization, dose-response curves, drug discovery, molecular research, computational chemistry, molecular clustering, drug hunters, scientific workflows, structure-based analysis","assay-data-enabled-analysis-workflows-in-mara",{"id":287,"attributes":288},36,{"title":289,"content":290,"createdAt":291,"updatedAt":292,"publishedAt":293,"date":294,"description":295,"keywords":296,"slug":297,"category":66},"Major Updates to the Nanome Documentation Site","# Assay Data Enabled Analysis Workflows in MARA\n\n## MARA is here to aid in analyzing assay data outputs\n\n## Overall Workflow\n\n* Bring in assay data and perform 4 parameter logistical analysis  \n* Clean-up data outputs  \n  * Calculate pIC50  \n  * Add SMILES to table  \n  * Merge datasets  \n* Calculate Murcko Scaffolds to better categorize molecules  \n* Graph molecules by pIC50 values to determine trends  \n* Perform Principle Component Analysis (PCA) on the dataset  \n* Graph molecules by Principle Components and highlight scaffold clusters  \n* Perform an unbiased Cluster Analysis  \n* Graph molecules by Principle Components highlighting new clusters  \n* Generate and interrogate dose-response curve graphs of outlier molecules\n\nFor this example, we will use mock data that has been prepared by normalizing dosages in the nM range\n\n### Upload and analyze your assay data for IC50\n\nFirst, normalize your assay data and upload it to MARA by attaching the files. Then, prompt MARA to assess for a logarithmic regression to produce an IC50 and report the statistical significance. “Calculate the IC50s from these datasets, my key column is Molecule” is a simple prompt as it does indicate the key column to group the data properly.\n\n[Loading and Calculating IC50](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FLoad_IC_50_5a94c40932.gif)\n\n### Clean up, merge data sets, and add structure-based properties\n\nNext, we want to analyze all the data together, but merging all of the data from the report isn’t directly helpful. We’ll also take the IC50 and improve the scaling by taking the pIC50. Running this over all three CSVs is easy with a prompt like “Calculate the \\-Log10 of the IC50 (nM) column for each of the results of the last run.”\n\n[Calculating pIC50](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fp_IC_50_e89b32c094.gif)\n\nNext we need to merge the CSV files into a single file to make it easier to analyze the dataset together. Asking to “Merge these three CSVs the key column is Molecule” is a good approach.\n\n[Merging CSV Files](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FBig_Merge_4f427f95c6.gif)\n\nAdding in the SMILES from a previous dataset matched to each Molecule allows us to include structure-based analysis with our assay data.\n\n[Adding SMILES to Dataset](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FAdd_SMILES_7dee514ea5.gif)\n\nSuch as Murcko scaffolds…\n\n[Generating Murcko Scaffolds](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMurcko_deb880f32a.gif)\n\n…and chemical property data.\n\n[Generating Chemical Properties](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FChem_Props_clip_d2bcb64c5d.gif)\n\n### Graphing pIC50 pairs and examining for clusters\n\nI want to see if trends in assay activity create clusters along the lines of my scaffolds or not. So, I’ll simply graph the three pIC50 values (pIC50\\_x, pIC50\\_y, and pIC50) against each other with a single prompt “Graph pIC50\\_x, pIC50\\_y, and pIC50 against each other colored by Murcko\\_No in 2D plots”.\n\n[Graphing pIC50 Values](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FGraph_p_IC_50_63aa3289d9.gif)\n\n### Analyzing, graphing principal components, and performing cluster analysis\n\nWe see some clustering, but let’s use principal component and cluster analysis to find more robust clusters. We need to perform principal component analysis first so we can perform cluster analysis on the condensed data. We will ask MARA to “Please perform PCA on the most recently created CSV” and MARA will limit the analysis to the numerical data. I want to see how the Murcko Scaffolds map to the principal component graphs, as well.\n\n[Performing Principal Component Analysis](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FPCA_e32963d0bf.gif)\n\nNext we can graph the principal components against each other with scaffold coloring.\n\n[Graphing Principal Components](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FGraph_PC_d6517e0608.gif)\n\nWe then follow with cluster analysis focused on the principal components “Perform cluster analysis on the most recently created CSV using only PC1, PC2, and PC3”. \n\n[Performing Cluster Analysis](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FCluster_d7a5bce3bb.gif)\n\nThen we can graph to examine this information “Graph PC1, PC2, and PC3 against each other colored by Label in 2D plots”.\n\n[Graphing Clusters from Principal Components](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FGraph_Cluster_PC_6fd8ff5b43.gif)\n\n### Examining dose-response curves of a relevant subset\n\nLast thing we’ll do is study the outliers from this analysis that show IC50’s that are sub-micromolar using this query in the table data engine “Give me a list of Molecule values that have a Label of \\-1 and a pIC50, pIC50\\_x, and pIC50\\_y greater than 6”.\n\n[Generating Molecule List](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMol_List_ca8871901e.gif)\n\nThen, we’ll ask for the graphs of the molecules that are given to us.\n\n[Graphing Dose Response Curves](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FDose_Response_Curves_4f92ebbef0.gif)\n\nWe invite you to explore our updated documentation site and take advantage of all these resources. Whether you’re navigating MARA, leveraging Nanome 2.0’s advanced features, or revisiting 1.x content, we’re committed to providing clear, comprehensive guides to elevate your experience. Dive in and make the most of these updates!\n\nHaven't tried MARA\u002F2.0? Nothing beats trying it yourself; sign up for a demo today at [https:\u002F\u002Fnanome.ai\u002Fdemo\u002F](https:\u002F\u002Fnanome.ai\u002Fdemo\u002F)!\n\n\nWritten by Jonathon Gast, PhD, Joe Laureanti, PhD, and Edgardo Leija.  ","2024-11-18T18:05:27.616Z","2026-03-24T17:53:27.762Z","2024-11-21T03:40:11.362Z","2024-11-20","We're excited to announce our revamped documentation site, adding major new sections for MARA, a system for scientific informatics, and Nanome 2.0, the latest version of their XR platform. These updates offer detailed guides, technical documentation, and sample workflows to help users navigate and maximize the capabilities of the new tools. The site has also been restructured for better organization and easier access, with older content still available under a dedicated section for Nanome 1.x. The refreshed layout aims to enhance user experience by providing clearer and more comprehensive resources.","include drug hutner comp chem med chem cadd etc and make it comma separated","major-updates-to-the-nanome-documentation-site",{"id":299,"attributes":300},35,{"title":301,"content":302,"createdAt":303,"updatedAt":304,"publishedAt":305,"date":306,"description":307,"keywords":308,"slug":309,"category":91},"Nanome 2.0: A Deep Dive into the Technical Foundations","# **Nanome 2.0: A Deep Dive into the Technical Foundations**\n\nSince launching **Nanome AI** (along with with Nanome 2.0) in early access, we’ve explored Nanome 2.0’s new scene menu and the enhanced permissioning system in previous blog posts. Today, we’re excited to take you on a deep dive into the technical foundations that make Nanome 2.0 a significant evolution from our 1.x versions.\n\nBuilt from the ground up, Nanome 2.0 embodies the many lessons we’ve learned over the years. This isn’t just an update; it’s a complete reimagining of our platform’s core architecture to provide a more powerful, efficient, and collaborative experience. Let’s delve into the technical innovations and foundational changes that set Nanome 2.0 apart.\n\n## **Reimagined Data Persistence and Seamless Collaboration**  \nOne of the most impactful enhancements in Nanome 2.0 is the overhaul of our data persistence and sharing mechanisms. In the 1.x versions, users often had to manually save workspaces, track who saved what, and manage room sessions—all of which could interrupt the collaborative workflow.  \nWith Nanome 2.0, we’ve introduced **cloud-based data persistence**. Your molecular data and workspace settings are now automatically saved and synchronized in the cloud, eliminating the need for manual saves or complex workspace management. This advancement enables truly **seamless collaboration**, allowing teams to focus on innovation rather than logistical hurdles.\n\n## **Advanced Rendering Engine Optimized for Mobile XR (All-In-One Devices)**   \n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_ca22d88568.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_ca22d88568.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\nRendering complex molecular structures on spatial computing platforms demands both performance and visual fidelity. Back in the Nanome 1.x era, our rendering engine was crafted for PC VR, specifically with devices like the HTC Vive and Oculus Rift CV1 in mind. We later stretched its capabilities to embrace mobile platforms. But with Nanome 2.0, we’ve taken a bold leap forward. We’ve rebuilt our rendering engine from the ground up using the Universal Render Pipeline (URP). This game-changing move supercharges Nanome for mobile XR platforms and even unlocks rendering on Apple Silicon devices like the Apple Vision Pro. The result? Smoother, more responsive visuals across a wider array of devices than ever before.\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_88bcce9f31.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_88bcce9f31.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n## **Enhanced Molecular Data Structures for Large-Scale Handling**  \nModern molecular research often involves handling vast and complex datasets. Recognizing this, we’ve made significant strides in optimizing our molecular data structures. Nanome 2.0 introduces a new framework for **accessing and mutating molecular data**, enabling the platform to efficiently handle **much larger atomic files**. This improvement is particularly evident in our enhanced support for **PDBx\u002FmmCIF standardized files** from the **RCSB Protein Data Bank**. Users can now load and interact with large-scale molecular models more effortlessly, facilitating deeper analysis and exploration without performance compromises.\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_f266c1142f.jpg\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_f266c1142f.jpg\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nSource: [https:\u002F\u002Fwww.sciencedirect.com\u002Fscience\u002Farticle\u002Fpii\u002FS0022283622001796](https:\u002F\u002Fwww.sciencedirect.com\u002Fscience\u002Farticle\u002Fpii\u002FS0022283622001796)\n\n## **Building for the Future While Mindful of Performance**  \nWhile rebuilding Nanome from the ground up has provided us with a leaner and more efficient platform, we are conscious that adding new features could impact performance over time. Our commitment is to **maintain optimal performance** as we continue to develop Nanome 2.0, paying close attention at every step to ensure the platform remains both powerful and user-friendly.  \nWe understand that performance is paramount for our users’ workflow and the integrity of their research. Therefore, we are dedicated to ongoing optimization as we expand Nanome’s capabilities.\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_fcc9afca7c.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_fcc9afca7c.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n## **Optimized for Tomorrow’s Science**  \nNanome 2.0 represents a bold step forward in our mission to empower scientists, educators, and students with cutting-edge tools for molecular visualization and collaboration. By reengineering our platform with a focus on data persistence, rendering performance, and molecular data handling, we’ve laid a strong foundation for future innovations. We invite you to experience the technical advancements of Nanome 2.0 and join us in shaping the future of molecular science with spatial computing.\n\n","2024-11-06T02:14:09.810Z","2026-03-24T17:53:28.703Z","2024-11-06T02:21:57.394Z","2024-11-06","Dive into Nanome 2.0, a reimagined molecular visualization platform that’s setting new standards in computational chemistry, chemical informatics, and bioinformatics. With advanced molecular data handling, seamless cloud-based data persistence, and an optimized rendering engine built for mobile XR and Apple Silicon devices, Nanome 2.0 empowers scientists and researchers to explore complex molecular structures collaboratively and intuitively. Unlock deeper insights in drug discovery and scientific informatics with Nanome’s enhanced capabilities, designed for scalable performance and future scientific advancements. Experience Nanome 2.0—where innovation meets the next generation of molecular science.","Nanome 2.0, molecular graphics, computational chemistry, drug discovery, chemical informatics, bioinformatics, molecular visualization, spatial computing, cloud-based data persistence, mobile XR, Apple Silicon, Universal Render Pipeline, PDBx\u002FmmCIF, RCSB Protein Data Bank, collaborative research, molecular data handling, scientific informatics, Nanome AI, molecular structures, advanced rendering engine, seamless collaboration, large-scale molecular models, data synchronization, molecular science","nanome-2.0:-a-deep-dive-into-the-technical-foundations",{"id":311,"attributes":312},34,{"title":313,"content":314,"createdAt":315,"updatedAt":316,"publishedAt":317,"date":318,"description":319,"keywords":320,"slug":321,"category":91},"Understanding Nanome 2.0’s New Permissioning System: A Guide to Secure and Collaborative Workspaces","**Understanding Nanome 2.0’s New Permissioning System: A Guide to Secure and Collaborative Workspaces**\n\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Ftop_bar_c438d07552.png\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Ftop_bar_c438d07552.png\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n\nIn the realm of structure based drug discovery,, collaboration and data security are paramount. With the release of Nanome 2.0, we’ve introduced an entirely new **Permissioning System** designed to give you granular control over your workspaces and how you share them with others. In this blog post, we’ll explore the new permissioning system in detail, explaining the roles of Viewer, Editor, and Owner, and how you can effectively manage permissions to create a secure and collaborative environment.\n\n**The Importance of Permissions in Collaborative Workflows**\n\nWhen working with sensitive data or proprietary research, it’s crucial to ensure that only authorized individuals have access to your workspaces. At the same time, collaboration is essential for advancing scientific discoveries. Nanome’s new permissioning system covers both security and collaboration, giving you control over who can view, edit, or manage your workspaces.\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fpermissions_1ede09c584.png\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fpermissions_1ede09c584.png\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n**Overview of the Permission Levels**\n\n**1. Viewer**\n\n\n\n* **Capabilities**:\n    * Can **see structures** within the workspace.\n    * Can **spotlight their view** to share their perspective with others.\n    * Can **follow others** to see what they are viewing.\n    * Can **navigate between scenes**.\n* **Restrictions**:\n    * **Cannot edit** representations or structures.\n    * **Cannot arrange** structures within the workspace.\n    * **Cannot invite** others to the workspace.\n\n**2. Editor**\n\n\n\n* **Capabilities**:\n    * **All Viewer permissions**, plus-\n    * Can **edit representations**, changing how molecules are visualized.\n    * Can **arrange structures**, repositioning molecules within the workspace.\n    * Can **invite others** to the workspace as Viewers or Editors.\n* **Restrictions**:\n    * **Cannot delete** the workspace.\n    * **Cannot change ownership** of the workspace.\n\n**3. Owner**\n\n\n\n* **Capabilities**:\n    * **All Editor permissions**, plus:\n    * Can **delete the workspace**.\n    * Has ultimate control over the workspace and its contents.\n* **Restrictions**:\n    * There can be **only one Owner** per workspace.\n    * Responsible for managing overall permissions and workspace settings.\n\n**Managing Permissions in Nanome 2.0**\n\n**Setting Workspace Permissions**\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fpermissions_e4b50109c8.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fpermissions_e4b50109c8.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nWhen you create a new workspace or load an existing one, you can set the permissions for that workspace as a whole:\n\n\n\n* **Private (None)**: The workspace is only accessible to you. Others cannot join unless invited.\n* **Viewer**: Allows others to view the workspace with Viewer permissions.\n* **Editor**: Allows others to join with Editor permissions.\n\n**Inviting Collaborators**\n\n\n\nAs an Owner,  you can also set the permissions for collaborators you invite to your workspace:\n\n\n\n1. **Open the Collaboration Menu**: Navigate to the **Collaboration** option in the Application Menu.\n2. **Access Permissions**: Click on **Permissions** to manage who has access to your workspace.\n3. **Invite by Code**:\n    1. **Set the Workspace Code**: Change the code from **Private** to **Viewer** or **Editor** based on your needs.\n    2. **Share the Code**: Provide the code to your collaborators so they can join the workspace.\n4. **Invite by Email**:\n    3. **Enter Email Address**: Type the exact email associated with your collaborator’s Nanome account.\n    4. **Select Permission Level**: Choose whether they should join as a Viewer or Editor.\n    5. **Send Invitation**: The user will receive an invitation to join your workspace.\n\n**Managing Collaborator Permissions**\n\n\n\n* **View Current Collaborators**: In the Permissions section, you’ll see a list of users who have access to the workspace.\n* **Adjust Individual Permissions**:\n    * **Change Roles**: You can upgrade a Viewer to an Editor or downgrade an Editor to a Viewer as needed.\n    * **Revoke Access**: Set a user’s permission to **None** to remove their access to the workspace.\n\n**Understanding Load Options**\n\nWhen loading a workspace, you have two options:\n\n\n\n* **Load for All**:\n    * **Brings Everyone with You**: All participants in your current session will transition to the new workspace.\n    * **Set Permissions**: You may be prompted to assign permissions to users who don’t already have them.\n    * **Participant Notifications**: Users will see a progress notification with the option to opt-out or join immediately.\n* **Load**:\n    * **Solo Transition**: Only you will move to the new workspace.\n    * **Others Remain**: Participants will stay in the original workspace unless they choose to join you later.\n\n**Best Practices for Secure Collaboration**\n\n\n\n* **Assign Appropriate Permissions**: Only grant Editor permissions to users who need to modify structures or representations.\n* **Regularly Review Access**: Periodically check the list of collaborators and adjust permissions as projects evolve.\n* **Use Private Workspaces for Sensitive Data**: Keep workspaces set to Private when working on proprietary or confidential research.\n* **Communicate with Your Team**: Ensure all collaborators understand their permissions and responsibilities within the workspace.\n* **Delete Workspaces that are No Longer in Use:**  Remember, only Owners can delete workspaces.  .\n\n\n\n**Scenarios and Applications**\n\n**Collaborative Research Projects**\n\nFor team projects where multiple researchers need to analyze and manipulate molecular structures:\n\n\n\n* **Assign Editors**: Key team members can be given Editor permissions to contribute actively.\n* **Assign Viewers**: Stakeholders or advisors can join as Viewers to observe progress without altering data.\n\n**Educational Settings**\n\nIn a classroom or workshop environment:\n\n\n\n* **Instructor as Owner**: The instructor maintains control over the workspace.\n* **Students as Viewers**: Participants can explore the structures and follow along with the lesson.\n* **Interactive Sessions**: Temporarily elevate students to Editors for hands-on activities, then revert to Viewer after.\n\n**Presentations and Demonstrations**\n\nWhen showcasing findings to an audience:\n\n\n\n* **Use Load for All**: Ensure everyone is viewing the same workspace.\n* **Spotlight Feature**: Guide viewers through your presentation by spotlighting key structures or scenes.\n* **Maintain Control**: Keep participants as Viewers to prevent unintended changes.\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fspotlight_follow_9f5f89378f.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fspotlight_follow_9f5f89378f.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n**Conclusion**\n\nNanome 2.0’s new permissioning system offers a robust and flexible framework for managing collaborative workspaces. By understanding and utilizing Viewer, Editor, and Owner roles effectively, you can create a secure environment that fosters collaboration while protecting your valuable research.\n\nWhether you’re leading a research team, teaching a class, or presenting to stakeholders, the ability to control access and permissions ensures that your work in Nanome is both productive and secure.\n\n**Experience the enhanced collaboration features today**! Get started with the Nanome AI Early Access Package: [https:\u002F\u002Fnanome.ai\u002Fnanome-ai\u002F](https:\u002F\u002Fnanome.ai\u002Fnanome-ai\u002F)\n","2024-10-29T05:13:24.035Z","2026-03-24T17:53:29.175Z","2024-10-29T05:16:14.034Z","2024-10-31","\t1.\tExplore Nanome 2.0’s New Permissioning System for Secure Collaboration – Discover how Nanome 2.0’s advanced Permissioning System helps scientists securely manage workspaces with Viewer, Editor, and Owner roles, fostering structured collaboration in drug discovery workflows.\n","Nanome 2.0, permissioning system, molecular collaboration, drug discovery, drug hunter, secure workspaces, computational chemistry, med chem, structure-based drug design, scientific collaboration, molecular visualization, Viewer Editor Owner roles, molecular research security, workspace management, collaborative science, Nanome AI, scientific permissions, VR in drug discovery, AI in drug discovery, secure data sharing","understanding-nanome-2.0's-new-permissioning-system:-a-guide-to-secure-and-collaborative-workspaces",{"id":323,"attributes":324},33,{"title":325,"content":326,"createdAt":327,"updatedAt":328,"publishedAt":329,"date":330,"description":331,"keywords":332,"slug":333,"category":91},"Exploring the New Scene Menu in Nanome 2.0: A Comprehensive Guide","**Exploring the New Scenes Menu in Nanome 2.0: A Guide**\n\nNanome has always been at the forefront of immersive molecular visualization, and with the release of Nanome 2.0, we’re excited to introduce a host of new features designed to enhance your structure based drug discovery workflows. One of the standout additions is the **Scenes Menu**, a powerful tool that revolutionizes how you manage and interact with molecular structures within Nanome.\n\nIn this blog post, we’ll delve into the intricacies of the new Scenes Menu, explaining how it works and how you can leverage it to let the structures themselves guide  your research and strengthen your collaborative  efforts.\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fscenes_f1807da781.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fscenes_f1807da781.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n**What is the Scenes Menu?**\n\nThe Scenes Menu in Nanome 2.0 serves as a central hub for managing different views and arrangements of your molecular structures. Think of scenes as snapshots or layouts that capture specific representations, orientations, and visual settings of your molecules. This feature allows you to switch between different scenes seamlessly, making it easier to compare structures, highlight specific interactions, or present findings to collaborators in memorable, narrative ways.\n\n**Key Features of the Scenes Menu**\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fscenes_ae04470688.png\">\n    \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fscenes_ae04470688.png\" style=\"width:25%; height:auto; float:left; margin: 0 30px 15px 0;\">\n\u003C\u002Fa>\n\u003Cp>\n\n\n**1. Easy Navigation and Management**\n\n\n* **Accessible from All NavBar Options**: The Scene Menu is conveniently located and remains visible regardless of which option you have selected in the navigation bar. This constant accessibility ensures that you can switch scenes or manage them without interrupting your workflow.\n* **Create, Duplicate, and Delete Scenes**: You can create new scenes from scratch or duplicate existing ones. Duplicated scenes inherit all the settings from the original, allowing you to make minor adjustments without altering the original scene.\n\n**2. Independent Representations**\n\n\n\n* **Fresh Representations in New Scenes**: When you create a new scene, you’ll have a clean slate to work with. **Customized Views**: Each scene can have its own unique representations, orientations, and visibility settings. This customization enables you to tailor each scene to highlight specific features, while your sequence of scenes becomes an organized informational story.\n\n**3. Seamless Collaboration**\n\n\n\n* **Shared Scenes in Collaborative Sessions**: When collaborating with others, scenes become a powerful way to guide discussions. You can spotlight a scene, allowing others to follow your perspective and see exactly what you’re seeing.  \n* **Pointers and Avatars**: Nanome 2.0 features updated, individual pointers that are relative to the structures. These pointers have scene indicators that everyone can see.   Even when users are in different scenes, pointers help maintain context by indicating where others are focusing.\u003C\u002Fp>\n\n**How to Use the Scenes Menu**\n\n**Creating a New Scene**\n\n**Access the Scene Menu**: From any navigation bar option, locate the Scene Menu on the side panel.\n\n**Create a New Scene**: Click on the “New Scene” button. A fresh scene will be created.\n\n**Customize Your Scene**:\n\n\n\n* **Load Structures**: Import the molecular structures you wish to include.\n* **Set Representations**: Choose how you want to represent different parts of the molecule (e.g., ribbons, sticks, surfaces).\n* **Arrange Structures**: Move and scale structures as desired.  For multiple structures, arrange their relative positions by briefly toggling the  “Arrange Structures” feature on, and independently moving the structures to the desired positions.\n\n**Duplicating an Existing Scene**\n\n\n\n1. **Select the Scene**: In the Scene Menu, choose the scene you want to duplicate.\n2. **Duplicate**: Click on the “Duplicate Scene” option. A copy of the scene will appear in the scenes menu, next position.\n3. **Modify as Needed**: Adjust the representations or orientations without affecting the original scene.\n\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fduplicate_scenes_00bc6dc027.gif\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fduplicate_scenes_00bc6dc027.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n\n**Switching Between Scenes**\n\n\n\n* **Simple Selection**: Click on the scene you want to view. The workspace will update to reflect the selected scene’s settings.\n* **Spotlight Feature**: If you’re leading a collaborative session, you can spotlight yourself while in a scene to ensure all participants are viewing the same scene and  perspective as you.  Likewise, you may choose to Follow another user to move to the scene and perspective of their current focus.\n\n**Tips for Maximizing the Scene Menu**\n\n\n\n* **Organize Your Scenes**: Give each scene a descriptive name to easily identify its purpose (e.g., “Active Site Close-up,” “Ligand Binding Comparison”).\n* **Utilize the Spotlight**: When presenting, use the spotlight feature to guide your audience through your scenes smoothly.\n* **Coordinate with Collaborators**: Encourage team members to create and share scenes, fostering a collaborative environment where everyone can contribute visual insights.\n\n**Conclusion**\n\nThe Scenes Menu in Nanome 2.0 is more than just a feature; it’s a paradigm shift in how molecular visualization and collaboration are conducted. By allowing users to create, manage, and share customized scenes, Nanome empowers researchers to explore complex molecular structures in a more organized and intuitive manner.\n\nWhether you’re analyzing protein-ligand interactions, comparing conformational changes, or presenting findings to your team, Scenes  the flexibility and control you need to enhance your workflow.\n\n**Ready to experience the new Scene Menu?** Get started with the Nanome AI Early Access Package Today! [https:\u002F\u002Fnanome.ai\u002Fnanome-ai\u002F](https:\u002F\u002Fnanome.ai\u002Fnanome-ai\u002F)\n","2024-10-29T00:42:06.673Z","2026-03-24T17:53:28.943Z","2024-10-29T00:53:38.135Z","2024-10-29","Discover Nanome 2.0’s New Scenes Menu for Enhanced Molecular Visualization – Dive into Nanome 2.0’s innovative Scenes Menu, a groundbreaking feature designed to streamline molecular structure management, foster seamless collaboration, and improve your drug discovery workflows. Explore how the Scenes Menu can elevate your scientific research.\n","Nanome 2.0, Scenes Menu, molecular visualization, drug discovery, drug hunter, structure-based drug design, computational chemistry, medicinal chemistry, virtual reality, collaborative research, protein-ligand interactions, scientific collaboration, molecular structures, structure management, molecular modeling, comp chem, med chem, AI in drug discovery, VR in science, Nanome AI, early access, drug development tools","exploring-the-new-scene-menu-in-nanome-2.0:-a-comprehensive-guide",{"id":335,"attributes":336},31,{"title":337,"content":338,"createdAt":339,"updatedAt":340,"publishedAt":341,"date":342,"description":343,"keywords":344,"slug":345,"category":91},"MARA and Jupyter Notebook Integration","\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F1_b3eadafe77.png\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\nWe are thrilled to announce a major new feature for our users: you can now harness the power of MARA directly within Jupyter Notebooks! This integration opens up exciting possibilities for users who want to leverage Jupyter's interactive environment while calling MARA to enhance data exploration, automation, and computational workflows.\n\n\n\n## What are Jupyter Notebooks?\n\nJupyter Notebooks have become a staple in data science, machine learning, and academic research due to their flexibility and interactive nature. At their core, Jupyter notebooks are an interactive computational environment that allows users to execute code, visualize outputs, and iterate on that code until they achieve their desired results.\n\n\n## MARA Installation Process for Jupyter Notebooks\n\n\n### Installation of python package\n\nThe first thing to do is to install the python package necessary. This can be done by following the GIF below.\n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fpip_install_c76a7cf9c2.gif\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\n### Obtain your API key\n\nNext you need to obtain your API key so you can securely call MARA through the Jupyter notebook. This can be done by following the steps below, or watch the GIF to see how to do it.\n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fget_api_key_blurred_ca7ab5098b.gif\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\nGo to MARA main page\n\n\n* Click on the user name on the lower left corner\n* Go to Settings\n* Switch to System Tab\n* Click on Create from the API Keys field\n* Obtain API key\n\n\n### Run Commands in jupyter notebook\n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>[Jupyter notebook.gif](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FJupyter_notebook_8481ea6756.gif)\n\n\n## Why Use MARA with Jupyter?\n\nIntegrating MARA with Jupyter Notebooks elevates your data workflow in several ways:\n\nSeamless Integration - You no longer need to switch between environments or interfaces to run complex MARA tasks.\n\nEnhanced Productivity - MARA takes over repetitive and computationally heavy processes, allowing you to focus on the insights and decisions that matter most.\n\nInteractive Exploration - Jupyter’s notebook format allows you to tweak and rerun your code while MARA handles the backend processes, giving you immediate feedback and flexibility.\n\nCollaboration and Sharing - Jupyter notebooks are a fantastic tool for collaboration. Your team can work together, see the code, the commands you’ve executed, and the results—all in one place.\n\nWant to try this for yourself? Sign up [here](mara.nanome.ai) and receive 50 free credits!\n","2024-10-10T15:16:45.971Z","2026-03-24T17:53:27.682Z","2024-10-21T23:08:18.568Z","2024-10-21","We’re excited to announce the integration of MARA within Jupyter Notebooks, allowing users to enhance data exploration, automation, and computational workflows seamlessly. Jupyter Notebooks provide an interactive environment for coding and visualization, now with MARA's powerful backend processes. This integration boosts productivity, streamlines tasks, and enhances collaboration. Ready to try it? Sign up and get 50 free credits!","MARA, Update, Public Beta, Jupyter, Jupyter Notebook, Nanome, scientific informatics, computational chemistry, cheminformatics, drug discovery, drug hunter, molecular design, structural biology, user feedback, customizable tools, scientific workflows, molecular dynamics, molecular docking, scientific computing, data democratization, scientific research tools, beta testing, secure deployments, virtual research environment, chemistry software, biology software, data analysis tools, AI in science, enterprise scientific platform, VR in science, lead optimization, target identification, pharmacophore modeling, QSAR, ADMET, drug design, medicinal chemistry,","mara-and-jupyter-notebook-integration",{"id":347,"attributes":348},32,{"title":349,"content":350,"createdAt":351,"updatedAt":352,"publishedAt":353,"date":354,"description":355,"keywords":356,"slug":357,"category":66},"Announcing the Early Access Program for Nanome AI: The Next Generation of Molecular Design","We’re excited to introduce [Nanome AI](https:\u002F\u002Fnanome.ai\u002Fnanome-ai\u002F), the next revolutionary step in molecular exploration! Nanome AI brings together the power of Nanome 2.0 and [MARA](https:\u002F\u002Fnanome.ai\u002Fmara\u002F), a culmination of our vision that has been in the making for over 9 years. Since 2015, we’ve been on a mission to redefine how scientists interact with data and molecules, and Nanome AI is the breakthrough that will drive the next wave of scientific discovery. This marks a major milestone on our path toward creating the ultimate interface for science—one that accelerates breakthroughs and helps advance humanity.\n\n### Nanome AI is available for Early Access Today\n\nWe’re launching Nanome AI in early access, giving users the chance to be among the first to experience its groundbreaking capabilities. This is just the beginning—by getting your hands on Nanome AI now, you’ll have the opportunity to explore its cutting-edge tools, provide valuable feedback, and help shape its future. Over the next year, we’ll be rolling out a ton of exciting new features and enhancements, expanding its potential even further. Your insights will be critical as we fine-tune Nanome AI to push the boundaries of molecular exploration and scientific discovery. [Request early access today](https:\u002F\u002Fnanome.ai\u002Fnanome-ai\u002F) to be among the first to explore Nanome AI and help shape the future of molecular exploration and scientific discovery!\n\n# Introducing Native AI Integration\n\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMzg4OGwzY3RuazJ2ZmhqbmdtNjVzemFlZWgyNHp0MWIzdDZkaXBhaiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FVLdKwmbuf2HO0thI9P\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExMzg4OGwzY3RuazJ2ZmhqbmdtNjVzemFlZWgyNHp0MWIzdDZkaXBhaiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FVLdKwmbuf2HO0thI9P\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\nNanome AI includes Nanome 2.0, which we [began developing over two years ago](https:\u002F\u002Fblog.matryx.ai\u002F1-23-2-patch-nanome-roadmap-for-the-rest-of-2022-and-nanome-2-0-2206c2e1a7ab) with the goal of amplifying the impact Nanome has delivered over the years. Nanome 2.0 is designed to seamlessly integrate with MARA, our Molecular Analysis and Reasoning Assistant. This integration allows users to set up and navigate their workspaces using natural language, making the platform more intuitive than ever.  Long-time users might recall our initial foray into voice commands—this feature was immensely popular among many of you. Consider the MARA integration a spiritual successor to those voice commands, but significantly enhanced and *much* more extensible. Whether you’re setting up complex experiments or need to switch tasks, just say the word—MARA will handle the rest. Please note that this is an early access release, and at present, MARA supports loading a single structure into Nanome 2.0. However, we are actively working on extending this functionality to include more comprehensive workflows, deeper integrations, and enhanced interactions in future updates.\n\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZ3pna2d4eTY2ZHVoNjNrcGdjeDZlcHNnb2ttbHhnNDZqMzZjcnA5aiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FkmDm2TPotA3HzHZ8EI\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZ3pna2d4eTY2ZHVoNjNrcGdjeDZlcHNnb2ttbHhnNDZqMzZjcnA5aiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FkmDm2TPotA3HzHZ8EI\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n# Revamped Collaboration and Visualization\n\nWith Nanome 2.0, we’ve overhauled our collaboration, rendering, and menu systems to be more intuitive and feel more natural when interacting in the 3D environment. This complete redesign means scientists can dive straight into their work with minimal setup. Our focus is on enhancing the user interface and interaction modes to ensure that collaborative medicinal chemistry ideation and capture are fluid, intuitive, and productive.\n\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZXRkMWRvNHdhN2dhbWY3NnBvbWprcXMyMWtvdGlhcXloNG9nM2w5NCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FjsAsuHCu0t4rylhfwB\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExZXRkMWRvNHdhN2dhbWY3NnBvbWprcXMyMWtvdGlhcXloNG9nM2w5NCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FjsAsuHCu0t4rylhfwB\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n\n\n# Optimized for Spatial Computing and all-in-one Headsets\nOur new platform is tailored for the latest in extended reality (XR) hardware, including the Meta Quest 3 and Apple Vision Pro. This ensures that Nanome 2.0 is not just compatible but optimized for the most advanced XR experiences available, providing unparalleled clarity and user experience.\n\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExYmM3amM5ZnRsbGluMzlxcGRkZThpaXgycnR5YzUyZWJvaDhiZ2NwZCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FNEkJsHZVju31LurVRK\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExYmM3amM5ZnRsbGluMzlxcGRkZThpaXgycnR5YzUyZWJvaDhiZ2NwZCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FNEkJsHZVju31LurVRK\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n# Initial focus: Medchem Ideation & Native AI integration \n\n**Important Notice for Early Access Participants**\n\nAs we embark on the journey to refine the perfect Nanome 2.0, we want our early adopters to be aware of some key differences from the previous version. During the initial phase, **not all features from Nanome 1.0 will be available.** This includes popular functionalities such as spatial recording and the EDM maps .\n\nOur primary focus for Nanome 2.0 is to enhance the fundamental features that support collaborative design sessions. This involves leveraging the power of native AI integration with MARA to streamline and enrich your scientific workflows. By concentrating on these core aspects, we aim to provide a robust and efficient tool that meets the specific needs of our users in the scientific community.\n\nWe appreciate your understanding and are excited to have you join us in this critical phase of development. Your feedback will be invaluable in shaping a tool that not only meets but exceeds the expectations of the drug discovery communities.\n\n\n# What’s New in Nanome 2.0?\n\nOur initial release includes several innovative features designed to enhance your medchem & structure viewing workflow:\n\n\n### **Spotlight and Focus Collaboration Mode:** \n\nEngage in a multiuser environment where you can follow and interact with colleagues’ work without a designated presenter.\n\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExaGk2bDRzd3RjOTFhczNsNzVqamRjd3d4dnZ6bmlrZDVuc2Y3ZnVqcSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FpuSyD3CpOfgUBicoC4\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExaGk2bDRzd3RjOTFhczNsNzVqamRjd3d4dnZ6bmlrZDVuc2Y3ZnVqcSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FpuSyD3CpOfgUBicoC4\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n\n### **Enhanced Visualization and Navigation:** \n\nCustomize your viewing experience with advanced rendering styles and navigate through virtual environments effortlessly.\n\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExeGtnYjEzZW00dzRrbmFib2NhZ3Q2NmRic3ZyYTJsdzk1eXgxaXdpZiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FyQjXpPVkqaRgdY1UtM\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExeGtnYjEzZW00dzRrbmFib2NhZ3Q2NmRic3ZyYTJsdzk1eXgxaXdpZiZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FyQjXpPVkqaRgdY1UtM\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n\n### **Scenes as a Narrative Tool:** \n\nOrganize and present your molecular structures in scenes that tell the story of your research, enhancing understanding and collaboration.\n\n### **Enhanced Web Portal:** \n\nImport and manage molecular data with ease via our improved web interface.\n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fwebportal_9222eebf7c.png\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n### **Assisted Ligand Building:** \n\nExperience a streamlined process for constructing ligands with intelligent guides and snapping features.\n\n\u003Ca href=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExbjkxN2lrc2NwOGU2d3Y1bHBjYWg3enRpNHZvNnR4bGN4YWc4ZWF1aSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FFw7urnNLYYZjet8vyz\u002Fgiphy.gif\">\u003Cimg src=\"https:\u002F\u002Fi.giphy.com\u002Fmedia\u002Fv1.Y2lkPTc5MGI3NjExbjkxN2lrc2NwOGU2d3Y1bHBjYWg3enRpNHZvNnR4bGN4YWc4ZWF1aSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw\u002FFw7urnNLYYZjet8vyz\u002Fgiphy.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n# **Join the Early Access Program**\n\nAre you interested in being at the forefront of scientific interfaces? We are looking for passionate users to try Nanome 2.0. Sign up here to request access and help shape the future of scientific discovery.\n\nWe are excited to roll out access over the coming months, with plans to expand as we continue to refine Nanome 2.0.\n\nStay tuned for more updates, and prepare to experience the cutting edge of collaborative scientific discovery with Nanome 2.0.\n","2024-10-16T21:25:07.042Z","2026-03-24T17:53:30.094Z","2024-10-16T21:25:16.233Z","2024-10-16","Discover the future of molecular exploration with Nanome AI, the next evolutionary leap in scientific informatics. Combining the advanced capabilities of Nanome 2.0 with MARA, our Molecular Analysis and Reasoning Assistant, Nanome AI revolutionizes how scientists interact with data and molecules. Available now in early access, you can explore groundbreaking tools, provide feedback, and shape the future of molecular research. Dive into intuitive AI-driven workflows, enhanced collaboration features, and cutting-edge visualization tailored for the latest in XR technology. Request early access and join us in accelerating scientific discovery.","NanomeAI, MolecularExploration, ScientificDiscovery, Nanome2, MARA, AIinScience, EarlyAccess, MolecularResearch, XRTechnology, AIIntegration, ScientificInformatics, DrugDiscovery, MedChem, CompChem, DrugHunter, CollaborativeScience, ComputationalChemistry, MolecularVisualization, PharmaResearch, ChemInformatics, MolecularModeling, StructureBasedDesign","announcing-the-early-access-program-for-nanome-ai:-the-next-generation-of-molecular-design",{"id":359,"attributes":360},29,{"title":361,"content":362,"createdAt":363,"updatedAt":364,"publishedAt":365,"date":366,"description":367,"keywords":368,"slug":369,"category":91},"MARA Cheminformatics Workflow","**MARA is here to aid in your cheminformatic workflows**\n\nWe are thrilled to showcase the incredible capabilities of this powerful tool. In this example we will highlight a daily workflow for a cheminformatic scientists. These incredibly important scientists aim to understand how the physical properties of a molecule can inhibit or lead to a(n) (un)wanted chemical interaction. While there are many possible tasks for a cheminformatician, today we focus on creating a library of molecules by specifically varying the R-groups of .SDF file we obtain by querying the ligand of an RCSB PDB file. We will then create an .SDF file for each molecule, add hydrogens, and minimize to be used later for docking studies. \n\nHere, we outline the potential MARA workflow for a cheminformatician, illustrating how MARA can revolutionize your research and elevate your work to new heights.\n\n**Overall Workflow**\n\n\n\n* Download a ligand from a PDB ID\n* Show an atomistic colored 2D representation of the ligand\n* Label each atom with the proper index\n* Generate new molecules as SMILES containing various groups at position 5: \n    * methyl\n    * ethyl\n    * methoxyl\n* Convert all new SMILES to .SDF files\n* Show structures\n\nFor this example we will use the RCSB PDB as the source of our molecular structure and we will obtain the structure using natural language. Any common source of molecular structure (SMILES, XYZ, SDF) can be used instead. Position 5 refers to carbon atom #5 as defined in the atom indices from the labeled 2D image. Users can easily define the chemical R-groups they would like to incorporate into the study and the location at which they should be substituted. MARA can then perform the operation and export a list of SMILES as well as .SDF structure files.   \n\n**Download and annotate a ligand SMILES formula from an RCSB PDB entry via natural language**\n\nMARA uses reasoning to perform requested tasks via natural language, when prompted “download ligand from pdb 5CEO and annotate the atom indices” MARA will skip downloading the full PDB file from 5CEO and instead download only the SMILES string, which is then used to pass through another tool to produce a 2D drawing of the molecule with all atoms labeled according to the ligand entry in 5CEO. \n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FChem_Part_1_f4f923b4ae.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\n**Modification of a molecular scaffold using common terms for R-groups**\n\nMARA understands the basic conversions of standard chemical compounds to SMILES. This makes using natural language to create a library of R-groups at a molecular scaffold incredibly easy and less time consuming. Completion of tasks on a batch level allows many SMILES strings to be converted to many molecular structures automatically. \n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FChem_Part_2_edf9d0e025.gif\n\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nWant to try MARA out yourself? Click [here](https:\u002F\u002Fmara.nanome.ai\u002F) to start with some free credits.","2024-09-24T10:41:38.640Z","2026-03-24T17:53:28.435Z","2024-09-24T18:58:13.298Z","2024-09-24","MARA Public Beta showcases its powerful capabilities in cheminformatics workflows. In this example, a scientist creates a molecular library by modifying R-groups in a ligand obtained from a PDB file, with natural language input. The workflow involves downloading a ligand, displaying its labeled 2D structure, and generating new molecules by substituting specific R-groups (e.g., methyl, cyano, methoxyl) at a designated position. MARA then converts the modified SMILES strings to .SDF files for docking studies. This process simplifies batch modifications of molecular structures using natural language commands.","MARA, Update, Public Beta, Nanome, scientific informatics, computational chemistry, cheminformatics, drug discovery, drug hunter, molecular design, structural biology, user feedback, customizable tools, scientific workflows, molecular dynamics, molecular docking, scientific computing, data democratization, scientific research tools, beta testing, secure deployments, virtual research environment, chemistry software, biology software, data analysis tools, AI in science, enterprise scientific platform, VR in science, lead optimization, target identification, pharmacophore modeling, QSAR, ADMET, drug design, medicinal chemistry,","mara-cheminformatics-workflow",{"id":371,"attributes":372},28,{"title":373,"content":374,"createdAt":375,"updatedAt":376,"publishedAt":377,"date":378,"description":379,"keywords":380,"slug":381,"category":91},"MARA Bioinformatics Workflow","## **MARA is here to aid in your Bioinformatics research projects**\n\nIn the ever-evolving landscape of bioinformatics, software tools play a pivotal role in translating complex biological data into meaningful insights. In this blog post, we’ll delve into the functionalities and innovations behind MARA, highlighting how it stands out in the crowded field of bioinformatics tool hubs. We’ll explore its capabilities for analyzing your data and the practical applications that make it a valuable asset for researchers. Whether you're a seasoned bioinformatician or just beginning to navigate the world of molecular data, MARA promises to be a game-changer in your toolkit. \n\n\n## **Build custom tools, query databases, easily interconvert files**\n\nLeverage our custom-built tools (_and build your own_) to access and interact with a broad range of public data sources tailored to your research needs. Our platform enables you to query comprehensive databases, allowing you to locate relevant papers, explore specific topics, and efficiently retrieve citations from a wealth of published journals. \n\nIn addition, MARA directly imports and exports many file formats and supports the direct interconnection with local databases, including but not limited to PDB (Protein Data Bank) files, SDF (Structure Data File) files, and SMILES (Simplified Molecular Input Line Entry System) strings. Further, you can import data from your own servers directly into MARA, where it will be organized and stored as a data table for in-depth analysis. Contact our sales team at [hello@nanome.ai](mailto:hello@nanome.ai) for more information about Enterprise options. \n\n\n## Workflow\n\n\n\n* Download PDB structure by PDB ID\n* Find similar proteins based on amino acid sequence\n* Return table with sequence identity of results\n* Perform multiple sequence alignment\n* View variations in sequence\n* Download homologue and introduce mutation \n* Apply rotamer library to a key residue\n* View rotamers and change representation\u002Fcolor with natural language\n\nThe workflow below is completed without any prior data existing. MARA is employed to string together multiple tools and processes to allow you more time in the lab or focusing on other tasks.\n\n**Find Similar Proteins Based on AA Sequence**\n\nMARA uses reasoning to decide which paths to take with the tools available. This enables MARA to use a prompt such as “use the amino acid sequence of 2ONH to find similar proteins” to decide that a pdb file needs to be downloaded from RCSB PDB, converted to fasta, and that we want to find proteins with a similar sequence.  \n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FBio_Info_GIF_1_a563568592.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nMARA allows users to interact with many file types directly in MARA, reducing the number of applications required to toggle between, saving your valuable time. \n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FAlignment_Made_with_Clipchamp_3fe77c3c54.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n**Download PDB files for most similar sequences**\n\nMARA directly reads data tables and extracts important information to complete tasks. A prompt of “download pdb for the top two hits with similarity &lt; 1“ is then followed by MARA reading the data table, internally sorting, and downloading the pdb files for the top two hits with a sequence identity &lt; 1%. \n\n**Introduce mutations and produce rotamer libraries using natural language**\n\nSince MARA understands what tools require specific files, a simple prompt such as “download pdb for 2ONH and make a mutation at residue 345 to ILE” is all that is required to produce a new pdb file with a required mutation. Rotamer libraries are easily added by simply requesting MARA to create a new file with a rotamer library at a key residue. \n\n\u003Ca href=\"https:\u002F\u002Fmara.nanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMutation_and_Rotamer_Made_with_C_2_51dd5295cd.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n## **Summary**\n\nIn summary, using MARA for a bioinformatics workflow transcends mere data handling; it is the alchemy of transforming raw information into profound knowledge. This knowledge can ignite groundbreaking scientific discoveries and foster advancements in medicinal chemistry, agriculture, environmental science, and energy storage. Within this intricate process, MARA emerges as an indispensable tool. MARA streamlines data interpretation and accelerates the journey from raw data to actionable insights, making it an essential component in the modern bioinformatician’s arsenal.\n\nWant to try MARA out yourself? Click[ here](https:\u002F\u002Fmara.nanome.ai\u002F) to begin with 50 free credits.\n","2024-09-11T13:29:54.638Z","2026-03-24T17:53:29.826Z","2024-09-17T16:54:36.351Z","2024-09-17","MARA is a powerful bioinformatics tool that streamlines data analysis, offering custom-built functionalities for researchers to query databases, interconvert file formats, and organize data for in-depth analysis. Its intuitive workflows allow users to find similar protein sequences, perform multiple sequence alignments, introduce mutations, and create rotamer libraries using simple commands. MARA reduces the need for multiple applications, making it a valuable tool for both experienced bioinformaticians and newcomers. It transforms complex biological data into actionable insights, accelerating discoveries in fields such as medicinal chemistry.","MARA, Update, Public Beta, Nanome, scientific informatics, computational chemistry, cheminformatics, bioinformatics, drug discovery, drug hunter, molecular design, structural biology, user feedback, customizable tools, scientific workflows, molecular dynamics, molecular docking, scientific computing, data democratization, scientific research tools, beta testing, secure deployments, virtual research environment, chemistry software, biology software, data analysis tools, AI in science, enterprise scientific platform, VR in science, lead optimization, target identification, pharmacophore modeling, QSAR, ADMET, drug design, medicinal chemistry,","mara-bioinformatics-workflow",{"id":383,"attributes":384},27,{"title":385,"content":386,"createdAt":387,"updatedAt":388,"publishedAt":389,"date":390,"description":391,"keywords":380,"slug":392,"category":66},"MARA Update: v0.12.10","We are thrilled to announce update 0.12.10 for MARA. This update delivers new features tailored to optimize enterprise deployments and enhance the public beta experience. Explore MARA’s evolving platform and share your insights with us.\n\nTry the public beta yourself [here](https:\u002F\u002Fmara.nanome.ai\u002F)!\n\n## Enterprise Features\n\nWith enterprise deployments that use a private GPT4 instance, customers can now use an API URL for all aspects of leveraging that LLM. The requirements for LLMs that can be leveraged by MARA are the following: \n\n- State of the art\u002Ffrontier models that score very high on reasoning and function calling benchmarks.\n- Advanced Reasoning is the primary need. So, in-line with current standards, high parameters LLMs are recommended (for example, Llama v3.1 405B \u002F GPT4 [1.4T params])\n\nBehind your organization's firewall you can bring your own LLM as long as it meets the state of the art (SOTA) benchmarks. Because there are different types of foundation models not all LLM approaches are the same. \n\n\n## New Features\n\nAbility to drag and drop CSV and SDF files to load directly as tables.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimport_SDF_Files_dba5b3cc8a.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nAdded the ability to export contents of the molecular viewer as STL|USDZ|GLTF, set a style, and spin fast or slow.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FNew_Molecular_Visualization_Commands_b7b3e77e36.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n\nAdded a “Convert mmCIF\u002FCIF to PDB”  as a native tool.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fmm_CIF_to_PDB_8646e6f011.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\nWe have improved the performance of the Synthetic Accessibility Scorer (SAScorer) tool.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FSA_Scorer_Tool_7992bdbee9.gif\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n## Improvements\n\nIn this update, we’ve made several improvements to MARA’s UI for a more intuitive experience. Users will also notice enhanced system performance, including more accurate tool recognition and streamlined tool creation, making it easier to work efficiently.\n\n\n## \n\n\n## Complete List of Changes, Features, and Improvements\n\n\n### Enterprise changes\n\n\n\n* Added support for Azure LLM\u002Fembeddings endpoints\n* Allow any LLM model name instead of restricting to a known set\n* Fix the native synthetic accessibility tool (SAScorer) for enterprise deployments\n* Better tool logging for debugging tools\n\n\n### New features\n\n\n\n* Ability to drag and drop CSVs\u002FSDFs to load as a Data Table\n* Added ability to turn a CSV to a Data Table via an Edit button\n* Added a general feedback mechanism in the sidebar\n* Updated the Settings page to include Profile Info\n* API Keys and Dark mode moved into the system tab\n* Added a “Convert mmCIF to PDB”  as a native tool\n* Added functionality to Structure Viewer Commands: Download as STL|UDSZ|GLTF, Set style, Spin faster\u002Fslower\n* Right click Data Table headers for action menus (add\u002Fremove columns)\n* Add native tools for creating Data Tables in chat from CSV and SDF\n* Python Tools list and dict args\n\n\n### Improvements\n\n\n\n* The system should try to find existing tools more meticulously\n* Creating a tool redirects you directly to the tool details page\n* Improved argument\u002Fheader editing in tool creation and editing form\n* Default tool type is now Python when creating tools\n* Improved tool permissions user experience\n* Chat info now lives in a split view\n* Miscellaneous UI improvements\n* Improved Ligand Extractor tool \n* Improved ChEMBL Search tool by prioritizing exact matches and including compound names\n","2024-09-04T13:15:32.327Z","2026-03-24T17:53:28.651Z","2024-09-11T07:48:01.272Z","2024-09-11","MARA Update 0.12.10 introduces new features to enhance both enterprise and public beta user experiences. Key updates include expanded Structure Viewer commands, a new tool for converting mmCIF to PDB. Additionally, the Synthetic Accessibility Scorer (SAScorer) tool has been improved for enterprise users. The update also brings various UI improvements and system enhancements to streamline workflows and improve research capabilities.","mara-update:-v0.12.10",{"id":394,"attributes":395},25,{"title":396,"content":397,"createdAt":398,"updatedAt":399,"publishedAt":400,"date":401,"description":402,"keywords":403,"slug":404,"category":91},"Introduction to MARA Tool Creation","\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F1_b3eadafe77.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nIn MARA, tools act as the driving force behind creating workflows that turn your lab into a cutting-edge informatics hub. Our freemium release offers over 170 tools, with new ones in development; being updated and added weekly. The true power of MARA is its extensibility; if you have an API endpoint, MARA can interact with it!\n\nCurrently in our public beta, you gain access to 50 credits to run tools and the ability to create 10 different tools! Try it for yourself [here](https:\u002F\u002Fmara.nanome.ai\u002F).\n\nFor our enterprise customers, you have the flexibility to create custom tools tailored to your specific needs, allowing you to fully customize your workflows. With enterprise deployment, you can easily develop or integrate your existing tools using various methods. These tools become immediately accessible across your entire organization. Additionally, you can tap into our extensive library of built-in tools to enhance your workflows. To enquire about enterprise deployments contact us at: [hello@nanome.ai](mailto:hello@nanome.ai). \n\n\n## Tool Creation\n\nThere are currently three ways of developing tools in MARA: python snippets, HTTP endpoints, and SQL queries. Alternatively, if needed you can also develop and write-up the code for your tool as JSON. \n\nEvery tool you develop becomes a rest API endpoint and when it has been fully developed and tested, it can then be made available to your organization. \n\n#### Python Snippets\n\nWrite a snippet in python of your code that you want to implement into MARA or you can utilize AI to generate the code for you. MARA automatically creates and manages a sandbox environment to service your tools.\n\nMoreover, you can use AI to generate the code for your tool so that you only have to worry about the prompts or the arguments for your tool.\n\n\n#### HTTP Endpoints\n\nWithin MARA, you can quickly create a tool that can interact with any API endpoint making existing infrastructure or external software readily available to MARA.  \n\n\n#### SQL Queries\n\nMake SQL queries accessible to everyone in your organization with reusable templates that allow you to create and share common queries. Generate SQL queries on the fly to answer specific questions, and construct SQL queries using natural language, making it easy for anyone to get the data they need.\n\n\n## Tool Creation Walkthrough\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FTool_Creation_Open_and_Add_Info_53835ecae2.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n### Tool Information\n\nThe first step in creating your tool is to Name and define the tool. Give it a description so others in your organization can easily see what task your tool performs. You can also define which category your tool belongs to; such as Data retrieval or informatics.\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FTool_Creation_Add_Arguments_1fa098360a.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nPrompts \u002F Arguments\n\nWhen entering the prompts or arguments for the tool, it's crucial to clearly define the parameters and inputs that the tool requires to function effectively.\n\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FTool_Creation_Add_Code_5b4dbcc585.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n\n### Code\n\nThe next step in tool creation is generating your code, which involves developing the logic and functionality required for the tool to perform its intended tasks. For Python, you can write the code manually, leveraging Python's vast libraries and frameworks, or use the built-in AI generative system to assist in writing the code, which can speed up development.\n\nWhen working with HTTP endpoint requests, you need to design and implement the appropriate API endpoints that will handle various types of HTTP requests (e.g., GET, POST, PUT, DELETE). These endpoints should be coded to process requests efficiently and return the appropriate HTTP status codes and JSON responses.\n\nWhen dealing with SQL queries, you'll need to write and optimize SQL statements to interact with databases, ensuring data is retrieved, updated, inserted, or deleted correctly.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FTool_Creation_Tool_Test_1ad814bc7d.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n### Test environment\n\nOnce you have developed a tool, it's important to test it in isolation to ensure it functions correctly and meets all performance standards. Isolated testing allows you to identify and address any bugs or issues without external interference, ensuring the tool operates as intended before it is released into your organization.\n\nWant to try out tool development for yourself? Sign up to the public beta [here](https:\u002F\u002Fmara.nanome.ai\u002F) and get 50 free credits!","2024-08-23T14:15:49.798Z","2026-03-24T17:53:27.845Z","2024-09-04T15:57:11.104Z","2024-09-04","Learn how MARA's extensibility allows you to create custom tools tailored to your lab's specific needs, making complex workflows accessible to your entire organization. Whether you're a coding expert or new to tool development, MARA offers solutions to streamline your lab's data management and analysis processes.","MARA, CDD Vault, Public Beta, Nanome, scientific informatics, computational chemistry, cheminformatics, bioinformatics, drug discovery, drug hunter, molecular design, structural biology, user feedback, customizable tools, scientific workflows, molecular dynamics, molecular docking, scientific computing, data democratization, scientific research tools, beta testing, secure deployments, virtual research environment, chemistry software, biology software, data analysis tools, AI in science, enterprise scientific platform, VR in science, lead optimization, target identification, pharmacophore modeling, QSAR, ADMET, drug design, medicinal chemistry,","introduction-to-mara-tool-creation",{"id":406,"attributes":407},26,{"title":408,"content":409,"createdAt":410,"updatedAt":411,"publishedAt":412,"date":413,"description":414,"keywords":415,"slug":416,"category":53},"Collaborative Drug Discovery & Nanome Partnership Announcement","\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fcdd_mara_banner_5a3236a34b.png?updated_at=2024-08-27T22:44:24.912Z\" style=\"width:100%; height:auto;\">\u003C\u002Fa>\n\n## Nanome and Collaborative Drug Discovery (CDD) are excited to announce a partnership that streamlines cheminformatic workflows for drug discovery\n\nTogether, Nanome and Collaborative Drug Discovery (CDD) are committed to enhancing the efficiency and innovation of drug discovery teams by providing an integrated solution for molecular visualization, design, and data analysis. \n\n**[CDD Vault](https:\u002F\u002Fwww.collaborativedrug.com\u002F)** is a hosted database solution for securely managing and sharing biological and chemical data. It lets you intuitively organize chemical structures and biological study data, and collaborate with internal or external partners through an easy-to-use web interface and API.\n\n**[MARA](http:\u002F\u002Fnanome.ai\u002Fmara)** is a secure enterprise platform that can orchestrate scientific workflows through natural language. It facilitates a conversational system between your cheminformatic, bioinformatic, and other internal tools and databases to become your scientific co-pilot for drug discovery.\n\n\n>**\"By combining the deterministic execution from MARA and the scientific data handling of CDD Vault, this partnership empowers scientists to make more informed decisions, leverage their data more effectively, and ultimately make more ground breaking discoveries.\"**\n>>-Edgardo Leija, CXO & Co-Founder of Nanome\n>\n\n>**“This partnership expands on our mission to help our customers succeed with their data management workflows. With the rise in machine learning, the innovative use of both of our technologies aligns with our desire to help scientists globally collaborate more effectively.“**\n>-Sylvia Ernst, PhD, Senior Manager, Commercial Operations\n\nWant to try MARA out yourself? Click [here](https:\u002F\u002Fmara.nanome.ai\u002F) to begin with 50 free credits.","2024-08-27T15:55:51.452Z","2026-03-24T17:53:27.885Z","2024-08-27T22:43:16.000Z","2024-08-28","Nanome and Collaborative Drug Discovery (CDD) have partnered to enhance drug discovery workflows by integrating molecular visualization, design, and data analysis. This collaboration combines CDD Vault's secure management of biological and chemical data with Nanome's MARA platform, which uses natural language to streamline scientific workflows. Together, they aim to improve data-driven decision-making and foster global collaboration in drug discovery.","Nanome, Collaborative Drug Discovery (CDD), Drug discovery, Cheminformatics, Molecular visualization, Data analysis, CDD Vault, MARA, Scientific workflows, Data management, Machine learning, Global collaboration, Biological data, Chemical data, Drug design, Integration, Natural language processing, Innovation, Partnership","collaborative-drug-discovery-and-nanome-partnership-announcement",{"id":418,"attributes":419},24,{"title":420,"content":421,"createdAt":422,"updatedAt":423,"publishedAt":424,"date":425,"description":426,"keywords":403,"slug":427,"category":91},"MARA & CDD Integration Example","### Introduction\n\nWe are excited to showcase an integration between MARA by Nanome and Collaborative Drug Discovery (CDD Vault).\n\nIn this blog, we delve into a common workflow for drug discovery researchers. We break down the steps to leverage an online hosted database and the conversational interface MARA provides. \n\n### An example workflow \n\nWhen handling large data sets, choosing the right tools is essential for efficient data management and analysis. CDD Vault is a prime example of powerful data management software tailored for the purpose of aiding in drug discovery research. It serves as a comprehensive, centralized repository for all your research data, offering seamless integrations through their API. As illustrated in the GIF below, CDD Vault excels at intuitively organizing chemical structures and biological study data, making it an invaluable resource for researchers.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FG1_CDD_Vault_Workings_04eb745ed6.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n\n### Import CDD Vault data into MARA\n\nA standout feature of this integration is the harmonious flow that allows you to access data from CDD Vault directly within MARA, enabling you to apply your custom workflows effortlessly. By simply writing a natural language prompt in MARA, you can securely retrieve data from your saved workspace in CDD Vault—provided you specify the file name, project name, and vault name. This streamlined process ensures that your research data is always accessible and ready for further analysis.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FG2_MARA_x_CDD_Vault_Workings_25ee5344c1.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n### Modify your specified data through simple commands\n\nOnce your data is within the MARA ecosystem, you can continue to use natural language to communicate with MARA for it to perform specific, deterministic tools and subsequent workflows on your data.\n\nBreak down the molecules into R groups by decomposing them around a common core fragment, usually represented as a SMILES string. This process generates data tables that organize your molecules based on their distinct R groups, enabling easier analysis and comparison.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FG3_Decomposing_SMILES_Strings_da8db58920.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nOnce you have separated your fragments, search through them to easily isolate your molecules of interest.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FG4_Br_Fragment_Search_3de9e8373b.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nOnce you have found your fragments of interest, connect them to the SMILES string scaffold of your molecules  \n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FG5_Br_Fragment_Combination_32859be780.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n### Export updated data from MARA back into CDD Vault\n\nOnce you have completed your work in MARA, you can export content right back to CDD vault for further sharing and analysis!\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FG6_Export_from_MARA_to_CDD_Vault_656636ed2e.gif\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n\n### Summary\n\nMARA can help simplify enterprise scientific workflows and it can easily integrate with internal tools and databases. This blog showcases just one example of what you can do when you combine the power of MARA and CDD Vault. \n\nLearn more about CDD Vault [here](https:\u002F\u002Fwww.collaborativedrug.com\u002F) and sign up to try MARA yourself [here](https:\u002F\u002Ft.dripemail2.com\u002Fc\u002FeyJhbGciOiJIUzI1NiJ9.eyJhdWQiOiJkZXRvdXIiLCJpc3MiOiJtb25vbGl0aCIsInN1YiI6ImRldG91cl9saW5rIiwiaWF0IjoxNzIzNjU4NDE2LCJuYmYiOjE3MjM2NTg0MTYsImFjY291bnRfaWQiOiI0MjkwNjc2IiwiZGVsaXZlcnlfaWQiOiJxZ2dnNTNpNTBwZzltMDY0eW84ayIsInVybCI6Imh0dHBzOi8vbWFyYS5uYW5vbWUuYWkvP3V0bV9zb3VyY2U9ZW1haWwmdXRtX21lZGl1bT1lbWFpbCZ1dG1fY2FtcGFpZ249bWFyYS1sYXVuY2gmX19zPXFuampoaXYzMnJiMXdraWZ3aHNtIn0.BelzzoFHDQLOcglRaNJTp45PZxo57S-MEujZkV1JEWA). \n\nFor more information about MARA contact, or to enquire about enterprise deployment, talk to our team ([hello@nanome.ai](mailto:hello@nanome.ai)).\n\n","2024-08-21T12:31:35.930Z","2026-03-24T17:53:27.965Z","2024-08-23T14:09:36.641Z","2024-08-23","We are excited to showcase an integration between MARA by Nanome and Collaborative Drug Discovery (CDD Vault). In this blog, we delve into a common workflow of drug discovery researchers, breaking down the key steps in the process of leveraging an online hosted database and the conversational interface MARA provides. ","mara-and-cdd-integration-example",{"id":429,"attributes":430},23,{"title":431,"content":432,"createdAt":433,"updatedAt":434,"publishedAt":435,"date":436,"description":437,"keywords":438,"slug":439,"category":91},"Computer-Aided Drug Discovery & Design through MARA","# Computer-Aided Drug Discovery & Design through MARA \n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F1_b3eadafe77.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n**Reduce the steps you take to complete your daily tasks**\n\nWith the release of MARA Public Beta, we are thrilled to showcase the incredible enterprise ready capabilities of this powerful tool. Workflows for Computer Aided Drug Discovery & Design (CADDD) and Structural Based Drug Design (SBDD) projects require an investigation of three-dimensional structures of biological macromolecules, such as proteins, nucleic acids, and carbohydrates to understand their function and interactions at the molecular level. Scientists completing these tasks are critical to molecular modeling, data analysis, experimental design, and structural determination. All of this work is essential for drug discovery, understanding disease mechanisms, developing new therapeutic strategies.\n\nHere, we outline a simple enterprise ready MARA workflow for a CADDD\u002FSBDD scientist, illustrating how MARA can save you valuable time and lighten your workload.\n\n**Workflow Overview**\n\n\n\n* Load AlphaFold structure\n* Prepare protein structure for docking (including a PDB file fixer tool) \n* Define potential pockets\n* Draw a molecule in 2D and convert to 3D as an SDF file\n* Minimize SDF file and add hydrogens\n* Dock the minimized molecule at the prepared AlphaFold structure at a desired pocket\n* Send to Nanome app Quick Drop to interact with in XR\n\nAlthough ESMfold can be queried directly from MARA, and a tool can be made to query AlphaFold directly, users may already have AlphaFold CIF files they want to use. An entire workflow to go from a predicted protein structure to a finished docking experiment can be completed using just a few simple natural language prompts. The final product can be downloaded or conveniently sent to a Nanome app Quick Drop for further analysis.   \n\n**Load an AlphaFold CIF Structure and Prepare for Docking**\n\nA prompt of “Prepare for docking” was used when loading the CIF file. MARA then handled the file conversions necessary to produce a PDBQT file and also ran the tools necessary to define pockets within the protein scaffold. For illustrative effects anthracene was drawn in MARA using the molecular drawing tool and imported to MARA as a SMILES string. MARA combines the use of multiple tools from a single prompt “_minimize and add hydrogens_” to produce an SDF file for docking experiments. \n\n![Using Mara to Prepare for Docking](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FT1_Prepare_for_Docking_GIF_85b514e90c.gif)\n\n**Define Pockets for Docking from a Tabular Output Created During Docking Preparation**\n\nDuring the docking preparation MARA outputs a structure with docking locations and a matching table with pocket id’s and x,y,z coordinates. A prompt of “_use the sdf file to dock at the pdbqt file in pocket_id 1_” initiated MARA to complete a series of tools to dock the hydrogenated and minimized anthracene molecule at the prepared predicted protein structure. \n\n![Using Mara to Dock for Based on Pockets](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FT2_Dock_Based_on_Pockets_found_GIF_817fad26ae.gif)\n\n\n**Send Docking Products to Local Drive or Nanome App Quick Drop**\n\nMARA integrates seamlessly into your existing data infrastructure and the Nanome data infrastructure; easily send your MARA produced structures to Nanome to interact with in XR. All files produced by MARA for the user are available to download directly to any device as well. \n\n![Using Mara to Send Docked Results to Nanome Vault](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FT3_Send_Docking_Products_to_Nanome_Vault_Directly_GIF_5181338e9f.gif)\n\n\n**Summary**\n\nMARA is here to help guide your enterprise scientific workflows. MARA is adaptive and can iterate through logical steps to find the best possible path. What tools will you create? How much time can you save?\n\nWant to try MARA out yourself? Click [here](https:\u002F\u002Fmara.nanome.ai\u002F) to begin with 50 free credits.\n","2024-08-20T09:48:31.891Z","2026-03-24T17:53:28.916Z","2024-08-20T16:44:22.684Z","2024-08-21","This blog post introduces an example workflow in the MARA Public Beta, a powerful tool designed to streamline Computer Aided Drug Design (CADD) and Structural Based Drug Design (SBDD) workflows. MARA simplifies the process of analyzing three-dimensional structures of biological macromolecules, which is crucial for drug discovery, understanding disease mechanisms, and developing therapeutic strategies. The post outlines a step-by-step MARA workflow, from loading AlphaFold structures and preparing proteins for docking to drawing and docking molecules, and sending results to the Nanome app for further analysis. MARA's natural language prompts significantly reduce task complexity, making it a valuable asset for CADD\u002FSBDD scientists.","CADD, SBDD, MARA, Public Beta, Nanome, scientific informatics, computational chemistry, cheminformatics, bioinformatics, drug discovery, drug hunter, molecular design, structural biology, user feedback, customizable tools, scientific workflows, molecular dynamics, molecular docking, scientific computing, data democratization, scientific research tools, beta testing, secure deployments, virtual research environment, chemistry software, biology software, data analysis tools, AI in science, enterprise scientific platform, VR in science, lead optimization, target identification, pharmacophore modeling, QSAR, ADMET, drug design, medicinal chemistry,","computer-aided-drug-discovery-and-design-through-mara",{"id":441,"attributes":442},22,{"title":443,"content":444,"createdAt":445,"updatedAt":446,"publishedAt":447,"date":448,"description":449,"keywords":450,"slug":451,"category":66},"Introduction to MARA Freemium","## An Introduction to MARA Freemium\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F1_b3eadafe77.png\" style=\"max-width:50%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\nWe're thrilled to announce the public beta launch of MARA, a groundbreaking scientific informatics platform designed to leverage natural language to help you manage and optimize your workflows.\n\nIn this blog, we’ll take a deep dive into a few MARA tools currently in place, ready for you to explore and implement in your own projects. Whether you're looking to streamline your project management, automate repetitive tasks, or enhance team collaboration, MARA has a suite of solutions tailored to meet your needs.\n\nOver the coming weeks, we’ll be rolling out additional blog posts, tips, and best practices to help you make the most of MARA’s capabilities. From setting up your first project to customizing workflows that fit your unique processes.\n\n## What’s Included in MARA Public Beta?\n\n50 Free Credits: Each credit gives you access to one of MARA’s powerful tools.\nReview & Create Tools: Use your free credits to run a variety of different tools or create up to 10 custom tools.\n\nHow to Get Started:\n1. Register: Create an account \u003Ca href=\"https:\u002F\u002Ft.dripemail2.com\u002Fc\u002FeyJhbGciOiJIUzI1NiJ9.eyJhdWQiOiJkZXRvdXIiLCJpc3MiOiJtb25vbGl0aCIsInN1YiI6ImRldG91cl9saW5rIiwiaWF0IjoxNzIzNjU4NDE2LCJuYmYiOjE3MjM2NTg0MTYsImFjY291bnRfaWQiOiI0MjkwNjc2IiwiZGVsaXZlcnlfaWQiOiJxZ2dnNTNpNTBwZzltMDY0eW84ayIsInVybCI6Imh0dHBzOi8vbWFyYS5uYW5vbWUuYWkvP3V0bV9zb3VyY2U9ZW1haWwmdXRtX21lZGl1bT1lbWFpbCZ1dG1fY2FtcGFpZ249bWFyYS1sYXVuY2gmX19zPXFuampoaXYzMnJiMXdraWZ3aHNtIn0.BelzzoFHDQLOcglRaNJTp45PZxo57S-MEujZkV1JEWA\">here\u003C\u002Fa>.\n2. Explore Tools: Utilize MARA’s state-of-the-art reasoning and planning capabilities for assisting you with data analysis and workflow automation and discover how MARA can streamline your research processes.\n3. Maximize Efficiency: Let MARA automate your repetitive tasks, allowing you to focus on high-impact activities.\n4. Leave Feedback: Help us improve MARA.\n\n## Enterprise Integration\n\nEasily connect MARA with your existing tools and internal databases, all while keeping your data secure. Contact our team (hello@nanome.ai) to inquire about an Enterprise deployment.\n\n## Tools Available in MARA Public Beta\n\nIn this section, we’re excited to showcase some of the tools and features available in the public beta release of MARA. Our goal is to give you a comprehensive overview of the resources at your disposal, helping you to make the most of your experience with MARA from day one. \n\n\n### Download and Visualize a PDB file\n\nThis tool will allow you to download a file from public databases, such as the RCSB PDB, PubMed, UniChem, Materials Project, and ChEMBL. Just ask MARA for your molecule of interest and a structure viewer window will open showing the molecular structure.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FRetrieve_PBD_5_CEO_GIF_1_08c9617f62.gif\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\n\n### Create a Data Table with Chemical Properties of the Ligand\n\nYou can also ask MARA to generate the chemical properties of a molecule from either a SMILES string search or from a downloaded PDB file. This will create a data table that can be manipulated with natural language. \n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FDefine_Chemical_Properties_GIF_197a2b7dd3.gif\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\n### Show Molecules Similar to this Ligand\n\nAnother great tool within MARA is the ability to search for molecules similar to your ligand using natural language. This can be used with previous tools to create a data table of similar ligands.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FSimilar_Molecules_Search_GIF_1_0b66692a6a.gif\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\n\n### Change the Color of the Protein in the Structure Viewer Using Natural Language\n\nUsing natural language, you can alter the view of a protein within the Structure Viewer window. You can change the color of individual residues or entire chains depending on the structure you are viewing.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMol_Star_Viewer_Edit_GIF_1_ba5fd27c81.gif\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\n\n### Visualize a SMILES String in 2D\n\nMARA offers a powerful feature that lets you draw a molecule and seamlessly convert it into a SMILES string. Once you've created your molecule, you can visualize it in 2D and also retrieve its chemical properties, providing a comprehensive understanding of the structure you’ve designed.\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FSMILES_String_Properties_GIF_1_de9f0a22a0.gif\" style=\"max-width:100%; height:auto;\" class=\"center\">\u003C\u002Fa>\n\n## Future Blogs\n\nStay tuned and keep this space bookmarked! We have an exciting lineup of blog posts on the horizon that will delve into specialized workflows tailored to meet the needs of enterprise professionals across various domains. In the coming weeks, we'll be rolling out additional  blog posts specifically designed for bioinformaticians, chem-informaticians, computer-aided drug design (CADD) experts, and structure based drug design (SBDD) practitioners. \n\n\n\n","2024-08-12T22:43:30.677Z","2026-03-24T17:53:27.922Z","2024-08-14T17:15:47.088Z","2024-08-14","We're thrilled to announce the public beta launch of MARA, a groundbreaking scientific informatics platform designed to leverage natural language to help you manage and optimize your workflows.\n\nIn this blog, we’ll take a deep dive into a few MARA tools currently in place, ready for you to explore and implement in your own projects. Whether you're looking to streamline your project management, automate repetitive tasks, or enhance team collaboration, MARA has a suite of solutions tailored to meet your needs.","MARA, public beta, science, informatics, LLMs, AI,","introduction-to-mara-freemium",{"id":453,"attributes":454},21,{"title":455,"content":456,"createdAt":457,"updatedAt":458,"publishedAt":459,"date":460,"description":461,"keywords":462,"slug":463,"category":66},"MARA Public Beta Coming Soon ","### Introducing the MARA Public Beta\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F1_b3eadafe77.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\n**Exciting Developments with MARA**\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F2_be83909738.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nSince our last update, we’ve been hard at work refining our newest product – a scientific discovery copilot called MARA (Molecular Analysis and Reasoning Assistant). For a closer look at how MARA can aid your scientific research, check out our new [informational page](https:\u002F\u002Fnanome.ai\u002Fmara).\n\n**MARA Public Beta: Coming Soon**\n\nWe're thrilled to announce that MARA will soon be available in Public Beta. To get everyone started, each user who signs up **will receive free credits.** Currently, each tool run will cost one credit, though this is expected to evolve based on user feedback and data analysis. Additional credits will be available for purchase.\n\n**Credits and Tools: Transparent and User-Centric**\nUnderstanding how many credits each prompt will consume is crucial. We're working on a feature to make this information clear and accessible, helping you plan your usage efficiently.\n\n**Engage with MARA: Try, Feedback, and Build**\n\nThe open beta phase is not just about testing; it's about interaction. We encourage you to:\n\n\n\n* **Try MARA**: Explore its capabilities firsthand.\n* **Give Feedback**: Help us refine and perfect MARA.\n* **Build Your Own Tools**: Customize and extend MARA to suit your needs.\n\n\n### **MARA's Proficiency**\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F3_19b68b10be.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F4_199764f043.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nMARA showcases robust capabilities in various scientific informatics workflows, which will be further explored in our forthcoming blog series. A key area to explore is the workflows and tools section on our [informational page](https:\u002F\u002Fnanome.ai\u002Fmara\u002F), which outlines supported functionalities. While computationally intensive tools like molecular dynamics and docking are not yet available, the ability to fully prepare molecules for these tools is! MARA offers the unique opportunity to develop custom tools and workflows and we encourage everyone to leverage this flexibility to tailor solutions that meet their specific research needs. Stay tuned for more insights on how MARA can revolutionize your scientific processes.\n\nFun Fact: there are already 170+ tools and we’re adding more  EVERY week!\n\n**Beta Considerations and Privacy**\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F5_07f4162f48.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\nPlease remember, as this is a beta version, occasional hiccups are expected. Our privacy disclosure, available [here](https:\u002F\u002Fnanome.ai\u002Fterms\u002F), outlines how we handle your data with care.\n\n**Secure and Offline Deployments**\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F6_2d31cfc8a5.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n\nFor those requiring secure or offline deployments of Mara, options are available. Click [here](https:\u002F\u002Fhome.nanome.ai\u002Fquote) for more details on setting up Mara in a secure environment.\n\n**Upcoming Resources**\n\nWe're preparing more guides and tips to help you get the most out of Mara. Whether you're integrating Mara into your existing workflow or starting fresh, these resources will provide invaluable insights into utilizing Mara effectively.\n\n\nCan a bench scientist wrangle data like a database expert?\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F7_94769d8f69.png\" style=\"max-width:100%; height:auto;\">\u003C\u002Fa>\n\n\n[Join us for a live webinar](https:\u002F\u002Fmeet.nanome.ai\u002Fdata-democratization-with-nanome-mara?source=social1&utm_content=302567283&utm_medium=social&utm_source=twitter&hss_channel=tw-3279899875) on August 22nd and find out how! Quillon Simpson, PhD Chemistry, will talk about Data Democratization with Nanome MARA.\n\nStay tuned for more updates and get ready to explore the possibilities with Mara in Public Beta!\n","2024-08-05T11:07:27.369Z","2026-03-24T17:53:27.805Z","2024-08-05T11:15:50.039Z","2024-08-05","Discover the groundbreaking MARA Public Beta: Nanome's latest innovation in scientific informatics! Join us as we unveil the enhanced capabilities of MARA, offering a unique blend of customizability and user-centric tools for scientists. Dive into our blog to explore how MARA can streamline your research workflows, get insights on securing free credits for early users, and learn about our commitment to privacy and secure deployments. Don't miss out on the opportunity to shape the future of scientific computing—try MARA, provide feedback, and customize your tools. Sign up now to transform how you approach scientific challenges with MARA!","MARA, Public Beta, Nanome, scientific informatics, computational chemistry, cheminformatics, bioinformatics, drug discovery, drug hunter, molecular design, structural biology, user feedback, customizable tools, scientific workflows, molecular dynamics, molecular docking, scientific computing, data democratization, scientific research tools, beta testing, secure deployments, virtual research environment, chemistry software, biology software, data analysis tools, AI in science, enterprise scientific platform, VR in science, lead optimization, target identification, pharmacophore modeling, QSAR, ADMET, drug design, medicinal chemistry","mara-public-beta-coming-soon",{"id":465,"attributes":466},20,{"title":467,"content":468,"createdAt":469,"updatedAt":470,"publishedAt":471,"date":472,"description":473,"keywords":474,"slug":475,"category":66},"v1.24.6 Patch Release","In scientific discovery, collaboration is one of the most essential tools in a scientist's arsenal. Whether your field is structural biology, protein engineering, or computational chemistry, the ability to visualize and share your work and ideas with peers is invaluable. With the release of Nanome 1.24.6, configuring the platform for seamless meetings is now easier than ever. This new version gives you greater control over participants in your virtual room, ensuring everyone sees exactly what you want them to see. This enhanced functionality will revolutionize how you demonstrate your work in XR, opening new avenues for research and collaboration.\n\nIn addition to various bug fixes, we are excited to announce added support for additional MOE fields, including grouping, residue serial numbers, BFactor, and ribbon style. (Please note: MOE file support is in beta)\n\nIn this blog post, we’ll explore the key improvements in the latest Nanome patch and how they will benefit users in the future.\n\n\n## Room Control\n\n\n### Room Positioning\n![](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_d142cb90bd.gif)\n\nThis feature allows the presenter, or host, to control where users load in, ensuring a seamless transition into your meetings. No longer will you need to spend time making sure all your colleagues can see the structure from the correct angle or visualize the binding pocket you're discussing. Moreover, you can configure it so that every time you load into a room, all guests teleport to your location. This way, you can instantly start and synchronize your meetings with everyone in the same position, visualizing the same content.\n\n\n### Hide all users on entry\n![](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_65c0a3581c.gif)\n\nAnother great feature has been added to hide users on entry. This feature works great in conjunction with the new room positioning tools. This allows the present to gather all guests in the room to one specific location once they load into the XR session and make their avatars invisible. This ensures that everyone can see the molecule or binding side of discussion without any interference from other users' avatars. Once again, making conferences or presentations flow even better as you can clearly demonstrate.\n\n","2024-07-26T19:37:25.010Z","2026-03-24T17:53:29.661Z","2024-07-26T19:37:33.660Z","2024-07-26","Discover the latest enhancements in scientific collaboration with Nanome V1.24.6. Our new patch simplifies virtual meetings with improved room control, ensuring seamless participant positioning and visibility. Explore how these updates, including support for additional MOE fields and various bug fixes, revolutionize structural biology, protein engineering, and computational chemistry. Elevate your research and demonstrations in XR like never before. Read our blog for an in-depth look at these game-changing features.","Nanome V1.24.6, scientific collaboration, structural biology, protein engineering, computational chemistry, computer-aided drug design (CADD), molecular visualization, virtual reality (VR) for science, extended reality (XR), MOE fields support, grouping in MOE, residue serial numbers, BFactor, ribbon style, virtual room control, room positioning, virtual meetings, hide users on entry, binding pocket visualization, molecule demonstration, medicinal chemistry, drug discovery, molecular modeling, structural analysis, scientific research tools, virtual conferences, XR presentations, real-time collaboration, scientific demonstrations, research synchronization, virtual avatars in science, seamless transition in XR, teleportation in virtual meetings, enhanced scientific workflows, visualization tools for scientists, Nanome platform updates, structural bioinformatics, protein-ligand interactions, drug design software, XR technology in biopharma, molecular graphics, structural bioinformatics tools, structure-activity relationship (SAR), SAR discovery, SAR analysis, SAR modeling, SAR tools, SAR optimization, ligand-based drug design, receptor-based drug design, molecular docking, pharmacophore modeling, QSAR analysis, bioisosteric replacement, lead optimization, high-throughput screening (HTS).","v1.24.6-patch-release",{"id":477,"attributes":478},19,{"title":479,"content":480,"createdAt":481,"updatedAt":482,"publishedAt":483,"date":484,"description":485,"keywords":486,"slug":487,"category":104},"Nanome on Apple Vision Pro Roadmap","Written by Sam Hessenauer and Keita Funakawa\n\n_Summary: Awesome device but it’s still very new, Nanome v2.x aiming to support EOY2024 but there are external dependencies since the device is so new and the standard frameworks aren’t yet fully supported._\n\n## Prototypes for AVP \n\u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fnanospin\u002Fid6478713911\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F2_copy_f10a70a2f9.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fcalcflow-vision\u002Fid6479180933\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_6b62b7f9c8.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\nJust a few months ago, we introduced two exciting prototypes on the Apple Vision Pro—NanoSpin and Calcflow. These applications showcase the innovative possibilities of integrating spatial computing with molecular visualization and 3D mathematics. If you haven’t had the chance to experience these groundbreaking tools, we highly encourage you to explore them. Both NanoSpin and Calcflow are available now on the Vision App Store, offering a unique glimpse into the future of immersive scientific and mathematical exploration. Don’t miss out on the chance to interact with molecules and math in a completely new and dynamic way!\n\n## \"When Nanome on AVP?\"\n\nThat said, in the mean time, we’ve been flooded with questions about when the full version of Nanome and Nanome 2.x will be available on the Apple Vision Pro. We appreciate your enthusiasm and are eager to bring you our comprehensive platform in this advanced format. With this blog, we wanted to share a bit more about or plans for a full port of Nanome on Apple Vision Pro. Our team is focused on tailoring our applications to align with the unique features of the Apple Vision Pro, ensuring that our launch meets our users expectations for seamless integration and enhanced functionality. \n\nThe Vision Pro (AVP) is a very cool device. It’s the first of its kind and has some very different approaches to the spatial computing paradigm. Overall, it’s a really great headset, and coupled with the apple ecosystem, offers an unparalleled integration and flow between devices making workflows seamless. It’s also an extremely personal device. It heavily relies on calibration for eyes and hands as well as custom prescription inserts for people with glasses, and ultimately connects to your icloud for maximum utility.\n\n## Divergent UX from Industry Standards \n\u003Ca href=\"https:\u002F\u002Fblog.matryx.ai\u002F1-23-2-patch-nanome-roadmap-for-the-rest-of-2022-and-nanome-2-0-2206c2e1a7ab\">\u003Cimg src=\"https:\u002F\u002Fwww.cnet.com\u002Fa\u002Fimg\u002Fresize\u002F2edd7881176d6b2e7cb0d74e15afff85e3098a27\u002Fhub\u002F2016\u002F05\u002F04\u002F06c1edc7-ab4f-49ab-8f15-82436c82589d\u002Fhtc-vive-vs-oculus-4060-013.jpg?auto=webp&precrop=2092,1467,x108,y0&width=1200\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nWith Vision Pro’s unique and new approach to both user experience and how they envision the spatial computing operating system, there are many divergent aspects of the technology compared to what’s been essentially standardized in the current XR industry (Meta Quest, Vive, Pico, etc). This means that it is not at all a simple 1:1 port of the Nanome 1.x software to support this new and interesting device.\n\nNot only do many typical frameworks and libraries not directly translate over to the AVP, but major components of the user experience and even molecular rendering are impacted.\n\nMany of you know that we’ve been building Nanome 2.x, the second generation of our XR platform, which will have its first publicly available versions in Q2 2024. We are working on supporting the AVP with Nanome 2.x, but still has many external dependencies that will ultimately drive the timeline of full availability.\n\n## Journey to a full port with Tech Demos\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FpGz2uFsoU7ADygLQns\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-pGz2uFsoU7ADygLQns\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n\nThat being said, we are developing a few tech demos that we will make available (but will most likely not have full enterprise product value) and highly encourage everyone to actually try out the vision pro at an Apple store when it becomes available in your region. It truly is a fascinating device and yet it is only generation 1.\n\nThe future is bright and we are excited to see technological progress and excitement alike from many of the world’s most curious scientists. ","2024-05-15T23:22:54.814Z","2026-03-24T17:53:29.629Z","2024-05-15T23:22:56.043Z","2024-05-21","Discover the future of molecular visualization and computational chemistry with Nanome on Apple Vision Pro. Explore our groundbreaking prototypes, NanoSpin and Calcflow, now available on the Vision App Store. Learn about our roadmap for the full integration of Nanome v2.x by the end of 2024, despite the new device’s unique challenges and external dependencies. Experience a new era of immersive scientific exploration with Nanome’s advanced applications tailored for the innovative Apple Vision Pro.","Nanome, Apple Vision Pro, molecular visualization, computational chemistry, drug discovery, medicinal chemistry, structural biology, drug hunter, spatial computing, XR platform, NanoSpin, Calcflow, Vision App Store, immersive science, technology integration, Apple ecosystem, tech demos, AVP support","nanome-on-apple-vision-pro-roadmap",{"id":489,"attributes":490},18,{"title":491,"content":492,"createdAt":493,"updatedAt":494,"publishedAt":495,"date":496,"description":497,"keywords":498,"slug":499,"category":66},"MARA Q2 2024 Update Blog","Written by Sam Hessenauer, CTO & Co-founder of Nanome\n\nSince the announcement of MARA in December 2023, We’ve made some incredible progress developing MARA. \n\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fmara_e6548e3_8d73f2d806.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n## Overcoming Initial Challenges in MARA’s Development: Addressing LLM Limitations and Enhancing Functional Integration\n\n\nIn Q1 2024, there were a few pieces of the puzzle that needed to be solved from bridging the gap between our vision of MARA and the prototypes we’ve made so far. LLMs have intrinsic biases from pre-training data, there are limited context window (input character length), they are trained only to a snapshot in time, and hallucination problems plagued the line between fact and fiction. There is also no ability for the LLMs to actually use tools on its own, you essentially needed to use software engineering methods to still convert outputs into a rigid structure (known as output parsing) and handle triggering appropriate functionality.\n\n\u003Ca href=\"https:\u002F\u002Fwww.langchain.com\u002F\">\u003Cimg src=\"https:\u002F\u002Fmiro.medium.com\u002Fv2\u002Fresize:fit:4800\u002Fformat:webp\u002F1*1DBe4cCQYfpM0oNXl_kH2w.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\n## Enhancing LLM Efficacy with RAG and Agents: Pioneering Solutions for Bias Reduction and Tool Utilization\n\n\nLangchain and others started to address these issues more head on, more notably Retrieval Augmented Generation (RAG) and Agents. RAG enabled you to essentially chunk and vectorize content and then prompt inject relevant chunks\u002Fsnippets into the prompt, thus reducing the reliance on the LLM’s pre-trained bias\u002Fhallucination and essentially act as a flexible summarizer of authentic and correct data that you determined. Agents were essentially equipping the LLMs themselves with deterministic tools. We can write code functions that have logic and describe their use, and the LLMs would use reasoning to decide which tool to use. \n\n## Scalable Solutions and Reliable Execution: Advanced Problem-Solving from Prototype to Platform\n\nWe ran into (and solved) many challenges throughout last year, everything from scaling from a handful of cheminformatics tools to hundreds, to reliable planning and execution, to workflow creation, hallucination mitigation, and much more. We go as far as evaluating what’s being done on the fly and flagging if something looks right. That, alongside a very clear communication of what tools are being used, what the inputs and outputs are, gives our scientist users the confidence they need on evaluating the legitimacy of the outputs of our new platform. Something that falls short with any other software.\n\nMARA essentially brings a completely internal ChatGPT-like interface to your internal scientific databases and informatics tools. It massively drops the barrier to entry for scientists to rapidly answer questions, expand their hypotheses, and trigger advanced informatics workflows like never before. \n\n## The Core of MARA: Tools, Knowledge, and Workflow Integration: Building a Versatile and Dynamic Scientific Environment\n\n\nMARA is primarily driven by 3 concepts: Tools, The Knowledge base, and Workflows. MARA is only limited by the tools it has at its disposal, you can bring in your internal tools in via:\n\n* REST API: Create a tool that can hit any API endpoint and return any type, from values to images to files.\n* SQL Query Templates: Create a tool with a pre-set SQL query and its inputs or describe the query with natural language. This is great for repeatable SQL queries that only have input parameter variations.\n* Python Snippets: Quickly type out your latest idea in python code using our embedded editor and make it a tool to be triggered in a sandboxed container (lambda-style). \n\nOf course, for every conversation, MARA will automatically curate a dataset to be used immediately for data analysis. Users can easily chat with their data or even make modifications to the dataset via natural language. Making it easy for more users to focus on the science and not the menial parts of data science. \n\nWhile MARA is primarily a web-based platform, it can also be interacted with via API, making it easy to trigger tools and workflows from your company’s internal tools or even from directly within an interactive Jupyter notebook. In the web interface, we’ve included a few quality of live features for scientists:\n\n\n\n* Directly upload molecular structural files such as PDB, MMCIF, SDF, and more\n* Draw a small molecule using a 2D Chemical Drawing tool\n* Embedded Molecular Viewer for viewing molecular structural files such as MMCIF, PDB, and SDF files\n\n## Ensuring Security and Control with MARA: An Enterprise-Ready Platform for Scientific Inquiry\n\n\nMARA is designed to be an enterprise-ready platform deployed completely behind your organization’s IT firewall and using the latest Open Source Large Language Models to power it. That means you can sleep easy knowing that all questions and conversations scientists ask MARA will be completely controlled by your organization. \n\nThis Scientific Co-pilot is part of an entirely new class of tools called LLM-enabled applications, and we expect to see many of the world’s industries get rapidly transformed by these tools over the next decade.\n\n## Validating MARA’s Impact: Early Pilots and Positive Feedback: Showcasing Future Possibilities and Real-World Applications\n\nWe’ve been in an early pilot with several customers and the feedback has been very positive. Now that we’ve made some incredible progress, we’d like to start sharing what’s possible with the world. Keep an eye out on our socials as we showcase impressive use-cases that now are incredibly simple to achieve with the MARA platform.\n\nIf you or your organization are interested in using MARA, if not for just a taste of the future, then please reach out to us. We’d love to hear your feedback and make rapid improvements which will accelerate our journey to our ultimate goal of equipping every scientist with their own Jarvis for Molecules.\n","2024-05-15T22:48:37.252Z","2026-03-24T17:53:29.585Z","2024-05-15T22:59:43.523Z","2024-05-15","Explore the Q2 2024 update on MARA, the transformative platform reshaping scientific research with advanced LLM technologies. Since its introduction in December 2023, MARA has evolved rapidly, enhancing its ability to bridge the gap between traditional scientific inquiry and automated, intelligent data analysis. This blog post delves into the recent advancements and solutions to challenges such as bias mitigation, hallucination issues, and workflow optimization. Discover how MARA’s integration of Retrieval Augmented Generation and tool-equipped Agents empowers scientists to engage with internal databases and execute complex informatics tasks with unprecedented ease. Learn about the new functionalities added to the MARA platform, including enhanced SQL query templates, Python snippet integration, and direct interactions via API. Perfect for enterprises, MARA ensures data security behind your organization’s IT firewall, offering a scalable, reliable solution for driving scientific innovation. Stay tuned for use cases and feedback from early adopters that underscore MARA’s role as a game-changer in the world of scientific research","drug discovery, medicinal chemistry, cheminformatics, structure-based design, LLM technologies, scientific research, data analysis, enterprise IT security, Retrieval Augmented Generation, tool-equipped Agents, API integration, Python snippets, SQL query templates, workflow optimization, scientific databases, advanced informatics workflows, large language models, web-based platform, data interaction, scientific innovation","mara-q2-2024-update-blog",{"id":501,"attributes":502},17,{"title":503,"content":504,"createdAt":505,"updatedAt":506,"publishedAt":507,"date":508,"description":509,"keywords":510,"slug":511,"category":66},"Nanome Apple Vision Pro & 2.0 Dev Blog April 2024","Thank you for being a vital member of the Nanome community! Today, we're excited to give you the first look at Nanome through the Apple Vision Pro, and unveil the much-anticipated Nanome 2.0 update.\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FpGz2uFsoU7ADygLQns\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-pGz2uFsoU7ADygLQns\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n\n\nYou can see in this brief preview that not only is Nanome a bounded volume app within VisionOS, the primary way in which the contents of the volume is controlled is through our web app. That’s right, currently, home.nanome.ai allows you to manage license details and vault.nanome.ai helps with file management. With Nanome 2.0, we're merging these two web portals into one grand home.nanome.ai, to offer unparalleled customization of your Nanome experience, all accessible via the web. Nanome 2.0 will still have in-app menus in XR to be able to do all of this too, but this new web app means you're not confined to in-app menus; you can tailor your sessions from anywhere, even outside of XR. And while we could definitely hint at a deeper integration with MARA, our AI-powered research assistant, consider this a sneak peak, and we’ll have more to share when we’re ready. We’re so incredibly excited about the possibilities of Nanome 2.0, Apple Vision Pro, and MARA. We can’t wait to share more in the future!\n\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002F0JYzwPT9JLLcqGOZ21\" width=\"480\" height=\"480\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-0JYzwPT9JLLcqGOZ21\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n\nIn this sneak peak of Nanome 2.0 using the quest 3, you can see our much revamped menu system and project organization. \n\nWe'll have more to share in the coming months, so stay tuned for more! ","2024-04-12T21:58:12.844Z","2026-03-24T17:53:29.293Z","2024-04-12T21:58:13.928Z","2024-04-12","Explore the cutting-edge advancements in computational chemistry and drug discovery with Nanome 2.0, now seamlessly integrated with Apple Vision Pro. This post offers an exclusive first look at how Nanome, a pioneer in in silico techniques and CADD (Computer-Aided Drug Design) modeling, is transforming medicinal chemistry. Discover the enhanced capabilities of our web application that merges home.nanome.ai and vault.nanome.ai for superior file and license management, accessible both within and outside of XR environments. Learn about the potential of deeper integration with MARA, our AI-powered research assistant, and stay tuned for more updates on how Nanome continues to empower drug hunters and researchers globally. Join us to see how Nanome 2.0 and the Apple Vision Pro are setting new standards in the quest for innovative drug discovery solutions.","computational chemistry, medicinal chemistry, drug discovery, in silico, CADD modeling, computer-aided drug design, Nanome 2.0, Apple Vision Pro, XR technology, web application integration, file management, license management, AI-powered research, drug hunter, pharmaceutical innovation","nanome-apple-vision-pro-and-2.0-dev-blog-april-2024",{"id":513,"attributes":514},16,{"title":515,"content":516,"createdAt":517,"updatedAt":518,"publishedAt":519,"date":520,"description":521,"keywords":522,"slug":523,"category":66},"Calcflow comes to the Apple Vision Pro","We’ve come a long way.\n\nAlmost eight years ago, we began a journey to make 3D mathematics more accessible and more fun. Working alongside a summer Vector Calculus course led by Dr. John Eggers of UC San Diego, we developed the initial pilot of what would become Calcflow, the 3D graphing calculator for virtual reality. Originally released (and still available!) for PCVR through Steam and Oculus, Calcflow brought tools to dive into vector calculus: you could plot two-variable functions and see the geometric interpretation of double integrals; you could plot 3D vector fields and examine flow lines; and the hallmark tool, you could plot parametrized surfaces and gain intuition behind the concept of a “2-variable input, 3-variable output” function.\n\n\u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fcalcflow-vision\u002Fid6479180933\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_86a24f682f.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n_Dr. John Eggers prototyping a 3D graphing calculator with Google’s Tilt Brush circa 2016._\n\nNow, eight years later, we’re excited to announce that we’re entering the new era of spatial computing by bringing you Calcflow on the Apple Vision Pro! [Available now](https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fcalcflow-vision\u002Fid6479180933) on the Vision App Store as “Calcflow Vision”, our initial v1.0 release brings you a calculator to plot two-variable functions. (We’re working on the rest!)\n\n\u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fcalcflow-vision\u002Fid6479180933\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_6b62b7f9c8.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nWe’ve come a long way - and yet, we still have so much more to offer. We hope you’ll continue to follow us as we realize our “vision” for a platform that brings 3D mathematics to life.\n\n**What is Calcflow Vision?**\n\nCalcflow Vision is our current app available for the Apple Vision Pro, and we’ll eventually be adding in all the great features from the original PCVR Calcflow - and more! Currently, Calcflow Vision provides you with an immersive interface for visualizing two-variable functions, of the form z=f(x,y). You can grab the graph and rotate it, or physically walk around and inspect it. Our calculator includes common constants and functions found in science and engineering, and we even have some preset graph models to get you started.\n\n\n\nWhether you're a student striving to grasp complex mathematical concepts or a seasoned mathematician looking to visualize intricate functions, our Calcflow Vision’s 3D graphing calculator offers invaluable insights and enhances the learning experience. By providing a visual representation of mathematical functions in a spatial context, users can better comprehend the relationships between variables and identify patterns that may not be immediately apparent in traditional 2D representations. The immersiveness of Calcflow Vision brings users a whole new level of interactivity with mathematics, giving you a fresh perspective on mathematical concepts and allowing you to gain a deeper understanding of 3D mathematics.\n\n\n\u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fcalcflow-vision\u002Fid6479180933\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_05c3497151.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n**A big Thank You to everyone who supported the initial Calcflow project**\n\nThe initial development of Calcflow could not have been possible without the support of key individuals from UC San Diego. As previously mentioned, Dr. Eggers and his Math 20E class of Summer 2016 gave us the initial ideas and testing opportunities to build out what would eventually become Calcflow. Even before then, we had help from the Physical Sciences Division through funding and mentorship.\n\nOne of our biggest champions early on was the late [Dr. Jeff Remmel](https:\u002F\u002Fmathweb.ucsd.edu\u002F~mathconf\u002Fmemorials\u002Fjeffrey-remmel\u002F), who was interested in the development of a virtual reality tool to support mathematics education. While he may no longer be with us, Dr. Remmel’s passion and commitment for mathematics education continues to drive Calcflow.\n\nProviding the funding for the grant we received in 2016 was [Dr. Richard Libby](https:\u002F\u002Fwww.linkedin.com\u002Fin\u002Frichard-a-libby-ph-d-a23535\u002F), who also saw a vision for both a VR mathematics tool and a VR molecular visualizer. Not only did he help with the initial project, but Dr. Libby continues to support Nanome Inc as one of our valued investors.\n\nWithout the support of all these individuals, Calcflow would not be where it is today. Thank you to all of our investors, advisors, and users for your support of our product and our organization.\n\n**Calcflow and Nanome - Two Peas in a Pod**\n\nYou might be familiar with Nanome Inc’s flagship product - [Nanome](https:\u002F\u002Fnanome.ai\u002F)! Nanome (the software) is our enterprise-grade XR suite built for drug discovery and design, used by both industry pharmaceutical companies and universities all over the world. And in case you haven’t seen yet, we’ve begun our journey to bring the core Nanome platform to the Apple Vision Pro - learn more about our demo app, NanoSpin, [here](https:\u002F\u002Fnanome.ai\u002Fblog\u002Fnanospin:-a-glimpse-into-nanome's-vision-for-the-future-with-apple-vision-pro).\n\nNow, you might be wondering… why do we have a math software and a chemistry software? Mathematics lays the theoretical foundations for much of the physical sciences, and the physical sciences then allow us to describe the various phenomena we see everyday. As such, Nanome’s core vision is to build the ultimate interface for engaging with STEM and providing a platform to experience math and science on a deeper level. From holding a molecule in your hand or looking at a simple paraboloid, to designing the next life-saving drug or studying flux through a vector field, Nanome seeks to make STEM more intuitive, more immersive, and more accessible to researchers and learners alike.\n\nIf you have Apple Vision Pro, be sure to [download Calcflow](https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fcalcflow-vision\u002Fid6479180933) today! \n\nWe will continue to bring over more of the features you love from the original Calcflow, like parametrized surfaces, vector fields, and even a taste of linear algebra. Stay tuned!\n","2024-03-13T03:26:03.459Z","2026-03-24T17:53:28.595Z","2024-03-13T03:31:48.829Z","2024-03-12","Introducing Calcflow on Apple Vision Pro by Nanome, a breakthrough in 3D graph visualization for both educational and industrial applications. Now available on the Vision App Store, Calcflow Vision offers an immersive platform for exploring complex two-variable functions with ease. Whether you're a student mastering mathematical concepts or a professional in need of advanced visualization tools, this app facilitates a deeper understanding of spatial data through interactive, hands-on manipulation of 3D graphs. Embrace the future of visualization technology with Calcflow Vision, designed to revolutionize STEM learning and industry practices alike.","Calcflow, Apple Vision Pro, 3D graph visualization, immersive technology, spatial data visualization, two-variable functions, STEM education, professional visualization tools, interactive 3D graphs, Nanome, educational technology, industry visualization, mathematics visualization, spatial computing, Vision App Store, advanced visualization technology, hands-on learning, STEM applications, 3D mathematical exploration, Calcflow Vision.","calcflow-comes-to-the-apple-vision-pro",{"id":525,"attributes":526},15,{"title":527,"content":528,"createdAt":529,"updatedAt":530,"publishedAt":531,"date":532,"description":533,"keywords":534,"slug":535,"category":104},"NanoSpin: A Glimpse into Nanome's Vision for the Future with Apple Vision Pro","\u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fnanospin\u002Fid6478713911\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002F2_copy_f10a70a2f9.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\nEarlier this week, we launched NanoSpin for Apple Vision Pro – A fun app that lets you spin a variety of different types of molecules. \n\nAs we’re hard at work on porting our full Nanome platform to Apple Vision Pro, we've created a lite experience that allows you to interact with molecules and chemistry in a way never before possible on any Apple device.\n\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FTUHmoWyW1s3rxlAC4g\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-TUHmoWyW1s3rxlAC4g\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n## **Step into the Sci-Fi Reality**\n\nImagine twirling a DNA molecule with the finesse of Tony Stark, surrounded by the sleek interface of the Apple Vision Pro. NanoSpin brings this science fiction dream to life, empowering you to interact with molecules like caffeine or DNA as if you were a superhero in a sci-fi blockbuster.  \n\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FSdM76tSsYnXE0qr89E\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-SdM76tSsYnXE0qr89E\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n## **Our first Bounded Volume App** \n\nUnlike existing AR\u002FVR experiences like the Meta Quest headset, the visionOS introduces an innovative approach to spatial computing by offering \"bounded volume apps.\" This unique feature allows applications to be displayed in a designated bounded cube or space, rather than occupying the full immersive environment. This design choice opens up new possibilities for multitasking and interaction within a virtual environment, making visionOS a groundbreaking spatial computing operating system for immersive applications. As a result, for the first time, Nanome presents a multitasking experience that integrates seamlessly with your daily digital interactions within Apple Vision Pro. Whether you’re browsing Safari, watching spatial videos, or reading your favorite sci-fi novel, NanoSpin allows you to have spinning molecular models at your fingertips, enhancing your digital landscape in a dynamic, interactive way. This multi-layered experience is a first-of-its-kind departure from our existing applications.\n\n\n## **A New Era for Nanome**\n\nNanoSpin is a testament to several firsts for Nanome:\n\n\n\n* Our first product compatible with any Apple device.\n* The first Nanome experience that no longer requires controllers, embracing the intuitive power of hand & eye-tracking technology.\n* A pioneering experience that signifies a new era for Nanome, one that harmonizes with Apple’s vision of seamless integration and intuitive design of visionOS.\n\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002Fb4OaQ2pyXf4fa4eGHB\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-b4OaQ2pyXf4fa4eGHB\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n\n## **The first step with Apple Vision Pro**\n\nWe’d like our community to think of NanoSpin as the enticing appetizer to the full course that is yet to come with our complete app port to Apple Vision Pro. It's a taste of the potential, a demonstration of our commitment to innovative design, and a peek into the future of scientific exploration and molecular design.\n\nBe sure to check out and [download NanoSpin](https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fnanospin\u002Fid6478713911) from the App Store! It's a small but significant step into a world where the nanoscale is as familiar to us as the devices we use every day.\n\nWe’ll have more to share when it comes to porting our full Nanome platform to visionOS later this year. Stay tuned! \n\n\u003Ca href=\"https:\u002F\u002Fapps.apple.com\u002Fapp\u002Fnanospin\u002Fid6478713911\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FScreenshot_03_08_2024_15_54_46_919cc25ff1.jpeg\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n","2024-03-08T21:33:30.975Z","2026-03-24T17:53:29.484Z","2024-03-08T21:36:23.495Z","2024-03-08","Discover NanoSpin by Nanome, a groundbreaking app launched for Apple Vision Pro, designed to revolutionize the way we interact with molecules in the fields of medicinal chemistry, computational chemistry, drug discovery, pharmaceutical R&D, and structural biology. NanoSpin offers a unique 'lite' experience, setting the stage for the full integration of Nanome's platform with visionOS. This app allows users to engage with molecular structures, like caffeine or DNA, in an unprecedented, intuitive manner, leveraging the advanced spatial computing capabilities of the Apple Vision Pro. By introducing the concept of bounded volume apps, NanoSpin facilitates a seamless multitasking experience within a virtual environment, enhancing digital interactions beyond traditional AR\u002FVR confines. A first in many regards, NanoSpin is not only compatible with all Apple devices but also introduces hands-free and eye-tracking interaction, marking a new era of immersive scientific exploration. As we look forward to expanding our Nanome platform on visionOS, NanoSpin serves as an exciting preview into the future of molecular design and scientific discovery. Download now from the App Store and step into the nanoscale world with ease and sophistication","Nanome, NanoSpin, Apple Vision Pro, molecular, chemistry, Med chem, Comp chem, Drug discovery, Pharma R&D, Structural biology, Spatial computing, chemistry, apple, research, science","nanospin:-a-glimpse-into-nanome's-vision-for-the-future-with-apple-vision-pro",{"id":537,"attributes":538},14,{"title":539,"content":540,"createdAt":541,"updatedAt":542,"publishedAt":543,"date":544,"description":545,"keywords":546,"slug":547,"category":66},"Introducing the The Nanome Scene Viewer Plugin","Written by Sheila Zipfel \n\n\u003Ca href=\"https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fsceneviewer.html\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_0f9e918602.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\nYour ability to understand and communicate structural narratives just got a beautiful boost!  Now, you can arrange your molecules into a custom gallery with Nanome’s new Scene Viewer.  Built right into Nanome’s Vault, it lets you take control of your protein models like never before.  \n\n\u003Ca href=\"https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fsceneviewer.html\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_4521631017.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\nHere's how it works\n\n\n\n* Arrange your molecules in 3D space just how you want them.  Show or hide, rotate, scale, and position them as you like for each scene you create.  \n* Create a new scene from scratch or from an existing one.\n* Add informative flair by highlighting specific regions with color and rendering choices, labels and chemical interactions. \n* Your Scene titles and descriptions are a great way to make sure your key points are captured.  When shown, they will appear as a movable placard that changes as you change scenes. \n* Easily rearrange your scenes in the Scene Deck. \n* Save and share your Scene Deck via Vault for easy presentation or self-guided asynchronous touring by your colleagues.\n* In Presentation mode, move from scene to scene without risk of changing the Scenes Deck, yet still be able to launch a new exploration from any scene.\n\n\u003Ca href=\"https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fsceneviewer.html\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_518f0489b0.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\nNanome’s Scene Viewer lets you organize and present your findings, insights, hypotheses, and designs as an impactful interactive 3D presentation.  Share your Scene Decks with colleagues and explore structures together for a truly collaborative experience.  \n","2024-01-12T00:12:08.524Z","2026-03-24T17:53:29.341Z","2024-01-12T00:12:09.743Z","2024-01-11","Enhance your molecular storytelling with 'The Nanome Scene Viewer' by Sheila Zipfel. This blog introduces Nanome's latest innovation, allowing you to craft custom 3D galleries of molecular structures. Discover how to manipulate and present protein models like never before using Nanome’s Vault. Learn to arrange molecules in 3D space, add colors, labels, and highlight chemical interactions for impactful presentations. Save and share your creations for collaborative exploration and effective communication of structural narratives. Ideal for professionals in structural biology, cheminformatics, and drug discovery, the Nanome Scene Viewer offers a new dimension in molecular visualization and collaboration. Experience the power of interactive 3D presentations and transform how you share your scientific insights.","Nanome Scene Viewer, molecular storytelling, 3D molecular gallery, structural biology, protein models, molecular manipulation, cheminformatics, drug discovery, 3D molecular visualization, interactive presentations, collaborative molecular exploration, structural narratives, chemical interactions, molecular structure arrangement, presentation tools in science, Sheila Zipfel, Nanome’s Vault, molecular scale and rotation, scene creation in chemistry, molecule labeling, 3D presentation in research, scientific communication, bioinformatics, molecular hypotheses presentation, molecular designs, molecular insights sharing, molecular findings organization","introducing-the-the-nanome-scene-viewer-plugin",{"id":549,"attributes":550},13,{"title":551,"content":552,"createdAt":553,"updatedAt":554,"publishedAt":555,"date":544,"description":556,"keywords":557,"slug":558,"category":91},"Revolutionize Your Research with the Nanome Jupyter Cookbook Plugin","**Revolutionize Your Research with the Nanome Jupyter Cookbook Plugin**\n\n![](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_95a67a778a.gif)\n\n\n**Step Into a New Era of Molecular Exploration**\n\nWelcome to a groundbreaking fusion of Python's versatility and Nanome's immersive mixed reality: the Nanome Jupyter Cookbook Plugin. This innovative platform is not just a tool; it's a gateway to transforming how you interact with molecular structures, data and algorithms by allowing you to control your collaborative XR sessions and bring in the computational techniques you care about.\n\n**What’s on the Menu?**\n\n\n\n* Limitless Python Notebooks: Open as many Python notebooks as you need, each seamlessly integrated with your active Nanome XR sessions.\n* Instant Access: Connect to your Nanome session with just a 5-digit code. It's quick, simple, and gets you right into the action.\n* Collaborative Power: Work alongside colleagues in real-time. Share code, ideas, and control the VR environment together, breaking down barriers of distance and time.\n* Unleash Your scientific creativity and take control of your algorithms.\n\nThe Nanome Plugin Cookbook is more than just a collection of tools; it's a canvas for your scientific ingenuity. Mix and match code snippets and templates to create custom plugins that cater exactly to your research needs, whether is it new ways to highlight CDR loops or to give feedback to your in-house AI models. \n\n**Here's a Sneak Peek at What You Can Create:**\n\n\n\n* Pocket Finder: Easily identify and highlight potential ligand binding pockets on proteins.\n* Molecular Dynamics Simulation: Initiate simulations and watch the magic unfold in XR.\n* Showtime!: Annotate regions, change colours, and modify surfaces with a few clicks.\n* DIY Plugin: Integrate your internal tools seamlessly into Nanome.\n* Your Laboratory, Without Limits\n\nThe Nanome Cookbook Plugin invites you to push the frontiers of molecular research. Experiment, innovate, and discover in ways you've never imagined. Ready to transform your research? Dive into the Nanome experience and cook up some molecular magic today.\n\n \\\n \\\nWhat do our researcher say:\n\n“_This cookbook allows me to connect the pythonic  tools I use in molecular simulation to XR in a seamless way so that I can literally reach out, touch and manipulate algorithms in a way that never been possible before_”. Simon J. Bennie PhD.\n","2024-01-12T00:04:38.053Z","2026-03-24T17:53:29.419Z","2024-01-12T00:06:07.396Z","Revolutionize your molecular research with the Nanome Jupyter Cookbook Plugin. This blog delves into the innovative fusion of Python and Nanome's mixed reality, transforming the approach to molecular exploration. Discover the limitless possibilities of Python notebooks, collaborative XR sessions, and the power to create custom plugins for scientific research. Explore features like ligand binding pocket identification, molecular dynamics simulation, and seamless integration with internal tools. Ideal for professionals in drug discovery, computational medicinal chemistry, and structural biology, the Nanome Plugin Cookbook is your key to a new era of molecular discovery. Read insights from Simon J. Bennie, PhD., and step into the future of molecular research.","Nanome Jupyter Cookbook Plugin, molecular research, Python notebooks, mixed reality, XR sessions, computational medicinal chemistry, drug discovery, structural biology, ligand binding, molecular dynamics simulation, cheminformatics, bioinformatics, structural activity, SAR data, custom plugins, collaborative research, virtual reality in science, Python in molecular simulation, molecular manipulation, scientific innovation, algorithm visualization, 3D molecular structures, interactive molecular studies, computational techniques in research","revolutionize-your-research-with-the-nanome-jupyter-cookbook-plugin",{"id":560,"attributes":561},12,{"title":562,"content":563,"createdAt":564,"updatedAt":565,"publishedAt":566,"date":567,"description":568,"keywords":569,"slug":570,"category":104},"Our thoughts on the up & coming Apple Vision Pro ","By Keita Funakawa\n\nAs we step into 2024, the tech world is abuzz with anticipation for Apple's [imminent launch](https:\u002F\u002F9to5mac.com\u002F2024\u002F01\u002F08\u002Fvision-pro-launching-on-february-2-pre-orders-begin-next-week\u002F) of its Vision Pro headset. At Nanome, we're not just waiting eagerly; we're preparing for a transformative shift. We wanted to share our thoughts on the Apple Vision Pro, exploring how we believe this groundbreaking headset will revolutionize the way Nanome and our users achieve more scientific breakthroughs. \n\n\u003Ca href=\"https:\u002F\u002Fwww.apple.com\u002Fapple-vision-pro\u002F\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_ded55a9a60.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n# **Passthrough Mixed Reality First Device**\n\n\u003Ca href=\"https:\u002F\u002Fvarjo.com\u002Fproducts\u002Fvarjo-xr-3\u002F\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FVarjo_products_header_mobiili_v2_8869d42ade.jpg\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nAt Nanome, we first tried passthrough mixed reality with the Varjo XR3 in May of 2021. Ever since then, we have seen the huge potential for passing through mixed reality. We were fortunate enough to launch the Meta Quest Pro (the first all-in-one passthrough color mixed reality device) in October of 2022 and have been a huge believer that MR is a crucial step for us to get the promised land that is lightweight AR glasses. On top of this, many of us at Nanome are huge fans of Apple, and use Apple products for every computing device, and we’ve been paying very close attention to all of the rumors these years. \n\nSo, saying that “we saw this coming” and “we’re glad Apple is taking this direction” are both an understatement. We were 99% certain Apple would go this route, and we’ve been preparing for this moment since we started the company in 2015.\n\n# **No controllers: eye, hand, and voice-based controlling and input device**\n\u003Ca href=\"https:\u002F\u002Fwww.uploadvr.com\u002Fapple-vision-pro-gesture-controls\u002F\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_3aa82edd67.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\nAt Nanome, we’ve been a huge believer in voice-based input since we added voice commands to Nanome in 2019. We’re glad Apple feels the same!\n\nHand tracking has historically been unreliable or fatiguing (users have to keep their wrists\u002Farms up). Based on initial impressions from the press, it seems like Apple has solved this problem by combining hand tracking with eye tracking.\n\nWe’ve been paying close attention to eye tracking and have felt that eye tracking has been underutilized and has a lot of potential. Research papers like [these](https:\u002F\u002Fwww.reddit.com\u002Fr\u002Fvirtualreality\u002Fcomments\u002F10o6o86\u002Fdemo_of_eye_and_hand_tracking_working_together\u002F) have piqued our interest in the past, but the bottleneck has always been devices that incorporate this technology that’s readily available AND an SDK that lets us easily build experiences around this type of user interaction. With Vision Pro and VisionOS, it seems to be finally here, and we’re thrilled!\n\n# **Cost: the least suprising thing about AVP**\n\u003Ca href=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage5_621b80b977.png\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage5_621b80b977.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\nApple consistently introduced new product categories at the higher echelon of the market price point (iPhone when it first launched, Pro Display XDR, Apple Watch, AirPods, M series laptops, the list goes on)\n\n_Price is actually competitive when it is compared to it’s appropriate competition_\n\nGiven the chip that’s inside the headset, it’s actually closer to a desktop computer + VR system ($2k+)\n\n\n\n* Approximately ~ 5 to 9x more powerful than current all-in-one headsets\n* (comparing Apple M2 chip vs XR2 chip GeekBench scores[[1](https:\u002F\u002Fbrowser.geekbench.com\u002Fv6\u002Fcpu\u002F1649934)][[2](https:\u002F\u002Fbrowser.geekbench.com\u002Fv6\u002Fcpu\u002F1460964)])\n\nIn addition to the chip, there’s literally only one other passthrough mixed reality solution that’s PC powered, which is the Varjo XR3, and it’s ($8k along for the headset with a $2k\u002Fyr subscription and requires a $2k+ PC = totaling $12k starting point)\n\n# **On VisionOS, the Software:**\n\n## **Actual Spatial Computing vs Passthrough as a Skybox**\n\nAlthough Passthrough Mixed Reality is available on Quest devices, it is not “true” spatial computing. As a simple example, when a real-life object comes between the user and a virtual object, the physical object doesn’t occlude the virtual object. This creates a weird sensation and dissonance to the user as it’s unclear which objects ought to be where. \n\nBased on the WWDC keynotes, it [seems](https:\u002F\u002Ftwitter.com\u002FDylanMcD8\u002Fstatus\u002F1670958710401671168) like Apple will be supporting actual occlusion and taking into consideration the dynamic between physical objects and virtual elements. We strongly believe this will lead to richer and more comfortable mixed reality experiences. \n\n## **No avatars is the best avatars**\n\u003Ca href=\"https:\u002F\u002Fwww.teamblind.com\u002Fpost\u002FApple-vision-pro-VS-meta-render-SysTtnXR\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_fc9e6733cf.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\nUsers tend to get decision fatigue when they have to choose which exact nose angle, eyebrow shape, and hairstyle best represents them out of thousands of possible combinations\n\nVisionOS and the headsets completely sidestep this issue by using the headset to scan a virtual version of yourself. \n\n## **No more manual guardian setup**\n\nBefore the Quest 3 came along, anyone using Meta devices had to manually sketch out a safety zone to keep from bumping into stuff in the real world. With the launch of the Quest 3, Meta now has an automatic boundary drawing feature. But, if your room's a bit messy, it only marks out a small safe area. So, you might find yourself getting these constant reminders that you're about to step out of your virtual safety net.\n\nThis has been a huge point of feedback from our users that we haven’t been able to address, so we’re very glad hardware manufacturers like Apple and Meta are making progress on this. \n\nexample [here](https:\u002F\u002Ftwitter.com\u002FShapesXR\u002Fstatus\u002F1671872786237911046)\n\n## **All the productivity apps, finally!**\n\u003Ca href=\"https:\u002F\u002Fwww.imore.com\u002Fapps\u002Fbest-apple-vision-pro-apps\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_f0cd510c4d.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nI’ve personally been disappointed with Meta when it comes to the lack of default\u002Fout-of-the-box productivity apps (like calculator, calendar, email, slack, etc). They’ve announced they’re adding 2D apps like[ slack and instagram](https:\u002F\u002Fwww.androidcentral.com\u002Foculus-adds-2d-apps-slack-and-instagram-quest-2-today) but it requires the user to search for it on the store or sideload it. On launch, Apple Vision Pro is slated to be compatible with all iPad and iOS apps, which is a huge leap forward in this aspect, AND VisionOS looks to have similar default apps that are in iOS\n\n## **\"[SharePlay](https:\u002F\u002Fdeveloper.apple.com\u002Fvideos\u002Fplay\u002Fwwdc2023\u002F10087\u002F)\" ?**\n\u003Ca href=\"https:\u002F\u002Fdeveloper.apple.com\u002Fvideos\u002Fplay\u002Fwwdc2023\u002F10087\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage6_46e535ba54.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nMeta is calling this feature [“shared spatial anchors”](https:\u002F\u002Fdeveloper.oculus.com\u002Fblog\u002Fbuild-local-multiplayer-experiences-shared-spatial-anchors\u002F) and is slated to be available on meta devices soon.\n\nSharePlay or “shared spatial anchors” is basically to ensure if there are multiple users in the same location that, the location of each virtual avatar and object are synced such that you don’t have to manually try to overlap and ensure that when you’re physically pointing to something your virtual avatar isn’t pointing to something else.\n\nThere haven’t been explicit mentions from Apple that Share Play will not be available at launch, so unlike Meta’s devices, this seems like developers will be able to make SharePlay experiences already at or near the launch of the device. \n\n## **iCloud, AirDrop, iMessage, all the things!**\n\nGetting files onto Quest devices is a cumbersome process. At Nanome we had to build features around this like [QuickDrop](https:\u002F\u002Fdocs.nanome.ai\u002Fintegration\u002Fquickdrop.html) and the [Vault](https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fvault.html) Plugin in order to enable our users to seamlessly leverage their files and data from their normal work environments (PC\u002FCloud\u002Fetc).\n\nAs a heavy Apple user, I can’t wait to have all the things from my iCloud auto sync, send and receive Airdrops to others nearby, and reply to iMessages in the app. This alone will get both our users and myself in XR far more often. Although there hasn’t been confirmation on to what extent iCloud will be compatible, having iMessages, notes, and mail-in their launch screenshots makes me extremely hopeful that most of iCloud will be available within the headset at launch. \n\n## **Things I look forward to evaluating upon launch:**\n\n## **Comfort:** \nthe materials used make it _look_ very heavy. Were they able to properly balance the weight? This has been an issue with some of our users' Quest devices.\n\n## **EyeSight & Reverse passthrough:** \nthe unique feature where other people in the same physical location as the user are able to see the user’s eyes is something I’m excited to try. It could be a miss, but I’m hopeful that it’ll “just work” as Apple usually does.\n\n## **2024: The Year of Apple Vision Pro**\nApple's Vision Pro marks a significant leap forward in the realm of mixed reality, aligning with our vision at Nanome of a more immersive and intuitive future. Its blend of cutting-edge technology and user-friendly features, like seamless integration with Apple's ecosystem, will hopefully set a new benchmark in the industry. \n\nKeep an eye on Nanome as we're gearing up to unveil a thrilling new update, designed to leverage the latest advancements in mixed reality technology. While we don’t have too much to share for now, future updates are poised to significantly enhance user experience, offering deeper immersion and more intuitive interactions. Be sure to stay connected for upcoming announcements – we're excited to elevate your experience with the newest innovations in the MR landscape to unlock more scientific breakthroughs!\n","2024-01-06T01:35:10.040Z","2026-03-24T17:53:29.119Z","2024-01-10T19:51:04.217Z","2024-01-10","Discover the revolutionary impact of Apple's Vision Pro on mixed reality (MR) and scientific innovation in our latest blog post at Nanome. Dive into our detailed analysis of the Vision Pro's advanced features, including eye, hand, and voice-based controls, enhanced spatial computing, and seamless integration with the Apple ecosystem. Explore how this groundbreaking headset is set to redefine the boundaries of technology and productivity in MR, and how Nanome plans to leverage its capabilities for scientific breakthroughs.\n\n","Apple Vision Pro, Mixed Reality Headset, Nanome and Apple Vision Pro, Spatial Computing, VisionOS, MR Technology, Advanced Mixed Reality, Apple Ecosystem Integration, Passthrough Mixed Reality, Vision Pro Features, Scientific Breakthroughs in MR, Hand Tracking Technology, Eye Tracking in VR, Voice-Based Controls, Apple Mixed Reality Experience, Virtual Reality Innovations, Productivity Apps in MR, Future of Mixed Reality, Vision Pro and Scientific Research, Nanome Updates, Drug Discovery, Computational Medicinal Chemistry, Structural Biology, SAR Data Analysis, Cheminformatics, Bioinformatics, Structure-Activity Relationship, Molecular Modeling, Drug Design, Protein-Ligand Interactions, Pharmaceutical Research, Virtual Screening, Molecular Docking, Biological Data Analysis, Medicinal Chemistry Innovations, Biotechnology in MR, Computational Biology, Virtual Lab Technology, Drug Development, Protein Structure Analysis, Genomic Data Analysis, Biomedical Research in MR","our-thoughts-on-the-up-and-coming-apple-vision-pro",{"id":572,"attributes":573},11,{"title":574,"content":575,"createdAt":576,"updatedAt":577,"publishedAt":578,"date":579,"description":580,"keywords":581,"slug":582,"category":104},"Reflecting on 2023, and looking forward to 2024","By Keita Funakawa\n\nWhat a year this has been! If 2020 - 2021 were the pandemic lockdown years and 2022 was the transition year, 2023 definitely felt like the first “normal” year when it came to lockdowns, travel, and other things we took for granted before the lockdowns in 2020. With the national emergency for COVID-19 ending in April, this was the first year that I was able to look back at the pandemic and think that it was a “past” event. \n\n**2023: The first full year of MR and LLMs**\n\n**The changing landscape of XR:**\nThis was also the first full year in which a prosumer-grade passthrough mixed reality headset, the Quest Pro, was commercially sold. At Nanome, we were excited to participate in the launch with Meta. Although, admittedly, press reviews of the device were mixed, it marked a key transition year from purely VR-focused to Mixed Reality. With the introduction of the first consumer-graded Mixed Reality headset with the Meta Quest 3 and the seemingly opposite reaction from the press (excellent reviews), mixed reality is here to stay. \n\n\u003Ca href=\"https:\u002F\u002Fwww.theverge.com\u002F23451629\u002Fmeta-quest-pro-vr-headset-horizon-review\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage5_025cf82265.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n\u003Ca href=\"https:\u002F\u002Farstechnica.com\u002Fgaming\u002F2023\u002F10\u002Fmeta-quest-3-review-mixed-reality-version-0-5\u002F#:~:text=The%20Quest%203%20offers%20distinct,charging%20dock%2C%20etc.\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_e8cf24fdfc.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nOf course, it’s impossible to mention mixed reality and 2023 without the announcement of the Apple Vision Pro. Apple, the largest company ever to exist, is getting into the XR space in a serious way. I’m thrilled when the device arrives in users' hands in 2024. \n\n\u003Ca href=\"https:\u002F\u002Fwww.cnbc.com\u002F2023\u002F06\u002F06\u002Fapple-vision-pro-hands-on-first-impressions-from-wwdc-2023.html\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage7_94c4d4abcf.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n**As a large language model…**\n2023 also marked the first full year in which large language models such as chatGPT and Stable Diffusion were readily accessible to anyone. At Nanome, we’ve always been passionate about building the ultimate interface for science and have been huge believers that AI will help make this happen (hence the nanome.ai domain name since 2015). We’ve always had a few R&D projects that we’ve experimented with, and one of them evolved into [MARA](https:\u002F\u002Fnanome.ai\u002Fmara\u002F). If you haven’t yet, be sure to check out our landing page and webinar! \n\n\u003Ca href=\"https:\u002F\u002Fwww.businessinsider.com\u002Fchatgpt-openai-sam-altman-ai-year-review-2023-12\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_2debb49b05.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\n**Looking forward to 2024**\n\nAt Nanome, we’re looking forward to three main things in 2024: \n\n**1) Early Access Testing for Nanome 2.0**\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_000ab3a289.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n(Concept Design)\n\nIn the past two years, we’ve been hard at work on Nanome 2.0. We have an exciting vision for Nanome 2.0, and we can’t wait to get this into more users for feedback and testing. Let us know if you’d like to participate in the early access testing.\n\n**2) Launch of Apple Vision Pro**\n\u003Ca href=\"https:\u002F\u002Fwww.macrumors.com\u002F2023\u002F12\u002F25\u002Fvision-pro-mass-shipments-timing-kuo\u002F\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage6_996cd1f9ce.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nWhenever there’s a groundbreaking new headset, we get excited about how we can delight our users with the latest tech available in these headsets. Whether it’s eye-tracking-based input, seamless integration with other Apple devices and the rest of the Apple ecosystem, or incredibly high-resolution displays, we’re thrilled that Apple is finally entering the XR industry. This is the moment that the entire XR industry has been waiting for, and it’s happening in 2024!!\n\n**3) Rolling out MARA to more users**\n\u003Ca href=\"https:\u002F\u002Fnanome.ai\u002Fmara\">\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_493312198b.png\" style=\"max-width:50%; height:auto;\">\u003C\u002Fa>\n\nWe’re currently in early alpha testing with select users for MARA. We aim to add many features and prepare MARA for a broader audience in 2024. With advances in LLMs showing no signs of slowing down, we’re thrilled about what’s in store for MARA in 2024! \u2028\u2028\n\nOf course, it’s not just these three things that excite us for 2024; it's the fact that all three are coming in 2024, and the Nanome experience will be completely redefined with the combination of these three things that are coming together. It’s always been a part of our vision to bring real-life JARVIS to actual scientists, and 2024 feels like a pivotal year to bring all of this to life. Thank you to our users, our team, and our community. We appreciate your continued trust in our company and look forward to realizing our mission of pushing humanity forward with more scientific breakthroughs.\n","2023-12-27T22:20:33.183Z","2026-03-24T17:53:28.517Z","2023-12-27T22:20:58.876Z","2023-12-27","Explore Nanome's 2023 reflections and 2024 projections in this insightful blog post. Discover the advancements in Mixed Reality (MR), the launch of groundbreaking devices like Meta Quest Pro and Apple Vision Pro, and the integration of Large Language Models (LLMs) in Nanome's innovative projects. Learn about the exciting developments in Nanome 2.0, the anticipated launch of Apple Vision Pro, and the expansion of MARA to a broader audience. Join us in looking forward to a transformative 2024, as we merge cutting-edge technology with science to create a real-life JARVIS for scientists.\n\n","Nanome 2023, Mixed Reality, MR, Meta Quest Pro, Apple Vision Pro, Large Language Models, LLMs, Nanome 2.0, XR industry, Virtual Reality, AI in science, MARA, Apple ecosystem, XR space, 2024 technology trends, scientific breakthroughs, prosumer-grade headset, Nanome blog, ChatGPT, Stable Diffusion, XR advancements, scientific interface, tech innovations, XR devices, early access testing.","reflecting-on-2023-and-looking-forward-to-2024",{"id":584,"attributes":585},10,{"title":586,"content":587,"createdAt":588,"updatedAt":589,"publishedAt":590,"date":591,"description":592,"keywords":593,"slug":594,"category":66},"Introducing MARA: A Paradigm Shift in AI-assisted Scientific Informatics Workflow Orchestration","# Nanome's Groundbreaking Enterprise Platform: MARA\nIn case you didn’t get a chance to join our [webinar](https:\u002F\u002Fmeet.nanome.ai\u002Fwebinars), we're excited to unveil MARA (Molecular Analysis and Reasoning Assistant), a revolutionary tool in the realm of biopharma research and development. MARA is an AI-driven platform that enhances how scientists engage with data and conduct research. \n\n![mara.e6548e3.png](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fmara_e6548e3_8d73f2d806.png)\n\n# Why MARA is a Game-Changer \nMARA combines the best of both worlds - advanced AI capabilities and the flexibility to suit every laboratory's unique needs.\n\n\n\n* AI-Powered Scientific Informatics: Leveraging large language models, MARA offers insightful, conversational responses and is adept at executing complex cheminformatics tasks. Its integration capabilities allow it to work with custom molecular simulation tools, databases, and more.\n* Designed for Enterprise: MARA is an entirely offline solution that can safely be deployed and monitored behind any life science organization’s IT firewall. Its adaptability is designed to make it very easy to integrate internal informatics tools, thus giving every company its own AI platform with its knowledge and tools at every scientist’s fingertips.\n* Scientific Accuracy and Trust: MARA stands out for its accuracy and reliability because the organization can define the knowledge of the system and even augment the reasoning and workflows without relying on the bias and incorrect information from pre-trained LLMs.\n* Future Integration and Expansion: MARA's future integration into Nanome’s XR (AR\u002FVR) platform and expansion to cover other scientific disciplines like genetics and material science is a testament to its potential and adaptability.\n\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FipB48n6TvU82TpBrlp\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-ipB48n6TvU82TpBrlp\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n\n### **Looking ahead**\n\nOne of the Nanome XR platform's most celebrated features is its voice command capabilities, which have significantly enhanced the user experience by enabling intuitive and efficient interaction within the virtual environment. Looking ahead, the integration of MARA with the Nanome XR platform's advanced features, including voice commands, eye\u002Fgaze tracking, and gesture recognition, is an exciting development we're eagerly anticipating. This combination will enhance the platform's intuitiveness and enrich the user experience with a deeper level of interaction and contextual understanding. By leveraging the full spectrum of XR capabilities, MARA will respond to researchers more dynamically and informatively, offering a highly immersive and interactive environment for scientific exploration and discovery. This synergy promises to unlock new possibilities in how scientific data is interacted with and analyzed, pushing the boundaries of research and development.\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002Fb8JeRzqTQoiaAcn1Co\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-b8JeRzqTQoiaAcn1Co\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n\n### **MARA's Impact and Reception**\n\nWe’re currently in early talks to get MARA into labs at leading pharma companies, with initial feedback highlighting its potential ability to amplify the use of data and informatics tools and ultimately better inform scientific research. Initial reception has been overwhelmingly positive. MARA enables every scientist to have data scientist expertise alongside the much more accessible methods to interact with real-time data or informatics systems. Since knowledge and reason can be augmented, organizations can see the potential of encoding their scientists’ context and decision-making into an intelligent system. With that, the possibilities are endless.\n\n\n### **Join the MARA Revolution**\n\nEmbrace the future of scientific research with MARA. Request early access to this trailblazing platform and equip your team with a powerful AI co-pilot in your journey of discovery and innovation.\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002F0EXwY6uaCkPfxfrN6X\" width=\"480\" height=\"480\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-0EXwY6uaCkPfxfrN6X\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n### **Creating Jarvis, For Real**\nAt Nanome, the vision of creating a real-life JARVIS for scientists has always been a driving force behind our innovations. The introduction of MARA marks a significant step towards realizing this ambitious goal. By embodying the capabilities of a highly intelligent, responsive, and interactive assistant, MARA represents the next leap forward in our journey. It's not just about providing advanced tools; it's about crafting an intuitive, AI-powered companion that can anticipate and respond to the unique needs of researchers. This exciting development is a major milestone in our commitment to revolutionizing the scientific research landscape, bringing the once-futuristic idea of a personalized scientific JARVIS into today's reality.\n\nVisit the official landing page:[https:\u002F\u002Fnanome.ai\u002Fmara\u002F](https:\u002F\u002Fnanome.ai\u002Fmara\u002F)\n\nCoverage by VentureBeat: [https:\u002F\u002Fventurebeat.com\u002Fai\u002Fmolecular-science-vr-startup-nanome-launches-ai-copilot-mara\u002F](https:\u002F\u002Fventurebeat.com\u002Fai\u002Fmolecular-science-vr-startup-nanome-launches-ai-copilot-mara\u002F)","2023-12-07T21:57:47.717Z","2026-03-24T17:53:29.730Z","2023-12-07T21:59:21.233Z","2023-12-07","Explore Nanome's MARA, a groundbreaking AI platform revolutionizing drug discovery and medicinal chemistry. This blog introduces MARA (Molecular Analysis and Reasoning Assistant), a cutting-edge tool designed to enhance the scientific informatics workflow in biopharma research. With its advanced AI and computational chemistry capabilities, MARA is a game-changer in drug discovery, offering personalized, conversational insights for complex cheminformatics tasks. As an adaptable, offline enterprise solution, it ensures secure, tailored integration in each lab. MARA's precision and reliability stand out, free from biases of pre-trained LLMs, and it's set for expansion in XR (AR\u002FVR) platforms and diverse scientific fields. Anticipate a new era of interactive, immersive scientific research and development with MARA, embodying Nanome's vision of a real-life JARVIS for scientists. Dive into this transformative platform on Nanome's official page and explore its features, as reported by VentureBeat.","LLM, Drug Discovery, Computational Chemistry, Chemistry, Medicinal, ChatGPT, Enterprise, MARA, JARVIS","introducing-mara:-a-paradigm-shift-in-ai-assisted-scientific-informatics-workflow-orchestration",{"id":596,"attributes":597},9,{"title":598,"content":599,"createdAt":600,"updatedAt":601,"publishedAt":602,"date":603,"description":604,"keywords":605,"slug":606,"category":153},"Exploring GABA in VR - Common therapeutics can reverse pathological sleep conditions","### GABA, Valium, and Xanax\n\nGABA (Gamma-Aminobutyric Acid) is the main inhibitory neurotransmitter in the mammalian central nervous system. It is a small molecule that reduces the activity of the neurons it binds to. \n\nGABA binds two types of receptors, [GABA-A and GABA-B](https:\u002F\u002Fwww.eurekaselect.com\u002Farticle\u002F70342). GABA-A receptors act by directly controlling the flow of ions across the cell membrane. GABA-B receptors activate intracellular signaling pathways that ultimately lead to changes in the cell's function.\n\nMany drugs that affect GABA signaling have important medical applications. For example, [benzodiazepines](https:\u002F\u002Fwww.ingentaconnect.com\u002Fcontent\u002Fascp\u002Ftcp\u002F2013\u002F00000028\u002F00000009\u002Fart00001) like Valium and Xanax enhance the activity of GABA-A receptors, leading to anxiolytic, sedative, and anticonvulsant effects. [Barbiturates](https:\u002F\u002Fwww.eurekaselect.com\u002Farticle\u002F66259) also enhance GABA-A receptor activity but have a higher risk of overdose and dependence. GABA-B agonists like Baclofen are used to treat spasticity and muscle rigidity in conditions like multiple sclerosis and spinal cord injury.\n\n![GABA receptor visualized in Nanome, PDB 6DW0.png](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FGABA_receptor_visualized_in_Nanome_PDB_6_DW_0_fa0768edd1.png)\n\u003Cbr>\n \n### An unexpected effect\nGABA-A receptor negative allosteric modulators (GABAaRNAMs) are a class of compounds that bind to a specific site on the GABA-A receptor and can increase arousal and seizures.  \n\nScientists at the University of Saint Joseph (CT) found out that pathological sleep in humans can be reversed with a [benzodiazepine antagonist and a macrolide antibiotic](https:\u002F\u002Fonlinelibrary.wiley.com\u002Fdoi\u002F10.1002\u002Fana.24459), which led them to wonder: what other drugs normally associated with sleep disruption can modulate arousal?  \n\nThey screened [11 compounds](https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F36830736\u002F) that have been reported to influence arousal. The goal was to see if these compounds shared a GABA-related mechanism, and if it could be leveraged to improve arousal in patients suffering from diseases associated with excessive daytime sleepiness.\n\nThe computational modeling of modulator–receptor interactions predicted drug action at canonical binding sites and novel orphan sites on the receptor. \n\nThe 11 compounds had different chemical structures, yet they all modulated GABA-A receptor activation and had the ability to impair arousal in humans. From a structure–activity perspective, this may seem counterintuitive. \n\nModeling data provided an insight into this paradox, and showed how the GABA-A receptor can act as a connection for these compounds by binding them at multiple sites that in turn impair receptor activation, translating the binding of disparate molecular structures into changes in a common pathway. \n\nThe findings of the study suggest that multiple avenues are now open to investigate large and brain-penetrant molecules for the treatment of patients with an Excitation\u002FInhibition imbalance.\n\n![Common therapeutics involved inarousal inhibit GABAaR activation](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FCommon_therapeutics_involved_inarousal_inhibit_GAB_Aa_R_activation_42e4777b4f.png)\n\u003Cbr>\n\n### Exploring the GABA receptor in VR\nProfessor Asher Brandt from the University of Saint Joseph walked Nanome’s scientist Daniel Gruffat on a journey to discover the GABA-A receptor structure for the first time in the history of Nanome’s YouTube channel!\n\nIn [this video](https:\u002F\u002Fyoutu.be\u002FEsLZceJ4CmQ) they go through the oligomeric structure of the receptor and then focus on different potential binding sites. Prof. Brandt shows us that GABA shares the same binding site as small antibiotics, while larger ones are likely to target a different region.\n\nDuring the session, Nanome’s [docking plugin](https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fdocking.html) came in handy for exploring several docking poses of Pentylenetetrazol, one of the compounds evaluated in the study.\n\n![Scientists visualize the GABA receptor in Nanome](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FScientists_visualize_the_GABA_receptor_in_Nanome_789949b0b5.jpg)\n\n[Watch the full video here: https:\u002F\u002Fyoutu.be\u002FEsLZceJ4CmQ](https:\u002F\u002Fyoutu.be\u002FEsLZceJ4CmQ)\n\u003Cbr>\n\n#### References:\n_Kaplan et al (2023). Commonly Used Therapeutics Associated with Changes in Arousal Inhibit GABAaR Activation. Biomolecules. 15;13(2):365. doi: 10.3390\u002Fbiom13020365._\n","2023-05-17T09:04:14.418Z","2026-03-24T17:53:28.475Z","2023-05-17T09:04:18.207Z","2023-05-17","Important drugs such as Valium and Xanax function by affecting GABA (Gamma-Aminobutyric Acid), the main inhibitory neurotransmitter in the mammalian central nervous system.\n\nProfessor Asher Brandt from University of Saint Joseph met Nanome scientists in virtual reality for a guided journey through the latest research on how the GABA-A receptor impacts pathological sleep in humans.\n\nTheir session is now available on Youtube!","GABA, receptor, GABA-A, Nanome, neurotransmitter, pathological sleep","exploring-gaba-in-vr-common-therapeutics-can-reverse-pathological-sleep-conditions",{"id":608,"attributes":609},8,{"title":610,"content":611,"createdAt":612,"updatedAt":613,"publishedAt":614,"date":615,"description":616,"keywords":617,"slug":618,"category":66},"v1.24.4 Patch Release Blog and Data Table Plugin Update","Collaborative work has become increasingly important in a variety of fields, from scientific research to design and engineering. However, working together remotely or in hybrid settings can be challenging, especially when it comes to XR environments. With the latest Nanome v1.24.4  patch (coming soon), the platform has become even more user-friendly and streamlined for collaborative work. The new patch focuses on reducing the friction involved in setting up group sessions and loading content, making it easier and quicker than ever before to get started with collaborative work in Nanome. This improved functionality is set to revolutionize the way teams work together in XR, opening up new possibilities for research, design, and innovation. In this blog post, we'll explore the key improvements in the latest Nanome patch and how they will benefit users.\n\n **Features that make room setup dramatically faster and easier** \n- **Quick Drop for Enterprise**\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FYui7R7cw8HUNoIo4zB\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-Yui7R7cw8HUNoIo4zB\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nMany of you may have tried the new home.nanome.ai feature, Quick Drop. To recap- Quick Drop allows users to drag and drop files onto your web browser outside of Nanome to instantly load workspaces or structures in Nanome when you put your headset on. This feature is now available to all enterprise users with on-prem or private cloud hosted instances of Nanome. To learn more, be sure to read our blog post that covers all the details about new features for home.nanome.ai here. This requires a new service to be installed for enterprise customers, please contact our support team to schedule a deployment time.\n- **Pymol Plugin: Send to Nanome**\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FSj9FdjNRtV3i8b7kyB\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-Sj9FdjNRtV3i8b7kyB\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nAnother major update we announced in our blog post last week was a Pymol Plugin for Nanome. We cannot emphasize how much of a game changer this is for a seamless workflow between pymol and Nanome. With Quick Drop now available to enterprise users, this is also something enterprise users can take advantage of. To lean more, be sure to read our blog post that covers all the details about new features for home.nanome.ai here.\n\n**Features that make running a group session much smoother**\n- **Room Auto-Join**\nThrough a license configuration users can now be sent to a specific room whenever they start Nanome. This is great for project teams or dedicated XR conference rooms to effortlessly get into meeting spaces tailored to their needs without needing to navigate the Lobby.  With the auto-rejoin feature eliminates the need for attendees to spend time navigating menus and loading content. This not only saves time but also reduces the likelihood of users becoming disengaged or choosing the wrong room on accident. Additionally, the auto-rejoin feature ensures that sessions aren't disrupted by users joining or leaving, which can be a major source of frustration and distraction. With Nanome's auto-rejoin feature, sessions can flow smoothly, allowing participants to focus on the task at hand and collaborate more effectively. The result is a more efficient, productive, and enjoyable collaborative work experience that can benefit teams across a wide range of fields. This requires our support team to make the custom configure ation so please contact our support team to schedule setting up this feature. \n\n- **Room Auto-Rejoin**\nBefore this update, if you took off an all-in-one headset during a group project, you might have lost all your work and had to start over. But now, with this cool feature, the headset can automatically put you back in the same room with all the content you had before. So you can take a break, grab a coffee, and come back to pick up where you left off without any hassle. This is a huge win for all-in-one devices like Vive Focus 3. This is a game changer for Vive Focus 3 and other all-in-one devices that go into an idle mode.\n\n**Fixes that make the shared space more comfortable**\n- Users can now have a default setting when joining a room to have the locomotion\u002Fedit position state unlocked, enabling them to easily move and get settling in amongst their colleagues in the space.  Or, they may choose to keep it locked by default, to avoid accidental teleportation. This is a license configuration and needs to be set by the support team for your organization\n- Fixed an auto-height calibration problem where some users were located in or below the floor level.\n\n**The Nanome Data Table Plugin**\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FK8WzcmoIUHSgP9NjwZ\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-K8WzcmoIUHSgP9NjwZ\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nWith the latest update, the Data Table Plugin now contains a 2D chemical drawing tool embedded into the table, opening up many new options and simplifications to building new molecules!\n- Create new molecules by simply editing a molecule in the table, for fast analog ideation\n- Use rapid 2D copy\u002Fpaste\u002Fedit tools to sketch multiple molecules in the sketcher and see them all as new rows in the table\n- View new molecules instantly in XR with 3D conformations\n- New molecules can be aligned in 3D space with any molecule in the table for easy spatial organization into groups.\n- Optimize the creative flow by using 2D drawing aids like clean up structure, fixed lengths and angles, and templates or by using 3D Nanome builder tools like libraries, the mover tool and importing files\n- Quickly create 3 dimensional stereoisomers with a quick hash\u002Fwedge in 2D\n- Make and break bonds in the 2D sketcher to quickly generate new, single entry molecules.\n- Now allows users to add multiple entries to the Data Table, so additional molecules can be readily added to the table\n\nv1.24.4, makes it even easier to collaborate remotely in XR environments.  With these improvements, working together on projects is smoother and more comfortable than ever before. Get ready to revolutionize the way you make breakthroughs  with Nanome!\n\n","2023-05-04T21:32:40.170Z","2026-03-24T17:53:28.095Z","2023-05-04T21:33:54.569Z","2023-05-04","Collaborative work has become increasingly important in a variety of fields, from scientific research to design and engineering. However, working together remotely or in hybrid settings can be challenging, especially when it comes to XR environments. With the latest Nanome v1.24.4  patch, the platform has become even more user-friendly and streamlined for collaborative work. The new patch focuses on reducing the friction involved in setting up group sessions and loading content, making it easier and quicker than ever before to get started with collaborative work in Nanome. This improved functionality is set to revolutionize the way teams work together in XR, opening up new possibilities for research, design, and innovation. In this blog post, we'll explore the key improvements in the latest Nanome patch and how they will benefit users.\n","Nanome, drug discovery research, Pymol, plugin, CryoEM, Antibody Representation, Conformer Generator, protein engineering, drug discovery, small molecule, macromolecule, SMILES, Quick Drop, virtual reality, augmented reality, mixed relaity, molecular visualization, collaboration, molecular modeling, maestro, ccg, moe, dotmatics ","v1.24.4-patch-release-blog-and-data-table-plugin-update",{"id":620,"attributes":621},7,{"title":622,"content":623,"createdAt":624,"updatedAt":625,"publishedAt":626,"date":627,"description":628,"keywords":617,"slug":629,"category":66},"April 2023 Nanome Plugins & home.nanome.ai Updates","At Nanome, we're committed to providing cutting-edge tools that enable researchers and scientists to explore their data in unprecedented ways. As we prepare for the highly anticipated release of Nanome 2.0, we're thrilled to unveil some of the powerful new plugins and features that will be available in our current v1.24. Whether you're working on drug discovery, protein engineering, or any other scientific project, these tools are designed to help you unlock new insights and drive breakthrough discoveries. You may have already seen this in Nanome, but in case you haven’t, here are some of the amazing new plugins and new home.nanome.ai features\n\n1 **Pymol Plugin For Nanome**\n2 **Nanome Plugins:**\n- CryoEM\n- Antibody Representation\n- Conformer Generator \n- Quality of life plugins (Coordinate Align, Merge as Frames, SMILES Loader)\n\n3 **home.nanome.ai  Features**\n- Chat Function (Beta) \n- Quick Drop\n\n**Pymol Plugin For Nanome:**\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FSj9FdjNRtV3i8b7kyB\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-Sj9FdjNRtV3i8b7kyB\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\nA first of its kind, this plugin offers the option to Pymol users to send their current session to Nanome directly from Pymol.\n\nHow to:\n- Install the Pymol plugin (Pymol Plugin Manager → Install New Plugin tab → Install from PyMOLWiki or any URL → https:\u002F\u002Fnanome.ai\u002FPymolToNanome.py → Fetch)\n- Open the plugin (Plugin → View in Nanome XR), input Nanome login credentials\n- Click “Send to Nanome” button and view the result in Nanome or in https:\u002F\u002Fhome.nanome.ai\u002Fdashboard \n\nCode:\nhttps:\u002F\u002Fgithub.com\u002Fnanome-ai\u002FPymol-plugin-Nanome-Quickdrop \n\n**CryoEM:**\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage4_3c5833c2b6.png\"  width=\"60%\" height=\"30%\">\n- Load map file from EMDB and align it to its corresponding RCSB model\n- Adjust color schemes and opacity of map.\n- Use a viewport editor to show only certain portions of the map.\n- We’re thrilled to finally enable Nanome users to analyze cryo-em maps\n\n**Antibody Representation:**\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_a2f405a547.png\" width=\"60%\" height=\"30%\"> \u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage1_58a77bb6ef.png\" width=\"60%\" height=\"30%\">\n- The Antibody Representation plugin uses visual schemes to help users see the antibody regions and features of interest, alone or in complex with antigens.\n- Adds the IMGT color scheme, labels antibody regions, and highlights CDR loops. \n- A table of Antibody Regions allows independent selection of CDR loops for calculations or representational changes. \n- See what’s what in your Antibody- Antigen complex.  Only antibodies and fragments of antibodies are modified; the representation of antigens in the complex is unchanged. \n\n**Conformer Generator:**\n![](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage9_39bf160d8a.png)\n- Generate conformers for small molecules and sort them by energy.\nSelect 50, 100, or 200 conformers\n- Sort by minimized energy or RMSD to original molecule\nOptionally lock entry and result\n\n**Quality of life plugins (Coordinate Align, Merge as Frames, SMILES Loader):**\n- Coordinate Align\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage8_05c799dbaf.png\"  width=\"60%\" height=\"30%\">\n\u003Cimg src=\"https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage11_fcf30cd8aa.png\"  width=\"60%\" height=\"30%\">\nOverride the 3D coordinates of a complex to be relative to a reference complex\n\n- Merge as Frames\n![](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage6_9691c2bdc3.png)\nA Nanome plugin to merge multiple entry list small molecule ligands into a single entry with multiple frames. Frames use the small molecule's local coordinate space. Ideal for creating multi- model SDFs.\n\n- SMILES Loader\n![](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage12_8a0a92c4c3.png)\n![image3.png](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage3_f52dc4062c.png)\nA Nanome Plugin to load from SMILES string using RDKit\n\n**Chat Function (Beta):**\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002Fj4Y6zAu68CrsbTsAt8\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-j4Y6zAu68CrsbTsAt8\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n- Nanome Chat allows you to chat with accepted contacts on home.nanome.ai.\n- You can send images and files, and create group chats with multiple users. \n- Chat with accepted contacts\n- Send images and files\n- Eventually this will be incorporated into the Nanome app itself.\n\n**Quick Drop:**\n\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FYui7R7cw8HUNoIo4zB\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003Cp>\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-Yui7R7cw8HUNoIo4zB\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\n- Quick Drop allows you to drag and drop files onto your web browser outside of Nanome to instantly load workspaces or structures in Nanome when you put your headset on.\n- Available on home.nanome.ai\u002Fdashboard\n- Upload up to 100MB of files\n- Auto create room with files loaded when you open Nanome\n","2023-04-27T19:16:04.607Z","2026-03-24T17:53:27.727Z","2023-04-27T19:27:15.033Z","2023-04-27","Discover the power of Nanome with its new innovative plugins and features designed to enhance drug discovery research. Explore Pymol integration, CryoEM, Antibody Representation, Conformer Generator, and more, alongside a seamless chat function and Quick Drop capabilities. Unlock new insights and drive breakthroughs in small molecule drug discovery, protein engineering, macromolecular design, and beyond with Nanome's cutting-edge tools.","april-2023-nanome-plugins-and-home.nanome.ai-updates",{"id":631,"attributes":632},6,{"title":633,"content":634,"createdAt":635,"updatedAt":636,"publishedAt":637,"date":638,"description":639,"keywords":640,"slug":641,"category":153},"Diabetes treatment is about to change forever","# The FDA approved an anti-CD3 antibody that delays type 1 diabetes onset!\n\u003Cbr>\n\n## A milestone for managing type 1 diabetes: Tzield (teplizumab) \nThe treatment of type 1 diabetes (T1D), a chronic autoimmune disease that affects millions worldwide, is about to undergo a revolution. Thanks to Tzield (teplizumab), a recently FDA-approved drug developed by [Provention Bio](https:\u002F\u002Fproventionbio.com\u002F), managing the disease and delaying symptom onset by nearly three years is now possible.\n\n## Tzield (teplizumab) – A Game-Changing Therapy for Type 1 Diabetes\nTzield (teplizumab) is the first CD3-directed IgG1 monoclonal antibody to receive marketing authorization for an autoimmune condition. Tzield delays the onset of T1D by targeting the CD3 molecule on T cells and depleting autoreactive T cells (which attack beta cells in the pancreas) from circulation. While Teplizumab does not cure T1D, it offers patients at risk a much-needed treatment option.\n\n![Teplizumab Mechanism of Action, adapted from Provention’s Investor Presentation](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FTzield_s_mechanism_of_action_67b403c8b4.png)\n\u003Cbr>\n\n## The Journey of Anti-CD3 Antibody Therapeutics\nAnti-CD3 antibodies have been studied as a potential treatment for type 1 diabetes since the 1960s and 1970s. The first clinical approval of a monoclonal antibody (mAb), Orthoclone OKT3, led to severe cytokine release syndrome and eventual withdrawal from clinical use in 1985. Technological advancements in the 1990s led to the development of modified CD3-specific mAbs showing potential for treating autoimmune diseases. Clinical trials of CD3-specific mAbs have shown slower deterioration of β-cell function with no evidence of severe systemic inflammation.\n\u003Cbr>\n\n## Over 20 years in the making \nTeplizumab had a long and tumultuous journey to approval. In the mid-1990s, animal studies showed that anti-CD3 antibodies could prevent or induce remission of diabetes by exhausting autoreactive T cells. Eli Lilly partnered with MacroGenics in 2007 for a phase III trial in 500 patients with recent-onset type 1 diabetes. However, in 2010, the trial failed to meet its primary endpoint, and Lilly dropped the drug.\n\nIn 2019, a small NIH-sponsored phase II trial gave Teplizumab new life. Researchers randomized 76 at-risk participants to a 14-day course of the antibody or placebo and found that the median time to the diagnosis of type 1 diabetes was longer in the antibody arm. The results were promising enough for Provention, which acquired the antibody in 2018, to submit it for FDA approval. In 2021, the FDA narrowly voted in favor of approval, but ultimately requested further study. It has since been approved to delay the onset of stage 3 type 1 diabetes in adults and pediatric patients with stage 2 type 1 diabetes.\n\n![FDA Approves a Drug That Can Delay Type 1 Diabetes, source: The New York Times](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FTzield_64ae10e7e7.jpg)\n\n## What's Next for Tzield (teplizumab)?\nThe journey of teplizumab highlights the challenging path from concept to market for new therapeutics. Despite the setbacks, researchers and companies persisted in their efforts to develop a treatment that could delay or prevent type 1 diabetes. Ongoing phase III trials will provide further insight into the potential impact of teplizumab, and the evolving landscape of diabetes research offers opportunities for continued innovation in the field.\n\nAs the field of biologics research continues to expand, companies like [Nanome](https:\u002F\u002Fnanome.ai\u002F) are taking steps to make drug discovery more efficient and effective. Nanome offers a software platform for drug discovery, enabling researchers to visualize, design, and simulate molecules in 3D. To support biologics research, Nanome is developing features specifically for large molecules - the [Antibody Representation plugin](https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fantibodies.html) and the Cryo-Em plugin - making the software more versatile and better suited to the needs of researchers working with complex biologics. By leveraging virtual reality technology, Nanome helps accelerate the discovery and development of novel biologics, which have the potential to transform treatment of a wide range of diseases.\n\u003Cbr>\n\n## References\n[Dolgin, E. Drug for delaying diabetes wins landmark approval. Nat Biotechnol 41, 6–7 (2023)](https:\u002F\u002Fdoi.org\u002F10.1038\u002Fs41587-022-01633-3)\n\n[Nature Reviews Drug Discovery 22, 6-7 (2023) FDA approves anti-CD3 antibody to delay type 1 diabetes onset](https:\u002F\u002Fdoi.org\u002F10.1038\u002Fd41573-022-00198-9)\n\n[Anti-CD3: the agonist and the ecstasy (2021)](https:\u002F\u002Fwww.nature.com\u002Farticles\u002Fd42859-021-00020-3)\n\n[FDA Approves First Drug That Can Delay Onset of Type 1 Diabetes (2022)](https:\u002F\u002Fwww.fda.gov\u002Fnews-events\u002Fpress-announcements\u002Ffda-approves-first-drug-can-delay-onset-type-1-diabetes)","2023-04-12T08:23:29.917Z","2026-03-24T17:53:27.645Z","2023-04-12T08:23:52.589Z","2023-04-12","The FDA recently approved Tzield, the first CD3-directed monoclonal antibody, which helps manage Type 1 Diabetes, and can delay the onset of symptoms by nearly three years. Nanome’s new VR plug-ins for biologics research make it easy for scientists to visualize, design, and simulate large molecules & complex biologics in 3D.","Diabetes, anti-CD3, type 1 diabetes, Tzield, teplizumab, biologics, biologics research, antibody representation plugin, cryo-em plugin","diabetes-treatment-is-about-to-change-forever",{"id":643,"attributes":644},5,{"title":645,"content":646,"createdAt":647,"updatedAt":648,"publishedAt":649,"date":650,"description":651,"keywords":652,"slug":653,"category":153},"Antibody Prophylaxis for Lyme Disease","**20 years and a vaccine for Lyme Disease is still in the works, but structural biology has another trick up its sleeve: Antibody Prophylaxis!**\n\nTicks are notorious for their pesky bites, but there's a more serious concern associated with them: [Lyme Disease](https:\u002F\u002Fwww.cdc.gov\u002Flyme\u002Findex.html). It's the most common tick-borne infection in the United States, with over 30,000 cases reported each year. If left untreated, it can lead to serious complications in joints, heart, and nervous system.\n\nHow do ticks spread Lyme Disease? Blacklegged ticks feed on the blood of mammals, birds, reptiles, and amphibians. They hide in grass, sensing body heat and waiting for the chance to climb aboard a passing host. Then, like a vampire, the tick will pierce its victim’s skin and begin drinking its victim’s blood. Any bloodborne infections carried by the host will pass into the tick, so when the tick feeds on to its next victim, it can transmit the infection. Lyme Disease is one of the more harmful infections spread by ticks and is caused by bacterium or (_Borrelia burgdorferi_ in North America, or _B. garinii_ and _B. afzelii_ in European and Asian countries.)\n\n![Reported cases of Lyme Disease in the USA, 2018. Source CDC.](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FLyme_Incidence_Dot_Map_e1f06746e1.jpg)\n\nUnfortunately, there's no cure for Lyme Disease, and no human vaccine or monoclonal antibody therapy is currently available on the market. Efforts to develop a new vaccine have been ongoing for two decades, and several vaccines are currently in development and undergoing clinical trials.\n\n“The only therapeutic option available if you do get bitten by a tick and are infected in the first 48 hours or so, is to get a regimen of antibiotics that can clear the bacteria and mitigate or prevent the disease.” says Dr. Michael Rudolph, a scientist working on Lyme Disease at the [New York Structural Biology Center](https:\u002F\u002Fnysbc.org\u002F). \n\n**Prevention is better than cure: therapeutic antibodies**\n\nHowever, scientists like Dr. Michael Rudolph and his collaborators at MassBiologics and the Wadsworth Center have been exploring a different route than vaccination: antibody prophylaxis. Essentially, this means using antibodies to prevent infection before it occurs.\n\nWhen a tick latches onto someone, it takes at least 36 hours to pass along the bacterium that causes Lyme disease. This provides a window of time for antibodies to get into the tick's gut and block transmission of the bacterium.\n\nIf antibodies that specifically recognise _B. burgdorferi_ are administered at the beginning of tick season, they act like a prophylactic to Lyme disease. So, if a subject has antibodies at the moment of the bite, infection can be prevented.\n\n**What’s the difference between vaccines and antibody prophylaxis?**\n\nVaccines use inactivated viruses or bacteria to stimulate an immune response, priming the human body to fight off future exposure. They help prevent a future illness, but do not treat the illness itself. By contrast, antibodies identify and attack a specific disease causing organism. If you already have antibodies in your system at the time of being exposed to a disease organism, then they can prevent disease from taking root. \n\n![Antibodies in VR: Therapeutic Antibodies Against Lyme Disease w\u002F Dr. Mike Rudolph - NYSBC on YouTube](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Flyme_antibodies_cover_1c7bd3d04e.png)_View the full video [here](https:\u002F\u002Fyoutu.be\u002F_Zwi4p_siO0)._\n\n**Secret powers of antibodies explained**\n\nDr. Rudolph's team has been studying the outer surface proteins (Osp) of the bacterium that causes Lyme disease, which are responsible for its virulence and immune evasion. If you have antibodies against those, chances are that you are not going to develop Lyme disease. \n\nDr. Rudolph is using structural biology techniques like X-ray crystallography and cryo-electron microscopy to study the three-dimensional structure of Osp proteins when they are bound to protective antibodies. By understanding how the immune system recognizes these proteins, researchers can design more effective vaccines and therapies for Lyme disease.\n\nWith the help of virtual reality tools like Nanome, Dr. Rudolph's team has explored the structures of Osp proteins in complex with protective antibodies. They've discovered the critical regions of Osp proteins recognized by protective antibodies and how they bind tightly together.\n\n**Antibodies in Nanome**\n\nOur team had the pleasure to meet Dr. Rudolph in Nanome and [explore these structures](https:\u002F\u002Fyoutu.be\u002F_Zwi4p_siO0) in virtual reality.\n\nWe started with OspA bound to the human antibody fragment 227-1 that is now undergoing clinical trials (PDBid 7JWG). \n\nWith the ‘[Antibody Representation](https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fantibodies.html)’ plugin we highlighted the Complementarity-determining Regions (CDRs) of the antibody fragment and immediately saw that OspA strands 4-9 are the ones recognised by 227-1.\n\n![OspA bound to the human antibody fragment 227-1 in Nanome](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Flyme_antibodies_2_3b69c62217.png)\n\nThat’s good news! We know that that region is conserved across different Borrelia variants, which means that antibody 227-1 has the potential to be protective against multiple strains. Plus, we visualized the intricate network of interactions between 227-1 and OspA with the [Chemical Interaction](https:\u002F\u002Fdocs.nanome.ai\u002Fplugins\u002Fcheminteractions.html#instructions) plugin, which helped to explain the tight binding. \n\nWe then pulled up another outer surface protein OspC, bound to the antibody fragment B5. Even though B5 was characterized years ago, until now, we didn’t know where it binds to OspC.\n\nDuring our VR session, Dr. Rudolph solved and presented for the first time the characterization of the epitope recognised by the [protective antibody B5](https:\u002F\u002Fwww.biorxiv.org\u002Fcontent\u002F10.1101\u002F2022.11.28.518297v1). \n\n![Comparing Lyme structures in Virtual Reality](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Flyme_antibodies_3_89598ab698.png)\n\nWe compared OspC-type A structures to OspC-type B with Superimpose Proteins, noting how specific residues on the epitope binding region were not conserved, thus preventing the cross reactivity of B5. \n \n**Moving Forward**\n\nBy gaining a deeper understanding of the interactions between protective antibodies and Osp proteins, Dr. Rudolph's team hopes to aid in the rational design of Osp-based vaccines and therapeutics. While a vaccine for Lyme disease may not exist yet, structural biology is providing valuable insights and potential solutions for prevention and treatment.\n\nView the full in-VR video of Nanome scientists exploring OspA and OspC antibodies complexes [here](https:\u002F\u002Fyoutu.be\u002F_Zwi4p_siO0).\n\n_Additional Resources:_\n_https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fpmc\u002Farticles\u002FPMC8159683\u002F_\n_https:\u002F\u002Fwww.biorxiv.org\u002Fcontent\u002F10.1101\u002F2022.11.28.518297v1_\n_https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F36000876\u002F_\n\n_Lyme vaccines in clinical trials:_\n_https:\u002F\u002Fclinicaltrials.gov\u002Fct2\u002Fresults?cond=Lyme+Disease&term=vaccine&cntry=&state=&city=&dist=&Search=Search_\n\n\n","2023-04-04T10:38:23.088Z","2026-03-24T17:53:27.417Z","2023-04-04T11:00:30.792Z","2023-04-05","20 years and a vaccine for Lyme Disease is still in the works, but structural biology has another trick up its sleeve: Antibody Prophylaxis! We met with Dr. Mike Rudolph, senior scientist at NYSBC, in virtual reality to discuss his team's research on Lyme Disease.","Lyme disease, antibody prophylaxis, virtual reality, nanome, plugins","antibody-prophylaxis-for-lyme-disease",{"id":655,"attributes":656},4,{"title":657,"content":658,"createdAt":659,"updatedAt":660,"publishedAt":661,"date":662,"description":663,"keywords":664,"slug":665,"category":153},"A new era in drug discovery? The first AI-generated drug is going to clinical trial","AI is finally delivering on its promise to revolutionize drug discovery and development: a COVID-19 drug entirely designed by generative AI has been approved for human use and will soon [begin clinical trials](https:\u002F\u002Fwww.eurekalert.org\u002Fnews-releases\u002F980646?adobe_mc=MCORGID%3D242B6472541199F70A4C98A6%2540AdobeOrg%7CTS%3D1678731199) in China! \n\u003Cbr \u002F>\n\nIn 2020, the scientific community was racing to find treatments for the deadly COVID-19 virus. Because lockdowns kept scientists out of the laboratories and glued to their computers, a particular discipline boomed: computational drug discovery.\n\u003Cbr \u002F>\n\nInSilico Medicine, a clinical-stage biotech powered by generative AI, was among the companies that leveraged generative AI to join the race for a COVID-19 cure. Using their generative chemistry platform, InSilico quickly generated their [first set of drug candidates](https:\u002F\u002Finsilico.com\u002Finsiliconanome), which were then optimized inside the Nanome virtual reality platform. By April 2020, they had filed a patent application, and further refinement led to the discovery of ISM3312, an orally available 3CLpro covalent inhibitor and the world's first COVID-19 therapeutic created entirely by generative AI.\n\u003Cbr \u002F>\n\n![Scientists use Virtual Reality and AI to combat COVID-19](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FNanome_Insilico_Case_Study_8c888d3734.jpg?updated_at=2023-03-16T06:56:46.851Z) \n\u003Cbr \u002F>\n\n>“The beauty of generative AI (in drug discovery) is instead of searching for a needle in a haystack, it generates a bunch of perfect needles for you,” [Zhavoronkov, InSilico Medicine CEO, said](https:\u002F\u002Fwww.thestar.com\u002Fnews\u002Fcanada\u002F2023\u002F02\u002F23\u002Fits-perfect-worlds-first-generative-ai-designed-covid-drug-to-start-clinical-trials.html). “Then you just have to choose from those perfect needles — if you want the longer one or the one that is more stable.”\n\nWhat makes ISM3312 so groundbreaking? It is the world’s first COVID-19 therapeutic created by generative AI. In preclinical studies, the drug has shown significant potential to reduce viral load in lung tissue, decrease lung inflammation, as well as serve as a broad-spectrum therapeutic for other types of coronavirus. Insilico Medicine will soon initiate clinical trials of ISM3312 in China, marking a major milestone in the fight against COVID-19.\n\u003Cbr \u002F>\n\n>“Insilico fully utilized its unique features in this internal COVID R&D program by leveraging AI capabilities with R&D expertise to design an innovative 3CLpro inhibitor with novel structure and molecular backbone.” [said Feng Ren](https:\u002F\u002Fwww.eurekalert.org\u002Fnews-releases\u002F980646?adobe_mc=MCORGID%3D242B6472541199F70A4C98A6%2540AdobeOrg%7CTS%3D1678731199), co-CEO and CSO at Insilico Medicine.\n\nInsilico Medicine will soon initiate clinical trials of ISM3312 in China to explore the tolerability, safety, and pharmacokinetic profile of the drug in humans, as well as its efficacy and safety in different subgroups of patients with COVID-19, thus providing an alternative option for health management in the post-pandemic era. \n\u003Cbr \u002F>\n\n![Nanome and InSilico scientists look at potential non-covalent SARS-CoV-2 3C-like protease inhibitors](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fyoutube_screenshot_70f0fa9e25.PNG?updated_at=2023-03-16T06:44:46.572Z)\n\u003Cbr \u002F>\n\nHowever, the use of AI in drug discovery does not mean that scientists are becoming obsolete. Human intuition and knowledge remain key aspects of the drug discovery process. In the early stages of COVID-19 inhibitor research, the InSilico team met in [virtual reality](https:\u002F\u002Fwww.youtube.com\u002Fwatch?v=KnQqR6G00sE) with Nanome scientists to work together and fine-tune AI-generated molecules. This collaboration between AI and human experts is the future of drug discovery, and it's why platforms like Nanome are so important.\n\u003Cbr \u002F>\n\nAt Nanome, we believe that the future of drug discovery lies in the combination of artificial intelligence and human expertise. Our platform enables scientists to communicate and collaborate in real time, no matter where they are in the world. With the ability to seamlessly upload and securely share proprietary data, Nanome is at the forefront of the virtual reality revolution in drug discovery.\n\u003Cbr \u002F>\n\n![The future of drug discovery lies in the combination of artificial intelligence and human expertise](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fai_image_427de69014.PNG?updated_at=2023-03-16T06:44:45.629Z)","2023-03-16T06:23:01.058Z","2026-03-24T17:53:27.593Z","2023-03-16T06:59:02.942Z","2023-03-16","AI is finally delivering on its promise to revolutionize drug discovery and development: a COVID-19 drug entirely designed by generative AI has been approved for human use and will soon begin clinical trials in China! ","COVID-19, SARS-CoV-2, virtual reality, drug design, AI, generative AI, InSilico, Nanome","a-new-era-in-drug-discovery-the-first-ai-generated-drug-is-going-to-clinical-trial",{"id":667,"attributes":668},3,{"title":669,"content":670,"createdAt":671,"updatedAt":672,"publishedAt":673,"date":674,"description":675,"keywords":676,"slug":677,"category":153},"Beating the Pandemic in Virtual Reality","You can’t catch COVID-19 in virtual reality, but you can certainly help beat it!  Dr. Kovalevsky and his team at Oak Ridge National Laboratory (ORNL), are combining neutron crystallography and VR to create accurate 3D models of SARS-CoV-2 main protease (Mpro), the enzyme that allows the coronavirus to reproduce.\n\u003Cbr \u002F>\n\nHigh resolution VR models of Mpro allow scientists to virtually build and test newly designed compounds to determine how well they can bind to the catalytic site on the enzyme surface. This data helps the scientific community design novel drugs that inhibit Mpro.\n\u003Cbr \u002F>\n\n![Scientists at the Oak Ridge National Laboratory use Virtual Reality to create COVID-19 inhibitors](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FORNL_Case_Study_render_30766cfe14.png)\n\u003Cbr \u002F>\n\n## Why Mpro?\n\u003Cbr \u002F>\nSARS-CoV-2 main protease is an indispensable protein enzyme that enables the coronavirus to reproduce. During the process of replication, a polyprotein chain is synthesized and the main protease cleaves this chain in multiple sites, releasing functional proteins. Inhibiting the protease is vital to stopping the virus from spreading in patients with COVID-19. \n\u003Cbr \u002F>\u003Cbr \u002F>\n\nGiven its biological importance, Mpro was identified as a target for drug development early in the pandemic. Scientists modeled the 3D architecture of Mpro and studied the structure of its catalytic site to design molecules that will block the protease’s activity. Thousands of such studies have already been conducted.\n\u003Cbr \u002F>\n\n## Designing Inhibitors in VR\n\u003Cbr \u002F>\n\nUp till now, most studies of Mpro have been based on cryogenically preserved crystals. However, the team at ORNL approached the problem differently by also solving room temperature and neutron structures, which provide a much more detailed picture of the enzyme. \n\u003Cbr \u002F>\n\nDr. Kovalevsky and his team use virtual reality as their method of choice to visually analyze these complex structures and perform virtual reality-assisted small molecule design. Their approach led to the discovery of a novel chemical structure that acts as a non-covalent inhibitor of the main protease:\n\u003Cbr \u002F>\n\n>“This novel chemical structure is different than what has been previously studied by the global community and could open new avenues of research with exciting possibilities for combating SARS-CoV-2,” says first author Dr. Kneller and “I would have never come up with that [chemical modification] if we didn't go through VR and look at it.”\n\nDr. Kovalevsky routinely uses Nanome to analyze molecular structures, test ideas, and optimize the design of small molecule compounds. Read [our full case study](https:\u002F\u002Fmeet.nanome.ai\u002Fcase-studies#ornl) to learn how Nanome supports Dr. Kovalevsky’s research.","2022-11-01T23:00:00.855Z","2026-03-24T17:53:27.558Z","2022-11-22T00:12:18.273Z","2022-11-21","You can't catch a cold in virtual reality, but you can figure out how to stop the COVID-19 pandemic! Learn how scientists at the Oak Ridge National Laboratory designed an inhibitor for SARS-CoV-2 using virtual reality and a High Flux Isotope Reactor.","COVID-19, SARS-CoV-2, virtual reality, ORNL, drug design","beating-the-pandemic-in-virtual-reality",{"id":679,"attributes":680},2,{"title":681,"content":682,"createdAt":683,"updatedAt":684,"publishedAt":685,"date":686,"description":687,"keywords":688,"slug":689,"category":66},"Meta Quest Pro & a new version of Nanome (v1.24)","The Meta Quest Pro is a groundbreaking headset, for more information about the new headset check out [here](https:\u002F\u002Fwww.meta.com\u002Fquest\u002Fquest-pro\u002F)\n\n![Meta Quest Pro](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002FMeta_Quest_Pro_Featured_3a56dd399a.webp)\n\nThe new Meta Quest Pro’s passthrough mixed reality is downright exquisite. I’ve always dreamt of the day I could hold a holographic molecule or DNA structure in my hand like ironman. And that day is finally here! \n\nThe Quest Pro’s new pancake lenses provide a crisp view, and when you take off the optional side covers, the way your peripheral vision blends in with the headset is magical. It’s truly freeing to have high fidelity, color pass through mixed reality in a wireless headset. I could move around confidently without worrying about stubbing my toe on a coffee table. I could even take notes on my whiteboard and sip my morning coffee without leaving my virtual Nanome workspace. \n\n\n## Nanome 1.24 release Highlights;\n1. Mixed reality passthrough on all-in-one devices, such as Meta Quest Pro and Meta Quest 2 \n2. Customize your avatar to your heart’s content using 3rd party avatars (Meta Avatars 2.0)\n3. Seamlessly collaborate with automatic presenter changing (manual presenter change no longer required)\n4. Use your real-world desk as a virtual whiteboard (aka BYODesk)\n5. Teleport changes (no more accidental teleportation)\n6. Updates to Wrist Menu\n7. Access new spatial recordings without app updates via the Content Discovery Menu \n\n![Nanome 1.24 Update Banner](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fnanome_1_24_BLOG_f8d1780650.png)\n## Mixed Reality Passthrough Support for All-In-One Devices, Starting with Quest Pro and Quest 2\n\u003Cp class=\"text-center mb-0\">\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FTl2ZNvgy4Po8aUYSmW\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe> \u003C\u002Fp>\u003Cp class=\"text-center\">\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-Tl2ZNvgy4Po8aUYSmW\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nAs we started to support passthrough mixed reality with the Varjo XR 3 in Nanome v1.23, we strongly believe the rise of mixed reality is the next major step for modern spatial computing headsets. As a recap, Mixed reality (MR) is a user environment in which physical reality and digital content are combined in a way that enables interaction between real-world and virtual objects. Modern virtual reality headsets use built-in external-facing cameras that are used for inside-out tracking to also use them to stream real-world content to their users. \n\nWhile previous generations of virtual reality headsets had low-resolution black and white cameras exclusively for controller tracking, newer devices like the Meta Quest Pro headset are paving the way forward with higher resolution and color-supported cameras. This enables users to comfortably see the room around them and have all the benefits of virtual reality headsets (e.g. wide field of view, controller tracking, etc).\n\nPlease note: While this Nanome 1.24 update is mostly focused around the Meta Quest Pro headset and its features, some of these features will not be available to other XR devices until later this year. (such as Vive focus 3 and others.)\n\nWith this latest software update, we will be introducing mixed reality as the default mode for the Meta Quest 2 and Meta Quest Pro devices. \n\n\n## Custom 3rd party avatars (Meta Avatars 2.0)\n\u003Cp class=\"text-center mb-0\">\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002Fsmg4NJyGPBJ61L7UXh\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003C\u002Fp>\u003Cp class=\"text-center\">\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-smg4NJyGPBJ61L7UXh\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nOur in-house avatars will always hold a special place in our hearts, but we’re massively increasing your avatar customizability by supporting third-party avatar systems. This first avatar system we’re adding is Meta’s Avatars 2.0 offering. \n\nThe new avatars contain tons of dedicated designs and features to amplify your feeling of presence. Using a Meta device, you can customize your avatar with a wide variety of hairstyles, looks, and outfits. Meta has even announced a fashion store for avatars! \u003CShare link>\n\n\u003Cp class=\"text-center mb-0\">\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FuSMM45pSqoMcrPLhwK\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003C\u002Fp>\u003Cp class=\"text-center\">\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-uSMM45pSqoMcrPLhwK\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nAlternatively, users can choose a wide range of preset avatars that Nanome has curated. Can’t find one that represents you? Reach out to us, and we’ll try to include one for you in our future releases or patches. \n\n\nCustomizable avatars will only be available on Meta devices, but you can select one of our new presets from any XR device! \n\nThese avatars also work with Nanome’s spatial recordings! There is an incredible difference in the look and feel of the recordings that will be sure to change the dynamic of your recordings. \n\nWe look forward to hearing your impression of these new avatars!\n\n\n## Easier Collaboration through Intuitive Presenter Changes  (manual presenter change no longer required)\n\n\u003Cp class=\"text-center mb-0\">\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002F6fjCbQjTg9mHwy4ucI\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003C\u002Fp>\u003Cp class=\"text-center\">\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-6fjCbQjTg9mHwy4ucI\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nWith passthrough mixed reality, our goal is to allow you to say “here you go” and hand a molecule over to another user. In 1.24, we’re introducing a feature where the Presenter Mode switches to the user who is grabbing a structure. Please note: presenter mode will not change if any user is grabbing the structure. This makes it significantly more natural to change presenters when you’re collaborating on molecular structures. We think you’ll love this new feature and can’t wait to hear how you use it to collaborate with your colleagues! \n\n## Bring your own desk (BYODesk) and use it as a whiteboard\n\n\u003Cp class=\"text-center mb-0\">\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FHhfBRBeeq2NA592VaW\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003C\u002Fp>\u003Cp class=\"text-center\">\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-HhfBRBeeq2NA592VaW\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\n\nNow that mixed reality is here, you might feel tempted to snap virtual windows to the physical surfaces around you. As a first step in that direction, you can now use your physical desk as a virtual plane\u002Fsurface in Nanome. \n\nWhen you open the whiteboard, you'll be able to set up your desk as a virtual surface. You can grab and position the whiteboard over your desk, snap it to the grid plane, and write on it. It’s a wonderful quality of life feature, and we plan to extend this to other types of Nanome menus and even physical walls. \n\n## Teleport Changes\n\n\u003Cp class=\"text-center mb-0\">\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FudvNAJrD75gbcmtVJd\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003C\u002Fp>\u003Cp class=\"text-center\">\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-udvNAJrD75gbcmtVJd\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\nWe’ve heard your feedback; it’s easy to accidentally teleport when you’re first getting started with Nanome. To address this, teleporting will be disabled by default. You can always reenable teleporting from the wrist menu. If you’re in passthrough mixed reality, you’ll also notice that this is what enables you to teleport around. We hope this prevents awkward moments from accidentally teleporting on top of  your colleagues! \n\n\n\n## Content Discovery Menu\n\n\u003Cp class=\"text-center mb-0\">\u003Ciframe src=\"https:\u002F\u002Fgiphy.com\u002Fembed\u002FA8XQ1bq3O0s37cHWPP\" width=\"480\" height=\"270\" frameBorder=\"0\" class=\"giphy-embed\" allowFullScreen>\u003C\u002Fiframe>\u003C\u002Fp>\u003Cp class=\"text-center\">\u003Ca href=\"https:\u002F\u002Fgiphy.com\u002Fgifs\u002FNanome-A8XQ1bq3O0s37cHWPP\">via GIPHY\u003C\u002Fa>\u003C\u002Fp>\nWith the launch of Nanome on the official Quest Store, we plan to make much more recorded content available. Prior to 1.24, we had to release a new patch\u002Fupdate every time we wanted to release new spatial recordings. We’ve replaced the Advanced Tutorial menu with the new Content Discovery menu, which can display new recordings as soon as we create them. Be sure to check out some of our new spatial recordings! \n\nWe are incredibly excited to bring Nanome v1.24 to the Meta Quest Pro and Meta Quest 2. Try it out as soon as you can to experience the most advanced and breathtaking scientific collaboration tool ever invented.\n\nHave a different XR device that supports mixed reality passthrough? We’re working on a new release that will extend Nanome’s mixed reality passthrough functionality to other XR devices.\n\nYou can download Nanome today and try it for free by clicking [here](https:\u002F\u002Fnanome.ai\u002Fsetup)!\n\n","2022-10-07T07:40:04.668Z","2026-03-24T17:53:27.514Z","2022-10-11T17:43:10.977Z","2022-10-11","We’re thrilled to launch version 1.24 of Nanome alongside the Meta Quest Pro!  Nanome 1.24 takes advantage of the Quest Pro’s hardware to deliver groundbreaking features such as Mixed Reality Passthrough for all-in-one devices, Meta Avatars, and more! ","Meta Quest Pro, Nanome v1.24, mixed reality, mixed reality passthrough, Meta avatars, first impressions","meta-quest-pro-and-a-new-version-of-nanome-(v1.24)",{"id":691,"attributes":692},1,{"title":693,"content":694,"createdAt":695,"updatedAt":696,"publishedAt":697,"date":698,"description":699,"keywords":700,"slug":701,"category":153},"Spy stories & rational drug design","In 2017, at Kuala Lumpur airport, [a woman was tricked into approaching Kim Jong-un’s half brother](https:\u002F\u002Fwww.businessinsider.com\u002Fkim-jong-nam-accidental-assassin-thought-youtube-star-2019-9) and smearing some liquid on his face. Within 20 minutes, Kim Jong-nam was dead. In 1995, the [Aum Shinrikyo cult released Sarin gas in the Tokyo Metro](https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FTokyo_subway_sarin_attack), killing 13 and severely injuring 50.\n\nToxic nerve agents are the stuff of spy stories and [war crimes](https:\u002F\u002Fen.wikipedia.org\u002Fwiki\u002FGhouta_chemical_attack). But thankfully, researchers like Prof. Zoran Radić (UC San Diego) have dedicated their careers to understanding how nerve agents such as VX and Novichok target the human nervous system and designing antidotes for nerve poisons.\n\n\n## How do nerve agents work?\n\nNerve agents don’t need to be ingested to work. Inhalation of a vapor or exposure to your skin is enough to cause vomiting, incontinence, and asphyxiation. [Nerve agents disrupt your central nervous system](https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fbooks\u002FNBK539735\u002F#:~:text=%5B1%5D%20The%20primary%20role%20of,of%20pesticides%20and%20nerve%20agents.) by interfering with acetylcholinesterase (AChE), [the enzyme responsible for breaking down ACh neurotransmitters](https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fbooks\u002FNBK539735\u002F#:~:text=%5B1%5D%20The%20primary%20role%20of,of%20pesticides%20and%20nerve%20agents.) so that your receptors can receive new messages from your nervous system. \n\nWithout AChE, ACh accumulates in your nerve synapses, eventually causing paralysis.\n\n![3D visualization of acetylcholinesterase (AChE) in Nanome](https:\u002F\u002Fnanome-cms.s3.us-west-1.amazonaws.com\u002Fimage2_6ec5ea19ce.png)\n\n## A novel approach to designing antidotes\n\nDr. Radić uses a structure-based approach to develop small molecules for treating nerve agent poisoning – by studying the 3D architecture of biological molecules, he and his team at UCSD gain an understanding of the molecular features of the active site, and use this knowledge to design antidotes. \n\nUntil recently, the challenge was trying to understand 3D structures through 2D visualizations: **“It is like looking at molecules behind a window in 3D, you can’t get in there.”**\n\nThen, in 2017, Dr. Radić chanced upon an early prototype of Nanome, our virtual reality software that allows you to visualize, touch, and interact with molecular structures in 3D! Stepping into 3D space and manipulating and rotating AChE with his hands was a revelation:\n \n“One can get into that molecule and instantaneously recognize what [ligand] fits and what does not fit … for me it works like an instantaneous focus on the topic. I am immediately focused on my atoms, molecules and side chains. Distractions are canceled out.\n\nProf. Radić immediately became a Nanome power user, and his feedback has helped Nanome evolve into the premier tool for 3D drug design. Read our [case study](https:\u002F\u002Fmeet.nanome.ai\u002Fcase-studies) for an in-depth dive into how Nanome has helped accelerate structure-based drug design for Prof. Radić.\n","2022-08-24T17:18:31.822Z","2026-03-24T17:53:27.467Z","2022-08-24T17:21:21.937Z","2022-08-29","Toxic nerve agents have been used in war crimes, assassinations, and terrorist attacks. Thankfully, researchers at UCSD are leveraging virtual reality to develop antidotes. Virtual reality allows them to visualize acetylcholinesterase (AChE), the enzyme that nerve agents like sarin or VX attack.","rational drug design, structure-based drug design, acetylcholinesterase, AChE","spy-stories-and-rational-drug-design",{"pagination":703},{"page":691,"pageSize":704,"pageCount":691,"total":705},100,58,1775871859876]