v0.1 (beta)

Guides

Getting started

Overview of the platform and how the pieces fit together.

TurbineX is a hosted platform for running OpenFAST wind-turbine simulations without managing your own compute infrastructure. Upload a turbine model, pick an inflow, queue a run, and pull the results back through the UI or the API.

The shape of the platform

Five nouns cover most of what you'll touch:

  • Turbines — a parameter set plus the OpenFAST input files that describe one turbine. Five reference designs are auto-loaded into every org on first visit (NREL-5MW-Land, three IEA-15-240 variants — onshore / monopile / floating semi-sub — and IEA-22-280). Verified accounts can also upload a ZIP of existing OpenFAST inputs or build one from scratch in the UI. Demo rows are read-only; click Clone as New on a demo to get an editable copy.
  • Sites — a location with met-mast wind data and (offshore) wave data. Lives at /site-conditions. Two demo sites bundled: FINO1 (offshore) and Texas Panhandle (onshore). Upload your own CSV time series or use the demos to drive certification + parametric studies. Wind profile, wind rose, statistics, and wave climate plots are auto-generated.
  • Inflow— the wind condition you want to drive the turbine with. For single simulations this is a single steady speed or a TurbSim box; for certification studies it's an IEC class.
  • DLCs — IEC 61400-1 Design Load Cases. The platform ships with the standard 1.x / 2.x / 6.x set and lets you enable only the ones that apply to your campaign.
  • Studies — a container that ties the others together for a specific analysis. Three modes: single simulation (one run), certification (one run per DLC × wind speed × seed), and parametric (sweep any axis). Each study has its own Share Space for attaching files and (optionally) a public read-only share-link.

What unverified accounts can do

Until you verify your email address, the account is in demo mode: you can run one single simulation against any of the five demo turbines. Custom turbine creation, ZIP upload, certification reports, parametric sweeps, and the Usage page are all locked behind verification (the relevant endpoints return 403 and the UI hides those tiles with a lock badge). Verifying your email unlocks everything.

A typical session

  1. Dashboardis the engineer's bench — recent studies, the job queue, and a few KPI cards. Click New Studyto create one; you'll pick the mode and give it a name (required).
  2. Open the study and walk the horizontal pill tabs left to right: TurbineInflowEnvironment DLCs (or Sweep) → Advanced RunResultsShare Space. Tabs adapt to the study mode (single sim is shorter; certification shows all eight).
  3. Once submitted, the Run tab shows live progress (sim-time per job, wall-clock elapsed, queue position if you're waiting). The same job stream is visible in the global Job Queue page on the sidebar.
  4. Results surfaces the OpenFAST output with a Bladed-style analysis pane — channels grouped by type, four tabs (time series, statistics, X-Y plots, power spectrum), one Export button for .outb / PNG / API snippets.
  5. Share Spaceattaches arbitrary files to the study (test plans, photos, post-processing notebooks). A toggle turns on a public read-only URL anyone with the link can open — handy for sharing a study with someone who doesn't have a TurbineX account.

Things worth knowing up front

  • Studies are persisted server-side.Each study is a row in the database, scoped to your org. That means you'll see the same studies on any device, and anyone in your org can open them. Studies hold the assembled config (turbine, inflow, DLCs, sweep) so results remain accessible after you log out.
  • Notes are org-scoped and can be shared across the team. Use the floating Notes button to keep notes alongside any page. See Taking notes.
  • Billing is plan-based (Free / Pro / Enterprise) plus optional monthly caps. Org owners can set a USD spend cap and a token cap on the Billing tab of their profile; alerts fire by email at 50 / 80 / 100%. See Plans & limits.

Next steps

The fastest way to get a feel for it is to run a single simulation: Your first simulation. It walks from a fresh login to a time-series chart in about five minutes.