Local Reflow vs Cloud Reflow
Reflow can be interacted with in two ways:
- Local Reflow: The Reflow CLI is locally installed and running via executing reflowio dashboard and navigating to localhost:3100.
- Hosted Reflow: The reflow dashboard is accessed via app.reflow.io.
- Self Hosted Reflow: The reflow dashboard is deployed into your account and accessed via
reflow.${your-domain}
.
In all cases, interaction requires login, after which a realtime, collaborative dashboard is available. I.e. Any changes made in Local Reflow are reflected in Cloud Reflow, and vice-versa.
In general, it is recommended to use Local Reflow for any recording activities, because it is much faster, and more reliable.
Recording localhost
When using Local Reflow, record a test with a starting URL pointing at locally running software, e.g. http://localhost:8080
Similarly, when making navigation actions, change the URL appropriately in the recorder pane to reflect the desired local URL.
Replaying localhost
When accessing the dashboard via local reflow, all tests executed are tagged to only run on the local device.
The tests results are uploaded to the cloud automatically as the run is executed.
The exception to this is CLI runs. When a run is executed via reflowio test --dry-run, the tests are executed but no results are uploaded. The results are only available as CLI output.
Advantages of Local Reflow
- All images are stored in a read-through cache. This means that all images are only downloaded from the cloud if they do not exist. This makes the web UI faster.
- The recording instance is pre-warmed, eliminating the need to wait when starting a new recorder
- The recording instance is available locally, which makes all interactions with it far faster. The cloud recording instance is, in contrast, hosted in
eu-west-2
(UK/London), the distance to which will contribute to latency on all interactions with it.
Limitations of Local Reflow
- Only 1 test can be executed concurrently when running locally. Should more than 1 test be scheduled locally, they will be executed sequentially.
- Recording and Replaying tests can be computationally expensive (due to image comparison algorithms), which can slow down your local machine.
- Replaying tests can fail if the node process runs out of memory. If this happens, re-execute the dashboard process with
NODE_OPTIONS=--max_old_space_size=4096
to increase the memory limit. - Triggered pipeline executions only execute in cloud instances.
- Local dashboard processes may need to be manually restarted before tests are scheduled for local executions should internet connectivity be temporarily disrupted.
Executing local tests in Cloud Reflow
If a URL is hard-coded as http://localhost
, but the test is being recorded or executed in Cloud Reflow, the navigation will fail and the test will halt.
However, all Navigation actions can be parameterized to use the http://localhost
URL, when running locally, and a cloud URL (e.g. https://staging.example.com
) when running in the cloud. This enables a workflow where a test can be developed/recorded against both staging URLs and local URLs, to shorten the testing and development life cycle.
To configure this, record the test first with hardcoded URLs, then edit the navigation actions to use a template String dependent on a variable.
For example using template string ${eval(IS_LOCAL) ? 'http://localhost:3000' : 'https://staging.example.com'}
in a navigation will automatically require all test executions to pass an IS_LOCAL
to execute the test. When IS_LOCAL
is passed in as true
(or any other truthy value, like 1
), the URL will be evaluated as http://localhost:3000
. When IS_LOCAL
is passed in as false
(or any other falsy value, like 0
), the URL will be evaluated as https://staging.example.com
.
Enterprise: Self Hosting Reflow
Reflow can be self-hosted in your AWS Account. This will ensure that your data will never leave your AWS Account.
This is only available with an Enterprise license. Once you have purchased a license, you will need to temporarily grant us rights to assume a role in your account with AdministratorAccess
permissions.
We will then configure (with CDK):
* A CloudFront distribution to host the web application
* A Cognito User Pool to store users
* A set of DynamoDB Tables to hold application data
* A set of ECS Clusters in regions of your choice to host the recording component
* A set of S3 Buckets to store test run snapshots and static files
* A set of AWS Lambda Functions to orchestrate everything
We will import your data from your existing team in Reflow to your new environment.