Docs > Deploying a Secure Tableau Server on AWS
This walkthrough is divided into four parts:
- Install Prereqs (20 minutes)
- Deploy (5 minutes)
- Connect and Test (10 minutes)
- Shut It Down (1 minute)
Time Requirement: <45 minutes (total)
Time Requirement: 20 minutes
Overview: This section walks through getting software installed and getting AWS-related credentials stored correctly. These steps only need to be done once. Complete this section before continuing to the next section.
-
Install Dev Tools
-
Install minimal toolset from Command Prompt as administator (via Chocolatey):
@"%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe" -NoProfile -InputFormat None -ExecutionPolicy Bypass -Command " [System.Net.ServicePointManager]::SecurityProtocol = 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1'))" && SET "PATH=%PATH%;%ALLUSERSPROFILE%\chocolatey\bin"
choco install -y awscli terraform vscode
-
Or install the full set of recommended tools using the Windows Development QuickStart
-
-
Clone the git repo:
dataops-tools
-
The following commands will clone the dataops-tools repo into
c:\Files\Source\dataops-tools
:cd / && mkdir Files & cd Files & mkdir Source & cd Source git clone https://github.com/slalom-ggp/dataops-tools.git
-
-
Customize deployment settings
-
In the newly cloned repo, edit
infra/settings.yml
to match your desired setup:cd C:\Files\Source\dataops-tools\infra code config.yml
-
Edit or confirm the following four settings:
tableau_windows_servers
:1
tableau_linux_servers
:1
region
:us-east-2
(match your selected region if not using the default)project_shortname
: preferrably something short and unique likeAlexTableau
TableauTest
(special characters not allowed)prefer_fargate
:true
(otherwise will cause charges related to an always-on ECS cluster)
-
-
Configure AWS on your local machine
- Follow these instructions from Amazon on how to install and configure the AWS CLI on your local development machine.
- As part of this install, you should create one text file called
credentials
and one calledconfig
in an.aws
subfolder within your local user profile folder-
The
~/.aws/credentials
file should look like:AWS_ACCESS_KEY_ID=asdf1234****** AWS_SECRET_ACCESS_KEY=asdf1234*****
-
The
~/.aws/config
file should look like:[default] AWS_DEFAULT_REGION=us-west-2
-
To find the profile folder on Windows:
Win+R
>%USERPROFILE%
><ENTER>
-
- If you do not yet have a user credential pair, you will need to create a new AWS user before you can continue to the next step.
Time Requirement: 5 minutes
Overview: This section covers the configuration and deployment of Terraform "Infrastructure as Code" (IAC).
-
Confirm the settings configured in the previous step.
cd C:\Files\Source\dataops-tools\infra & code config.yml
-
Deploy the infrastructure using terraform.
cd C:\Files\Source\dataops-tools\infra terraform init terraform plan terraform apply -auto-approve
-
Watch and wait...
- The
terraform apply
statement should take 2-3 minutes, and afterwards will print login information for how to connect to the AWS instances. If you need to see these again, simply runterraform output
which will print the outputs again without changing the deployment. - After the instance is started, Tableau takes another 15-20 minutes to fully configure and install. Go get a coffee, and/or watch the logs by jumping ahead to the next step and following banner instructions to monitor (aka "tail") the setup logs.
- The
Expected Time: 10 minutes
Overview: After terraform apply
is successfully executed (2-3 minutes) and after the tableau install scripts have completed (another 15-20 minutes), you should be able to log into the new EC2 instance and complete server setup.
- Connect to the remote EC2 instance.
- After running either
terraform apply
(orterraform output
), copytableau_server_windows_rdp_command
(Windows) ortableau_server_linux_ssh_command
(Linux) into your terminal and press enter. - Once logged in, follow the instructions in the banner message to tail the install log. If everything runs successfully, you will see two marker files in the scrips folder:
_INIT_STARTED_
and_INIT_COMPLETE_
. (You can check the timestamps on these files in order to calculate how long the scripts ran, generally around 16 minutes.) - Paste the command into a terminal and run from your local machine. This will connect you to the new Tableau server via an RDP session (Windows) or an ssh terminal session (Linux).
- After running either
- Connect and test from your local workstation
- After the automated install completes, you should be able to connect to Tableau Server via the provided public ip address. By default, TSM uses
https://[my_ip]:8850
and the Tableau Server itself useshttp://[my_ip]
- After the automated install completes, you should be able to connect to Tableau Server via the provided public ip address. By default, TSM uses
Expected Time: 1 minute
Overview: Once your testing is complete, don't forget to shut down the resources and avoid large AWS bill. Luckily, this only takes a single command to shut everything off.
-
Run
terraform destroy
OR reduce instance counts.cd c:\Files\Source\dataops-tools\infra terraform destroy -auto-approve
-
You're done! (There is no step 2.) 😎
- Repeat from the Deploy step to start again.
- The
dataops-tools
repo and this one (dataops-quickstart
) are both public and GitHub provides two mechanisms to help us continually improve:- Pull Requests - for code contributions
- Issues - for submitting bugs and proposed enhancements
This section contains additional information which might be helpful, but is not required for all users or use cases.
Understanding how this works
- To get a better understanding of how this works to deploy a full environment in terraform, explore the code files in the
infra
folder of thedataops-tools
repo, starting the filesinfra/main.tf
andinfra/components/aws-tableau/main.tf
. - For information specifically on the Tableau Server config, see
infra/components/aws-tableau/bootstrap.bat
(Windows) andinfra/components/aws-tableau/bootstrap.sh
(Linux).
Testing different versions of the setup scripts
Whenever you modify the setup scripts in infra/components/aws-tableau
and then run terraform apply
, terraform will detect the change to the script and will automatically rebuild the environment using the updated scripts. This means you can test different script options and configurations and rebuild everything in a single command.
Proceed with Caution: While fully rebuilding the environment from scratch is extremely powerful, it also means you will be starting over from scratch each time. You will lose all settings, tableau workbooks, and data files which you may have deployed to the server(s).
Customizing the Server Config
- Log in to the instance using ssh or rdp, as described above
- Follow the instructions in the welcome banner to locate the correct files for configuration.
Securely storing secrets within AWS (Optional Step)
NOTE: No secrets are currently needed for the install process itself.
-
Login to AWS and navigate to the "Secrets Manager" service.
-
Important: ensure you are logging into the AWS region you selected in the step above. (Secrets are regionalized by AWS, which means they are only available in the region in which they are created.)
-
Select the option to "Store a new secret"
-
Select "Other type of secrets" and enter the following secrets:
AWS_ACCESS_KEY_ID: 123456*** AWS_SECRET_ACCESS_KEY: adfc1!***
-
Use the default encryption key unless you have another encryption key you prefer.
-
Click "Next" and name the secret collection as
TableauServer/demo
or similar. -
The remaining settings should use their provided defaults. Click "Next" until you have completed the wizard.
Selecting an AWS region for your project
Due to better pricing and availability, us-west-2 (Oregon)
is generally recommended for most use cases. The next best option is usually us-east-2 (Ohio)
for clients and offices running on the East coast.
- Note: While it's good practice to put the server on the same coast as the targeted end-users, the more important latency to optimize for is the distance between your BI server and your relational database or data warehouse. For instance, if you are pulling large amounts of data from Redshift, first find out which region that instance resides in and try to match that region if at all possible. This traffic will represent the largest factor in network performance.