We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I'm working on creating es with log groups. But i have noticed that when using AWS console, a cloudwatch resource policy is created by default.
But when creating through terraform, there is no parameter to either specify to use existing policy or create new policy.
So i just defined my resource policy
data "aws_iam_policy_document" "elasticsearch-log-publishing-policy" { statement { actions = [ "logs:CreateLogStream", "logs:PutLogEvents", "logs:PutLogEventsBatch", ] resources = ["arn:aws:logs:*"] principals { identifiers = ["es.amazonaws.com"] type = "Service" } } } resource "aws_cloudwatch_log_resource_policy" "elasticsearch-log-publishing-policy" { policy_document = "${data.aws_iam_policy_document.elasticsearch-log-publishing-policy.json}" policy_name = "elasticsearch-log-publishing-policy" }
But getting this below error
11:58:07 * aws_cloudwatch_log_resource_policy.elasticsearch-log-publishing-policy: Writing CloudWatch log resource policy failed: LimitExceededException: Resource limit exceeded. 11:58:07 * aws_elasticsearch_domain.es2: 1 error(s) occurred:
Can someone pls help me on how to proceed.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I'm working on creating es with log groups. But i have noticed that when using AWS console, a cloudwatch resource policy is created by default.
But when creating through terraform, there is no parameter to either specify to use existing policy or create new policy.
So i just defined my resource policy
But getting this below error
Can someone pls help me on how to proceed.
The text was updated successfully, but these errors were encountered: