Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[16.0][MIG] storage_backend_s3: Migration to 16.0 #270

Closed
wants to merge 49 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
6a3ee4c
[REF] Several refactor for splitting amazon storage
sebastienbeau Apr 10, 2018
2276f2e
[REF] split sftp backend in a separated module
sebastienbeau Apr 11, 2018
03757bc
[REF] rename method store and retrieve by more explicit method add/ge…
sebastienbeau Apr 11, 2018
55a429d
[REF] refactor test in order to use the same test between the differe…
sebastienbeau Apr 11, 2018
90236c0
[IMP] add method for listing directory and deleting file on storage.b…
sebastienbeau Apr 13, 2018
758fc60
[REF] set all module to the category storage
sebastienbeau Apr 17, 2018
34c86eb
[FIX] clean with pre-commit and pep 8
bguillot Apr 10, 2019
d8e9649
[REF] refactor test for checking access right and refactor S3 testing
sebastienbeau Apr 11, 2019
c398ba4
[IMP] add tests and support pilbox for thumbnail
bguillot Apr 12, 2019
e095e47
[12.0] storage*: Make installable False
rousseldenis Jun 7, 2019
e8ceacb
[FIX] __manifest__: Uses github repo url as website and add OCA into …
lmignon Sep 24, 2019
05992e1
pre-commit, black, isort
sbidoul Oct 1, 2019
78f8866
[MIG] storage_backend_s3: Migration to 12.0
simahawk Oct 14, 2019
2585803
storage_backend_s3: improvements
simahawk Nov 2, 2019
c7d668d
[UPD] Update storage_backend_s3.pot
oca-travis Nov 4, 2019
29634e7
[ADD] icon.png
OCA-git-bot Nov 4, 2019
3ca3e71
S3: add file ACL control
simahawk Nov 22, 2019
2e5f901
Fix runbot warning on clashing labels
simahawk Nov 22, 2019
0624629
Add server_env support
simahawk Nov 22, 2019
c22db6c
[UPD] Update storage_backend_s3.pot
oca-travis Nov 25, 2019
ba9f1bd
storage_backend_s3 12.0.2.0.0
OCA-git-bot Nov 25, 2019
348ff3d
[IMP] storage_backend_s3: black, isort, prettier
JasminSForgeFlow Jul 21, 2022
f502397
[MIG] storage_backend: Migration to 13.0
Oct 22, 2019
812a87b
[NEW] Make s3 compatible with services that need region_name eg: scal…
mileo Nov 11, 2019
c91dfd4
[NEW] Test fake s3
mileo Nov 12, 2019
0a78a29
storage_backend_s3: fix other region name handling
simahawk Jan 16, 2020
1b7536e
S3 aws_other_region: support env override
simahawk Jan 16, 2020
4fa6599
[UPD] Update storage_backend_s3.pot
oca-travis Jan 16, 2020
f86cc5f
pre-commit update
OCA-git-bot Mar 14, 2020
eb0aa94
storage_backend: run permission tests explicitely
simahawk Oct 29, 2020
635005b
storage_backend_s3 bump 13.0.1.1.0
simahawk Nov 23, 2020
5728d2b
[ADD] add new V14 config
sebastienbeau Dec 6, 2020
db722ae
[IMP] all: black, isort, prettier
sebastienbeau Dec 6, 2020
e9cb4df
[MIG] batch migration of modules
sebastienbeau Dec 6, 2020
e753210
storage_backend_s3 14.0.1.0.1
OCA-git-bot Mar 1, 2021
cc281b3
[UPD] Update storage_backend_s3.pot
oca-travis Jun 9, 2021
8a48766
[CHG] storage: Use more permissive licence: AGPL-> LGPL
etobella Mar 10, 2021
7e0546b
storage_backend_s3 14.0.2.0.0
OCA-git-bot Aug 2, 2021
bd1f443
storage_s3: fix aws regions lookup to load once
simahawk Feb 3, 2021
8eec98e
storage_backend_s3 14.0.2.0.1
OCA-git-bot Nov 30, 2021
1212e25
[UPD] Reflect boto3 version issue in readme
Mat-moran Dec 1, 2021
3ffc0d0
[MIG] storage_backend_s3: Migration to 15.0
JasminSForgeFlow Jul 21, 2022
9fc0528
[UPD] Update storage_backend_s3.pot
Oct 18, 2022
c33e2c4
[UPD] README.rst
OCA-git-bot Oct 18, 2022
9ae145b
[IMP] boto3 version bump
MiquelRForgeFlow Oct 19, 2022
8e9d999
[UPD] README.rst
OCA-git-bot Oct 20, 2022
b7ffff1
storage_backend_s3 15.0.1.0.1
OCA-git-bot Oct 20, 2022
64708c0
[IMP] storage_backend_s3: pre-commit stuff
Aug 8, 2023
842bcf6
[MIG] storage_backend_s3: Migration to 16.0
Aug 9, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# generated from manifests external_dependencies
boto3<=1.15.18
vcrpy-unittest
1 change: 1 addition & 0 deletions setup/storage_backend_s3/odoo/addons/storage_backend_s3
6 changes: 6 additions & 0 deletions setup/storage_backend_s3/setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
import setuptools

setuptools.setup(
setup_requires=['setuptools-odoo'],
odoo_addon=True,
)
80 changes: 80 additions & 0 deletions storage_backend_s3/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
==================
Storage Backend S3
==================

.. !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/licence-LGPL--3-blue.png
:target: http://www.gnu.org/licenses/lgpl-3.0-standalone.html
:alt: License: LGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Fstorage-lightgray.png?logo=github
:target: https://github.com/OCA/storage/tree/15.0/storage_backend_s3
:alt: OCA/storage
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/storage-15-0/storage-15-0-storage_backend_s3
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runbot-Try%20me-875A7B.png
:target: https://runbot.odoo-community.org/runbot/275/15.0
:alt: Try me on Runbot

|badge1| |badge2| |badge3| |badge4| |badge5|

Add the possibility to store and get data from amazon S3 for your storage backend

**Table of contents**

.. contents::
:local:

Known issues / Roadmap
======================

There is an issue with the latest version of `boto3` and `urllib3`
- boto3 needs to be `boto3<=1.15.18` related with https://github.com/OCA/storage/issues/67

Bug Tracker
===========

Bugs are tracked on `GitHub Issues <https://github.com/OCA/storage/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us smashing it by providing a detailed and welcomed
`feedback <https://github.com/OCA/storage/issues/new?body=module:%20storage_backend_s3%0Aversion:%2015.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.

Do not contact contributors directly about support or help with technical issues.

Credits
=======

Authors
~~~~~~~

* Akretion

Contributors
~~~~~~~~~~~~

* Sebastien Beau <[email protected]>
* Raphaël Reverdy <[email protected]>

Maintainers
~~~~~~~~~~~

This module is maintained by the OCA.

.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org

OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.

This module is part of the `OCA/storage <https://github.com/OCA/storage/tree/15.0/storage_backend_s3>`_ project on GitHub.

You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
2 changes: 2 additions & 0 deletions storage_backend_s3/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
from . import models
from . import components
17 changes: 17 additions & 0 deletions storage_backend_s3/__manifest__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright 2017 Akretion (http://www.akretion.com).
# @author Sébastien BEAU <[email protected]>
# License LGPL-3.0 or later (http://www.gnu.org/licenses/lgpl).

{
"name": "Storage Backend S3",
"summary": "Implement amazon S3 Storage",
"version": "16.0.1.0.0",
"category": "Storage",
"website": "https://github.com/OCA/storage",
"author": " Akretion, Odoo Community Association (OCA)",
"license": "LGPL-3",
"installable": True,
"external_dependencies": {"python": ["boto3<=1.15.18", "vcrpy-unittest"]},
"depends": ["storage_backend"],
"data": ["views/backend_storage_view.xml"],
}
1 change: 1 addition & 0 deletions storage_backend_s3/components/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import s3_adapter
121 changes: 121 additions & 0 deletions storage_backend_s3/components/s3_adapter.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
# Copyright 2017 Akretion (http://www.akretion.com).
# @author Sébastien BEAU <[email protected]>
# Copyright 2019 Camptocamp SA (http://www.camptocamp.com).
# @author Simone Orsi <[email protected]>
# License LGPL-3.0 or later (http://www.gnu.org/licenses/lgpl).

import io
import logging

from odoo import _, exceptions

from odoo.addons.component.core import Component

_logger = logging.getLogger(__name__)

try:
import boto3
from botocore.exceptions import ClientError, EndpointConnectionError

except ImportError as err: # pragma: no cover
_logger.debug(err)


class S3StorageAdapter(Component):
_name = "s3.adapter"
_inherit = "base.storage.adapter"
_usage = "amazon_s3"

def _aws_bucket_params(self):
params = {
"aws_access_key_id": self.collection.aws_access_key_id,
"aws_secret_access_key": self.collection.aws_secret_access_key,
}
if self.collection.aws_host:
params["endpoint_url"] = self.collection.aws_host

if self.collection.aws_region:
if self.collection.aws_region != "other":
params["region_name"] = self.collection.aws_region
elif self.collection.aws_other_region:
params["region_name"] = self.collection.aws_other_region
return params

def _get_bucket(self):
params = self._aws_bucket_params()
s3 = boto3.resource("s3", **params)
bucket_name = self.collection.aws_bucket
bucket = s3.Bucket(bucket_name)
exists = True
try:
s3.meta.client.head_bucket(Bucket=bucket_name)
except ClientError as e:
# If a client error is thrown, then check that it was a 404 error.
# If it was a 404 error, then the bucket does not exist.
error_code = e.response["Error"]["Code"]
if error_code == "404":
exists = False
except EndpointConnectionError as error:
# log verbose error from s3, return short message for user
_logger.exception("Error during connection on S3")
raise exceptions.UserError(str(error)) from error
region_name = params.get("region_name")
if not exists:
if not region_name:
bucket = s3.create_bucket(Bucket=bucket_name)
else:
bucket = s3.create_bucket(
Bucket=bucket_name,
CreateBucketConfiguration={"LocationConstraint": region_name},
)
return bucket

def _get_object(self, relative_path=None):
bucket = self._get_bucket()
path = None
if relative_path:
path = self._fullpath(relative_path)
return bucket.Object(key=path)

def add(self, relative_path, bin_data, mimetype=None, **kwargs):
s3object = self._get_object(relative_path)
file_params = self._aws_upload_fileobj_params(mimetype=mimetype, **kwargs)
with io.BytesIO() as fileobj:
fileobj.write(bin_data)
fileobj.seek(0)
try:
s3object.upload_fileobj(fileobj, **file_params)
except ClientError as error:
# log verbose error from s3, return short message for user
_logger.exception("Error during storage of the file %s" % relative_path)
raise exceptions.UserError(
_("The file could not be stored: %s") % str(error)
) from error

def _aws_upload_fileobj_params(self, mimetype=None, **kw):
extra_args = {}
if mimetype:
extra_args["ContentType"] = mimetype
if self.collection.aws_cache_control:
extra_args["CacheControl"] = self.collection.aws_cache_control
if self.collection.aws_file_acl:
extra_args["ACL"] = self.collection.aws_file_acl
if extra_args:
return {"ExtraArgs": extra_args}
return {}

def get(self, relative_path):
s3object = self._get_object(relative_path)
return s3object.get()["Body"].read()

def list(self, relative_path):
bucket = self._get_bucket()
dir_path = self.collection.directory_path or ""
return [
o.key.replace(dir_path, "").lstrip("/")
for o in bucket.objects.filter(Prefix=dir_path)
]

def delete(self, relative_path):
s3object = self._get_object(relative_path)
s3object.delete()
116 changes: 116 additions & 0 deletions storage_backend_s3/i18n/storage_backend_s3.pot
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
# Translation of Odoo Server.
# This file contains the translation of the following modules:
# * storage_backend_s3
#
msgid ""
msgstr ""
"Project-Id-Version: Odoo Server 15.0\n"
"Report-Msgid-Bugs-To: \n"
"Last-Translator: \n"
"Language-Team: \n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: \n"
"Plural-Forms: \n"

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_host
msgid "AWS Host"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_access_key_id
msgid "Access Key ID"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__backend_type__amazon_s3
#: model_terms:ir.ui.view,arch_db:storage_backend_s3.storage_backend_view_form
msgid "Amazon S3"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_cache_control
msgid "Aws Cache Control"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_file_acl
msgid "Aws File Acl"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__backend_type
msgid "Backend Type"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_bucket
msgid "Bucket"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,help:storage_backend_s3.field_storage_backend__aws_host
msgid "If you are using a different host than standard AWS ones, eg: Exoscale"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_other_region
msgid "Other region"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_region
msgid "Region"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_secret_access_key
msgid "Secret Access Key"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model,name:storage_backend_s3.model_storage_backend
msgid "Storage Backend"
msgstr ""

#. module: storage_backend_s3
#: code:addons/storage_backend_s3/components/s3_adapter.py:0
#, python-format
msgid "The file could not be stored: %s"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__authenticated-read
msgid "authenticated-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__aws-exec-read
msgid "aws-exec-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__bucket-owner-full-control
msgid "bucket-owner-full-control"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__bucket-owner-read
msgid "bucket-owner-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__private
msgid "private"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__public-read
msgid "public-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__public-read-write
msgid "public-read-write"
msgstr ""
1 change: 1 addition & 0 deletions storage_backend_s3/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import storage_backend
Loading