##// END OF EJS Templates
chore: fixed merge conflicts
andverb -
r5546:50cf7822 merge v5.2.0 stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,161 b''
1 .. _config-saml-azure-ref:
2
3
4 SAML 2.0 with Azure Entra ID
5 ----------------------------
6
7 **This plugin is available only in EE Edition.**
8
9 |RCE| supports SAML 2.0 Authentication with Azure Entra ID provider. This allows
10 users to log-in to RhodeCode via SSO mechanism of external identity provider
11 such as Azure AD. The login can be triggered either by the external IDP, or internally
12 by clicking specific authentication button on the log-in page.
13
14
15 Configuration steps
16 ^^^^^^^^^^^^^^^^^^^
17
18 To configure Duo Security SAML authentication, use the following steps:
19
20 1. From the |RCE| interface, select
21 :menuselection:`Admin --> Authentication`
22 2. Activate the `Azure Entra ID` plugin and select :guilabel:`Save`
23 3. Go to newly available menu option called `Azure Entra ID` on the left side.
24 4. Check the `enabled` check box in the plugin configuration section,
25 and fill in the required SAML information and :guilabel:`Save`, for more details,
26 see :ref:`config-saml-azure`
27
28
29 .. _config-saml-azure:
30
31
32 Example SAML Azure Entra ID configuration
33 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
34
35 Example configuration for SAML 2.0 with Azure Entra ID provider
36
37
38 Enabled
39 `True`:
40
41 .. note::
42 Enable or disable this authentication plugin.
43
44
45 Auth Cache TTL
46 `30`:
47
48 .. note::
49 Amount of seconds to cache the authentication and permissions check response call for this plugin.
50 Useful for expensive calls like LDAP to improve the performance of the system (0 means disabled).
51
52 Debug
53 `True`:
54
55 .. note::
56 Enable or disable debug mode that shows SAML errors in the RhodeCode logs.
57
58
59 Auth button name
60 `Azure Entra ID`:
61
62 .. note::
63 Alternative authentication display name. E.g AzureAuth, CorporateID etc.
64
65
66 Entity ID
67 `https://sts.windows.net/APP_ID/`:
68
69 .. note::
70 Identity Provider entity/metadata URI. Known as "Microsoft Entra Identifier"
71 E.g. https://sts.windows.net/abcd-c655-dcee-aab7-abcd/
72
73 SSO URL
74 `https://login.microsoftonline.com/APP_ID/saml2`:
75
76 .. note::
77 SSO (SingleSignOn) endpoint URL of the IdP. This can be used to initialize login, Known also as Login URL
78 E.g. https://login.microsoftonline.com/abcd-c655-dcee-aab7-abcd/saml2
79
80 SLO URL
81 `https://login.microsoftonline.com/APP_ID/saml2`:
82
83 .. note::
84 SLO (SingleLogout) endpoint URL of the IdP. , Known also as Logout URL
85 E.g. https://login.microsoftonline.com/abcd-c655-dcee-aab7-abcd/saml2
86
87 x509cert
88 `<CERTIFICATE_STRING>`:
89
90 .. note::
91 Identity provider public x509 certificate. It will be converted to single-line format without headers.
92 Download the raw base64 encoded certificate from the Identity provider and paste it here.
93
94 SAML Signature
95 `sha-256`:
96
97 .. note::
98 Type of Algorithm to use for verification of SAML signature on Identity provider side.
99
100 SAML Digest
101 `sha-256`:
102
103 .. note::
104 Type of Algorithm to use for verification of SAML digest on Identity provider side.
105
106 Service Provider Cert Dir
107 `/etc/rhodecode/conf/saml_ssl/`:
108
109 .. note::
110 Optional directory to store service provider certificate and private keys.
111 Expected certs for the SP should be stored in this folder as:
112
113 * sp.key Private Key
114 * sp.crt Public cert
115 * sp_new.crt Future Public cert
116
117 Also you can use other cert to sign the metadata of the SP using the:
118
119 * metadata.key
120 * metadata.crt
121
122 Expected NameID Format
123 `nameid-format:emailAddress`:
124
125 .. note::
126 The format that specifies how the NameID is sent to the service provider.
127
128 User ID Attribute
129 `user.email`:
130
131 .. note::
132 User ID Attribute name. This defines which attribute in SAML response will be used to link accounts via unique id.
133 Ensure this is returned from DuoSecurity for example via duo_username.
134
135 Username Attribute
136 `user.username`:
137
138 .. note::
139 Username Attribute name. This defines which attribute in SAML response will map to a username.
140
141 Email Attribute
142 `user.email`:
143
144 .. note::
145 Email Attribute name. This defines which attribute in SAML response will map to an email address.
146
147
148
149 Below is example setup from Azure Administration page that can be used with above config.
150
151 .. image:: ../images/saml-azure-service-provider-example.png
152 :alt: Azure SAML setup example
153 :scale: 50 %
154
155
156 Below is an example attribute mapping set for IDP provider required by the above config.
157
158
159 .. image:: ../images/saml-azure-attributes-example.png
160 :alt: Azure SAML setup example
161 :scale: 50 % No newline at end of file
1 NO CONTENT: new file 100644, binary diff hidden
NO CONTENT: new file 100644, binary diff hidden
1 NO CONTENT: new file 100644, binary diff hidden
NO CONTENT: new file 100644, binary diff hidden
@@ -0,0 +1,40 b''
1 |RCE| 5.1.1 |RNS|
2 -----------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2024-07-23
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13
14
15 General
16 ^^^^^^^
17
18
19
20 Security
21 ^^^^^^^^
22
23
24
25 Performance
26 ^^^^^^^^^^^
27
28
29
30
31 Fixes
32 ^^^^^
33
34 - Fixed problems with JS static files build
35
36
37 Upgrade notes
38 ^^^^^^^^^^^^^
39
40 - RhodeCode 5.1.1 is unscheduled bugfix release to address some build issues with 5.1 images
@@ -0,0 +1,41 b''
1 |RCE| 5.1.2 |RNS|
2 -----------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2024-09-12
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13
14
15 General
16 ^^^^^^^
17
18
19
20 Security
21 ^^^^^^^^
22
23
24
25 Performance
26 ^^^^^^^^^^^
27
28
29
30
31 Fixes
32 ^^^^^
33
34 - Fixed problems with Mercurial authentication after enabling httppostargs.
35 Currently this protocol will be disabled until proper fix is in place
36
37
38 Upgrade notes
39 ^^^^^^^^^^^^^
40
41 - RhodeCode 5.1.2 is unscheduled bugfix release to address some build issues with 5.1 images
@@ -0,0 +1,55 b''
1 |RCE| 5.2.0 |RNS|
2 -----------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2024-10-09
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - New artifact storage engines allowing an s3 based uploads
14 - Enterprise version only: Added security tab to admin interface and possibility to whitelist specific vcs client versions. Some older versions clients have known security vulnerabilities, now you can disallow them.
15 - Enterprise version only: Implemented support for Azure SAML authentication
16
17
18 General
19 ^^^^^^^
20 - Bumped version of packaging, gunicorn, orjson, zope.interface and some other requirements
21 - Few tweaks and changes to saml plugins to allows easier setup
22 - Configs: allow json log format for gunicorn
23 - Configs: deprecated old ssh wrapper command and make the v2 the default one
24 - Make sure commit-caches propagate to parent repo groups
25 - Configs: Moved git lfs path and path of hg large files to ini file
26
27 Security
28 ^^^^^^^^
29
30
31
32 Performance
33 ^^^^^^^^^^^
34
35 - description escaper for better performance
36
37 Fixes
38 ^^^^^
39
40 - Email notifications not working properly
41 - Removed waitress as a default runner
42 - Fixed issue with branch permissions
43 - Ldap: fixed nested groups extraction logic
44 - Fixed possible db corruption in case of filesystem problems
45 - Cleanup and improvements to documentation
46 - Added Kubernetes deployment section to the documentation
47 - Added default value to celery result and broker
48 - Fixed broken backends function after python3 migration
49 - Explicitly disable mercurial web_push ssl flag to prevent from errors about ssl required
50 - VCS: fixed problems with locked repos and with branch permissions reporting
51
52 Upgrade notes
53 ^^^^^^^^^^^^^
54
55 - RhodeCode 5.2.0 is a planned major release featuring Azure SAML, whitelist for client versions, s3 artifacts backend and more!
@@ -0,0 +1,46 b''
1 # Copyright (C) 2010-2024 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
19 import logging
20
21 from rhodecode.apps._base import BaseAppView
22 from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator
23
24 log = logging.getLogger(__name__)
25
26
27 class AdminSecurityView(BaseAppView):
28
29 def load_default_context(self):
30 c = self._get_local_tmpl_context()
31 return c
32
33 @LoginRequired()
34 @HasPermissionAllDecorator('hg.admin')
35 def security(self):
36 c = self.load_default_context()
37 c.active = 'security'
38 return self._get_template_context(c)
39
40
41 @LoginRequired()
42 @HasPermissionAllDecorator('hg.admin')
43 def admin_security_modify_allowed_vcs_client_versions(self):
44 c = self.load_default_context()
45 c.active = 'security'
46 return self._get_template_context(c)
@@ -0,0 +1,269 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
19 import os
20 import fsspec # noqa
21 import logging
22
23 from rhodecode.lib.ext_json import json
24
25 from rhodecode.apps.file_store.utils import sha256_safe, ShardFileReader, get_uid_filename
26 from rhodecode.apps.file_store.extensions import resolve_extensions
27 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException # noqa: F401
28
29 log = logging.getLogger(__name__)
30
31
32 class BaseShard:
33
34 metadata_suffix: str = '.metadata'
35 storage_type: str = ''
36 fs = None
37
38 @property
39 def storage_medium(self):
40 if not self.storage_type:
41 raise ValueError('No storage type set for this shard storage_type=""')
42 return getattr(self, self.storage_type)
43
44 def __contains__(self, key):
45 full_path = self.store_path(key)
46 return self.fs.exists(full_path)
47
48 def metadata_convert(self, uid_filename, metadata):
49 return metadata
50
51 def get_metadata_filename(self, uid_filename) -> tuple[str, str]:
52 metadata_file: str = f'{uid_filename}{self.metadata_suffix}'
53 return metadata_file, self.store_path(metadata_file)
54
55 def get_metadata(self, uid_filename, ignore_missing=False) -> dict:
56 _metadata_file, metadata_file_path = self.get_metadata_filename(uid_filename)
57 if ignore_missing and not self.fs.exists(metadata_file_path):
58 return {}
59
60 with self.fs.open(metadata_file_path, 'rb') as f:
61 metadata = json.loads(f.read())
62
63 metadata = self.metadata_convert(uid_filename, metadata)
64 return metadata
65
66 def _store(self, key: str, uid_key: str, value_reader, max_filesize: int | None = None, metadata: dict | None = None, **kwargs):
67 raise NotImplementedError
68
69 def store(self, key: str, uid_key: str, value_reader, max_filesize: int | None = None, metadata: dict | None = None, **kwargs):
70 return self._store(key, uid_key, value_reader, max_filesize, metadata, **kwargs)
71
72 def _fetch(self, key, presigned_url_expires: int = 0):
73 raise NotImplementedError
74
75 def fetch(self, key, **kwargs) -> tuple[ShardFileReader, dict]:
76 return self._fetch(key)
77
78 def _delete(self, key):
79 if key not in self:
80 log.exception(f'requested key={key} not found in {self}')
81 raise KeyError(key)
82
83 metadata = self.get_metadata(key)
84 _metadata_file, metadata_file_path = self.get_metadata_filename(key)
85 artifact_file_path = metadata['filename_uid_path']
86 self.fs.rm(artifact_file_path)
87 self.fs.rm(metadata_file_path)
88
89 return 1
90
91 def delete(self, key):
92 raise NotImplementedError
93
94 def store_path(self, uid_filename):
95 raise NotImplementedError
96
97
98 class BaseFileStoreBackend:
99 _shards = tuple()
100 _shard_cls = BaseShard
101 _config: dict | None = None
102 _storage_path: str = ''
103
104 def __init__(self, settings, extension_groups=None):
105 self._config = settings
106 extension_groups = extension_groups or ['any']
107 self.extensions = resolve_extensions([], groups=extension_groups)
108
109 def __contains__(self, key):
110 return self.filename_exists(key)
111
112 def __repr__(self):
113 return f'<{self.__class__.__name__}(storage={self.storage_path})>'
114
115 @property
116 def storage_path(self):
117 return self._storage_path
118
119 @classmethod
120 def get_shard_index(cls, filename: str, num_shards) -> int:
121 # Generate a hash value from the filename
122 hash_value = sha256_safe(filename)
123
124 # Convert the hash value to an integer
125 hash_int = int(hash_value, 16)
126
127 # Map the hash integer to a shard number between 1 and num_shards
128 shard_number = (hash_int % num_shards)
129
130 return shard_number
131
132 @classmethod
133 def apply_counter(cls, counter: int, filename: str) -> str:
134 """
135 Apply a counter to the filename.
136
137 :param counter: The counter value to apply.
138 :param filename: The original filename.
139 :return: The modified filename with the counter.
140 """
141 name_counted = f'{counter:d}-{filename}'
142 return name_counted
143
144 def _get_shard(self, key) -> _shard_cls:
145 index = self.get_shard_index(key, len(self._shards))
146 shard = self._shards[index]
147 return shard
148
149 def get_conf(self, key, pop=False):
150 if key not in self._config:
151 raise ValueError(
152 f"No configuration key '{key}', please make sure it exists in filestore config")
153 val = self._config[key]
154 if pop:
155 del self._config[key]
156 return val
157
158 def filename_allowed(self, filename, extensions=None):
159 """Checks if a filename has an allowed extension
160
161 :param filename: base name of file
162 :param extensions: iterable of extensions (or self.extensions)
163 """
164 _, ext = os.path.splitext(filename)
165 return self.extension_allowed(ext, extensions)
166
167 def extension_allowed(self, ext, extensions=None):
168 """
169 Checks if an extension is permitted. Both e.g. ".jpg" and
170 "jpg" can be passed in. Extension lookup is case-insensitive.
171
172 :param ext: extension to check
173 :param extensions: iterable of extensions to validate against (or self.extensions)
174 """
175 def normalize_ext(_ext):
176 if _ext.startswith('.'):
177 _ext = _ext[1:]
178 return _ext.lower()
179
180 extensions = extensions or self.extensions
181 if not extensions:
182 return True
183
184 ext = normalize_ext(ext)
185
186 return ext in [normalize_ext(x) for x in extensions]
187
188 def filename_exists(self, uid_filename):
189 shard = self._get_shard(uid_filename)
190 return uid_filename in shard
191
192 def store_path(self, uid_filename):
193 """
194 Returns absolute file path of the uid_filename
195 """
196 shard = self._get_shard(uid_filename)
197 return shard.store_path(uid_filename)
198
199 def store_metadata(self, uid_filename):
200 shard = self._get_shard(uid_filename)
201 return shard.get_metadata_filename(uid_filename)
202
203 def store(self, filename, value_reader, extensions=None, metadata=None, max_filesize=None, randomized_name=True, **kwargs):
204 extensions = extensions or self.extensions
205
206 if not self.filename_allowed(filename, extensions):
207 msg = f'filename {filename} does not allow extensions {extensions}'
208 raise FileNotAllowedException(msg)
209
210 # # TODO: check why we need this setting ? it looks stupid...
211 # no_body_seek is used in stream mode importer somehow
212 # no_body_seek = kwargs.pop('no_body_seek', False)
213 # if no_body_seek:
214 # pass
215 # else:
216 # value_reader.seek(0)
217
218 uid_filename = kwargs.pop('uid_filename', None)
219 if uid_filename is None:
220 uid_filename = get_uid_filename(filename, randomized=randomized_name)
221
222 shard = self._get_shard(uid_filename)
223
224 return shard.store(filename, uid_filename, value_reader, max_filesize, metadata, **kwargs)
225
226 def import_to_store(self, value_reader, org_filename, uid_filename, metadata, **kwargs):
227 shard = self._get_shard(uid_filename)
228 max_filesize = None
229 return shard.store(org_filename, uid_filename, value_reader, max_filesize, metadata, import_mode=True)
230
231 def delete(self, uid_filename):
232 shard = self._get_shard(uid_filename)
233 return shard.delete(uid_filename)
234
235 def fetch(self, uid_filename) -> tuple[ShardFileReader, dict]:
236 shard = self._get_shard(uid_filename)
237 return shard.fetch(uid_filename)
238
239 def get_metadata(self, uid_filename, ignore_missing=False) -> dict:
240 shard = self._get_shard(uid_filename)
241 return shard.get_metadata(uid_filename, ignore_missing=ignore_missing)
242
243 def iter_keys(self):
244 for shard in self._shards:
245 if shard.fs.exists(shard.storage_medium):
246 for path, _dirs, _files in shard.fs.walk(shard.storage_medium):
247 for key_file_path in _files:
248 if key_file_path.endswith(shard.metadata_suffix):
249 yield shard, key_file_path
250
251 def iter_artifacts(self):
252 for shard, key_file in self.iter_keys():
253 json_key = f"{shard.storage_medium}/{key_file}"
254 with shard.fs.open(json_key, 'rb') as f:
255 yield shard, json.loads(f.read())['filename_uid']
256
257 def get_statistics(self):
258 total_files = 0
259 total_size = 0
260 meta = {}
261
262 for shard, key_file in self.iter_keys():
263 json_key = f"{shard.storage_medium}/{key_file}"
264 with shard.fs.open(json_key, 'rb') as f:
265 total_files += 1
266 metadata = json.loads(f.read())
267 total_size += metadata['size']
268
269 return total_files, total_size, meta
@@ -0,0 +1,183 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
19 import os
20 import hashlib
21 import functools
22 import time
23 import logging
24
25 from .. import config_keys
26 from ..exceptions import FileOverSizeException
27 from ..backends.base import BaseFileStoreBackend, fsspec, BaseShard, ShardFileReader
28
29 from ....lib.ext_json import json
30
31 log = logging.getLogger(__name__)
32
33
34 class FileSystemShard(BaseShard):
35 METADATA_VER = 'v2'
36 BACKEND_TYPE = config_keys.backend_filesystem
37 storage_type: str = 'directory'
38
39 def __init__(self, index, directory, directory_folder, fs, **settings):
40 self._index: int = index
41 self._directory: str = directory
42 self._directory_folder: str = directory_folder
43 self.fs = fs
44
45 @property
46 def directory(self) -> str:
47 """Cache directory final path."""
48 return os.path.join(self._directory, self._directory_folder)
49
50 def _write_file(self, full_path, iterator, max_filesize, mode='wb'):
51
52 # ensure dir exists
53 destination, _ = os.path.split(full_path)
54 if not self.fs.exists(destination):
55 self.fs.makedirs(destination)
56
57 writer = self.fs.open(full_path, mode)
58
59 digest = hashlib.sha256()
60 oversize_cleanup = False
61 with writer:
62 size = 0
63 for chunk in iterator:
64 size += len(chunk)
65 digest.update(chunk)
66 writer.write(chunk)
67
68 if max_filesize and size > max_filesize:
69 oversize_cleanup = True
70 # free up the copied file, and raise exc
71 break
72
73 writer.flush()
74 # Get the file descriptor
75 fd = writer.fileno()
76
77 # Sync the file descriptor to disk, helps with NFS cases...
78 os.fsync(fd)
79
80 if oversize_cleanup:
81 self.fs.rm(full_path)
82 raise FileOverSizeException(f'given file is over size limit ({max_filesize}): {full_path}')
83
84 sha256 = digest.hexdigest()
85 log.debug('written new artifact under %s, sha256: %s', full_path, sha256)
86 return size, sha256
87
88 def _store(self, key: str, uid_key, value_reader, max_filesize: int | None = None, metadata: dict | None = None, **kwargs):
89
90 filename = key
91 uid_filename = uid_key
92 full_path = self.store_path(uid_filename)
93
94 # STORE METADATA
95 _metadata = {
96 "version": self.METADATA_VER,
97 "store_type": self.BACKEND_TYPE,
98
99 "filename": filename,
100 "filename_uid_path": full_path,
101 "filename_uid": uid_filename,
102 "sha256": "", # NOTE: filled in by reader iteration
103
104 "store_time": time.time(),
105
106 "size": 0
107 }
108
109 if metadata:
110 if kwargs.pop('import_mode', False):
111 # in import mode, we don't need to compute metadata, we just take the old version
112 _metadata["import_mode"] = True
113 else:
114 _metadata.update(metadata)
115
116 read_iterator = iter(functools.partial(value_reader.read, 2**22), b'')
117 size, sha256 = self._write_file(full_path, read_iterator, max_filesize)
118 _metadata['size'] = size
119 _metadata['sha256'] = sha256
120
121 # after storing the artifacts, we write the metadata present
122 _metadata_file, metadata_file_path = self.get_metadata_filename(uid_key)
123
124 with self.fs.open(metadata_file_path, 'wb') as f:
125 f.write(json.dumps(_metadata))
126
127 return uid_filename, _metadata
128
129 def store_path(self, uid_filename):
130 """
131 Returns absolute file path of the uid_filename
132 """
133 return os.path.join(self._directory, self._directory_folder, uid_filename)
134
135 def _fetch(self, key, presigned_url_expires: int = 0):
136 if key not in self:
137 log.exception(f'requested key={key} not found in {self}')
138 raise KeyError(key)
139
140 metadata = self.get_metadata(key)
141
142 file_path = metadata['filename_uid_path']
143 if presigned_url_expires and presigned_url_expires > 0:
144 metadata['url'] = self.fs.url(file_path, expires=presigned_url_expires)
145
146 return ShardFileReader(self.fs.open(file_path, 'rb')), metadata
147
148 def delete(self, key):
149 return self._delete(key)
150
151
152 class FileSystemBackend(BaseFileStoreBackend):
153 shard_name: str = 'shard_{:03d}'
154 _shard_cls = FileSystemShard
155
156 def __init__(self, settings):
157 super().__init__(settings)
158
159 store_dir = self.get_conf(config_keys.filesystem_storage_path)
160 directory = os.path.expanduser(store_dir)
161
162 self._directory = directory
163 self._storage_path = directory # common path for all from BaseCache
164 self._shard_count = int(self.get_conf(config_keys.filesystem_shards, pop=True))
165 if self._shard_count < 1:
166 raise ValueError(f'{config_keys.filesystem_shards} must be 1 or more')
167
168 log.debug('Initializing %s file_store instance', self)
169 fs = fsspec.filesystem('file')
170
171 if not fs.exists(self._directory):
172 fs.makedirs(self._directory, exist_ok=True)
173
174 self._shards = tuple(
175 self._shard_cls(
176 index=num,
177 directory=directory,
178 directory_folder=self.shard_name.format(num),
179 fs=fs,
180 **settings,
181 )
182 for num in range(self._shard_count)
183 )
@@ -0,0 +1,278 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18 import errno
19 import os
20 import hashlib
21 import functools
22 import time
23 import logging
24
25 from .. import config_keys
26 from ..exceptions import FileOverSizeException
27 from ..backends.base import BaseFileStoreBackend, fsspec, BaseShard, ShardFileReader
28
29 from ....lib.ext_json import json
30
31 log = logging.getLogger(__name__)
32
33
34 class LegacyFileSystemShard(BaseShard):
35 # legacy ver
36 METADATA_VER = 'v2'
37 BACKEND_TYPE = config_keys.backend_legacy_filesystem
38 storage_type: str = 'dir_struct'
39
40 # legacy suffix
41 metadata_suffix: str = '.meta'
42
43 @classmethod
44 def _sub_store_from_filename(cls, filename):
45 return filename[:2]
46
47 @classmethod
48 def apply_counter(cls, counter, filename):
49 name_counted = '%d-%s' % (counter, filename)
50 return name_counted
51
52 @classmethod
53 def safe_make_dirs(cls, dir_path):
54 if not os.path.exists(dir_path):
55 try:
56 os.makedirs(dir_path)
57 except OSError as e:
58 if e.errno != errno.EEXIST:
59 raise
60 return
61
62 @classmethod
63 def resolve_name(cls, name, directory):
64 """
65 Resolves a unique name and the correct path. If a filename
66 for that path already exists then a numeric prefix with values > 0 will be
67 added, for example test.jpg -> 1-test.jpg etc. initially file would have 0 prefix.
68
69 :param name: base name of file
70 :param directory: absolute directory path
71 """
72
73 counter = 0
74 while True:
75 name_counted = cls.apply_counter(counter, name)
76
77 # sub_store prefix to optimize disk usage, e.g some_path/ab/final_file
78 sub_store: str = cls._sub_store_from_filename(name_counted)
79 sub_store_path: str = os.path.join(directory, sub_store)
80 cls.safe_make_dirs(sub_store_path)
81
82 path = os.path.join(sub_store_path, name_counted)
83 if not os.path.exists(path):
84 return name_counted, path
85 counter += 1
86
87 def __init__(self, index, directory, directory_folder, fs, **settings):
88 self._index: int = index
89 self._directory: str = directory
90 self._directory_folder: str = directory_folder
91 self.fs = fs
92
93 @property
94 def dir_struct(self) -> str:
95 """Cache directory final path."""
96 return os.path.join(self._directory, '0-')
97
98 def _write_file(self, full_path, iterator, max_filesize, mode='wb'):
99
100 # ensure dir exists
101 destination, _ = os.path.split(full_path)
102 if not self.fs.exists(destination):
103 self.fs.makedirs(destination)
104
105 writer = self.fs.open(full_path, mode)
106
107 digest = hashlib.sha256()
108 oversize_cleanup = False
109 with writer:
110 size = 0
111 for chunk in iterator:
112 size += len(chunk)
113 digest.update(chunk)
114 writer.write(chunk)
115
116 if max_filesize and size > max_filesize:
117 # free up the copied file, and raise exc
118 oversize_cleanup = True
119 break
120
121 writer.flush()
122 # Get the file descriptor
123 fd = writer.fileno()
124
125 # Sync the file descriptor to disk, helps with NFS cases...
126 os.fsync(fd)
127
128 if oversize_cleanup:
129 self.fs.rm(full_path)
130 raise FileOverSizeException(f'given file is over size limit ({max_filesize}): {full_path}')
131
132 sha256 = digest.hexdigest()
133 log.debug('written new artifact under %s, sha256: %s', full_path, sha256)
134 return size, sha256
135
136 def _store(self, key: str, uid_key, value_reader, max_filesize: int | None = None, metadata: dict | None = None, **kwargs):
137
138 filename = key
139 uid_filename = uid_key
140
141 # NOTE:, also apply N- Counter...
142 uid_filename, full_path = self.resolve_name(uid_filename, self._directory)
143
144 # STORE METADATA
145 # TODO: make it compatible, and backward proof
146 _metadata = {
147 "version": self.METADATA_VER,
148
149 "filename": filename,
150 "filename_uid_path": full_path,
151 "filename_uid": uid_filename,
152 "sha256": "", # NOTE: filled in by reader iteration
153
154 "store_time": time.time(),
155
156 "size": 0
157 }
158 if metadata:
159 _metadata.update(metadata)
160
161 read_iterator = iter(functools.partial(value_reader.read, 2**22), b'')
162 size, sha256 = self._write_file(full_path, read_iterator, max_filesize)
163 _metadata['size'] = size
164 _metadata['sha256'] = sha256
165
166 # after storing the artifacts, we write the metadata present
167 _metadata_file, metadata_file_path = self.get_metadata_filename(uid_filename)
168
169 with self.fs.open(metadata_file_path, 'wb') as f:
170 f.write(json.dumps(_metadata))
171
172 return uid_filename, _metadata
173
174 def store_path(self, uid_filename):
175 """
176 Returns absolute file path of the uid_filename
177 """
178 prefix_dir = ''
179 if '/' in uid_filename:
180 prefix_dir, filename = uid_filename.split('/')
181 sub_store = self._sub_store_from_filename(filename)
182 else:
183 sub_store = self._sub_store_from_filename(uid_filename)
184
185 return os.path.join(self._directory, prefix_dir, sub_store, uid_filename)
186
187 def metadata_convert(self, uid_filename, metadata):
188 # NOTE: backward compat mode here... this is for file created PRE 5.2 system
189 if 'meta_ver' in metadata:
190 full_path = self.store_path(uid_filename)
191 metadata = {
192 "_converted": True,
193 "_org": metadata,
194 "version": self.METADATA_VER,
195 "store_type": self.BACKEND_TYPE,
196
197 "filename": metadata['filename'],
198 "filename_uid_path": full_path,
199 "filename_uid": uid_filename,
200 "sha256": metadata['sha256'],
201
202 "store_time": metadata['time'],
203
204 "size": metadata['size']
205 }
206 return metadata
207
208 def _fetch(self, key, presigned_url_expires: int = 0):
209 if key not in self:
210 log.exception(f'requested key={key} not found in {self}')
211 raise KeyError(key)
212
213 metadata = self.get_metadata(key)
214
215 file_path = metadata['filename_uid_path']
216 if presigned_url_expires and presigned_url_expires > 0:
217 metadata['url'] = self.fs.url(file_path, expires=presigned_url_expires)
218
219 return ShardFileReader(self.fs.open(file_path, 'rb')), metadata
220
221 def delete(self, key):
222 return self._delete(key)
223
224 def _delete(self, key):
225 if key not in self:
226 log.exception(f'requested key={key} not found in {self}')
227 raise KeyError(key)
228
229 metadata = self.get_metadata(key)
230 metadata_file, metadata_file_path = self.get_metadata_filename(key)
231 artifact_file_path = metadata['filename_uid_path']
232 self.fs.rm(artifact_file_path)
233 self.fs.rm(metadata_file_path)
234
235 def get_metadata_filename(self, uid_filename) -> tuple[str, str]:
236
237 metadata_file: str = f'{uid_filename}{self.metadata_suffix}'
238 uid_path_in_store = self.store_path(uid_filename)
239
240 metadata_file_path = f'{uid_path_in_store}{self.metadata_suffix}'
241 return metadata_file, metadata_file_path
242
243
244 class LegacyFileSystemBackend(BaseFileStoreBackend):
245 _shard_cls = LegacyFileSystemShard
246
247 def __init__(self, settings):
248 super().__init__(settings)
249
250 store_dir = self.get_conf(config_keys.legacy_filesystem_storage_path)
251 directory = os.path.expanduser(store_dir)
252
253 self._directory = directory
254 self._storage_path = directory # common path for all from BaseCache
255
256 log.debug('Initializing %s file_store instance', self)
257 fs = fsspec.filesystem('file')
258
259 if not fs.exists(self._directory):
260 fs.makedirs(self._directory, exist_ok=True)
261
262 # legacy system uses single shard
263 self._shards = tuple(
264 [
265 self._shard_cls(
266 index=0,
267 directory=directory,
268 directory_folder='',
269 fs=fs,
270 **settings,
271 )
272 ]
273 )
274
275 @classmethod
276 def get_shard_index(cls, filename: str, num_shards) -> int:
277 # legacy filesystem doesn't use shards, and always uses single shard
278 return 0
@@ -0,0 +1,184 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
19 import os
20 import hashlib
21 import functools
22 import time
23 import logging
24
25 from .. import config_keys
26 from ..exceptions import FileOverSizeException
27 from ..backends.base import BaseFileStoreBackend, fsspec, BaseShard, ShardFileReader
28
29 from ....lib.ext_json import json
30
31 log = logging.getLogger(__name__)
32
33
34 class S3Shard(BaseShard):
35 METADATA_VER = 'v2'
36 BACKEND_TYPE = config_keys.backend_objectstore
37 storage_type: str = 'bucket'
38
39 def __init__(self, index, bucket, bucket_folder, fs, **settings):
40 self._index: int = index
41 self._bucket_main: str = bucket
42 self._bucket_folder: str = bucket_folder
43
44 self.fs = fs
45
46 @property
47 def bucket(self) -> str:
48 """Cache bucket final path."""
49 return os.path.join(self._bucket_main, self._bucket_folder)
50
51 def _write_file(self, full_path, iterator, max_filesize, mode='wb'):
52
53 # ensure dir exists
54 destination, _ = os.path.split(full_path)
55 if not self.fs.exists(destination):
56 self.fs.makedirs(destination)
57
58 writer = self.fs.open(full_path, mode)
59
60 digest = hashlib.sha256()
61 oversize_cleanup = False
62 with writer:
63 size = 0
64 for chunk in iterator:
65 size += len(chunk)
66 digest.update(chunk)
67 writer.write(chunk)
68
69 if max_filesize and size > max_filesize:
70 oversize_cleanup = True
71 # free up the copied file, and raise exc
72 break
73
74 if oversize_cleanup:
75 self.fs.rm(full_path)
76 raise FileOverSizeException(f'given file is over size limit ({max_filesize}): {full_path}')
77
78 sha256 = digest.hexdigest()
79 log.debug('written new artifact under %s, sha256: %s', full_path, sha256)
80 return size, sha256
81
82 def _store(self, key: str, uid_key, value_reader, max_filesize: int | None = None, metadata: dict | None = None, **kwargs):
83
84 filename = key
85 uid_filename = uid_key
86 full_path = self.store_path(uid_filename)
87
88 # STORE METADATA
89 _metadata = {
90 "version": self.METADATA_VER,
91 "store_type": self.BACKEND_TYPE,
92
93 "filename": filename,
94 "filename_uid_path": full_path,
95 "filename_uid": uid_filename,
96 "sha256": "", # NOTE: filled in by reader iteration
97
98 "store_time": time.time(),
99
100 "size": 0
101 }
102
103 if metadata:
104 if kwargs.pop('import_mode', False):
105 # in import mode, we don't need to compute metadata, we just take the old version
106 _metadata["import_mode"] = True
107 else:
108 _metadata.update(metadata)
109
110 read_iterator = iter(functools.partial(value_reader.read, 2**22), b'')
111 size, sha256 = self._write_file(full_path, read_iterator, max_filesize)
112 _metadata['size'] = size
113 _metadata['sha256'] = sha256
114
115 # after storing the artifacts, we write the metadata present
116 metadata_file, metadata_file_path = self.get_metadata_filename(uid_key)
117
118 with self.fs.open(metadata_file_path, 'wb') as f:
119 f.write(json.dumps(_metadata))
120
121 return uid_filename, _metadata
122
123 def store_path(self, uid_filename):
124 """
125 Returns absolute file path of the uid_filename
126 """
127 return os.path.join(self._bucket_main, self._bucket_folder, uid_filename)
128
129 def _fetch(self, key, presigned_url_expires: int = 0):
130 if key not in self:
131 log.exception(f'requested key={key} not found in {self}')
132 raise KeyError(key)
133
134 metadata_file, metadata_file_path = self.get_metadata_filename(key)
135 with self.fs.open(metadata_file_path, 'rb') as f:
136 metadata = json.loads(f.read())
137
138 file_path = metadata['filename_uid_path']
139 if presigned_url_expires and presigned_url_expires > 0:
140 metadata['url'] = self.fs.url(file_path, expires=presigned_url_expires)
141
142 return ShardFileReader(self.fs.open(file_path, 'rb')), metadata
143
144 def delete(self, key):
145 return self._delete(key)
146
147
148 class ObjectStoreBackend(BaseFileStoreBackend):
149 shard_name: str = 'shard-{:03d}'
150 _shard_cls = S3Shard
151
152 def __init__(self, settings):
153 super().__init__(settings)
154
155 self._shard_count = int(self.get_conf(config_keys.objectstore_bucket_shards, pop=True))
156 if self._shard_count < 1:
157 raise ValueError('cache_shards must be 1 or more')
158
159 self._bucket = settings.pop(config_keys.objectstore_bucket)
160 if not self._bucket:
161 raise ValueError(f'{config_keys.objectstore_bucket} needs to have a value')
162
163 objectstore_url = self.get_conf(config_keys.objectstore_url)
164 key = settings.pop(config_keys.objectstore_key)
165 secret = settings.pop(config_keys.objectstore_secret)
166
167 self._storage_path = objectstore_url # common path for all from BaseCache
168 log.debug('Initializing %s file_store instance', self)
169 fs = fsspec.filesystem('s3', anon=False, endpoint_url=objectstore_url, key=key, secret=secret)
170
171 # init main bucket
172 if not fs.exists(self._bucket):
173 fs.mkdir(self._bucket)
174
175 self._shards = tuple(
176 self._shard_cls(
177 index=num,
178 bucket=self._bucket,
179 bucket_folder=self.shard_name.format(num),
180 fs=fs,
181 **settings,
182 )
183 for num in range(self._shard_count)
184 )
@@ -0,0 +1,128 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18 import pytest
19
20 from rhodecode.apps import file_store
21 from rhodecode.apps.file_store import config_keys
22 from rhodecode.apps.file_store.backends.filesystem_legacy import LegacyFileSystemBackend
23 from rhodecode.apps.file_store.backends.filesystem import FileSystemBackend
24 from rhodecode.apps.file_store.backends.objectstore import ObjectStoreBackend
25 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
26
27 from rhodecode.apps.file_store import utils as store_utils
28 from rhodecode.apps.file_store.tests import random_binary_file, file_store_instance
29
30
31 class TestFileStoreBackends:
32
33 @pytest.mark.parametrize('backend_type, expected_instance', [
34 (config_keys.backend_legacy_filesystem, LegacyFileSystemBackend),
35 (config_keys.backend_filesystem, FileSystemBackend),
36 (config_keys.backend_objectstore, ObjectStoreBackend),
37 ])
38 def test_get_backend(self, backend_type, expected_instance, ini_settings):
39 config = ini_settings
40 config[config_keys.backend_type] = backend_type
41 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
42 assert isinstance(f_store, expected_instance)
43
44 @pytest.mark.parametrize('backend_type, expected_instance', [
45 (config_keys.backend_legacy_filesystem, LegacyFileSystemBackend),
46 (config_keys.backend_filesystem, FileSystemBackend),
47 (config_keys.backend_objectstore, ObjectStoreBackend),
48 ])
49 def test_store_and_read(self, backend_type, expected_instance, ini_settings, random_binary_file):
50 filename, temp_file = random_binary_file
51 config = ini_settings
52 config[config_keys.backend_type] = backend_type
53 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
54 metadata = {
55 'user_uploaded': {
56 'username': 'user1',
57 'user_id': 10,
58 'ip': '10.20.30.40'
59 }
60 }
61 store_fid, metadata = f_store.store(filename, temp_file, extra_metadata=metadata)
62 assert store_fid
63 assert metadata
64
65 # read-after write
66 reader, metadata2 = f_store.fetch(store_fid)
67 assert reader
68 assert metadata2['filename'] == filename
69
70 @pytest.mark.parametrize('backend_type, expected_instance', [
71 (config_keys.backend_legacy_filesystem, LegacyFileSystemBackend),
72 (config_keys.backend_filesystem, FileSystemBackend),
73 (config_keys.backend_objectstore, ObjectStoreBackend),
74 ])
75 def test_store_file_not_allowed(self, backend_type, expected_instance, ini_settings, random_binary_file):
76 filename, temp_file = random_binary_file
77 config = ini_settings
78 config[config_keys.backend_type] = backend_type
79 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
80 with pytest.raises(FileNotAllowedException):
81 f_store.store('notallowed.exe', temp_file, extensions=['.txt'])
82
83 @pytest.mark.parametrize('backend_type, expected_instance', [
84 (config_keys.backend_legacy_filesystem, LegacyFileSystemBackend),
85 (config_keys.backend_filesystem, FileSystemBackend),
86 (config_keys.backend_objectstore, ObjectStoreBackend),
87 ])
88 def test_store_file_over_size(self, backend_type, expected_instance, ini_settings, random_binary_file):
89 filename, temp_file = random_binary_file
90 config = ini_settings
91 config[config_keys.backend_type] = backend_type
92 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
93 with pytest.raises(FileOverSizeException):
94 f_store.store('toobig.exe', temp_file, extensions=['.exe'], max_filesize=124)
95
96 @pytest.mark.parametrize('backend_type, expected_instance, extra_conf', [
97 (config_keys.backend_legacy_filesystem, LegacyFileSystemBackend, {}),
98 (config_keys.backend_filesystem, FileSystemBackend, {config_keys.filesystem_storage_path: '/tmp/test-fs-store'}),
99 (config_keys.backend_objectstore, ObjectStoreBackend, {config_keys.objectstore_bucket: 'test-bucket'}),
100 ])
101 def test_store_stats_and_keys(self, backend_type, expected_instance, extra_conf, ini_settings, random_binary_file):
102 config = ini_settings
103 config[config_keys.backend_type] = backend_type
104 config.update(extra_conf)
105
106 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
107
108 # purge storage before running
109 for shard, k in f_store.iter_artifacts():
110 f_store.delete(k)
111
112 for i in range(10):
113 filename, temp_file = random_binary_file
114
115 metadata = {
116 'user_uploaded': {
117 'username': 'user1',
118 'user_id': 10,
119 'ip': '10.20.30.40'
120 }
121 }
122 store_fid, metadata = f_store.store(filename, temp_file, extra_metadata=metadata)
123 assert store_fid
124 assert metadata
125
126 cnt, size, meta = f_store.get_statistics()
127 assert cnt == 10
128 assert 10 == len(list(f_store.iter_keys()))
@@ -0,0 +1,52 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18 import pytest
19
20 from rhodecode.apps.file_store import utils as store_utils
21 from rhodecode.apps.file_store import config_keys
22 from rhodecode.apps.file_store.tests import generate_random_filename
23
24
25 @pytest.fixture()
26 def file_store_filesystem_instance(ini_settings):
27 config = ini_settings
28 config[config_keys.backend_type] = config_keys.backend_filesystem
29 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
30 return f_store
31
32
33 class TestFileStoreFileSystemBackend:
34
35 @pytest.mark.parametrize('filename', [generate_random_filename() for _ in range(10)])
36 def test_get_shard_number(self, filename, file_store_filesystem_instance):
37 shard_number = file_store_filesystem_instance.get_shard_index(filename, len(file_store_filesystem_instance._shards))
38 # Check that the shard number is between 0 and max-shards
39 assert 0 <= shard_number <= len(file_store_filesystem_instance._shards)
40
41 @pytest.mark.parametrize('filename, expected_shard_num', [
42 ('my-name-1', 3),
43 ('my-name-2', 2),
44 ('my-name-3', 4),
45 ('my-name-4', 1),
46
47 ('rhodecode-enterprise-ce', 5),
48 ('rhodecode-enterprise-ee', 6),
49 ])
50 def test_get_shard_number_consistency(self, filename, expected_shard_num, file_store_filesystem_instance):
51 shard_number = file_store_filesystem_instance.get_shard_index(filename, len(file_store_filesystem_instance._shards))
52 assert expected_shard_num == shard_number
@@ -0,0 +1,17 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/ No newline at end of file
@@ -0,0 +1,52 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18 import pytest
19
20 from rhodecode.apps.file_store import utils as store_utils
21 from rhodecode.apps.file_store import config_keys
22 from rhodecode.apps.file_store.tests import generate_random_filename
23
24
25 @pytest.fixture()
26 def file_store_legacy_instance(ini_settings):
27 config = ini_settings
28 config[config_keys.backend_type] = config_keys.backend_legacy_filesystem
29 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
30 return f_store
31
32
33 class TestFileStoreLegacyBackend:
34
35 @pytest.mark.parametrize('filename', [generate_random_filename() for _ in range(10)])
36 def test_get_shard_number(self, filename, file_store_legacy_instance):
37 shard_number = file_store_legacy_instance.get_shard_index(filename, len(file_store_legacy_instance._shards))
38 # Check that the shard number is 0 for legacy filesystem store we don't use shards
39 assert shard_number == 0
40
41 @pytest.mark.parametrize('filename, expected_shard_num', [
42 ('my-name-1', 0),
43 ('my-name-2', 0),
44 ('my-name-3', 0),
45 ('my-name-4', 0),
46
47 ('rhodecode-enterprise-ce', 0),
48 ('rhodecode-enterprise-ee', 0),
49 ])
50 def test_get_shard_number_consistency(self, filename, expected_shard_num, file_store_legacy_instance):
51 shard_number = file_store_legacy_instance.get_shard_index(filename, len(file_store_legacy_instance._shards))
52 assert expected_shard_num == shard_number
@@ -0,0 +1,52 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18 import pytest
19
20 from rhodecode.apps.file_store import utils as store_utils
21 from rhodecode.apps.file_store import config_keys
22 from rhodecode.apps.file_store.tests import generate_random_filename
23
24
25 @pytest.fixture()
26 def file_store_objectstore_instance(ini_settings):
27 config = ini_settings
28 config[config_keys.backend_type] = config_keys.backend_objectstore
29 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
30 return f_store
31
32
33 class TestFileStoreObjectStoreBackend:
34
35 @pytest.mark.parametrize('filename', [generate_random_filename() for _ in range(10)])
36 def test_get_shard_number(self, filename, file_store_objectstore_instance):
37 shard_number = file_store_objectstore_instance.get_shard_index(filename, len(file_store_objectstore_instance._shards))
38 # Check that the shard number is between 0 and shards
39 assert 0 <= shard_number <= len(file_store_objectstore_instance._shards)
40
41 @pytest.mark.parametrize('filename, expected_shard_num', [
42 ('my-name-1', 3),
43 ('my-name-2', 2),
44 ('my-name-3', 4),
45 ('my-name-4', 1),
46
47 ('rhodecode-enterprise-ce', 5),
48 ('rhodecode-enterprise-ee', 6),
49 ])
50 def test_get_shard_number_consistency(self, filename, expected_shard_num, file_store_objectstore_instance):
51 shard_number = file_store_objectstore_instance.get_shard_index(filename, len(file_store_objectstore_instance._shards))
52 assert expected_shard_num == shard_number
1 NO CONTENT: new file 100644
NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
@@ -1,5 +1,5 b''
1 [bumpversion]
1 [bumpversion]
2 current_version = 5.1.2
2 current_version = 5.2.0
3 message = release: Bump version {current_version} to {new_version}
3 message = release: Bump version {current_version} to {new_version}
4
4
5 [bumpversion:file:rhodecode/VERSION]
5 [bumpversion:file:rhodecode/VERSION]
@@ -1,71 +1,71 b''
1 syntax: glob
1 syntax: glob
2
2
3 *.egg
3 *.egg
4 *.egg-info
4 *.egg-info
5 *.idea
5 *.idea
6 *.orig
6 *.orig
7 *.pyc
7 *.pyc
8 *.sqlite-journal
8 *.sqlite-journal
9 *.swp
9 *.swp
10 *.tox
10 *.tox
11 *.DS_Store*
11 *.DS_Store*
12 rhodecode/public/js/src/components/**/*.css
12 rhodecode/public/js/src/components/**/*.css
13
13
14 syntax: regexp
14 syntax: regexp
15
15
16 #.filename
16 #.filename
17 ^\.settings$
17 ^\.settings$
18 ^\.project$
18 ^\.project$
19 ^\.pydevproject$
19 ^\.pydevproject$
20 ^\.coverage$
20 ^\.coverage$
21 ^\.cache.*$
21 ^\.cache.*$
22 ^\.ruff_cache.*$
22 ^\.ruff_cache.*$
23 ^\.rhodecode$
23 ^\.rhodecode$
24
24
25 ^rcextensions
25 ^rcextensions
26 ^.dev
26 ^.dev
27 ^._dev
27 ^._dev
28 ^build/
28 ^build/
29 ^coverage\.xml$
29 ^coverage\.xml$
30 ^data$
30 ^data$
31 ^\.eggs/
31 ^\.eggs/
32 ^configs/data$
32 ^configs/data$
33 ^dev.ini$
33 ^dev.ini$
34 ^acceptance_tests/dev.*\.ini$
34 ^acceptance_tests/dev.*\.ini$
35 ^dist/
35 ^dist/
36 ^fabfile.py
36 ^fabfile.py
37 ^htmlcov
37 ^htmlcov
38 ^junit\.xml$
38 ^junit\.xml$
39 ^node_modules/
39 ^node_modules/
40 ^node_binaries/
40 ^node_binaries/
41 ^pylint.log$
41 ^pylint.log$
42 ^rcextensions/
42 ^rcextensions/
43 ^result$
43 ^result$
44 ^rhodecode/public/css/style.css$
44 ^rhodecode/public/css/style.css$
45 ^rhodecode/public/css/style-polymer.css$
45 ^rhodecode/public/css/style-polymer.css$
46 ^rhodecode/public/css/style-ipython.css$
46 ^rhodecode/public/css/style-ipython.css$
47 ^rhodecode/public/js/rhodecode-components.html$
47 ^rhodecode/public/js/rhodecode-components.html$
48 ^rhodecode/public/js/rhodecode-components.js$
48 ^rhodecode/public/js/rhodecode-components.js$
49 ^rhodecode/public/js/scripts.js$
49 ^rhodecode/public/js/scripts.js$
50 ^rhodecode/public/js/scripts.min.js$
50 ^rhodecode/public/js/scripts.min.js$
51 ^rhodecode/public/js/src/components/root-styles.gen.html$
51 ^rhodecode/public/js/src/components/root-styles.gen.html$
52 ^rhodecode/public/js/vendors/webcomponentsjs/
52 ^rhodecode/public/js/vendors/webcomponentsjs/
53 ^rhodecode\.db$
53 ^rhodecode\.db$
54 ^rhodecode\.log$
54 ^rhodecode\.log$
55 ^rhodecode_dev\.log$
55 ^rhodecode_dev\.log$
56 ^test\.db$
56 ^test\.db$
57
57 ^venv/
58
58
59 # ac-tests
59 # ac-tests
60 ^acceptance_tests/\.cache.*$
60 ^acceptance_tests/\.cache.*$
61 ^acceptance_tests/externals
61 ^acceptance_tests/externals
62 ^acceptance_tests/ghostdriver.log$
62 ^acceptance_tests/ghostdriver.log$
63 ^acceptance_tests/local(_.+)?\.ini$
63 ^acceptance_tests/local(_.+)?\.ini$
64
64
65 # docs
65 # docs
66 ^docs/_build$
66 ^docs/_build$
67 ^docs/result$
67 ^docs/result$
68 ^docs-internal/_build$
68 ^docs-internal/_build$
69
69
70 # Cythonized things
70 # Cythonized things
71 ^rhodecode/.*\.(c|so)$
71 ^rhodecode/.*\.(c|so)$
@@ -1,172 +1,158 b''
1 .DEFAULT_GOAL := help
2
3 # Pretty print values cf. https://misc.flogisoft.com/bash/tip_colors_and_formatting
4 RESET := \033[0m # Reset all formatting
5 GREEN := \033[0;32m # Resets before setting 16b colour (32 -- green)
6 YELLOW := \033[0;33m
7 ORANGE := \033[0;38;5;208m # Reset then set 256b colour (208 -- orange)
8 PEACH := \033[0;38;5;216m
9
10
11 ## ---------------------------------------------------------------------------------- ##
12 ## ------------------------- Help usage builder ------------------------------------- ##
13 ## ---------------------------------------------------------------------------------- ##
14 # use '# >>> Build commands' to create section
15 # use '# target: target description' to create help for target
16 .PHONY: help
17 help:
18 @echo "Usage:"
19 @cat $(MAKEFILE_LIST) | grep -E '^# >>>|^# [A-Za-z0-9_.-]+:' | sed -E 's/^# //' | awk ' \
20 BEGIN { \
21 green="\033[32m"; \
22 yellow="\033[33m"; \
23 reset="\033[0m"; \
24 section=""; \
25 } \
26 /^>>>/ { \
27 section=substr($$0, 5); \
28 printf "\n" green ">>> %s" reset "\n", section; \
29 next; \
30 } \
31 /^([A-Za-z0-9_.-]+):/ { \
32 target=$$1; \
33 gsub(/:$$/, "", target); \
34 description=substr($$0, index($$0, ":") + 2); \
35 if (description == "") { description="-"; } \
36 printf " - " yellow "%-35s" reset " %s\n", target, description; \
37 } \
38 '
39
1 # required for pushd to work..
40 # required for pushd to work..
2 SHELL = /bin/bash
41 SHELL = /bin/bash
3
42
4
43 # >>> Tests commands
5 # set by: PATH_TO_OUTDATED_PACKAGES=/some/path/outdated_packages.py
6 OUTDATED_PACKAGES = ${PATH_TO_OUTDATED_PACKAGES}
7
44
8 .PHONY: clean
45 .PHONY: clean
9 ## Cleanup compiled and cache py files
46 # clean: Cleanup compiled and cache py files
10 clean:
47 clean:
11 make test-clean
48 make test-clean
12 find . -type f \( -iname '*.c' -o -iname '*.pyc' -o -iname '*.so' -o -iname '*.orig' \) -exec rm '{}' ';'
49 find . -type f \( -iname '*.c' -o -iname '*.pyc' -o -iname '*.so' -o -iname '*.orig' \) -exec rm '{}' ';'
13 find . -type d -name "build" -prune -exec rm -rf '{}' ';'
50 find . -type d -name "build" -prune -exec rm -rf '{}' ';'
14
51
15
52
16 .PHONY: test
53 .PHONY: test
17 ## run test-clean and tests
54 # test: run test-clean and tests
18 test:
55 test:
19 make test-clean
56 make test-clean
20 make test-only
57 unset RC_SQLALCHEMY_DB1_URL && unset RC_DB_URL && make test-only
21
58
22
59
23 .PHONY: test-clean
60 .PHONY: test-clean
24 ## run test-clean and tests
61 # test-clean: run test-clean and tests
25 test-clean:
62 test-clean:
26 rm -rf coverage.xml htmlcov junit.xml pylint.log result
63 rm -rf coverage.xml htmlcov junit.xml pylint.log result
27 find . -type d -name "__pycache__" -prune -exec rm -rf '{}' ';'
64 find . -type d -name "__pycache__" -prune -exec rm -rf '{}' ';'
28 find . -type f \( -iname '.coverage.*' \) -exec rm '{}' ';'
65 find . -type f \( -iname '.coverage.*' \) -exec rm '{}' ';'
29
66
30
67
31 .PHONY: test-only
68 .PHONY: test-only
32 ## Run tests only without cleanup
69 # test-only: Run tests only without cleanup
33 test-only:
70 test-only:
34 PYTHONHASHSEED=random \
71 PYTHONHASHSEED=random \
35 py.test -x -vv -r xw -p no:sugar \
72 py.test -x -vv -r xw -p no:sugar \
36 --cov-report=term-missing --cov-report=html \
73 --cov-report=term-missing --cov-report=html \
37 --cov=rhodecode rhodecode
74 --cov=rhodecode rhodecode
38
75
76 # >>> Docs commands
39
77
40 .PHONY: docs
78 .PHONY: docs
41 ## build docs
79 # docs: build docs
42 docs:
80 docs:
43 (cd docs; docker run --rm -v $(PWD):/project --workdir=/project/docs sphinx-doc-build-rc make clean html SPHINXOPTS="-W")
81 (cd docs; docker run --rm -v $(PWD):/project --workdir=/project/docs sphinx-doc-build-rc make clean html SPHINXOPTS="-W")
44
82
45
83
46 .PHONY: docs-clean
84 .PHONY: docs-clean
47 ## Cleanup docs
85 # docs-clean: Cleanup docs
48 docs-clean:
86 docs-clean:
49 (cd docs; docker run --rm -v $(PWD):/project --workdir=/project/docs sphinx-doc-build-rc make clean)
87 (cd docs; docker run --rm -v $(PWD):/project --workdir=/project/docs sphinx-doc-build-rc make clean)
50
88
51
89
52 .PHONY: docs-cleanup
90 .PHONY: docs-cleanup
53 ## Cleanup docs
91 # docs-cleanup: Cleanup docs
54 docs-cleanup:
92 docs-cleanup:
55 (cd docs; docker run --rm -v $(PWD):/project --workdir=/project/docs sphinx-doc-build-rc make cleanup)
93 (cd docs; docker run --rm -v $(PWD):/project --workdir=/project/docs sphinx-doc-build-rc make cleanup)
56
94
95 # >>> Dev commands
57
96
58 .PHONY: web-build
97 .PHONY: web-build
59 ## Build JS packages static/js
98 # web-build: Build JS packages static/js
60 web-build:
99 web-build:
61 rm -rf node_modules
100 rm -rf node_modules
62 docker run -it --rm -v $(PWD):/project --workdir=/project rhodecode/static-files-build:16 -c "npm install && /project/node_modules/.bin/grunt"
101 docker run -it --rm -v $(PWD):/project --workdir=/project rhodecode/static-files-build:16 -c "npm install && /project/node_modules/.bin/grunt"
63 # run static file check
102 # run static file check
64 ./rhodecode/tests/scripts/static-file-check.sh rhodecode/public/
103 ./rhodecode/tests/scripts/static-file-check.sh rhodecode/public/
65 rm -rf node_modules
104 rm -rf node_modules
66
105
67 .PHONY: ruff-check
68 ## run a ruff analysis
69 ruff-check:
70 ruff check --ignore F401 --ignore I001 --ignore E402 --ignore E501 --ignore F841 --exclude rhodecode/lib/dbmigrate --exclude .eggs --exclude .dev .
71
72 .PHONY: pip-packages
73 ## Show outdated packages
74 pip-packages:
75 python ${OUTDATED_PACKAGES}
76
77
78 .PHONY: build
79 ## Build sdist/egg
80 build:
81 python -m build
82
83
106
84 .PHONY: dev-sh
107 .PHONY: dev-sh
85 ## make dev-sh
108 # dev-sh: make dev-sh
86 dev-sh:
109 dev-sh:
87 sudo echo "deb [trusted=yes] https://apt.fury.io/rsteube/ /" | sudo tee -a "/etc/apt/sources.list.d/fury.list"
110 sudo echo "deb [trusted=yes] https://apt.fury.io/rsteube/ /" | sudo tee -a "/etc/apt/sources.list.d/fury.list"
88 sudo apt-get update
111 sudo apt-get update
89 sudo apt-get install -y zsh carapace-bin
112 sudo apt-get install -y zsh carapace-bin
90 rm -rf /home/rhodecode/.oh-my-zsh
113 rm -rf /home/rhodecode/.oh-my-zsh
91 curl https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh | sh
114 curl https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh | sh
92 @echo "source <(carapace _carapace)" > /home/rhodecode/.zsrc
115 @echo "source <(carapace _carapace)" > /home/rhodecode/.zsrc
93 @echo "${RC_DEV_CMD_HELP}"
116 @echo "${RC_DEV_CMD_HELP}"
94 @PROMPT='%(?.%F{green}√.%F{red}?%?)%f %B%F{240}%1~%f%b %# ' zsh
117 @PROMPT='%(?.%F{green}√.%F{red}?%?)%f %B%F{240}%1~%f%b %# ' zsh
95
118
96
119
97 .PHONY: dev-cleanup
120 .PHONY: dev-cleanup
98 ## Cleanup: pip freeze | grep -v "^-e" | grep -v "@" | xargs pip uninstall -y
121 # dev-cleanup: Cleanup: pip freeze | grep -v "^-e" | grep -v "@" | xargs pip uninstall -y
99 dev-cleanup:
122 dev-cleanup:
100 pip freeze | grep -v "^-e" | grep -v "@" | xargs pip uninstall -y
123 pip freeze | grep -v "^-e" | grep -v "@" | xargs pip uninstall -y
101 rm -rf /tmp/*
124 rm -rf /tmp/*
102
125
103
126
104 .PHONY: dev-env
127 .PHONY: dev-env
105 ## make dev-env based on the requirements files and install develop of packages
128 # dev-env: make dev-env based on the requirements files and install develop of packages
106 ## Cleanup: pip freeze | grep -v "^-e" | grep -v "@" | xargs pip uninstall -y
129 ## Cleanup: pip freeze | grep -v "^-e" | grep -v "@" | xargs pip uninstall -y
107 dev-env:
130 dev-env:
108 sudo -u root chown rhodecode:rhodecode /home/rhodecode/.cache/pip/
131 sudo -u root chown rhodecode:rhodecode /home/rhodecode/.cache/pip/
109 pip install build virtualenv
132 pip install build virtualenv
110 pushd ../rhodecode-vcsserver/ && make dev-env && popd
133 pushd ../rhodecode-vcsserver/ && make dev-env && popd
111 pip wheel --wheel-dir=/home/rhodecode/.cache/pip/wheels -r requirements.txt -r requirements_rc_tools.txt -r requirements_test.txt -r requirements_debug.txt
134 pip wheel --wheel-dir=/home/rhodecode/.cache/pip/wheels -r requirements.txt -r requirements_rc_tools.txt -r requirements_test.txt -r requirements_debug.txt
112 pip install --no-index --find-links=/home/rhodecode/.cache/pip/wheels -r requirements.txt -r requirements_rc_tools.txt -r requirements_test.txt -r requirements_debug.txt
135 pip install --no-index --find-links=/home/rhodecode/.cache/pip/wheels -r requirements.txt -r requirements_rc_tools.txt -r requirements_test.txt -r requirements_debug.txt
113 pip install -e .
136 pip install -e .
114
137
115
138
116 .PHONY: sh
139 .PHONY: sh
117 ## shortcut for make dev-sh dev-env
140 # sh: shortcut for make dev-sh dev-env
118 sh:
141 sh:
119 make dev-env
142 make dev-env
120 make dev-sh
143 make dev-sh
121
144
122
145
123 ## Allows changes of workers e.g make dev-srv-g workers=2
146 ## Allows changes of workers e.g make dev-srv-g workers=2
124 workers?=1
147 workers?=1
125
148
126 .PHONY: dev-srv
149 .PHONY: dev-srv
127 ## run gunicorn web server with reloader, use workers=N to set multiworker mode
150 # dev-srv: run gunicorn web server with reloader, use workers=N to set multiworker mode, workers=N allows changes of workers
128 dev-srv:
151 dev-srv:
129 gunicorn --paste=.dev/dev.ini --bind=0.0.0.0:10020 --config=.dev/gunicorn_config.py --timeout=120 --reload --workers=$(workers)
152 gunicorn --paste=.dev/dev.ini --bind=0.0.0.0:10020 --config=.dev/gunicorn_config.py --timeout=120 --reload --workers=$(workers)
130
153
131
154 .PHONY: ruff-check
132 # Default command on calling make
155 # ruff-check: run a ruff analysis
133 .DEFAULT_GOAL := show-help
156 ruff-check:
157 ruff check --ignore F401 --ignore I001 --ignore E402 --ignore E501 --ignore F841 --exclude rhodecode/lib/dbmigrate --exclude .eggs --exclude .dev .
134
158
135 .PHONY: show-help
136 show-help:
137 @echo "$$(tput bold)Available rules:$$(tput sgr0)"
138 @echo
139 @sed -n -e "/^## / { \
140 h; \
141 s/.*//; \
142 :doc" \
143 -e "H; \
144 n; \
145 s/^## //; \
146 t doc" \
147 -e "s/:.*//; \
148 G; \
149 s/\\n## /---/; \
150 s/\\n/ /g; \
151 p; \
152 }" ${MAKEFILE_LIST} \
153 | LC_ALL='C' sort --ignore-case \
154 | awk -F '---' \
155 -v ncol=$$(tput cols) \
156 -v indent=19 \
157 -v col_on="$$(tput setaf 6)" \
158 -v col_off="$$(tput sgr0)" \
159 '{ \
160 printf "%s%*s%s ", col_on, -indent, $$1, col_off; \
161 n = split($$2, words, " "); \
162 line_length = ncol - indent; \
163 for (i = 1; i <= n; i++) { \
164 line_length -= length(words[i]) + 1; \
165 if (line_length <= 0) { \
166 line_length = ncol - indent - length(words[i]) - 1; \
167 printf "\n%*s ", -indent, " "; \
168 } \
169 printf "%s ", words[i]; \
170 } \
171 printf "\n"; \
172 }'
@@ -1,856 +1,912 b''
1
1
2 ; #########################################
2 ; #########################################
3 ; RHODECODE COMMUNITY EDITION CONFIGURATION
3 ; RHODECODE COMMUNITY EDITION CONFIGURATION
4 ; #########################################
4 ; #########################################
5
5
6 [DEFAULT]
6 [DEFAULT]
7 ; Debug flag sets all loggers to debug, and enables request tracking
7 ; Debug flag sets all loggers to debug, and enables request tracking
8 debug = true
8 debug = true
9
9
10 ; ########################################################################
10 ; ########################################################################
11 ; EMAIL CONFIGURATION
11 ; EMAIL CONFIGURATION
12 ; These settings will be used by the RhodeCode mailing system
12 ; These settings will be used by the RhodeCode mailing system
13 ; ########################################################################
13 ; ########################################################################
14
14
15 ; prefix all emails subjects with given prefix, helps filtering out emails
15 ; prefix all emails subjects with given prefix, helps filtering out emails
16 #email_prefix = [RhodeCode]
16 #email_prefix = [RhodeCode]
17
17
18 ; email FROM address all mails will be sent
18 ; email FROM address all mails will be sent
19 #app_email_from = rhodecode-noreply@localhost
19 #app_email_from = rhodecode-noreply@localhost
20
20
21 #smtp_server = mail.server.com
21 #smtp_server = mail.server.com
22 #smtp_username =
22 #smtp_username =
23 #smtp_password =
23 #smtp_password =
24 #smtp_port =
24 #smtp_port =
25 #smtp_use_tls = false
25 #smtp_use_tls = false
26 #smtp_use_ssl = true
26 #smtp_use_ssl = true
27
27
28 [server:main]
28 [server:main]
29 ; COMMON HOST/IP CONFIG, This applies mostly to develop setup,
29 ; COMMON HOST/IP CONFIG, This applies mostly to develop setup,
30 ; Host port for gunicorn are controlled by gunicorn_conf.py
30 ; Host port for gunicorn are controlled by gunicorn_conf.py
31 host = 127.0.0.1
31 host = 127.0.0.1
32 port = 10020
32 port = 10020
33
33
34
34
35 ; ###########################
35 ; ###########################
36 ; GUNICORN APPLICATION SERVER
36 ; GUNICORN APPLICATION SERVER
37 ; ###########################
37 ; ###########################
38
38
39 ; run with gunicorn --config gunicorn_conf.py --paste rhodecode.ini
39 ; run with gunicorn --config gunicorn_conf.py --paste rhodecode.ini
40
40
41 ; Module to use, this setting shouldn't be changed
41 ; Module to use, this setting shouldn't be changed
42 use = egg:gunicorn#main
42 use = egg:gunicorn#main
43
43
44 ; Prefix middleware for RhodeCode.
44 ; Prefix middleware for RhodeCode.
45 ; recommended when using proxy setup.
45 ; recommended when using proxy setup.
46 ; allows to set RhodeCode under a prefix in server.
46 ; allows to set RhodeCode under a prefix in server.
47 ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well.
47 ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well.
48 ; And set your prefix like: `prefix = /custom_prefix`
48 ; And set your prefix like: `prefix = /custom_prefix`
49 ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need
49 ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need
50 ; to make your cookies only work on prefix url
50 ; to make your cookies only work on prefix url
51 [filter:proxy-prefix]
51 [filter:proxy-prefix]
52 use = egg:PasteDeploy#prefix
52 use = egg:PasteDeploy#prefix
53 prefix = /
53 prefix = /
54
54
55 [app:main]
55 [app:main]
56 ; The %(here)s variable will be replaced with the absolute path of parent directory
56 ; The %(here)s variable will be replaced with the absolute path of parent directory
57 ; of this file
57 ; of this file
58 ; Each option in the app:main can be override by an environmental variable
58 ; Each option in the app:main can be override by an environmental variable
59 ;
59 ;
60 ;To override an option:
60 ;To override an option:
61 ;
61 ;
62 ;RC_<KeyName>
62 ;RC_<KeyName>
63 ;Everything should be uppercase, . and - should be replaced by _.
63 ;Everything should be uppercase, . and - should be replaced by _.
64 ;For example, if you have these configuration settings:
64 ;For example, if you have these configuration settings:
65 ;rc_cache.repo_object.backend = foo
65 ;rc_cache.repo_object.backend = foo
66 ;can be overridden by
66 ;can be overridden by
67 ;export RC_CACHE_REPO_OBJECT_BACKEND=foo
67 ;export RC_CACHE_REPO_OBJECT_BACKEND=foo
68
68
69 use = egg:rhodecode-enterprise-ce
69 use = egg:rhodecode-enterprise-ce
70
70
71 ; enable proxy prefix middleware, defined above
71 ; enable proxy prefix middleware, defined above
72 #filter-with = proxy-prefix
72 #filter-with = proxy-prefix
73
73
74 ; #############
74 ; #############
75 ; DEBUG OPTIONS
75 ; DEBUG OPTIONS
76 ; #############
76 ; #############
77
77
78 pyramid.reload_templates = true
78 pyramid.reload_templates = true
79
79
80 # During development the we want to have the debug toolbar enabled
80 # During development the we want to have the debug toolbar enabled
81 pyramid.includes =
81 pyramid.includes =
82 pyramid_debugtoolbar
82 pyramid_debugtoolbar
83
83
84 debugtoolbar.hosts = 0.0.0.0/0
84 debugtoolbar.hosts = 0.0.0.0/0
85 debugtoolbar.exclude_prefixes =
85 debugtoolbar.exclude_prefixes =
86 /css
86 /css
87 /fonts
87 /fonts
88 /images
88 /images
89 /js
89 /js
90
90
91 ## RHODECODE PLUGINS ##
91 ## RHODECODE PLUGINS ##
92 rhodecode.includes =
92 rhodecode.includes =
93 rhodecode.api
93 rhodecode.api
94
94
95
95
96 # api prefix url
96 # api prefix url
97 rhodecode.api.url = /_admin/api
97 rhodecode.api.url = /_admin/api
98
98
99 ; enable debug style page
99 ; enable debug style page
100 debug_style = true
100 debug_style = true
101
101
102 ; #################
102 ; #################
103 ; END DEBUG OPTIONS
103 ; END DEBUG OPTIONS
104 ; #################
104 ; #################
105
105
106 ; encryption key used to encrypt social plugin tokens,
106 ; encryption key used to encrypt social plugin tokens,
107 ; remote_urls with credentials etc, if not set it defaults to
107 ; remote_urls with credentials etc, if not set it defaults to
108 ; `beaker.session.secret`
108 ; `beaker.session.secret`
109 #rhodecode.encrypted_values.secret =
109 #rhodecode.encrypted_values.secret =
110
110
111 ; decryption strict mode (enabled by default). It controls if decryption raises
111 ; decryption strict mode (enabled by default). It controls if decryption raises
112 ; `SignatureVerificationError` in case of wrong key, or damaged encryption data.
112 ; `SignatureVerificationError` in case of wrong key, or damaged encryption data.
113 #rhodecode.encrypted_values.strict = false
113 #rhodecode.encrypted_values.strict = false
114
114
115 ; Pick algorithm for encryption. Either fernet (more secure) or aes (default)
115 ; Pick algorithm for encryption. Either fernet (more secure) or aes (default)
116 ; fernet is safer, and we strongly recommend switching to it.
116 ; fernet is safer, and we strongly recommend switching to it.
117 ; Due to backward compatibility aes is used as default.
117 ; Due to backward compatibility aes is used as default.
118 #rhodecode.encrypted_values.algorithm = fernet
118 #rhodecode.encrypted_values.algorithm = fernet
119
119
120 ; Return gzipped responses from RhodeCode (static files/application)
120 ; Return gzipped responses from RhodeCode (static files/application)
121 gzip_responses = false
121 gzip_responses = false
122
122
123 ; Auto-generate javascript routes file on startup
123 ; Auto-generate javascript routes file on startup
124 generate_js_files = false
124 generate_js_files = false
125
125
126 ; System global default language.
126 ; System global default language.
127 ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh
127 ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh
128 lang = en
128 lang = en
129
129
130 ; Perform a full repository scan and import on each server start.
130 ; Perform a full repository scan and import on each server start.
131 ; Settings this to true could lead to very long startup time.
131 ; Settings this to true could lead to very long startup time.
132 startup.import_repos = false
132 startup.import_repos = false
133
133
134 ; URL at which the application is running. This is used for Bootstrapping
134 ; URL at which the application is running. This is used for Bootstrapping
135 ; requests in context when no web request is available. Used in ishell, or
135 ; requests in context when no web request is available. Used in ishell, or
136 ; SSH calls. Set this for events to receive proper url for SSH calls.
136 ; SSH calls. Set this for events to receive proper url for SSH calls.
137 app.base_url = http://rhodecode.local
137 app.base_url = http://rhodecode.local
138
138
139 ; Host at which the Service API is running.
139 ; Host at which the Service API is running.
140 app.service_api.host = http://rhodecode.local:10020
140 app.service_api.host = http://rhodecode.local:10020
141
141
142 ; Secret for Service API authentication.
142 ; Secret for Service API authentication.
143 app.service_api.token =
143 app.service_api.token =
144
144
145 ; Unique application ID. Should be a random unique string for security.
145 ; Unique application ID. Should be a random unique string for security.
146 app_instance_uuid = rc-production
146 app_instance_uuid = rc-production
147
147
148 ; Cut off limit for large diffs (size in bytes). If overall diff size on
148 ; Cut off limit for large diffs (size in bytes). If overall diff size on
149 ; commit, or pull request exceeds this limit this diff will be displayed
149 ; commit, or pull request exceeds this limit this diff will be displayed
150 ; partially. E.g 512000 == 512Kb
150 ; partially. E.g 512000 == 512Kb
151 cut_off_limit_diff = 512000
151 cut_off_limit_diff = 512000
152
152
153 ; Cut off limit for large files inside diffs (size in bytes). Each individual
153 ; Cut off limit for large files inside diffs (size in bytes). Each individual
154 ; file inside diff which exceeds this limit will be displayed partially.
154 ; file inside diff which exceeds this limit will be displayed partially.
155 ; E.g 128000 == 128Kb
155 ; E.g 128000 == 128Kb
156 cut_off_limit_file = 128000
156 cut_off_limit_file = 128000
157
157
158 ; Use cached version of vcs repositories everywhere. Recommended to be `true`
158 ; Use cached version of vcs repositories everywhere. Recommended to be `true`
159 vcs_full_cache = true
159 vcs_full_cache = true
160
160
161 ; Force https in RhodeCode, fixes https redirects, assumes it's always https.
161 ; Force https in RhodeCode, fixes https redirects, assumes it's always https.
162 ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache
162 ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache
163 force_https = false
163 force_https = false
164
164
165 ; use Strict-Transport-Security headers
165 ; use Strict-Transport-Security headers
166 use_htsts = false
166 use_htsts = false
167
167
168 ; Set to true if your repos are exposed using the dumb protocol
168 ; Set to true if your repos are exposed using the dumb protocol
169 git_update_server_info = false
169 git_update_server_info = false
170
170
171 ; RSS/ATOM feed options
171 ; RSS/ATOM feed options
172 rss_cut_off_limit = 256000
172 rss_cut_off_limit = 256000
173 rss_items_per_page = 10
173 rss_items_per_page = 10
174 rss_include_diff = false
174 rss_include_diff = false
175
175
176 ; gist URL alias, used to create nicer urls for gist. This should be an
176 ; gist URL alias, used to create nicer urls for gist. This should be an
177 ; url that does rewrites to _admin/gists/{gistid}.
177 ; url that does rewrites to _admin/gists/{gistid}.
178 ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
178 ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
179 ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid}
179 ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid}
180 gist_alias_url =
180 gist_alias_url =
181
181
182 ; List of views (using glob pattern syntax) that AUTH TOKENS could be
182 ; List of views (using glob pattern syntax) that AUTH TOKENS could be
183 ; used for access.
183 ; used for access.
184 ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it
184 ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it
185 ; came from the the logged in user who own this authentication token.
185 ; came from the the logged in user who own this authentication token.
186 ; Additionally @TOKEN syntax can be used to bound the view to specific
186 ; Additionally @TOKEN syntax can be used to bound the view to specific
187 ; authentication token. Such view would be only accessible when used together
187 ; authentication token. Such view would be only accessible when used together
188 ; with this authentication token
188 ; with this authentication token
189 ; list of all views can be found under `/_admin/permissions/auth_token_access`
189 ; list of all views can be found under `/_admin/permissions/auth_token_access`
190 ; The list should be "," separated and on a single line.
190 ; The list should be "," separated and on a single line.
191 ; Most common views to enable:
191 ; Most common views to enable:
192
192
193 # RepoCommitsView:repo_commit_download
193 # RepoCommitsView:repo_commit_download
194 # RepoCommitsView:repo_commit_patch
194 # RepoCommitsView:repo_commit_patch
195 # RepoCommitsView:repo_commit_raw
195 # RepoCommitsView:repo_commit_raw
196 # RepoCommitsView:repo_commit_raw@TOKEN
196 # RepoCommitsView:repo_commit_raw@TOKEN
197 # RepoFilesView:repo_files_diff
197 # RepoFilesView:repo_files_diff
198 # RepoFilesView:repo_archivefile
198 # RepoFilesView:repo_archivefile
199 # RepoFilesView:repo_file_raw
199 # RepoFilesView:repo_file_raw
200 # GistView:*
200 # GistView:*
201 api_access_controllers_whitelist =
201 api_access_controllers_whitelist =
202
202
203 ; Default encoding used to convert from and to unicode
203 ; Default encoding used to convert from and to unicode
204 ; can be also a comma separated list of encoding in case of mixed encodings
204 ; can be also a comma separated list of encoding in case of mixed encodings
205 default_encoding = UTF-8
205 default_encoding = UTF-8
206
206
207 ; instance-id prefix
207 ; instance-id prefix
208 ; a prefix key for this instance used for cache invalidation when running
208 ; a prefix key for this instance used for cache invalidation when running
209 ; multiple instances of RhodeCode, make sure it's globally unique for
209 ; multiple instances of RhodeCode, make sure it's globally unique for
210 ; all running RhodeCode instances. Leave empty if you don't use it
210 ; all running RhodeCode instances. Leave empty if you don't use it
211 instance_id =
211 instance_id =
212
212
213 ; Fallback authentication plugin. Set this to a plugin ID to force the usage
213 ; Fallback authentication plugin. Set this to a plugin ID to force the usage
214 ; of an authentication plugin also if it is disabled by it's settings.
214 ; of an authentication plugin also if it is disabled by it's settings.
215 ; This could be useful if you are unable to log in to the system due to broken
215 ; This could be useful if you are unable to log in to the system due to broken
216 ; authentication settings. Then you can enable e.g. the internal RhodeCode auth
216 ; authentication settings. Then you can enable e.g. the internal RhodeCode auth
217 ; module to log in again and fix the settings.
217 ; module to log in again and fix the settings.
218 ; Available builtin plugin IDs (hash is part of the ID):
218 ; Available builtin plugin IDs (hash is part of the ID):
219 ; egg:rhodecode-enterprise-ce#rhodecode
219 ; egg:rhodecode-enterprise-ce#rhodecode
220 ; egg:rhodecode-enterprise-ce#pam
220 ; egg:rhodecode-enterprise-ce#pam
221 ; egg:rhodecode-enterprise-ce#ldap
221 ; egg:rhodecode-enterprise-ce#ldap
222 ; egg:rhodecode-enterprise-ce#jasig_cas
222 ; egg:rhodecode-enterprise-ce#jasig_cas
223 ; egg:rhodecode-enterprise-ce#headers
223 ; egg:rhodecode-enterprise-ce#headers
224 ; egg:rhodecode-enterprise-ce#crowd
224 ; egg:rhodecode-enterprise-ce#crowd
225
225
226 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
226 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
227
227
228 ; Flag to control loading of legacy plugins in py:/path format
228 ; Flag to control loading of legacy plugins in py:/path format
229 auth_plugin.import_legacy_plugins = true
229 auth_plugin.import_legacy_plugins = true
230
230
231 ; alternative return HTTP header for failed authentication. Default HTTP
231 ; alternative return HTTP header for failed authentication. Default HTTP
232 ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with
232 ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with
233 ; handling that causing a series of failed authentication calls.
233 ; handling that causing a series of failed authentication calls.
234 ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code
234 ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code
235 ; This will be served instead of default 401 on bad authentication
235 ; This will be served instead of default 401 on bad authentication
236 auth_ret_code =
236 auth_ret_code =
237
237
238 ; use special detection method when serving auth_ret_code, instead of serving
238 ; use special detection method when serving auth_ret_code, instead of serving
239 ; ret_code directly, use 401 initially (Which triggers credentials prompt)
239 ; ret_code directly, use 401 initially (Which triggers credentials prompt)
240 ; and then serve auth_ret_code to clients
240 ; and then serve auth_ret_code to clients
241 auth_ret_code_detection = false
241 auth_ret_code_detection = false
242
242
243 ; locking return code. When repository is locked return this HTTP code. 2XX
243 ; locking return code. When repository is locked return this HTTP code. 2XX
244 ; codes don't break the transactions while 4XX codes do
244 ; codes don't break the transactions while 4XX codes do
245 lock_ret_code = 423
245 lock_ret_code = 423
246
246
247 ; Filesystem location were repositories should be stored
247 ; Filesystem location were repositories should be stored
248 repo_store.path = /var/opt/rhodecode_repo_store
248 repo_store.path = /var/opt/rhodecode_repo_store
249
249
250 ; allows to setup custom hooks in settings page
250 ; allows to setup custom hooks in settings page
251 allow_custom_hooks_settings = true
251 allow_custom_hooks_settings = true
252
252
253 ; Generated license token required for EE edition license.
253 ; Generated license token required for EE edition license.
254 ; New generated token value can be found in Admin > settings > license page.
254 ; New generated token value can be found in Admin > settings > license page.
255 license_token =
255 license_token =
256
256
257 ; This flag hides sensitive information on the license page such as token, and license data
257 ; This flag hides sensitive information on the license page such as token, and license data
258 license.hide_license_info = false
258 license.hide_license_info = false
259
259
260 ; Import EE license from this license path
261 #license.import_path = %(here)s/rhodecode_enterprise.license
262
263 ; import license 'if-missing' or 'force' (always override)
264 ; if-missing means apply license if it doesn't exist. 'force' option always overrides it
265 license.import_path_mode = if-missing
266
260 ; supervisor connection uri, for managing supervisor and logs.
267 ; supervisor connection uri, for managing supervisor and logs.
261 supervisor.uri =
268 supervisor.uri =
262
269
263 ; supervisord group name/id we only want this RC instance to handle
270 ; supervisord group name/id we only want this RC instance to handle
264 supervisor.group_id = dev
271 supervisor.group_id = dev
265
272
266 ; Display extended labs settings
273 ; Display extended labs settings
267 labs_settings_active = true
274 labs_settings_active = true
268
275
269 ; Custom exception store path, defaults to TMPDIR
276 ; Custom exception store path, defaults to TMPDIR
270 ; This is used to store exception from RhodeCode in shared directory
277 ; This is used to store exception from RhodeCode in shared directory
271 #exception_tracker.store_path =
278 #exception_tracker.store_path =
272
279
273 ; Send email with exception details when it happens
280 ; Send email with exception details when it happens
274 #exception_tracker.send_email = false
281 #exception_tracker.send_email = false
275
282
276 ; Comma separated list of recipients for exception emails,
283 ; Comma separated list of recipients for exception emails,
277 ; e.g admin@rhodecode.com,devops@rhodecode.com
284 ; e.g admin@rhodecode.com,devops@rhodecode.com
278 ; Can be left empty, then emails will be sent to ALL super-admins
285 ; Can be left empty, then emails will be sent to ALL super-admins
279 #exception_tracker.send_email_recipients =
286 #exception_tracker.send_email_recipients =
280
287
281 ; optional prefix to Add to email Subject
288 ; optional prefix to Add to email Subject
282 #exception_tracker.email_prefix = [RHODECODE ERROR]
289 #exception_tracker.email_prefix = [RHODECODE ERROR]
283
290
284 ; File store configuration. This is used to store and serve uploaded files
291 ; NOTE: this setting IS DEPRECATED:
285 file_store.enabled = true
292 ; file_store backend is always enabled
293 #file_store.enabled = true
286
294
295 ; NOTE: this setting IS DEPRECATED:
296 ; file_store.backend = X -> use `file_store.backend.type = filesystem_v2` instead
287 ; Storage backend, available options are: local
297 ; Storage backend, available options are: local
288 file_store.backend = local
298 #file_store.backend = local
289
299
300 ; NOTE: this setting IS DEPRECATED:
301 ; file_store.storage_path = X -> use `file_store.filesystem_v2.storage_path = X` instead
290 ; path to store the uploaded binaries and artifacts
302 ; path to store the uploaded binaries and artifacts
291 file_store.storage_path = /var/opt/rhodecode_data/file_store
303 #file_store.storage_path = /var/opt/rhodecode_data/file_store
304
305 ; Artifacts file-store, is used to store comment attachments and artifacts uploads.
306 ; file_store backend type: filesystem_v1, filesystem_v2 or objectstore (s3-based) are available as options
307 ; filesystem_v1 is backwards compat with pre 5.1 storage changes
308 ; new installations should choose filesystem_v2 or objectstore (s3-based), pick filesystem when migrating from
309 ; previous installations to keep the artifacts without a need of migration
310 #file_store.backend.type = filesystem_v2
311
312 ; filesystem options...
313 #file_store.filesystem_v1.storage_path = /var/opt/rhodecode_data/artifacts_file_store
314
315 ; filesystem_v2 options...
316 #file_store.filesystem_v2.storage_path = /var/opt/rhodecode_data/artifacts_file_store
317 #file_store.filesystem_v2.shards = 8
292
318
319 ; objectstore options...
320 ; url for s3 compatible storage that allows to upload artifacts
321 ; e.g http://minio:9000
322 #file_store.backend.type = objectstore
323 #file_store.objectstore.url = http://s3-minio:9000
324
325 ; a top-level bucket to put all other shards in
326 ; objects will be stored in rhodecode-file-store/shard-N based on the bucket_shards number
327 #file_store.objectstore.bucket = rhodecode-file-store
328
329 ; number of sharded buckets to create to distribute archives across
330 ; default is 8 shards
331 #file_store.objectstore.bucket_shards = 8
332
333 ; key for s3 auth
334 #file_store.objectstore.key = s3admin
335
336 ; secret for s3 auth
337 #file_store.objectstore.secret = s3secret4
338
339 ;region for s3 storage
340 #file_store.objectstore.region = eu-central-1
293
341
294 ; Redis url to acquire/check generation of archives locks
342 ; Redis url to acquire/check generation of archives locks
295 archive_cache.locking.url = redis://redis:6379/1
343 archive_cache.locking.url = redis://redis:6379/1
296
344
297 ; Storage backend, only 'filesystem' and 'objectstore' are available now
345 ; Storage backend, only 'filesystem' and 'objectstore' are available now
298 archive_cache.backend.type = filesystem
346 archive_cache.backend.type = filesystem
299
347
300 ; url for s3 compatible storage that allows to upload artifacts
348 ; url for s3 compatible storage that allows to upload artifacts
301 ; e.g http://minio:9000
349 ; e.g http://minio:9000
302 archive_cache.objectstore.url = http://s3-minio:9000
350 archive_cache.objectstore.url = http://s3-minio:9000
303
351
304 ; key for s3 auth
352 ; key for s3 auth
305 archive_cache.objectstore.key = key
353 archive_cache.objectstore.key = key
306
354
307 ; secret for s3 auth
355 ; secret for s3 auth
308 archive_cache.objectstore.secret = secret
356 archive_cache.objectstore.secret = secret
309
357
310 ;region for s3 storage
358 ;region for s3 storage
311 archive_cache.objectstore.region = eu-central-1
359 archive_cache.objectstore.region = eu-central-1
312
360
313 ; number of sharded buckets to create to distribute archives across
361 ; number of sharded buckets to create to distribute archives across
314 ; default is 8 shards
362 ; default is 8 shards
315 archive_cache.objectstore.bucket_shards = 8
363 archive_cache.objectstore.bucket_shards = 8
316
364
317 ; a top-level bucket to put all other shards in
365 ; a top-level bucket to put all other shards in
318 ; objects will be stored in rhodecode-archive-cache/shard-N based on the bucket_shards number
366 ; objects will be stored in rhodecode-archive-cache/shard-N based on the bucket_shards number
319 archive_cache.objectstore.bucket = rhodecode-archive-cache
367 archive_cache.objectstore.bucket = rhodecode-archive-cache
320
368
321 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
369 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
322 archive_cache.objectstore.retry = false
370 archive_cache.objectstore.retry = false
323
371
324 ; number of seconds to wait for next try using retry
372 ; number of seconds to wait for next try using retry
325 archive_cache.objectstore.retry_backoff = 1
373 archive_cache.objectstore.retry_backoff = 1
326
374
327 ; how many tries do do a retry fetch from this backend
375 ; how many tries do do a retry fetch from this backend
328 archive_cache.objectstore.retry_attempts = 10
376 archive_cache.objectstore.retry_attempts = 10
329
377
330 ; Default is $cache_dir/archive_cache if not set
378 ; Default is $cache_dir/archive_cache if not set
331 ; Generated repo archives will be cached at this location
379 ; Generated repo archives will be cached at this location
332 ; and served from the cache during subsequent requests for the same archive of
380 ; and served from the cache during subsequent requests for the same archive of
333 ; the repository. This path is important to be shared across filesystems and with
381 ; the repository. This path is important to be shared across filesystems and with
334 ; RhodeCode and vcsserver
382 ; RhodeCode and vcsserver
335 archive_cache.filesystem.store_dir = /var/opt/rhodecode_data/archive_cache
383 archive_cache.filesystem.store_dir = /var/opt/rhodecode_data/archive_cache
336
384
337 ; The limit in GB sets how much data we cache before recycling last used, defaults to 10 gb
385 ; The limit in GB sets how much data we cache before recycling last used, defaults to 10 gb
338 archive_cache.filesystem.cache_size_gb = 1
386 archive_cache.filesystem.cache_size_gb = 1
339
387
340 ; Eviction policy used to clear out after cache_size_gb limit is reached
388 ; Eviction policy used to clear out after cache_size_gb limit is reached
341 archive_cache.filesystem.eviction_policy = least-recently-stored
389 archive_cache.filesystem.eviction_policy = least-recently-stored
342
390
343 ; By default cache uses sharding technique, this specifies how many shards are there
391 ; By default cache uses sharding technique, this specifies how many shards are there
344 ; default is 8 shards
392 ; default is 8 shards
345 archive_cache.filesystem.cache_shards = 8
393 archive_cache.filesystem.cache_shards = 8
346
394
347 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
395 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
348 archive_cache.filesystem.retry = false
396 archive_cache.filesystem.retry = false
349
397
350 ; number of seconds to wait for next try using retry
398 ; number of seconds to wait for next try using retry
351 archive_cache.filesystem.retry_backoff = 1
399 archive_cache.filesystem.retry_backoff = 1
352
400
353 ; how many tries do do a retry fetch from this backend
401 ; how many tries do do a retry fetch from this backend
354 archive_cache.filesystem.retry_attempts = 10
402 archive_cache.filesystem.retry_attempts = 10
355
403
356
404
357 ; #############
405 ; #############
358 ; CELERY CONFIG
406 ; CELERY CONFIG
359 ; #############
407 ; #############
360
408
361 ; manually run celery: /path/to/celery worker --task-events --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini
409 ; manually run celery: /path/to/celery worker --task-events --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini
362
410
363 use_celery = true
411 use_celery = true
364
412
365 ; path to store schedule database
413 ; path to store schedule database
366 #celerybeat-schedule.path =
414 #celerybeat-schedule.path =
367
415
368 ; connection url to the message broker (default redis)
416 ; connection url to the message broker (default redis)
369 celery.broker_url = redis://redis:6379/8
417 celery.broker_url = redis://redis:6379/8
370
418
371 ; results backend to get results for (default redis)
419 ; results backend to get results for (default redis)
372 celery.result_backend = redis://redis:6379/8
420 celery.result_backend = redis://redis:6379/8
373
421
374 ; rabbitmq example
422 ; rabbitmq example
375 #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost
423 #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost
376
424
377 ; maximum tasks to execute before worker restart
425 ; maximum tasks to execute before worker restart
378 celery.max_tasks_per_child = 20
426 celery.max_tasks_per_child = 20
379
427
380 ; tasks will never be sent to the queue, but executed locally instead.
428 ; tasks will never be sent to the queue, but executed locally instead.
381 celery.task_always_eager = false
429 celery.task_always_eager = false
382
430
383 ; #############
431 ; #############
384 ; DOGPILE CACHE
432 ; DOGPILE CACHE
385 ; #############
433 ; #############
386
434
387 ; Default cache dir for caches. Putting this into a ramdisk can boost performance.
435 ; Default cache dir for caches. Putting this into a ramdisk can boost performance.
388 ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space
436 ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space
389 cache_dir = /var/opt/rhodecode_data
437 cache_dir = /var/opt/rhodecode_data
390
438
391 ; *********************************************
439 ; *********************************************
392 ; `sql_cache_short` cache for heavy SQL queries
440 ; `sql_cache_short` cache for heavy SQL queries
393 ; Only supported backend is `memory_lru`
441 ; Only supported backend is `memory_lru`
394 ; *********************************************
442 ; *********************************************
395 rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru
443 rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru
396 rc_cache.sql_cache_short.expiration_time = 30
444 rc_cache.sql_cache_short.expiration_time = 30
397
445
398
446
399 ; *****************************************************
447 ; *****************************************************
400 ; `cache_repo_longterm` cache for repo object instances
448 ; `cache_repo_longterm` cache for repo object instances
401 ; Only supported backend is `memory_lru`
449 ; Only supported backend is `memory_lru`
402 ; *****************************************************
450 ; *****************************************************
403 rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru
451 rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru
404 ; by default we use 30 Days, cache is still invalidated on push
452 ; by default we use 30 Days, cache is still invalidated on push
405 rc_cache.cache_repo_longterm.expiration_time = 2592000
453 rc_cache.cache_repo_longterm.expiration_time = 2592000
406 ; max items in LRU cache, set to smaller number to save memory, and expire last used caches
454 ; max items in LRU cache, set to smaller number to save memory, and expire last used caches
407 rc_cache.cache_repo_longterm.max_size = 10000
455 rc_cache.cache_repo_longterm.max_size = 10000
408
456
409
457
410 ; *********************************************
458 ; *********************************************
411 ; `cache_general` cache for general purpose use
459 ; `cache_general` cache for general purpose use
412 ; for simplicity use rc.file_namespace backend,
460 ; for simplicity use rc.file_namespace backend,
413 ; for performance and scale use rc.redis
461 ; for performance and scale use rc.redis
414 ; *********************************************
462 ; *********************************************
415 rc_cache.cache_general.backend = dogpile.cache.rc.file_namespace
463 rc_cache.cache_general.backend = dogpile.cache.rc.file_namespace
416 rc_cache.cache_general.expiration_time = 43200
464 rc_cache.cache_general.expiration_time = 43200
417 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
465 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
418 #rc_cache.cache_general.arguments.filename = /tmp/cache_general_db
466 #rc_cache.cache_general.arguments.filename = /tmp/cache_general_db
419
467
420 ; alternative `cache_general` redis backend with distributed lock
468 ; alternative `cache_general` redis backend with distributed lock
421 #rc_cache.cache_general.backend = dogpile.cache.rc.redis
469 #rc_cache.cache_general.backend = dogpile.cache.rc.redis
422 #rc_cache.cache_general.expiration_time = 300
470 #rc_cache.cache_general.expiration_time = 300
423
471
424 ; redis_expiration_time needs to be greater then expiration_time
472 ; redis_expiration_time needs to be greater then expiration_time
425 #rc_cache.cache_general.arguments.redis_expiration_time = 7200
473 #rc_cache.cache_general.arguments.redis_expiration_time = 7200
426
474
427 #rc_cache.cache_general.arguments.host = localhost
475 #rc_cache.cache_general.arguments.host = localhost
428 #rc_cache.cache_general.arguments.port = 6379
476 #rc_cache.cache_general.arguments.port = 6379
429 #rc_cache.cache_general.arguments.db = 0
477 #rc_cache.cache_general.arguments.db = 0
430 #rc_cache.cache_general.arguments.socket_timeout = 30
478 #rc_cache.cache_general.arguments.socket_timeout = 30
431 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
479 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
432 #rc_cache.cache_general.arguments.distributed_lock = true
480 #rc_cache.cache_general.arguments.distributed_lock = true
433
481
434 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
482 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
435 #rc_cache.cache_general.arguments.lock_auto_renewal = true
483 #rc_cache.cache_general.arguments.lock_auto_renewal = true
436
484
437 ; *************************************************
485 ; *************************************************
438 ; `cache_perms` cache for permission tree, auth TTL
486 ; `cache_perms` cache for permission tree, auth TTL
439 ; for simplicity use rc.file_namespace backend,
487 ; for simplicity use rc.file_namespace backend,
440 ; for performance and scale use rc.redis
488 ; for performance and scale use rc.redis
441 ; *************************************************
489 ; *************************************************
442 rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace
490 rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace
443 rc_cache.cache_perms.expiration_time = 3600
491 rc_cache.cache_perms.expiration_time = 3600
444 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
492 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
445 #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms_db
493 #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms_db
446
494
447 ; alternative `cache_perms` redis backend with distributed lock
495 ; alternative `cache_perms` redis backend with distributed lock
448 #rc_cache.cache_perms.backend = dogpile.cache.rc.redis
496 #rc_cache.cache_perms.backend = dogpile.cache.rc.redis
449 #rc_cache.cache_perms.expiration_time = 300
497 #rc_cache.cache_perms.expiration_time = 300
450
498
451 ; redis_expiration_time needs to be greater then expiration_time
499 ; redis_expiration_time needs to be greater then expiration_time
452 #rc_cache.cache_perms.arguments.redis_expiration_time = 7200
500 #rc_cache.cache_perms.arguments.redis_expiration_time = 7200
453
501
454 #rc_cache.cache_perms.arguments.host = localhost
502 #rc_cache.cache_perms.arguments.host = localhost
455 #rc_cache.cache_perms.arguments.port = 6379
503 #rc_cache.cache_perms.arguments.port = 6379
456 #rc_cache.cache_perms.arguments.db = 0
504 #rc_cache.cache_perms.arguments.db = 0
457 #rc_cache.cache_perms.arguments.socket_timeout = 30
505 #rc_cache.cache_perms.arguments.socket_timeout = 30
458 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
506 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
459 #rc_cache.cache_perms.arguments.distributed_lock = true
507 #rc_cache.cache_perms.arguments.distributed_lock = true
460
508
461 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
509 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
462 #rc_cache.cache_perms.arguments.lock_auto_renewal = true
510 #rc_cache.cache_perms.arguments.lock_auto_renewal = true
463
511
464 ; ***************************************************
512 ; ***************************************************
465 ; `cache_repo` cache for file tree, Readme, RSS FEEDS
513 ; `cache_repo` cache for file tree, Readme, RSS FEEDS
466 ; for simplicity use rc.file_namespace backend,
514 ; for simplicity use rc.file_namespace backend,
467 ; for performance and scale use rc.redis
515 ; for performance and scale use rc.redis
468 ; ***************************************************
516 ; ***************************************************
469 rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace
517 rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace
470 rc_cache.cache_repo.expiration_time = 2592000
518 rc_cache.cache_repo.expiration_time = 2592000
471 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
519 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
472 #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo_db
520 #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo_db
473
521
474 ; alternative `cache_repo` redis backend with distributed lock
522 ; alternative `cache_repo` redis backend with distributed lock
475 #rc_cache.cache_repo.backend = dogpile.cache.rc.redis
523 #rc_cache.cache_repo.backend = dogpile.cache.rc.redis
476 #rc_cache.cache_repo.expiration_time = 2592000
524 #rc_cache.cache_repo.expiration_time = 2592000
477
525
478 ; redis_expiration_time needs to be greater then expiration_time
526 ; redis_expiration_time needs to be greater then expiration_time
479 #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400
527 #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400
480
528
481 #rc_cache.cache_repo.arguments.host = localhost
529 #rc_cache.cache_repo.arguments.host = localhost
482 #rc_cache.cache_repo.arguments.port = 6379
530 #rc_cache.cache_repo.arguments.port = 6379
483 #rc_cache.cache_repo.arguments.db = 1
531 #rc_cache.cache_repo.arguments.db = 1
484 #rc_cache.cache_repo.arguments.socket_timeout = 30
532 #rc_cache.cache_repo.arguments.socket_timeout = 30
485 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
533 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
486 #rc_cache.cache_repo.arguments.distributed_lock = true
534 #rc_cache.cache_repo.arguments.distributed_lock = true
487
535
488 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
536 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
489 #rc_cache.cache_repo.arguments.lock_auto_renewal = true
537 #rc_cache.cache_repo.arguments.lock_auto_renewal = true
490
538
491 ; ##############
539 ; ##############
492 ; BEAKER SESSION
540 ; BEAKER SESSION
493 ; ##############
541 ; ##############
494
542
495 ; beaker.session.type is type of storage options for the logged users sessions. Current allowed
543 ; beaker.session.type is type of storage options for the logged users sessions. Current allowed
496 ; types are file, ext:redis, ext:database, ext:memcached
544 ; types are file, ext:redis, ext:database, ext:memcached
497 ; Fastest ones are ext:redis and ext:database, DO NOT use memory type for session
545 ; Fastest ones are ext:redis and ext:database, DO NOT use memory type for session
498 #beaker.session.type = file
546 #beaker.session.type = file
499 #beaker.session.data_dir = %(here)s/data/sessions
547 #beaker.session.data_dir = %(here)s/data/sessions
500
548
501 ; Redis based sessions
549 ; Redis based sessions
502 beaker.session.type = ext:redis
550 beaker.session.type = ext:redis
503 beaker.session.url = redis://redis:6379/2
551 beaker.session.url = redis://redis:6379/2
504
552
505 ; DB based session, fast, and allows easy management over logged in users
553 ; DB based session, fast, and allows easy management over logged in users
506 #beaker.session.type = ext:database
554 #beaker.session.type = ext:database
507 #beaker.session.table_name = db_session
555 #beaker.session.table_name = db_session
508 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
556 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
509 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
557 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
510 #beaker.session.sa.pool_recycle = 3600
558 #beaker.session.sa.pool_recycle = 3600
511 #beaker.session.sa.echo = false
559 #beaker.session.sa.echo = false
512
560
513 beaker.session.key = rhodecode
561 beaker.session.key = rhodecode
514 beaker.session.secret = develop-rc-uytcxaz
562 beaker.session.secret = develop-rc-uytcxaz
515 beaker.session.lock_dir = /data_ramdisk/lock
563 beaker.session.lock_dir = /data_ramdisk/lock
516
564
517 ; Secure encrypted cookie. Requires AES and AES python libraries
565 ; Secure encrypted cookie. Requires AES and AES python libraries
518 ; you must disable beaker.session.secret to use this
566 ; you must disable beaker.session.secret to use this
519 #beaker.session.encrypt_key = key_for_encryption
567 #beaker.session.encrypt_key = key_for_encryption
520 #beaker.session.validate_key = validation_key
568 #beaker.session.validate_key = validation_key
521
569
522 ; Sets session as invalid (also logging out user) if it haven not been
570 ; Sets session as invalid (also logging out user) if it haven not been
523 ; accessed for given amount of time in seconds
571 ; accessed for given amount of time in seconds
524 beaker.session.timeout = 2592000
572 beaker.session.timeout = 2592000
525 beaker.session.httponly = true
573 beaker.session.httponly = true
526
574
527 ; Path to use for the cookie. Set to prefix if you use prefix middleware
575 ; Path to use for the cookie. Set to prefix if you use prefix middleware
528 #beaker.session.cookie_path = /custom_prefix
576 #beaker.session.cookie_path = /custom_prefix
529
577
530 ; Set https secure cookie
578 ; Set https secure cookie
531 beaker.session.secure = false
579 beaker.session.secure = false
532
580
533 ; default cookie expiration time in seconds, set to `true` to set expire
581 ; default cookie expiration time in seconds, set to `true` to set expire
534 ; at browser close
582 ; at browser close
535 #beaker.session.cookie_expires = 3600
583 #beaker.session.cookie_expires = 3600
536
584
537 ; #############################
585 ; #############################
538 ; SEARCH INDEXING CONFIGURATION
586 ; SEARCH INDEXING CONFIGURATION
539 ; #############################
587 ; #############################
540
588
541 ; Full text search indexer is available in rhodecode-tools under
589 ; Full text search indexer is available in rhodecode-tools under
542 ; `rhodecode-tools index` command
590 ; `rhodecode-tools index` command
543
591
544 ; WHOOSH Backend, doesn't require additional services to run
592 ; WHOOSH Backend, doesn't require additional services to run
545 ; it works good with few dozen repos
593 ; it works good with few dozen repos
546 search.module = rhodecode.lib.index.whoosh
594 search.module = rhodecode.lib.index.whoosh
547 search.location = %(here)s/data/index
595 search.location = %(here)s/data/index
548
596
549 ; ####################
597 ; ####################
550 ; CHANNELSTREAM CONFIG
598 ; CHANNELSTREAM CONFIG
551 ; ####################
599 ; ####################
552
600
553 ; channelstream enables persistent connections and live notification
601 ; channelstream enables persistent connections and live notification
554 ; in the system. It's also used by the chat system
602 ; in the system. It's also used by the chat system
555
603
556 channelstream.enabled = true
604 channelstream.enabled = true
557
605
558 ; server address for channelstream server on the backend
606 ; server address for channelstream server on the backend
559 channelstream.server = channelstream:9800
607 channelstream.server = channelstream:9800
560
608
561 ; location of the channelstream server from outside world
609 ; location of the channelstream server from outside world
562 ; use ws:// for http or wss:// for https. This address needs to be handled
610 ; use ws:// for http or wss:// for https. This address needs to be handled
563 ; by external HTTP server such as Nginx or Apache
611 ; by external HTTP server such as Nginx or Apache
564 ; see Nginx/Apache configuration examples in our docs
612 ; see Nginx/Apache configuration examples in our docs
565 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
613 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
566 channelstream.secret = ENV_GENERATED
614 channelstream.secret = ENV_GENERATED
567 channelstream.history.location = /var/opt/rhodecode_data/channelstream_history
615 channelstream.history.location = /var/opt/rhodecode_data/channelstream_history
568
616
569 ; Internal application path that Javascript uses to connect into.
617 ; Internal application path that Javascript uses to connect into.
570 ; If you use proxy-prefix the prefix should be added before /_channelstream
618 ; If you use proxy-prefix the prefix should be added before /_channelstream
571 channelstream.proxy_path = /_channelstream
619 channelstream.proxy_path = /_channelstream
572
620
573
621
574 ; ##############################
622 ; ##############################
575 ; MAIN RHODECODE DATABASE CONFIG
623 ; MAIN RHODECODE DATABASE CONFIG
576 ; ##############################
624 ; ##############################
577
625
578 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
626 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
579 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
627 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
580 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8
628 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8
581 ; pymysql is an alternative driver for MySQL, use in case of problems with default one
629 ; pymysql is an alternative driver for MySQL, use in case of problems with default one
582 #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode
630 #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode
583
631
584 sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
632 sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
585
633
586 ; see sqlalchemy docs for other advanced settings
634 ; see sqlalchemy docs for other advanced settings
587 ; print the sql statements to output
635 ; print the sql statements to output
588 sqlalchemy.db1.echo = false
636 sqlalchemy.db1.echo = false
589
637
590 ; recycle the connections after this amount of seconds
638 ; recycle the connections after this amount of seconds
591 sqlalchemy.db1.pool_recycle = 3600
639 sqlalchemy.db1.pool_recycle = 3600
592
640
593 ; the number of connections to keep open inside the connection pool.
641 ; the number of connections to keep open inside the connection pool.
594 ; 0 indicates no limit
642 ; 0 indicates no limit
595 ; the general calculus with gevent is:
643 ; the general calculus with gevent is:
596 ; if your system allows 500 concurrent greenlets (max_connections) that all do database access,
644 ; if your system allows 500 concurrent greenlets (max_connections) that all do database access,
597 ; then increase pool size + max overflow so that they add up to 500.
645 ; then increase pool size + max overflow so that they add up to 500.
598 #sqlalchemy.db1.pool_size = 5
646 #sqlalchemy.db1.pool_size = 5
599
647
600 ; The number of connections to allow in connection pool "overflow", that is
648 ; The number of connections to allow in connection pool "overflow", that is
601 ; connections that can be opened above and beyond the pool_size setting,
649 ; connections that can be opened above and beyond the pool_size setting,
602 ; which defaults to five.
650 ; which defaults to five.
603 #sqlalchemy.db1.max_overflow = 10
651 #sqlalchemy.db1.max_overflow = 10
604
652
605 ; Connection check ping, used to detect broken database connections
653 ; Connection check ping, used to detect broken database connections
606 ; could be enabled to better handle cases if MySQL has gone away errors
654 ; could be enabled to better handle cases if MySQL has gone away errors
607 #sqlalchemy.db1.ping_connection = true
655 #sqlalchemy.db1.ping_connection = true
608
656
609 ; ##########
657 ; ##########
610 ; VCS CONFIG
658 ; VCS CONFIG
611 ; ##########
659 ; ##########
612 vcs.server.enable = true
660 vcs.server.enable = true
613 vcs.server = vcsserver:10010
661 vcs.server = vcsserver:10010
614
662
615 ; Web server connectivity protocol, responsible for web based VCS operations
663 ; Web server connectivity protocol, responsible for web based VCS operations
616 ; Available protocols are:
664 ; Available protocols are:
617 ; `http` - use http-rpc backend (default)
665 ; `http` - use http-rpc backend (default)
618 vcs.server.protocol = http
666 vcs.server.protocol = http
619
667
620 ; Push/Pull operations protocol, available options are:
668 ; Push/Pull operations protocol, available options are:
621 ; `http` - use http-rpc backend (default)
669 ; `http` - use http-rpc backend (default)
622 vcs.scm_app_implementation = http
670 vcs.scm_app_implementation = http
623
671
624 ; Push/Pull operations hooks protocol, available options are:
672 ; Push/Pull operations hooks protocol, available options are:
625 ; `http` - use http-rpc backend (default)
673 ; `http` - use http-rpc backend (default)
626 ; `celery` - use celery based hooks
674 ; `celery` - use celery based hooks
627 vcs.hooks.protocol = http
675 #DEPRECATED:vcs.hooks.protocol = http
676 vcs.hooks.protocol.v2 = celery
628
677
629 ; Host on which this instance is listening for hooks. vcsserver will call this host to pull/push hooks so it should be
678 ; Host on which this instance is listening for hooks. vcsserver will call this host to pull/push hooks so it should be
630 ; accessible via network.
679 ; accessible via network.
631 ; Use vcs.hooks.host = "*" to bind to current hostname (for Docker)
680 ; Use vcs.hooks.host = "*" to bind to current hostname (for Docker)
632 vcs.hooks.host = *
681 vcs.hooks.host = *
633
682
634 ; Start VCSServer with this instance as a subprocess, useful for development
683 ; Start VCSServer with this instance as a subprocess, useful for development
635 vcs.start_server = false
684 vcs.start_server = false
636
685
637 ; List of enabled VCS backends, available options are:
686 ; List of enabled VCS backends, available options are:
638 ; `hg` - mercurial
687 ; `hg` - mercurial
639 ; `git` - git
688 ; `git` - git
640 ; `svn` - subversion
689 ; `svn` - subversion
641 vcs.backends = hg, git, svn
690 vcs.backends = hg, git, svn
642
691
643 ; Wait this number of seconds before killing connection to the vcsserver
692 ; Wait this number of seconds before killing connection to the vcsserver
644 vcs.connection_timeout = 3600
693 vcs.connection_timeout = 3600
645
694
646 ; Cache flag to cache vcsserver remote calls locally
695 ; Cache flag to cache vcsserver remote calls locally
647 ; It uses cache_region `cache_repo`
696 ; It uses cache_region `cache_repo`
648 vcs.methods.cache = true
697 vcs.methods.cache = true
649
698
699 ; Filesystem location where Git lfs objects should be stored
700 vcs.git.lfs.storage_location = /var/opt/rhodecode_repo_store/.cache/git_lfs_store
701
702 ; Filesystem location where Mercurial largefile objects should be stored
703 vcs.hg.largefiles.storage_location = /var/opt/rhodecode_repo_store/.cache/hg_largefiles_store
704
650 ; ####################################################
705 ; ####################################################
651 ; Subversion proxy support (mod_dav_svn)
706 ; Subversion proxy support (mod_dav_svn)
652 ; Maps RhodeCode repo groups into SVN paths for Apache
707 ; Maps RhodeCode repo groups into SVN paths for Apache
653 ; ####################################################
708 ; ####################################################
654
709
655 ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
710 ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
656 ; Set a numeric version for your current SVN e.g 1.8, or 1.12
711 ; Set a numeric version for your current SVN e.g 1.8, or 1.12
657 ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible
712 ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible
658 #vcs.svn.compatible_version = 1.8
713 #vcs.svn.compatible_version = 1.8
659
714
660 ; Redis connection settings for svn integrations logic
715 ; Redis connection settings for svn integrations logic
661 ; This connection string needs to be the same on ce and vcsserver
716 ; This connection string needs to be the same on ce and vcsserver
662 vcs.svn.redis_conn = redis://redis:6379/0
717 vcs.svn.redis_conn = redis://redis:6379/0
663
718
664 ; Enable SVN proxy of requests over HTTP
719 ; Enable SVN proxy of requests over HTTP
665 vcs.svn.proxy.enabled = true
720 vcs.svn.proxy.enabled = true
666
721
667 ; host to connect to running SVN subsystem
722 ; host to connect to running SVN subsystem
668 vcs.svn.proxy.host = http://svn:8090
723 vcs.svn.proxy.host = http://svn:8090
669
724
670 ; Enable or disable the config file generation.
725 ; Enable or disable the config file generation.
671 svn.proxy.generate_config = true
726 svn.proxy.generate_config = true
672
727
673 ; Generate config file with `SVNListParentPath` set to `On`.
728 ; Generate config file with `SVNListParentPath` set to `On`.
674 svn.proxy.list_parent_path = true
729 svn.proxy.list_parent_path = true
675
730
676 ; Set location and file name of generated config file.
731 ; Set location and file name of generated config file.
677 svn.proxy.config_file_path = /etc/rhodecode/conf/svn/mod_dav_svn.conf
732 svn.proxy.config_file_path = /etc/rhodecode/conf/svn/mod_dav_svn.conf
678
733
679 ; alternative mod_dav config template. This needs to be a valid mako template
734 ; alternative mod_dav config template. This needs to be a valid mako template
680 ; Example template can be found in the source code:
735 ; Example template can be found in the source code:
681 ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako
736 ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako
682 #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako
737 #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako
683
738
684 ; Used as a prefix to the `Location` block in the generated config file.
739 ; Used as a prefix to the `Location` block in the generated config file.
685 ; In most cases it should be set to `/`.
740 ; In most cases it should be set to `/`.
686 svn.proxy.location_root = /
741 svn.proxy.location_root = /
687
742
688 ; Command to reload the mod dav svn configuration on change.
743 ; Command to reload the mod dav svn configuration on change.
689 ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh
744 ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh
690 ; Make sure user who runs RhodeCode process is allowed to reload Apache
745 ; Make sure user who runs RhodeCode process is allowed to reload Apache
691 #svn.proxy.reload_cmd = /etc/init.d/apache2 reload
746 #svn.proxy.reload_cmd = /etc/init.d/apache2 reload
692
747
693 ; If the timeout expires before the reload command finishes, the command will
748 ; If the timeout expires before the reload command finishes, the command will
694 ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds.
749 ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds.
695 #svn.proxy.reload_timeout = 10
750 #svn.proxy.reload_timeout = 10
696
751
697 ; ####################
752 ; ####################
698 ; SSH Support Settings
753 ; SSH Support Settings
699 ; ####################
754 ; ####################
700
755
701 ; Defines if a custom authorized_keys file should be created and written on
756 ; Defines if a custom authorized_keys file should be created and written on
702 ; any change user ssh keys. Setting this to false also disables possibility
757 ; any change user ssh keys. Setting this to false also disables possibility
703 ; of adding SSH keys by users from web interface. Super admins can still
758 ; of adding SSH keys by users from web interface. Super admins can still
704 ; manage SSH Keys.
759 ; manage SSH Keys.
705 ssh.generate_authorized_keyfile = true
760 ssh.generate_authorized_keyfile = true
706
761
707 ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding`
762 ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding`
708 # ssh.authorized_keys_ssh_opts =
763 # ssh.authorized_keys_ssh_opts =
709
764
710 ; Path to the authorized_keys file where the generate entries are placed.
765 ; Path to the authorized_keys file where the generate entries are placed.
711 ; It is possible to have multiple key files specified in `sshd_config` e.g.
766 ; It is possible to have multiple key files specified in `sshd_config` e.g.
712 ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode
767 ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode
713 ssh.authorized_keys_file_path = /etc/rhodecode/conf/ssh/authorized_keys_rhodecode
768 ssh.authorized_keys_file_path = /etc/rhodecode/conf/ssh/authorized_keys_rhodecode
714
769
715 ; Command to execute the SSH wrapper. The binary is available in the
770 ; Command to execute the SSH wrapper. The binary is available in the
716 ; RhodeCode installation directory.
771 ; RhodeCode installation directory.
717 ; legacy: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
772 ; legacy: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
718 ; new rewrite: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2
773 ; new rewrite: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2
719 ssh.wrapper_cmd = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
774 #DEPRECATED: ssh.wrapper_cmd = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
775 ssh.wrapper_cmd.v2 = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2
720
776
721 ; Allow shell when executing the ssh-wrapper command
777 ; Allow shell when executing the ssh-wrapper command
722 ssh.wrapper_cmd_allow_shell = false
778 ssh.wrapper_cmd_allow_shell = false
723
779
724 ; Enables logging, and detailed output send back to the client during SSH
780 ; Enables logging, and detailed output send back to the client during SSH
725 ; operations. Useful for debugging, shouldn't be used in production.
781 ; operations. Useful for debugging, shouldn't be used in production.
726 ssh.enable_debug_logging = true
782 ssh.enable_debug_logging = true
727
783
728 ; Paths to binary executable, by default they are the names, but we can
784 ; Paths to binary executable, by default they are the names, but we can
729 ; override them if we want to use a custom one
785 ; override them if we want to use a custom one
730 ssh.executable.hg = /usr/local/bin/rhodecode_bin/vcs_bin/hg
786 ssh.executable.hg = /usr/local/bin/rhodecode_bin/vcs_bin/hg
731 ssh.executable.git = /usr/local/bin/rhodecode_bin/vcs_bin/git
787 ssh.executable.git = /usr/local/bin/rhodecode_bin/vcs_bin/git
732 ssh.executable.svn = /usr/local/bin/rhodecode_bin/vcs_bin/svnserve
788 ssh.executable.svn = /usr/local/bin/rhodecode_bin/vcs_bin/svnserve
733
789
734 ; Enables SSH key generator web interface. Disabling this still allows users
790 ; Enables SSH key generator web interface. Disabling this still allows users
735 ; to add their own keys.
791 ; to add their own keys.
736 ssh.enable_ui_key_generator = true
792 ssh.enable_ui_key_generator = true
737
793
738 ; Statsd client config, this is used to send metrics to statsd
794 ; Statsd client config, this is used to send metrics to statsd
739 ; We recommend setting statsd_exported and scrape them using Prometheus
795 ; We recommend setting statsd_exported and scrape them using Prometheus
740 #statsd.enabled = false
796 #statsd.enabled = false
741 #statsd.statsd_host = 0.0.0.0
797 #statsd.statsd_host = 0.0.0.0
742 #statsd.statsd_port = 8125
798 #statsd.statsd_port = 8125
743 #statsd.statsd_prefix =
799 #statsd.statsd_prefix =
744 #statsd.statsd_ipv6 = false
800 #statsd.statsd_ipv6 = false
745
801
746 ; configure logging automatically at server startup set to false
802 ; configure logging automatically at server startup set to false
747 ; to use the below custom logging config.
803 ; to use the below custom logging config.
748 ; RC_LOGGING_FORMATTER
804 ; RC_LOGGING_FORMATTER
749 ; RC_LOGGING_LEVEL
805 ; RC_LOGGING_LEVEL
750 ; env variables can control the settings for logging in case of autoconfigure
806 ; env variables can control the settings for logging in case of autoconfigure
751
807
752 #logging.autoconfigure = true
808 #logging.autoconfigure = true
753
809
754 ; specify your own custom logging config file to configure logging
810 ; specify your own custom logging config file to configure logging
755 #logging.logging_conf_file = /path/to/custom_logging.ini
811 #logging.logging_conf_file = /path/to/custom_logging.ini
756
812
757 ; Dummy marker to add new entries after.
813 ; Dummy marker to add new entries after.
758 ; Add any custom entries below. Please don't remove this marker.
814 ; Add any custom entries below. Please don't remove this marker.
759 custom.conf = 1
815 custom.conf = 1
760
816
761
817
762 ; #####################
818 ; #####################
763 ; LOGGING CONFIGURATION
819 ; LOGGING CONFIGURATION
764 ; #####################
820 ; #####################
765
821
766 [loggers]
822 [loggers]
767 keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper
823 keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper
768
824
769 [handlers]
825 [handlers]
770 keys = console, console_sql
826 keys = console, console_sql
771
827
772 [formatters]
828 [formatters]
773 keys = generic, json, color_formatter, color_formatter_sql
829 keys = generic, json, color_formatter, color_formatter_sql
774
830
775 ; #######
831 ; #######
776 ; LOGGERS
832 ; LOGGERS
777 ; #######
833 ; #######
778 [logger_root]
834 [logger_root]
779 level = NOTSET
835 level = NOTSET
780 handlers = console
836 handlers = console
781
837
782 [logger_sqlalchemy]
838 [logger_sqlalchemy]
783 level = INFO
839 level = INFO
784 handlers = console_sql
840 handlers = console_sql
785 qualname = sqlalchemy.engine
841 qualname = sqlalchemy.engine
786 propagate = 0
842 propagate = 0
787
843
788 [logger_beaker]
844 [logger_beaker]
789 level = DEBUG
845 level = DEBUG
790 handlers =
846 handlers =
791 qualname = beaker.container
847 qualname = beaker.container
792 propagate = 1
848 propagate = 1
793
849
794 [logger_rhodecode]
850 [logger_rhodecode]
795 level = DEBUG
851 level = DEBUG
796 handlers =
852 handlers =
797 qualname = rhodecode
853 qualname = rhodecode
798 propagate = 1
854 propagate = 1
799
855
800 [logger_ssh_wrapper]
856 [logger_ssh_wrapper]
801 level = DEBUG
857 level = DEBUG
802 handlers =
858 handlers =
803 qualname = ssh_wrapper
859 qualname = ssh_wrapper
804 propagate = 1
860 propagate = 1
805
861
806 [logger_celery]
862 [logger_celery]
807 level = DEBUG
863 level = DEBUG
808 handlers =
864 handlers =
809 qualname = celery
865 qualname = celery
810
866
811
867
812 ; ########
868 ; ########
813 ; HANDLERS
869 ; HANDLERS
814 ; ########
870 ; ########
815
871
816 [handler_console]
872 [handler_console]
817 class = StreamHandler
873 class = StreamHandler
818 args = (sys.stderr, )
874 args = (sys.stderr, )
819 level = DEBUG
875 level = DEBUG
820 ; To enable JSON formatted logs replace 'generic/color_formatter' with 'json'
876 ; To enable JSON formatted logs replace 'generic/color_formatter' with 'json'
821 ; This allows sending properly formatted logs to grafana loki or elasticsearch
877 ; This allows sending properly formatted logs to grafana loki or elasticsearch
822 formatter = color_formatter
878 formatter = color_formatter
823
879
824 [handler_console_sql]
880 [handler_console_sql]
825 ; "level = DEBUG" logs SQL queries and results.
881 ; "level = DEBUG" logs SQL queries and results.
826 ; "level = INFO" logs SQL queries.
882 ; "level = INFO" logs SQL queries.
827 ; "level = WARN" logs neither. (Recommended for production systems.)
883 ; "level = WARN" logs neither. (Recommended for production systems.)
828 class = StreamHandler
884 class = StreamHandler
829 args = (sys.stderr, )
885 args = (sys.stderr, )
830 level = WARN
886 level = WARN
831 ; To enable JSON formatted logs replace 'generic/color_formatter_sql' with 'json'
887 ; To enable JSON formatted logs replace 'generic/color_formatter_sql' with 'json'
832 ; This allows sending properly formatted logs to grafana loki or elasticsearch
888 ; This allows sending properly formatted logs to grafana loki or elasticsearch
833 formatter = color_formatter_sql
889 formatter = color_formatter_sql
834
890
835 ; ##########
891 ; ##########
836 ; FORMATTERS
892 ; FORMATTERS
837 ; ##########
893 ; ##########
838
894
839 [formatter_generic]
895 [formatter_generic]
840 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
896 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
841 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
897 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
842 datefmt = %Y-%m-%d %H:%M:%S
898 datefmt = %Y-%m-%d %H:%M:%S
843
899
844 [formatter_color_formatter]
900 [formatter_color_formatter]
845 class = rhodecode.lib.logging_formatter.ColorFormatter
901 class = rhodecode.lib.logging_formatter.ColorFormatter
846 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
902 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
847 datefmt = %Y-%m-%d %H:%M:%S
903 datefmt = %Y-%m-%d %H:%M:%S
848
904
849 [formatter_color_formatter_sql]
905 [formatter_color_formatter_sql]
850 class = rhodecode.lib.logging_formatter.ColorFormatterSql
906 class = rhodecode.lib.logging_formatter.ColorFormatterSql
851 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
907 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
852 datefmt = %Y-%m-%d %H:%M:%S
908 datefmt = %Y-%m-%d %H:%M:%S
853
909
854 [formatter_json]
910 [formatter_json]
855 format = %(timestamp)s %(levelname)s %(name)s %(message)s %(req_id)s
911 format = %(timestamp)s %(levelname)s %(name)s %(message)s %(req_id)s
856 class = rhodecode.lib._vendor.jsonlogger.JsonFormatter
912 class = rhodecode.lib._vendor.jsonlogger.JsonFormatter
@@ -1,520 +1,545 b''
1 """
1 """
2 Gunicorn config extension and hooks. This config file adds some extra settings and memory management.
2 Gunicorn config extension and hooks. This config file adds some extra settings and memory management.
3 Gunicorn configuration should be managed by .ini files entries of RhodeCode or VCSServer
3 Gunicorn configuration should be managed by .ini files entries of RhodeCode or VCSServer
4 """
4 """
5
5
6 import gc
6 import gc
7 import os
7 import os
8 import sys
8 import sys
9 import math
9 import math
10 import time
10 import time
11 import threading
11 import threading
12 import traceback
12 import traceback
13 import random
13 import random
14 import socket
14 import socket
15 import dataclasses
15 import dataclasses
16 import json
16 from gunicorn.glogging import Logger
17 from gunicorn.glogging import Logger
17
18
18
19
19 def get_workers():
20 def get_workers():
20 import multiprocessing
21 import multiprocessing
21 return multiprocessing.cpu_count() * 2 + 1
22 return multiprocessing.cpu_count() * 2 + 1
22
23
23
24
24 bind = "127.0.0.1:10020"
25 bind = "127.0.0.1:10020"
25
26
26
27
27 # Error logging output for gunicorn (-) is stdout
28 # Error logging output for gunicorn (-) is stdout
28 errorlog = '-'
29 errorlog = '-'
29
30
30 # Access logging output for gunicorn (-) is stdout
31 # Access logging output for gunicorn (-) is stdout
31 accesslog = '-'
32 accesslog = '-'
32
33
33
34
34 # SERVER MECHANICS
35 # SERVER MECHANICS
35 # None == system temp dir
36 # None == system temp dir
36 # worker_tmp_dir is recommended to be set to some tmpfs
37 # worker_tmp_dir is recommended to be set to some tmpfs
37 worker_tmp_dir = None
38 worker_tmp_dir = None
38 tmp_upload_dir = None
39 tmp_upload_dir = None
39
40
40 # use re-use port logic
41 # use re-use port logic to let linux internals load-balance the requests better.
41 #reuse_port = True
42 reuse_port = True
42
43
43 # Custom log format
44 # Custom log format
44 #access_log_format = (
45 #access_log_format = (
45 # '%(t)s %(p)s INFO [GNCRN] %(h)-15s rqt:%(L)s %(s)s %(b)-6s "%(m)s:%(U)s %(q)s" usr:%(u)s "%(f)s" "%(a)s"')
46 # '%(t)s %(p)s INFO [GNCRN] %(h)-15s rqt:%(L)s %(s)s %(b)-6s "%(m)s:%(U)s %(q)s" usr:%(u)s "%(f)s" "%(a)s"')
46
47
47 # loki format for easier parsing in grafana
48 # loki format for easier parsing in grafana
48 access_log_format = (
49 loki_access_log_format = (
49 'time="%(t)s" pid=%(p)s level="INFO" type="[GNCRN]" ip="%(h)-15s" rqt="%(L)s" response_code="%(s)s" response_bytes="%(b)-6s" uri="%(m)s:%(U)s %(q)s" user=":%(u)s" user_agent="%(a)s"')
50 'time="%(t)s" pid=%(p)s level="INFO" type="[GNCRN]" ip="%(h)-15s" rqt="%(L)s" response_code="%(s)s" response_bytes="%(b)-6s" uri="%(m)s:%(U)s %(q)s" user=":%(u)s" user_agent="%(a)s"')
50
51
52 # JSON format
53 json_access_log_format = json.dumps({
54 'time': r'%(t)s',
55 'pid': r'%(p)s',
56 'level': 'INFO',
57 'ip': r'%(h)s',
58 'request_time': r'%(L)s',
59 'remote_address': r'%(h)s',
60 'user_name': r'%(u)s',
61 'status': r'%(s)s',
62 'method': r'%(m)s',
63 'url_path': r'%(U)s',
64 'query_string': r'%(q)s',
65 'protocol': r'%(H)s',
66 'response_length': r'%(B)s',
67 'referer': r'%(f)s',
68 'user_agent': r'%(a)s',
69
70 })
71
72 access_log_format = loki_access_log_format
73 if os.environ.get('RC_LOGGING_FORMATTER') == 'json':
74 access_log_format = json_access_log_format
75
51 # self adjust workers based on CPU count, to use maximum of CPU and not overquota the resources
76 # self adjust workers based on CPU count, to use maximum of CPU and not overquota the resources
52 # workers = get_workers()
77 # workers = get_workers()
53
78
54 # Gunicorn access log level
79 # Gunicorn access log level
55 loglevel = 'info'
80 loglevel = 'info'
56
81
57 # Process name visible in a process list
82 # Process name visible in a process list
58 proc_name = 'rhodecode_enterprise'
83 proc_name = 'rhodecode_enterprise'
59
84
60 # Type of worker class, one of `sync`, `gevent` or `gthread`
85 # Type of worker class, one of `sync`, `gevent` or `gthread`
61 # currently `sync` is the only option allowed for vcsserver and for rhodecode all of 3 are allowed
86 # currently `sync` is the only option allowed for vcsserver and for rhodecode all of 3 are allowed
62 # gevent:
87 # gevent:
63 # In this case, the maximum number of concurrent requests is (N workers * X worker_connections)
88 # In this case, the maximum number of concurrent requests is (N workers * X worker_connections)
64 # e.g. workers =3 worker_connections=10 = 3*10, 30 concurrent requests can be handled
89 # e.g. workers =3 worker_connections=10 = 3*10, 30 concurrent requests can be handled
65 # gthread:
90 # gthread:
66 # In this case, the maximum number of concurrent requests is (N workers * X threads)
91 # In this case, the maximum number of concurrent requests is (N workers * X threads)
67 # e.g. workers = 3 threads=3 = 3*3, 9 concurrent requests can be handled
92 # e.g. workers = 3 threads=3 = 3*3, 9 concurrent requests can be handled
68 worker_class = 'gthread'
93 worker_class = 'gthread'
69
94
70 # Sets the number of process workers. More workers means more concurrent connections
95 # Sets the number of process workers. More workers means more concurrent connections
71 # RhodeCode can handle at the same time. Each additional worker also it increases
96 # RhodeCode can handle at the same time. Each additional worker also it increases
72 # memory usage as each has its own set of caches.
97 # memory usage as each has its own set of caches.
73 # The Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more
98 # The Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more
74 # than 8-10 unless for huge deployments .e.g 700-1000 users.
99 # than 8-10 unless for huge deployments .e.g 700-1000 users.
75 # `instance_id = *` must be set in the [app:main] section below (which is the default)
100 # `instance_id = *` must be set in the [app:main] section below (which is the default)
76 # when using more than 1 worker.
101 # when using more than 1 worker.
77 workers = 2
102 workers = 2
78
103
79 # Threads numbers for worker class gthread
104 # Threads numbers for worker class gthread
80 threads = 1
105 threads = 1
81
106
82 # The maximum number of simultaneous clients. Valid only for gevent
107 # The maximum number of simultaneous clients. Valid only for gevent
83 # In this case, the maximum number of concurrent requests is (N workers * X worker_connections)
108 # In this case, the maximum number of concurrent requests is (N workers * X worker_connections)
84 # e.g workers =3 worker_connections=10 = 3*10, 30 concurrent requests can be handled
109 # e.g workers =3 worker_connections=10 = 3*10, 30 concurrent requests can be handled
85 worker_connections = 10
110 worker_connections = 10
86
111
87 # Max number of requests that worker will handle before being gracefully restarted.
112 # Max number of requests that worker will handle before being gracefully restarted.
88 # Prevents memory leaks, jitter adds variability so not all workers are restarted at once.
113 # Prevents memory leaks, jitter adds variability so not all workers are restarted at once.
89 max_requests = 2000
114 max_requests = 2000
90 max_requests_jitter = int(max_requests * 0.2) # 20% of max_requests
115 max_requests_jitter = int(max_requests * 0.2) # 20% of max_requests
91
116
92 # The maximum number of pending connections.
117 # The maximum number of pending connections.
93 # Exceeding this number results in the client getting an error when attempting to connect.
118 # Exceeding this number results in the client getting an error when attempting to connect.
94 backlog = 64
119 backlog = 64
95
120
96 # The Amount of time a worker can spend with handling a request before it
121 # The Amount of time a worker can spend with handling a request before it
97 # gets killed and restarted. By default, set to 21600 (6hrs)
122 # gets killed and restarted. By default, set to 21600 (6hrs)
98 # Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h)
123 # Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h)
99 timeout = 21600
124 timeout = 21600
100
125
101 # The maximum size of HTTP request line in bytes.
126 # The maximum size of HTTP request line in bytes.
102 # 0 for unlimited
127 # 0 for unlimited
103 limit_request_line = 0
128 limit_request_line = 0
104
129
105 # Limit the number of HTTP headers fields in a request.
130 # Limit the number of HTTP headers fields in a request.
106 # By default this value is 100 and can't be larger than 32768.
131 # By default this value is 100 and can't be larger than 32768.
107 limit_request_fields = 32768
132 limit_request_fields = 32768
108
133
109 # Limit the allowed size of an HTTP request header field.
134 # Limit the allowed size of an HTTP request header field.
110 # Value is a positive number or 0.
135 # Value is a positive number or 0.
111 # Setting it to 0 will allow unlimited header field sizes.
136 # Setting it to 0 will allow unlimited header field sizes.
112 limit_request_field_size = 0
137 limit_request_field_size = 0
113
138
114 # Timeout for graceful workers restart.
139 # Timeout for graceful workers restart.
115 # After receiving a restart signal, workers have this much time to finish
140 # After receiving a restart signal, workers have this much time to finish
116 # serving requests. Workers still alive after the timeout (starting from the
141 # serving requests. Workers still alive after the timeout (starting from the
117 # receipt of the restart signal) are force killed.
142 # receipt of the restart signal) are force killed.
118 # Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h)
143 # Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h)
119 graceful_timeout = 21600
144 graceful_timeout = 21600
120
145
121 # The number of seconds to wait for requests on a Keep-Alive connection.
146 # The number of seconds to wait for requests on a Keep-Alive connection.
122 # Generally set in the 1-5 seconds range.
147 # Generally set in the 1-5 seconds range.
123 keepalive = 2
148 keepalive = 2
124
149
125 # Maximum memory usage that each worker can use before it will receive a
150 # Maximum memory usage that each worker can use before it will receive a
126 # graceful restart signal 0 = memory monitoring is disabled
151 # graceful restart signal 0 = memory monitoring is disabled
127 # Examples: 268435456 (256MB), 536870912 (512MB)
152 # Examples: 268435456 (256MB), 536870912 (512MB)
128 # 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB)
153 # 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB)
129 # Dynamic formula 1024 * 1024 * 256 == 256MBs
154 # Dynamic formula 1024 * 1024 * 256 == 256MBs
130 memory_max_usage = 0
155 memory_max_usage = 0
131
156
132 # How often in seconds to check for memory usage for each gunicorn worker
157 # How often in seconds to check for memory usage for each gunicorn worker
133 memory_usage_check_interval = 60
158 memory_usage_check_interval = 60
134
159
135 # Threshold value for which we don't recycle worker if GarbageCollection
160 # Threshold value for which we don't recycle worker if GarbageCollection
136 # frees up enough resources. Before each restart, we try to run GC on worker
161 # frees up enough resources. Before each restart, we try to run GC on worker
137 # in case we get enough free memory after that; restart will not happen.
162 # in case we get enough free memory after that; restart will not happen.
138 memory_usage_recovery_threshold = 0.8
163 memory_usage_recovery_threshold = 0.8
139
164
140
165
141 @dataclasses.dataclass
166 @dataclasses.dataclass
142 class MemoryCheckConfig:
167 class MemoryCheckConfig:
143 max_usage: int
168 max_usage: int
144 check_interval: int
169 check_interval: int
145 recovery_threshold: float
170 recovery_threshold: float
146
171
147
172
148 def _get_process_rss(pid=None):
173 def _get_process_rss(pid=None):
149 try:
174 try:
150 import psutil
175 import psutil
151 if pid:
176 if pid:
152 proc = psutil.Process(pid)
177 proc = psutil.Process(pid)
153 else:
178 else:
154 proc = psutil.Process()
179 proc = psutil.Process()
155 return proc.memory_info().rss
180 return proc.memory_info().rss
156 except Exception:
181 except Exception:
157 return None
182 return None
158
183
159
184
160 def _get_config(ini_path):
185 def _get_config(ini_path):
161 import configparser
186 import configparser
162
187
163 try:
188 try:
164 config = configparser.RawConfigParser()
189 config = configparser.RawConfigParser()
165 config.read(ini_path)
190 config.read(ini_path)
166 return config
191 return config
167 except Exception:
192 except Exception:
168 return None
193 return None
169
194
170
195
171 def get_memory_usage_params(config=None):
196 def get_memory_usage_params(config=None):
172 # memory spec defaults
197 # memory spec defaults
173 _memory_max_usage = memory_max_usage
198 _memory_max_usage = memory_max_usage
174 _memory_usage_check_interval = memory_usage_check_interval
199 _memory_usage_check_interval = memory_usage_check_interval
175 _memory_usage_recovery_threshold = memory_usage_recovery_threshold
200 _memory_usage_recovery_threshold = memory_usage_recovery_threshold
176
201
177 if config:
202 if config:
178 ini_path = os.path.abspath(config)
203 ini_path = os.path.abspath(config)
179 conf = _get_config(ini_path)
204 conf = _get_config(ini_path)
180
205
181 section = 'server:main'
206 section = 'server:main'
182 if conf and conf.has_section(section):
207 if conf and conf.has_section(section):
183
208
184 if conf.has_option(section, 'memory_max_usage'):
209 if conf.has_option(section, 'memory_max_usage'):
185 _memory_max_usage = conf.getint(section, 'memory_max_usage')
210 _memory_max_usage = conf.getint(section, 'memory_max_usage')
186
211
187 if conf.has_option(section, 'memory_usage_check_interval'):
212 if conf.has_option(section, 'memory_usage_check_interval'):
188 _memory_usage_check_interval = conf.getint(section, 'memory_usage_check_interval')
213 _memory_usage_check_interval = conf.getint(section, 'memory_usage_check_interval')
189
214
190 if conf.has_option(section, 'memory_usage_recovery_threshold'):
215 if conf.has_option(section, 'memory_usage_recovery_threshold'):
191 _memory_usage_recovery_threshold = conf.getfloat(section, 'memory_usage_recovery_threshold')
216 _memory_usage_recovery_threshold = conf.getfloat(section, 'memory_usage_recovery_threshold')
192
217
193 _memory_max_usage = int(os.environ.get('RC_GUNICORN_MEMORY_MAX_USAGE', '')
218 _memory_max_usage = int(os.environ.get('RC_GUNICORN_MEMORY_MAX_USAGE', '')
194 or _memory_max_usage)
219 or _memory_max_usage)
195 _memory_usage_check_interval = int(os.environ.get('RC_GUNICORN_MEMORY_USAGE_CHECK_INTERVAL', '')
220 _memory_usage_check_interval = int(os.environ.get('RC_GUNICORN_MEMORY_USAGE_CHECK_INTERVAL', '')
196 or _memory_usage_check_interval)
221 or _memory_usage_check_interval)
197 _memory_usage_recovery_threshold = float(os.environ.get('RC_GUNICORN_MEMORY_USAGE_RECOVERY_THRESHOLD', '')
222 _memory_usage_recovery_threshold = float(os.environ.get('RC_GUNICORN_MEMORY_USAGE_RECOVERY_THRESHOLD', '')
198 or _memory_usage_recovery_threshold)
223 or _memory_usage_recovery_threshold)
199
224
200 return MemoryCheckConfig(_memory_max_usage, _memory_usage_check_interval, _memory_usage_recovery_threshold)
225 return MemoryCheckConfig(_memory_max_usage, _memory_usage_check_interval, _memory_usage_recovery_threshold)
201
226
202
227
203 def _time_with_offset(check_interval):
228 def _time_with_offset(check_interval):
204 return time.time() - random.randint(0, check_interval/2.0)
229 return time.time() - random.randint(0, check_interval/2.0)
205
230
206
231
207 def pre_fork(server, worker):
232 def pre_fork(server, worker):
208 pass
233 pass
209
234
210
235
211 def post_fork(server, worker):
236 def post_fork(server, worker):
212
237
213 memory_conf = get_memory_usage_params()
238 memory_conf = get_memory_usage_params()
214 _memory_max_usage = memory_conf.max_usage
239 _memory_max_usage = memory_conf.max_usage
215 _memory_usage_check_interval = memory_conf.check_interval
240 _memory_usage_check_interval = memory_conf.check_interval
216 _memory_usage_recovery_threshold = memory_conf.recovery_threshold
241 _memory_usage_recovery_threshold = memory_conf.recovery_threshold
217
242
218 worker._memory_max_usage = int(os.environ.get('RC_GUNICORN_MEMORY_MAX_USAGE', '')
243 worker._memory_max_usage = int(os.environ.get('RC_GUNICORN_MEMORY_MAX_USAGE', '')
219 or _memory_max_usage)
244 or _memory_max_usage)
220 worker._memory_usage_check_interval = int(os.environ.get('RC_GUNICORN_MEMORY_USAGE_CHECK_INTERVAL', '')
245 worker._memory_usage_check_interval = int(os.environ.get('RC_GUNICORN_MEMORY_USAGE_CHECK_INTERVAL', '')
221 or _memory_usage_check_interval)
246 or _memory_usage_check_interval)
222 worker._memory_usage_recovery_threshold = float(os.environ.get('RC_GUNICORN_MEMORY_USAGE_RECOVERY_THRESHOLD', '')
247 worker._memory_usage_recovery_threshold = float(os.environ.get('RC_GUNICORN_MEMORY_USAGE_RECOVERY_THRESHOLD', '')
223 or _memory_usage_recovery_threshold)
248 or _memory_usage_recovery_threshold)
224
249
225 # register memory last check time, with some random offset so we don't recycle all
250 # register memory last check time, with some random offset so we don't recycle all
226 # at once
251 # at once
227 worker._last_memory_check_time = _time_with_offset(_memory_usage_check_interval)
252 worker._last_memory_check_time = _time_with_offset(_memory_usage_check_interval)
228
253
229 if _memory_max_usage:
254 if _memory_max_usage:
230 server.log.info("pid=[%-10s] WORKER spawned with max memory set at %s", worker.pid,
255 server.log.info("pid=[%-10s] WORKER spawned with max memory set at %s", worker.pid,
231 _format_data_size(_memory_max_usage))
256 _format_data_size(_memory_max_usage))
232 else:
257 else:
233 server.log.info("pid=[%-10s] WORKER spawned", worker.pid)
258 server.log.info("pid=[%-10s] WORKER spawned", worker.pid)
234
259
235
260
236 def pre_exec(server):
261 def pre_exec(server):
237 server.log.info("Forked child, re-executing.")
262 server.log.info("Forked child, re-executing.")
238
263
239
264
240 def on_starting(server):
265 def on_starting(server):
241 server_lbl = '{} {}'.format(server.proc_name, server.address)
266 server_lbl = '{} {}'.format(server.proc_name, server.address)
242 server.log.info("Server %s is starting.", server_lbl)
267 server.log.info("Server %s is starting.", server_lbl)
243 server.log.info('Config:')
268 server.log.info('Config:')
244 server.log.info(f"\n{server.cfg}")
269 server.log.info(f"\n{server.cfg}")
245 server.log.info(get_memory_usage_params())
270 server.log.info(get_memory_usage_params())
246
271
247
272
248 def when_ready(server):
273 def when_ready(server):
249 server.log.info("Server %s is ready. Spawning workers", server)
274 server.log.info("Server %s is ready. Spawning workers", server)
250
275
251
276
252 def on_reload(server):
277 def on_reload(server):
253 pass
278 pass
254
279
255
280
256 def _format_data_size(size, unit="B", precision=1, binary=True):
281 def _format_data_size(size, unit="B", precision=1, binary=True):
257 """Format a number using SI units (kilo, mega, etc.).
282 """Format a number using SI units (kilo, mega, etc.).
258
283
259 ``size``: The number as a float or int.
284 ``size``: The number as a float or int.
260
285
261 ``unit``: The unit name in plural form. Examples: "bytes", "B".
286 ``unit``: The unit name in plural form. Examples: "bytes", "B".
262
287
263 ``precision``: How many digits to the right of the decimal point. Default
288 ``precision``: How many digits to the right of the decimal point. Default
264 is 1. 0 suppresses the decimal point.
289 is 1. 0 suppresses the decimal point.
265
290
266 ``binary``: If false, use base-10 decimal prefixes (kilo = K = 1000).
291 ``binary``: If false, use base-10 decimal prefixes (kilo = K = 1000).
267 If true, use base-2 binary prefixes (kibi = Ki = 1024).
292 If true, use base-2 binary prefixes (kibi = Ki = 1024).
268
293
269 ``full_name``: If false (default), use the prefix abbreviation ("k" or
294 ``full_name``: If false (default), use the prefix abbreviation ("k" or
270 "Ki"). If true, use the full prefix ("kilo" or "kibi"). If false,
295 "Ki"). If true, use the full prefix ("kilo" or "kibi"). If false,
271 use abbreviation ("k" or "Ki").
296 use abbreviation ("k" or "Ki").
272
297
273 """
298 """
274
299
275 if not binary:
300 if not binary:
276 base = 1000
301 base = 1000
277 multiples = ('', 'k', 'M', 'G', 'T', 'P', 'E', 'Z', 'Y')
302 multiples = ('', 'k', 'M', 'G', 'T', 'P', 'E', 'Z', 'Y')
278 else:
303 else:
279 base = 1024
304 base = 1024
280 multiples = ('', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi', 'Yi')
305 multiples = ('', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi', 'Yi')
281
306
282 sign = ""
307 sign = ""
283 if size > 0:
308 if size > 0:
284 m = int(math.log(size, base))
309 m = int(math.log(size, base))
285 elif size < 0:
310 elif size < 0:
286 sign = "-"
311 sign = "-"
287 size = -size
312 size = -size
288 m = int(math.log(size, base))
313 m = int(math.log(size, base))
289 else:
314 else:
290 m = 0
315 m = 0
291 if m > 8:
316 if m > 8:
292 m = 8
317 m = 8
293
318
294 if m == 0:
319 if m == 0:
295 precision = '%.0f'
320 precision = '%.0f'
296 else:
321 else:
297 precision = '%%.%df' % precision
322 precision = '%%.%df' % precision
298
323
299 size = precision % (size / math.pow(base, m))
324 size = precision % (size / math.pow(base, m))
300
325
301 return '%s%s %s%s' % (sign, size.strip(), multiples[m], unit)
326 return '%s%s %s%s' % (sign, size.strip(), multiples[m], unit)
302
327
303
328
304 def _check_memory_usage(worker):
329 def _check_memory_usage(worker):
305 _memory_max_usage = worker._memory_max_usage
330 _memory_max_usage = worker._memory_max_usage
306 if not _memory_max_usage:
331 if not _memory_max_usage:
307 return
332 return
308
333
309 _memory_usage_check_interval = worker._memory_usage_check_interval
334 _memory_usage_check_interval = worker._memory_usage_check_interval
310 _memory_usage_recovery_threshold = memory_max_usage * worker._memory_usage_recovery_threshold
335 _memory_usage_recovery_threshold = memory_max_usage * worker._memory_usage_recovery_threshold
311
336
312 elapsed = time.time() - worker._last_memory_check_time
337 elapsed = time.time() - worker._last_memory_check_time
313 if elapsed > _memory_usage_check_interval:
338 if elapsed > _memory_usage_check_interval:
314 mem_usage = _get_process_rss()
339 mem_usage = _get_process_rss()
315 if mem_usage and mem_usage > _memory_max_usage:
340 if mem_usage and mem_usage > _memory_max_usage:
316 worker.log.info(
341 worker.log.info(
317 "memory usage %s > %s, forcing gc",
342 "memory usage %s > %s, forcing gc",
318 _format_data_size(mem_usage), _format_data_size(_memory_max_usage))
343 _format_data_size(mem_usage), _format_data_size(_memory_max_usage))
319 # Try to clean it up by forcing a full collection.
344 # Try to clean it up by forcing a full collection.
320 gc.collect()
345 gc.collect()
321 mem_usage = _get_process_rss()
346 mem_usage = _get_process_rss()
322 if mem_usage > _memory_usage_recovery_threshold:
347 if mem_usage > _memory_usage_recovery_threshold:
323 # Didn't clean up enough, we'll have to terminate.
348 # Didn't clean up enough, we'll have to terminate.
324 worker.log.warning(
349 worker.log.warning(
325 "memory usage %s > %s after gc, quitting",
350 "memory usage %s > %s after gc, quitting",
326 _format_data_size(mem_usage), _format_data_size(_memory_max_usage))
351 _format_data_size(mem_usage), _format_data_size(_memory_max_usage))
327 # This will cause worker to auto-restart itself
352 # This will cause worker to auto-restart itself
328 worker.alive = False
353 worker.alive = False
329 worker._last_memory_check_time = time.time()
354 worker._last_memory_check_time = time.time()
330
355
331
356
332 def worker_int(worker):
357 def worker_int(worker):
333 worker.log.info("pid=[%-10s] worker received INT or QUIT signal", worker.pid)
358 worker.log.info("pid=[%-10s] worker received INT or QUIT signal", worker.pid)
334
359
335 # get traceback info, when a worker crashes
360 # get traceback info, when a worker crashes
336 def get_thread_id(t_id):
361 def get_thread_id(t_id):
337 id2name = dict([(th.ident, th.name) for th in threading.enumerate()])
362 id2name = dict([(th.ident, th.name) for th in threading.enumerate()])
338 return id2name.get(t_id, "unknown_thread_id")
363 return id2name.get(t_id, "unknown_thread_id")
339
364
340 code = []
365 code = []
341 for thread_id, stack in sys._current_frames().items(): # noqa
366 for thread_id, stack in sys._current_frames().items(): # noqa
342 code.append(
367 code.append(
343 "\n# Thread: %s(%d)" % (get_thread_id(thread_id), thread_id))
368 "\n# Thread: %s(%d)" % (get_thread_id(thread_id), thread_id))
344 for fname, lineno, name, line in traceback.extract_stack(stack):
369 for fname, lineno, name, line in traceback.extract_stack(stack):
345 code.append('File: "%s", line %d, in %s' % (fname, lineno, name))
370 code.append('File: "%s", line %d, in %s' % (fname, lineno, name))
346 if line:
371 if line:
347 code.append(" %s" % (line.strip()))
372 code.append(" %s" % (line.strip()))
348 worker.log.debug("\n".join(code))
373 worker.log.debug("\n".join(code))
349
374
350
375
351 def worker_abort(worker):
376 def worker_abort(worker):
352 worker.log.info("pid=[%-10s] worker received SIGABRT signal", worker.pid)
377 worker.log.info("pid=[%-10s] worker received SIGABRT signal", worker.pid)
353
378
354
379
355 def worker_exit(server, worker):
380 def worker_exit(server, worker):
356 worker.log.info("pid=[%-10s] worker exit", worker.pid)
381 worker.log.info("pid=[%-10s] worker exit", worker.pid)
357
382
358
383
359 def child_exit(server, worker):
384 def child_exit(server, worker):
360 worker.log.info("pid=[%-10s] worker child exit", worker.pid)
385 worker.log.info("pid=[%-10s] worker child exit", worker.pid)
361
386
362
387
363 def pre_request(worker, req):
388 def pre_request(worker, req):
364 worker.start_time = time.time()
389 worker.start_time = time.time()
365 worker.log.debug(
390 worker.log.debug(
366 "GNCRN PRE WORKER [cnt:%s]: %s %s", worker.nr, req.method, req.path)
391 "GNCRN PRE WORKER [cnt:%s]: %s %s", worker.nr, req.method, req.path)
367
392
368
393
369 def post_request(worker, req, environ, resp):
394 def post_request(worker, req, environ, resp):
370 total_time = time.time() - worker.start_time
395 total_time = time.time() - worker.start_time
371 # Gunicorn sometimes has problems with reading the status_code
396 # Gunicorn sometimes has problems with reading the status_code
372 status_code = getattr(resp, 'status_code', '')
397 status_code = getattr(resp, 'status_code', '')
373 worker.log.debug(
398 worker.log.debug(
374 "GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %.4fs",
399 "GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %.4fs",
375 worker.nr, req.method, req.path, status_code, total_time)
400 worker.nr, req.method, req.path, status_code, total_time)
376 _check_memory_usage(worker)
401 _check_memory_usage(worker)
377
402
378
403
379 def _filter_proxy(ip):
404 def _filter_proxy(ip):
380 """
405 """
381 Passed in IP addresses in HEADERS can be in a special format of multiple
406 Passed in IP addresses in HEADERS can be in a special format of multiple
382 ips. Those comma separated IPs are passed from various proxies in the
407 ips. Those comma separated IPs are passed from various proxies in the
383 chain of request processing. The left-most being the original client.
408 chain of request processing. The left-most being the original client.
384 We only care about the first IP which came from the org. client.
409 We only care about the first IP which came from the org. client.
385
410
386 :param ip: ip string from headers
411 :param ip: ip string from headers
387 """
412 """
388 if ',' in ip:
413 if ',' in ip:
389 _ips = ip.split(',')
414 _ips = ip.split(',')
390 _first_ip = _ips[0].strip()
415 _first_ip = _ips[0].strip()
391 return _first_ip
416 return _first_ip
392 return ip
417 return ip
393
418
394
419
395 def _filter_port(ip):
420 def _filter_port(ip):
396 """
421 """
397 Removes a port from ip, there are 4 main cases to handle here.
422 Removes a port from ip, there are 4 main cases to handle here.
398 - ipv4 eg. 127.0.0.1
423 - ipv4 eg. 127.0.0.1
399 - ipv6 eg. ::1
424 - ipv6 eg. ::1
400 - ipv4+port eg. 127.0.0.1:8080
425 - ipv4+port eg. 127.0.0.1:8080
401 - ipv6+port eg. [::1]:8080
426 - ipv6+port eg. [::1]:8080
402
427
403 :param ip:
428 :param ip:
404 """
429 """
405 def is_ipv6(ip_addr):
430 def is_ipv6(ip_addr):
406 if hasattr(socket, 'inet_pton'):
431 if hasattr(socket, 'inet_pton'):
407 try:
432 try:
408 socket.inet_pton(socket.AF_INET6, ip_addr)
433 socket.inet_pton(socket.AF_INET6, ip_addr)
409 except socket.error:
434 except socket.error:
410 return False
435 return False
411 else:
436 else:
412 return False
437 return False
413 return True
438 return True
414
439
415 if ':' not in ip: # must be ipv4 pure ip
440 if ':' not in ip: # must be ipv4 pure ip
416 return ip
441 return ip
417
442
418 if '[' in ip and ']' in ip: # ipv6 with port
443 if '[' in ip and ']' in ip: # ipv6 with port
419 return ip.split(']')[0][1:].lower()
444 return ip.split(']')[0][1:].lower()
420
445
421 # must be ipv6 or ipv4 with port
446 # must be ipv6 or ipv4 with port
422 if is_ipv6(ip):
447 if is_ipv6(ip):
423 return ip
448 return ip
424 else:
449 else:
425 ip, _port = ip.split(':')[:2] # means ipv4+port
450 ip, _port = ip.split(':')[:2] # means ipv4+port
426 return ip
451 return ip
427
452
428
453
429 def get_ip_addr(environ):
454 def get_ip_addr(environ):
430 proxy_key = 'HTTP_X_REAL_IP'
455 proxy_key = 'HTTP_X_REAL_IP'
431 proxy_key2 = 'HTTP_X_FORWARDED_FOR'
456 proxy_key2 = 'HTTP_X_FORWARDED_FOR'
432 def_key = 'REMOTE_ADDR'
457 def_key = 'REMOTE_ADDR'
433
458
434 def _filters(x):
459 def _filters(x):
435 return _filter_port(_filter_proxy(x))
460 return _filter_port(_filter_proxy(x))
436
461
437 ip = environ.get(proxy_key)
462 ip = environ.get(proxy_key)
438 if ip:
463 if ip:
439 return _filters(ip)
464 return _filters(ip)
440
465
441 ip = environ.get(proxy_key2)
466 ip = environ.get(proxy_key2)
442 if ip:
467 if ip:
443 return _filters(ip)
468 return _filters(ip)
444
469
445 ip = environ.get(def_key, '0.0.0.0')
470 ip = environ.get(def_key, '0.0.0.0')
446 return _filters(ip)
471 return _filters(ip)
447
472
448
473
449 class RhodeCodeLogger(Logger):
474 class RhodeCodeLogger(Logger):
450 """
475 """
451 Custom Logger that allows some customization that gunicorn doesn't allow
476 Custom Logger that allows some customization that gunicorn doesn't allow
452 """
477 """
453
478
454 datefmt = r"%Y-%m-%d %H:%M:%S"
479 datefmt = r"%Y-%m-%d %H:%M:%S"
455
480
456 def __init__(self, cfg):
481 def __init__(self, cfg):
457 Logger.__init__(self, cfg)
482 Logger.__init__(self, cfg)
458
483
459 def now(self):
484 def now(self):
460 """ return date in RhodeCode Log format """
485 """ return date in RhodeCode Log format """
461 now = time.time()
486 now = time.time()
462 msecs = int((now - int(now)) * 1000)
487 msecs = int((now - int(now)) * 1000)
463 return time.strftime(self.datefmt, time.localtime(now)) + '.{0:03d}'.format(msecs)
488 return time.strftime(self.datefmt, time.localtime(now)) + '.{0:03d}'.format(msecs)
464
489
465 def atoms(self, resp, req, environ, request_time):
490 def atoms(self, resp, req, environ, request_time):
466 """ Gets atoms for log formatting.
491 """ Gets atoms for log formatting.
467 """
492 """
468 status = resp.status
493 status = resp.status
469 if isinstance(status, str):
494 if isinstance(status, str):
470 status = status.split(None, 1)[0]
495 status = status.split(None, 1)[0]
471 atoms = {
496 atoms = {
472 'h': get_ip_addr(environ),
497 'h': get_ip_addr(environ),
473 'l': '-',
498 'l': '-',
474 'u': self._get_user(environ) or '-',
499 'u': self._get_user(environ) or '-',
475 't': self.now(),
500 't': self.now(),
476 'r': "%s %s %s" % (environ['REQUEST_METHOD'],
501 'r': "%s %s %s" % (environ['REQUEST_METHOD'],
477 environ['RAW_URI'],
502 environ['RAW_URI'],
478 environ["SERVER_PROTOCOL"]),
503 environ["SERVER_PROTOCOL"]),
479 's': status,
504 's': status,
480 'm': environ.get('REQUEST_METHOD'),
505 'm': environ.get('REQUEST_METHOD'),
481 'U': environ.get('PATH_INFO'),
506 'U': environ.get('PATH_INFO'),
482 'q': environ.get('QUERY_STRING'),
507 'q': environ.get('QUERY_STRING'),
483 'H': environ.get('SERVER_PROTOCOL'),
508 'H': environ.get('SERVER_PROTOCOL'),
484 'b': getattr(resp, 'sent', None) is not None and str(resp.sent) or '-',
509 'b': getattr(resp, 'sent', None) is not None and str(resp.sent) or '-',
485 'B': getattr(resp, 'sent', None),
510 'B': getattr(resp, 'sent', None),
486 'f': environ.get('HTTP_REFERER', '-'),
511 'f': environ.get('HTTP_REFERER', '-'),
487 'a': environ.get('HTTP_USER_AGENT', '-'),
512 'a': environ.get('HTTP_USER_AGENT', '-'),
488 'T': request_time.seconds,
513 'T': request_time.seconds,
489 'D': (request_time.seconds * 1000000) + request_time.microseconds,
514 'D': (request_time.seconds * 1000000) + request_time.microseconds,
490 'M': (request_time.seconds * 1000) + int(request_time.microseconds/1000),
515 'M': (request_time.seconds * 1000) + int(request_time.microseconds/1000),
491 'L': "%d.%06d" % (request_time.seconds, request_time.microseconds),
516 'L': "%d.%06d" % (request_time.seconds, request_time.microseconds),
492 'p': "<%s>" % os.getpid()
517 'p': "<%s>" % os.getpid()
493 }
518 }
494
519
495 # add request headers
520 # add request headers
496 if hasattr(req, 'headers'):
521 if hasattr(req, 'headers'):
497 req_headers = req.headers
522 req_headers = req.headers
498 else:
523 else:
499 req_headers = req
524 req_headers = req
500
525
501 if hasattr(req_headers, "items"):
526 if hasattr(req_headers, "items"):
502 req_headers = req_headers.items()
527 req_headers = req_headers.items()
503
528
504 atoms.update({"{%s}i" % k.lower(): v for k, v in req_headers})
529 atoms.update({"{%s}i" % k.lower(): v for k, v in req_headers})
505
530
506 resp_headers = resp.headers
531 resp_headers = resp.headers
507 if hasattr(resp_headers, "items"):
532 if hasattr(resp_headers, "items"):
508 resp_headers = resp_headers.items()
533 resp_headers = resp_headers.items()
509
534
510 # add response headers
535 # add response headers
511 atoms.update({"{%s}o" % k.lower(): v for k, v in resp_headers})
536 atoms.update({"{%s}o" % k.lower(): v for k, v in resp_headers})
512
537
513 # add environ variables
538 # add environ variables
514 environ_variables = environ.items()
539 environ_variables = environ.items()
515 atoms.update({"{%s}e" % k.lower(): v for k, v in environ_variables})
540 atoms.update({"{%s}e" % k.lower(): v for k, v in environ_variables})
516
541
517 return atoms
542 return atoms
518
543
519
544
520 logger_class = RhodeCodeLogger
545 logger_class = RhodeCodeLogger
@@ -1,824 +1,880 b''
1
1
2 ; #########################################
2 ; #########################################
3 ; RHODECODE COMMUNITY EDITION CONFIGURATION
3 ; RHODECODE COMMUNITY EDITION CONFIGURATION
4 ; #########################################
4 ; #########################################
5
5
6 [DEFAULT]
6 [DEFAULT]
7 ; Debug flag sets all loggers to debug, and enables request tracking
7 ; Debug flag sets all loggers to debug, and enables request tracking
8 debug = false
8 debug = false
9
9
10 ; ########################################################################
10 ; ########################################################################
11 ; EMAIL CONFIGURATION
11 ; EMAIL CONFIGURATION
12 ; These settings will be used by the RhodeCode mailing system
12 ; These settings will be used by the RhodeCode mailing system
13 ; ########################################################################
13 ; ########################################################################
14
14
15 ; prefix all emails subjects with given prefix, helps filtering out emails
15 ; prefix all emails subjects with given prefix, helps filtering out emails
16 #email_prefix = [RhodeCode]
16 #email_prefix = [RhodeCode]
17
17
18 ; email FROM address all mails will be sent
18 ; email FROM address all mails will be sent
19 #app_email_from = rhodecode-noreply@localhost
19 #app_email_from = rhodecode-noreply@localhost
20
20
21 #smtp_server = mail.server.com
21 #smtp_server = mail.server.com
22 #smtp_username =
22 #smtp_username =
23 #smtp_password =
23 #smtp_password =
24 #smtp_port =
24 #smtp_port =
25 #smtp_use_tls = false
25 #smtp_use_tls = false
26 #smtp_use_ssl = true
26 #smtp_use_ssl = true
27
27
28 [server:main]
28 [server:main]
29 ; COMMON HOST/IP CONFIG, This applies mostly to develop setup,
29 ; COMMON HOST/IP CONFIG, This applies mostly to develop setup,
30 ; Host port for gunicorn are controlled by gunicorn_conf.py
30 ; Host port for gunicorn are controlled by gunicorn_conf.py
31 host = 127.0.0.1
31 host = 127.0.0.1
32 port = 10020
32 port = 10020
33
33
34
34
35 ; ###########################
35 ; ###########################
36 ; GUNICORN APPLICATION SERVER
36 ; GUNICORN APPLICATION SERVER
37 ; ###########################
37 ; ###########################
38
38
39 ; run with gunicorn --config gunicorn_conf.py --paste rhodecode.ini
39 ; run with gunicorn --config gunicorn_conf.py --paste rhodecode.ini
40
40
41 ; Module to use, this setting shouldn't be changed
41 ; Module to use, this setting shouldn't be changed
42 use = egg:gunicorn#main
42 use = egg:gunicorn#main
43
43
44 ; Prefix middleware for RhodeCode.
44 ; Prefix middleware for RhodeCode.
45 ; recommended when using proxy setup.
45 ; recommended when using proxy setup.
46 ; allows to set RhodeCode under a prefix in server.
46 ; allows to set RhodeCode under a prefix in server.
47 ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well.
47 ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well.
48 ; And set your prefix like: `prefix = /custom_prefix`
48 ; And set your prefix like: `prefix = /custom_prefix`
49 ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need
49 ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need
50 ; to make your cookies only work on prefix url
50 ; to make your cookies only work on prefix url
51 [filter:proxy-prefix]
51 [filter:proxy-prefix]
52 use = egg:PasteDeploy#prefix
52 use = egg:PasteDeploy#prefix
53 prefix = /
53 prefix = /
54
54
55 [app:main]
55 [app:main]
56 ; The %(here)s variable will be replaced with the absolute path of parent directory
56 ; The %(here)s variable will be replaced with the absolute path of parent directory
57 ; of this file
57 ; of this file
58 ; Each option in the app:main can be override by an environmental variable
58 ; Each option in the app:main can be override by an environmental variable
59 ;
59 ;
60 ;To override an option:
60 ;To override an option:
61 ;
61 ;
62 ;RC_<KeyName>
62 ;RC_<KeyName>
63 ;Everything should be uppercase, . and - should be replaced by _.
63 ;Everything should be uppercase, . and - should be replaced by _.
64 ;For example, if you have these configuration settings:
64 ;For example, if you have these configuration settings:
65 ;rc_cache.repo_object.backend = foo
65 ;rc_cache.repo_object.backend = foo
66 ;can be overridden by
66 ;can be overridden by
67 ;export RC_CACHE_REPO_OBJECT_BACKEND=foo
67 ;export RC_CACHE_REPO_OBJECT_BACKEND=foo
68
68
69 use = egg:rhodecode-enterprise-ce
69 use = egg:rhodecode-enterprise-ce
70
70
71 ; enable proxy prefix middleware, defined above
71 ; enable proxy prefix middleware, defined above
72 #filter-with = proxy-prefix
72 #filter-with = proxy-prefix
73
73
74 ; encryption key used to encrypt social plugin tokens,
74 ; encryption key used to encrypt social plugin tokens,
75 ; remote_urls with credentials etc, if not set it defaults to
75 ; remote_urls with credentials etc, if not set it defaults to
76 ; `beaker.session.secret`
76 ; `beaker.session.secret`
77 #rhodecode.encrypted_values.secret =
77 #rhodecode.encrypted_values.secret =
78
78
79 ; decryption strict mode (enabled by default). It controls if decryption raises
79 ; decryption strict mode (enabled by default). It controls if decryption raises
80 ; `SignatureVerificationError` in case of wrong key, or damaged encryption data.
80 ; `SignatureVerificationError` in case of wrong key, or damaged encryption data.
81 #rhodecode.encrypted_values.strict = false
81 #rhodecode.encrypted_values.strict = false
82
82
83 ; Pick algorithm for encryption. Either fernet (more secure) or aes (default)
83 ; Pick algorithm for encryption. Either fernet (more secure) or aes (default)
84 ; fernet is safer, and we strongly recommend switching to it.
84 ; fernet is safer, and we strongly recommend switching to it.
85 ; Due to backward compatibility aes is used as default.
85 ; Due to backward compatibility aes is used as default.
86 #rhodecode.encrypted_values.algorithm = fernet
86 #rhodecode.encrypted_values.algorithm = fernet
87
87
88 ; Return gzipped responses from RhodeCode (static files/application)
88 ; Return gzipped responses from RhodeCode (static files/application)
89 gzip_responses = false
89 gzip_responses = false
90
90
91 ; Auto-generate javascript routes file on startup
91 ; Auto-generate javascript routes file on startup
92 generate_js_files = false
92 generate_js_files = false
93
93
94 ; System global default language.
94 ; System global default language.
95 ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh
95 ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh
96 lang = en
96 lang = en
97
97
98 ; Perform a full repository scan and import on each server start.
98 ; Perform a full repository scan and import on each server start.
99 ; Settings this to true could lead to very long startup time.
99 ; Settings this to true could lead to very long startup time.
100 startup.import_repos = false
100 startup.import_repos = false
101
101
102 ; URL at which the application is running. This is used for Bootstrapping
102 ; URL at which the application is running. This is used for Bootstrapping
103 ; requests in context when no web request is available. Used in ishell, or
103 ; requests in context when no web request is available. Used in ishell, or
104 ; SSH calls. Set this for events to receive proper url for SSH calls.
104 ; SSH calls. Set this for events to receive proper url for SSH calls.
105 app.base_url = http://rhodecode.local
105 app.base_url = http://rhodecode.local
106
106
107 ; Host at which the Service API is running.
107 ; Host at which the Service API is running.
108 app.service_api.host = http://rhodecode.local:10020
108 app.service_api.host = http://rhodecode.local:10020
109
109
110 ; Secret for Service API authentication.
110 ; Secret for Service API authentication.
111 app.service_api.token =
111 app.service_api.token =
112
112
113 ; Unique application ID. Should be a random unique string for security.
113 ; Unique application ID. Should be a random unique string for security.
114 app_instance_uuid = rc-production
114 app_instance_uuid = rc-production
115
115
116 ; Cut off limit for large diffs (size in bytes). If overall diff size on
116 ; Cut off limit for large diffs (size in bytes). If overall diff size on
117 ; commit, or pull request exceeds this limit this diff will be displayed
117 ; commit, or pull request exceeds this limit this diff will be displayed
118 ; partially. E.g 512000 == 512Kb
118 ; partially. E.g 512000 == 512Kb
119 cut_off_limit_diff = 512000
119 cut_off_limit_diff = 512000
120
120
121 ; Cut off limit for large files inside diffs (size in bytes). Each individual
121 ; Cut off limit for large files inside diffs (size in bytes). Each individual
122 ; file inside diff which exceeds this limit will be displayed partially.
122 ; file inside diff which exceeds this limit will be displayed partially.
123 ; E.g 128000 == 128Kb
123 ; E.g 128000 == 128Kb
124 cut_off_limit_file = 128000
124 cut_off_limit_file = 128000
125
125
126 ; Use cached version of vcs repositories everywhere. Recommended to be `true`
126 ; Use cached version of vcs repositories everywhere. Recommended to be `true`
127 vcs_full_cache = true
127 vcs_full_cache = true
128
128
129 ; Force https in RhodeCode, fixes https redirects, assumes it's always https.
129 ; Force https in RhodeCode, fixes https redirects, assumes it's always https.
130 ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache
130 ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache
131 force_https = false
131 force_https = false
132
132
133 ; use Strict-Transport-Security headers
133 ; use Strict-Transport-Security headers
134 use_htsts = false
134 use_htsts = false
135
135
136 ; Set to true if your repos are exposed using the dumb protocol
136 ; Set to true if your repos are exposed using the dumb protocol
137 git_update_server_info = false
137 git_update_server_info = false
138
138
139 ; RSS/ATOM feed options
139 ; RSS/ATOM feed options
140 rss_cut_off_limit = 256000
140 rss_cut_off_limit = 256000
141 rss_items_per_page = 10
141 rss_items_per_page = 10
142 rss_include_diff = false
142 rss_include_diff = false
143
143
144 ; gist URL alias, used to create nicer urls for gist. This should be an
144 ; gist URL alias, used to create nicer urls for gist. This should be an
145 ; url that does rewrites to _admin/gists/{gistid}.
145 ; url that does rewrites to _admin/gists/{gistid}.
146 ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
146 ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
147 ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid}
147 ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid}
148 gist_alias_url =
148 gist_alias_url =
149
149
150 ; List of views (using glob pattern syntax) that AUTH TOKENS could be
150 ; List of views (using glob pattern syntax) that AUTH TOKENS could be
151 ; used for access.
151 ; used for access.
152 ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it
152 ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it
153 ; came from the the logged in user who own this authentication token.
153 ; came from the the logged in user who own this authentication token.
154 ; Additionally @TOKEN syntax can be used to bound the view to specific
154 ; Additionally @TOKEN syntax can be used to bound the view to specific
155 ; authentication token. Such view would be only accessible when used together
155 ; authentication token. Such view would be only accessible when used together
156 ; with this authentication token
156 ; with this authentication token
157 ; list of all views can be found under `/_admin/permissions/auth_token_access`
157 ; list of all views can be found under `/_admin/permissions/auth_token_access`
158 ; The list should be "," separated and on a single line.
158 ; The list should be "," separated and on a single line.
159 ; Most common views to enable:
159 ; Most common views to enable:
160
160
161 # RepoCommitsView:repo_commit_download
161 # RepoCommitsView:repo_commit_download
162 # RepoCommitsView:repo_commit_patch
162 # RepoCommitsView:repo_commit_patch
163 # RepoCommitsView:repo_commit_raw
163 # RepoCommitsView:repo_commit_raw
164 # RepoCommitsView:repo_commit_raw@TOKEN
164 # RepoCommitsView:repo_commit_raw@TOKEN
165 # RepoFilesView:repo_files_diff
165 # RepoFilesView:repo_files_diff
166 # RepoFilesView:repo_archivefile
166 # RepoFilesView:repo_archivefile
167 # RepoFilesView:repo_file_raw
167 # RepoFilesView:repo_file_raw
168 # GistView:*
168 # GistView:*
169 api_access_controllers_whitelist =
169 api_access_controllers_whitelist =
170
170
171 ; Default encoding used to convert from and to unicode
171 ; Default encoding used to convert from and to unicode
172 ; can be also a comma separated list of encoding in case of mixed encodings
172 ; can be also a comma separated list of encoding in case of mixed encodings
173 default_encoding = UTF-8
173 default_encoding = UTF-8
174
174
175 ; instance-id prefix
175 ; instance-id prefix
176 ; a prefix key for this instance used for cache invalidation when running
176 ; a prefix key for this instance used for cache invalidation when running
177 ; multiple instances of RhodeCode, make sure it's globally unique for
177 ; multiple instances of RhodeCode, make sure it's globally unique for
178 ; all running RhodeCode instances. Leave empty if you don't use it
178 ; all running RhodeCode instances. Leave empty if you don't use it
179 instance_id =
179 instance_id =
180
180
181 ; Fallback authentication plugin. Set this to a plugin ID to force the usage
181 ; Fallback authentication plugin. Set this to a plugin ID to force the usage
182 ; of an authentication plugin also if it is disabled by it's settings.
182 ; of an authentication plugin also if it is disabled by it's settings.
183 ; This could be useful if you are unable to log in to the system due to broken
183 ; This could be useful if you are unable to log in to the system due to broken
184 ; authentication settings. Then you can enable e.g. the internal RhodeCode auth
184 ; authentication settings. Then you can enable e.g. the internal RhodeCode auth
185 ; module to log in again and fix the settings.
185 ; module to log in again and fix the settings.
186 ; Available builtin plugin IDs (hash is part of the ID):
186 ; Available builtin plugin IDs (hash is part of the ID):
187 ; egg:rhodecode-enterprise-ce#rhodecode
187 ; egg:rhodecode-enterprise-ce#rhodecode
188 ; egg:rhodecode-enterprise-ce#pam
188 ; egg:rhodecode-enterprise-ce#pam
189 ; egg:rhodecode-enterprise-ce#ldap
189 ; egg:rhodecode-enterprise-ce#ldap
190 ; egg:rhodecode-enterprise-ce#jasig_cas
190 ; egg:rhodecode-enterprise-ce#jasig_cas
191 ; egg:rhodecode-enterprise-ce#headers
191 ; egg:rhodecode-enterprise-ce#headers
192 ; egg:rhodecode-enterprise-ce#crowd
192 ; egg:rhodecode-enterprise-ce#crowd
193
193
194 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
194 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
195
195
196 ; Flag to control loading of legacy plugins in py:/path format
196 ; Flag to control loading of legacy plugins in py:/path format
197 auth_plugin.import_legacy_plugins = true
197 auth_plugin.import_legacy_plugins = true
198
198
199 ; alternative return HTTP header for failed authentication. Default HTTP
199 ; alternative return HTTP header for failed authentication. Default HTTP
200 ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with
200 ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with
201 ; handling that causing a series of failed authentication calls.
201 ; handling that causing a series of failed authentication calls.
202 ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code
202 ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code
203 ; This will be served instead of default 401 on bad authentication
203 ; This will be served instead of default 401 on bad authentication
204 auth_ret_code =
204 auth_ret_code =
205
205
206 ; use special detection method when serving auth_ret_code, instead of serving
206 ; use special detection method when serving auth_ret_code, instead of serving
207 ; ret_code directly, use 401 initially (Which triggers credentials prompt)
207 ; ret_code directly, use 401 initially (Which triggers credentials prompt)
208 ; and then serve auth_ret_code to clients
208 ; and then serve auth_ret_code to clients
209 auth_ret_code_detection = false
209 auth_ret_code_detection = false
210
210
211 ; locking return code. When repository is locked return this HTTP code. 2XX
211 ; locking return code. When repository is locked return this HTTP code. 2XX
212 ; codes don't break the transactions while 4XX codes do
212 ; codes don't break the transactions while 4XX codes do
213 lock_ret_code = 423
213 lock_ret_code = 423
214
214
215 ; Filesystem location were repositories should be stored
215 ; Filesystem location were repositories should be stored
216 repo_store.path = /var/opt/rhodecode_repo_store
216 repo_store.path = /var/opt/rhodecode_repo_store
217
217
218 ; allows to setup custom hooks in settings page
218 ; allows to setup custom hooks in settings page
219 allow_custom_hooks_settings = true
219 allow_custom_hooks_settings = true
220
220
221 ; Generated license token required for EE edition license.
221 ; Generated license token required for EE edition license.
222 ; New generated token value can be found in Admin > settings > license page.
222 ; New generated token value can be found in Admin > settings > license page.
223 license_token =
223 license_token =
224
224
225 ; This flag hides sensitive information on the license page such as token, and license data
225 ; This flag hides sensitive information on the license page such as token, and license data
226 license.hide_license_info = false
226 license.hide_license_info = false
227
227
228 ; Import EE license from this license path
229 #license.import_path = %(here)s/rhodecode_enterprise.license
230
231 ; import license 'if-missing' or 'force' (always override)
232 ; if-missing means apply license if it doesn't exist. 'force' option always overrides it
233 license.import_path_mode = if-missing
234
228 ; supervisor connection uri, for managing supervisor and logs.
235 ; supervisor connection uri, for managing supervisor and logs.
229 supervisor.uri =
236 supervisor.uri =
230
237
231 ; supervisord group name/id we only want this RC instance to handle
238 ; supervisord group name/id we only want this RC instance to handle
232 supervisor.group_id = prod
239 supervisor.group_id = prod
233
240
234 ; Display extended labs settings
241 ; Display extended labs settings
235 labs_settings_active = true
242 labs_settings_active = true
236
243
237 ; Custom exception store path, defaults to TMPDIR
244 ; Custom exception store path, defaults to TMPDIR
238 ; This is used to store exception from RhodeCode in shared directory
245 ; This is used to store exception from RhodeCode in shared directory
239 #exception_tracker.store_path =
246 #exception_tracker.store_path =
240
247
241 ; Send email with exception details when it happens
248 ; Send email with exception details when it happens
242 #exception_tracker.send_email = false
249 #exception_tracker.send_email = false
243
250
244 ; Comma separated list of recipients for exception emails,
251 ; Comma separated list of recipients for exception emails,
245 ; e.g admin@rhodecode.com,devops@rhodecode.com
252 ; e.g admin@rhodecode.com,devops@rhodecode.com
246 ; Can be left empty, then emails will be sent to ALL super-admins
253 ; Can be left empty, then emails will be sent to ALL super-admins
247 #exception_tracker.send_email_recipients =
254 #exception_tracker.send_email_recipients =
248
255
249 ; optional prefix to Add to email Subject
256 ; optional prefix to Add to email Subject
250 #exception_tracker.email_prefix = [RHODECODE ERROR]
257 #exception_tracker.email_prefix = [RHODECODE ERROR]
251
258
252 ; File store configuration. This is used to store and serve uploaded files
259 ; NOTE: this setting IS DEPRECATED:
253 file_store.enabled = true
260 ; file_store backend is always enabled
261 #file_store.enabled = true
254
262
263 ; NOTE: this setting IS DEPRECATED:
264 ; file_store.backend = X -> use `file_store.backend.type = filesystem_v2` instead
255 ; Storage backend, available options are: local
265 ; Storage backend, available options are: local
256 file_store.backend = local
266 #file_store.backend = local
257
267
268 ; NOTE: this setting IS DEPRECATED:
269 ; file_store.storage_path = X -> use `file_store.filesystem_v2.storage_path = X` instead
258 ; path to store the uploaded binaries and artifacts
270 ; path to store the uploaded binaries and artifacts
259 file_store.storage_path = /var/opt/rhodecode_data/file_store
271 #file_store.storage_path = /var/opt/rhodecode_data/file_store
272
273 ; Artifacts file-store, is used to store comment attachments and artifacts uploads.
274 ; file_store backend type: filesystem_v1, filesystem_v2 or objectstore (s3-based) are available as options
275 ; filesystem_v1 is backwards compat with pre 5.1 storage changes
276 ; new installations should choose filesystem_v2 or objectstore (s3-based), pick filesystem when migrating from
277 ; previous installations to keep the artifacts without a need of migration
278 #file_store.backend.type = filesystem_v2
279
280 ; filesystem options...
281 #file_store.filesystem_v1.storage_path = /var/opt/rhodecode_data/artifacts_file_store
282
283 ; filesystem_v2 options...
284 #file_store.filesystem_v2.storage_path = /var/opt/rhodecode_data/artifacts_file_store
285 #file_store.filesystem_v2.shards = 8
260
286
287 ; objectstore options...
288 ; url for s3 compatible storage that allows to upload artifacts
289 ; e.g http://minio:9000
290 #file_store.backend.type = objectstore
291 #file_store.objectstore.url = http://s3-minio:9000
292
293 ; a top-level bucket to put all other shards in
294 ; objects will be stored in rhodecode-file-store/shard-N based on the bucket_shards number
295 #file_store.objectstore.bucket = rhodecode-file-store
296
297 ; number of sharded buckets to create to distribute archives across
298 ; default is 8 shards
299 #file_store.objectstore.bucket_shards = 8
300
301 ; key for s3 auth
302 #file_store.objectstore.key = s3admin
303
304 ; secret for s3 auth
305 #file_store.objectstore.secret = s3secret4
306
307 ;region for s3 storage
308 #file_store.objectstore.region = eu-central-1
261
309
262 ; Redis url to acquire/check generation of archives locks
310 ; Redis url to acquire/check generation of archives locks
263 archive_cache.locking.url = redis://redis:6379/1
311 archive_cache.locking.url = redis://redis:6379/1
264
312
265 ; Storage backend, only 'filesystem' and 'objectstore' are available now
313 ; Storage backend, only 'filesystem' and 'objectstore' are available now
266 archive_cache.backend.type = filesystem
314 archive_cache.backend.type = filesystem
267
315
268 ; url for s3 compatible storage that allows to upload artifacts
316 ; url for s3 compatible storage that allows to upload artifacts
269 ; e.g http://minio:9000
317 ; e.g http://minio:9000
270 archive_cache.objectstore.url = http://s3-minio:9000
318 archive_cache.objectstore.url = http://s3-minio:9000
271
319
272 ; key for s3 auth
320 ; key for s3 auth
273 archive_cache.objectstore.key = key
321 archive_cache.objectstore.key = key
274
322
275 ; secret for s3 auth
323 ; secret for s3 auth
276 archive_cache.objectstore.secret = secret
324 archive_cache.objectstore.secret = secret
277
325
278 ;region for s3 storage
326 ;region for s3 storage
279 archive_cache.objectstore.region = eu-central-1
327 archive_cache.objectstore.region = eu-central-1
280
328
281 ; number of sharded buckets to create to distribute archives across
329 ; number of sharded buckets to create to distribute archives across
282 ; default is 8 shards
330 ; default is 8 shards
283 archive_cache.objectstore.bucket_shards = 8
331 archive_cache.objectstore.bucket_shards = 8
284
332
285 ; a top-level bucket to put all other shards in
333 ; a top-level bucket to put all other shards in
286 ; objects will be stored in rhodecode-archive-cache/shard-N based on the bucket_shards number
334 ; objects will be stored in rhodecode-archive-cache/shard-N based on the bucket_shards number
287 archive_cache.objectstore.bucket = rhodecode-archive-cache
335 archive_cache.objectstore.bucket = rhodecode-archive-cache
288
336
289 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
337 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
290 archive_cache.objectstore.retry = false
338 archive_cache.objectstore.retry = false
291
339
292 ; number of seconds to wait for next try using retry
340 ; number of seconds to wait for next try using retry
293 archive_cache.objectstore.retry_backoff = 1
341 archive_cache.objectstore.retry_backoff = 1
294
342
295 ; how many tries do do a retry fetch from this backend
343 ; how many tries do do a retry fetch from this backend
296 archive_cache.objectstore.retry_attempts = 10
344 archive_cache.objectstore.retry_attempts = 10
297
345
298 ; Default is $cache_dir/archive_cache if not set
346 ; Default is $cache_dir/archive_cache if not set
299 ; Generated repo archives will be cached at this location
347 ; Generated repo archives will be cached at this location
300 ; and served from the cache during subsequent requests for the same archive of
348 ; and served from the cache during subsequent requests for the same archive of
301 ; the repository. This path is important to be shared across filesystems and with
349 ; the repository. This path is important to be shared across filesystems and with
302 ; RhodeCode and vcsserver
350 ; RhodeCode and vcsserver
303 archive_cache.filesystem.store_dir = /var/opt/rhodecode_data/archive_cache
351 archive_cache.filesystem.store_dir = /var/opt/rhodecode_data/archive_cache
304
352
305 ; The limit in GB sets how much data we cache before recycling last used, defaults to 10 gb
353 ; The limit in GB sets how much data we cache before recycling last used, defaults to 10 gb
306 archive_cache.filesystem.cache_size_gb = 40
354 archive_cache.filesystem.cache_size_gb = 40
307
355
308 ; Eviction policy used to clear out after cache_size_gb limit is reached
356 ; Eviction policy used to clear out after cache_size_gb limit is reached
309 archive_cache.filesystem.eviction_policy = least-recently-stored
357 archive_cache.filesystem.eviction_policy = least-recently-stored
310
358
311 ; By default cache uses sharding technique, this specifies how many shards are there
359 ; By default cache uses sharding technique, this specifies how many shards are there
312 ; default is 8 shards
360 ; default is 8 shards
313 archive_cache.filesystem.cache_shards = 8
361 archive_cache.filesystem.cache_shards = 8
314
362
315 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
363 ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time
316 archive_cache.filesystem.retry = false
364 archive_cache.filesystem.retry = false
317
365
318 ; number of seconds to wait for next try using retry
366 ; number of seconds to wait for next try using retry
319 archive_cache.filesystem.retry_backoff = 1
367 archive_cache.filesystem.retry_backoff = 1
320
368
321 ; how many tries do do a retry fetch from this backend
369 ; how many tries do do a retry fetch from this backend
322 archive_cache.filesystem.retry_attempts = 10
370 archive_cache.filesystem.retry_attempts = 10
323
371
324
372
325 ; #############
373 ; #############
326 ; CELERY CONFIG
374 ; CELERY CONFIG
327 ; #############
375 ; #############
328
376
329 ; manually run celery: /path/to/celery worker --task-events --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini
377 ; manually run celery: /path/to/celery worker --task-events --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini
330
378
331 use_celery = true
379 use_celery = true
332
380
333 ; path to store schedule database
381 ; path to store schedule database
334 #celerybeat-schedule.path =
382 #celerybeat-schedule.path =
335
383
336 ; connection url to the message broker (default redis)
384 ; connection url to the message broker (default redis)
337 celery.broker_url = redis://redis:6379/8
385 celery.broker_url = redis://redis:6379/8
338
386
339 ; results backend to get results for (default redis)
387 ; results backend to get results for (default redis)
340 celery.result_backend = redis://redis:6379/8
388 celery.result_backend = redis://redis:6379/8
341
389
342 ; rabbitmq example
390 ; rabbitmq example
343 #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost
391 #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost
344
392
345 ; maximum tasks to execute before worker restart
393 ; maximum tasks to execute before worker restart
346 celery.max_tasks_per_child = 20
394 celery.max_tasks_per_child = 20
347
395
348 ; tasks will never be sent to the queue, but executed locally instead.
396 ; tasks will never be sent to the queue, but executed locally instead.
349 celery.task_always_eager = false
397 celery.task_always_eager = false
350
398
351 ; #############
399 ; #############
352 ; DOGPILE CACHE
400 ; DOGPILE CACHE
353 ; #############
401 ; #############
354
402
355 ; Default cache dir for caches. Putting this into a ramdisk can boost performance.
403 ; Default cache dir for caches. Putting this into a ramdisk can boost performance.
356 ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space
404 ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space
357 cache_dir = /var/opt/rhodecode_data
405 cache_dir = /var/opt/rhodecode_data
358
406
359 ; *********************************************
407 ; *********************************************
360 ; `sql_cache_short` cache for heavy SQL queries
408 ; `sql_cache_short` cache for heavy SQL queries
361 ; Only supported backend is `memory_lru`
409 ; Only supported backend is `memory_lru`
362 ; *********************************************
410 ; *********************************************
363 rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru
411 rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru
364 rc_cache.sql_cache_short.expiration_time = 30
412 rc_cache.sql_cache_short.expiration_time = 30
365
413
366
414
367 ; *****************************************************
415 ; *****************************************************
368 ; `cache_repo_longterm` cache for repo object instances
416 ; `cache_repo_longterm` cache for repo object instances
369 ; Only supported backend is `memory_lru`
417 ; Only supported backend is `memory_lru`
370 ; *****************************************************
418 ; *****************************************************
371 rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru
419 rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru
372 ; by default we use 30 Days, cache is still invalidated on push
420 ; by default we use 30 Days, cache is still invalidated on push
373 rc_cache.cache_repo_longterm.expiration_time = 2592000
421 rc_cache.cache_repo_longterm.expiration_time = 2592000
374 ; max items in LRU cache, set to smaller number to save memory, and expire last used caches
422 ; max items in LRU cache, set to smaller number to save memory, and expire last used caches
375 rc_cache.cache_repo_longterm.max_size = 10000
423 rc_cache.cache_repo_longterm.max_size = 10000
376
424
377
425
378 ; *********************************************
426 ; *********************************************
379 ; `cache_general` cache for general purpose use
427 ; `cache_general` cache for general purpose use
380 ; for simplicity use rc.file_namespace backend,
428 ; for simplicity use rc.file_namespace backend,
381 ; for performance and scale use rc.redis
429 ; for performance and scale use rc.redis
382 ; *********************************************
430 ; *********************************************
383 rc_cache.cache_general.backend = dogpile.cache.rc.file_namespace
431 rc_cache.cache_general.backend = dogpile.cache.rc.file_namespace
384 rc_cache.cache_general.expiration_time = 43200
432 rc_cache.cache_general.expiration_time = 43200
385 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
433 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
386 #rc_cache.cache_general.arguments.filename = /tmp/cache_general_db
434 #rc_cache.cache_general.arguments.filename = /tmp/cache_general_db
387
435
388 ; alternative `cache_general` redis backend with distributed lock
436 ; alternative `cache_general` redis backend with distributed lock
389 #rc_cache.cache_general.backend = dogpile.cache.rc.redis
437 #rc_cache.cache_general.backend = dogpile.cache.rc.redis
390 #rc_cache.cache_general.expiration_time = 300
438 #rc_cache.cache_general.expiration_time = 300
391
439
392 ; redis_expiration_time needs to be greater then expiration_time
440 ; redis_expiration_time needs to be greater then expiration_time
393 #rc_cache.cache_general.arguments.redis_expiration_time = 7200
441 #rc_cache.cache_general.arguments.redis_expiration_time = 7200
394
442
395 #rc_cache.cache_general.arguments.host = localhost
443 #rc_cache.cache_general.arguments.host = localhost
396 #rc_cache.cache_general.arguments.port = 6379
444 #rc_cache.cache_general.arguments.port = 6379
397 #rc_cache.cache_general.arguments.db = 0
445 #rc_cache.cache_general.arguments.db = 0
398 #rc_cache.cache_general.arguments.socket_timeout = 30
446 #rc_cache.cache_general.arguments.socket_timeout = 30
399 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
447 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
400 #rc_cache.cache_general.arguments.distributed_lock = true
448 #rc_cache.cache_general.arguments.distributed_lock = true
401
449
402 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
450 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
403 #rc_cache.cache_general.arguments.lock_auto_renewal = true
451 #rc_cache.cache_general.arguments.lock_auto_renewal = true
404
452
405 ; *************************************************
453 ; *************************************************
406 ; `cache_perms` cache for permission tree, auth TTL
454 ; `cache_perms` cache for permission tree, auth TTL
407 ; for simplicity use rc.file_namespace backend,
455 ; for simplicity use rc.file_namespace backend,
408 ; for performance and scale use rc.redis
456 ; for performance and scale use rc.redis
409 ; *************************************************
457 ; *************************************************
410 rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace
458 rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace
411 rc_cache.cache_perms.expiration_time = 3600
459 rc_cache.cache_perms.expiration_time = 3600
412 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
460 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
413 #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms_db
461 #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms_db
414
462
415 ; alternative `cache_perms` redis backend with distributed lock
463 ; alternative `cache_perms` redis backend with distributed lock
416 #rc_cache.cache_perms.backend = dogpile.cache.rc.redis
464 #rc_cache.cache_perms.backend = dogpile.cache.rc.redis
417 #rc_cache.cache_perms.expiration_time = 300
465 #rc_cache.cache_perms.expiration_time = 300
418
466
419 ; redis_expiration_time needs to be greater then expiration_time
467 ; redis_expiration_time needs to be greater then expiration_time
420 #rc_cache.cache_perms.arguments.redis_expiration_time = 7200
468 #rc_cache.cache_perms.arguments.redis_expiration_time = 7200
421
469
422 #rc_cache.cache_perms.arguments.host = localhost
470 #rc_cache.cache_perms.arguments.host = localhost
423 #rc_cache.cache_perms.arguments.port = 6379
471 #rc_cache.cache_perms.arguments.port = 6379
424 #rc_cache.cache_perms.arguments.db = 0
472 #rc_cache.cache_perms.arguments.db = 0
425 #rc_cache.cache_perms.arguments.socket_timeout = 30
473 #rc_cache.cache_perms.arguments.socket_timeout = 30
426 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
474 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
427 #rc_cache.cache_perms.arguments.distributed_lock = true
475 #rc_cache.cache_perms.arguments.distributed_lock = true
428
476
429 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
477 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
430 #rc_cache.cache_perms.arguments.lock_auto_renewal = true
478 #rc_cache.cache_perms.arguments.lock_auto_renewal = true
431
479
432 ; ***************************************************
480 ; ***************************************************
433 ; `cache_repo` cache for file tree, Readme, RSS FEEDS
481 ; `cache_repo` cache for file tree, Readme, RSS FEEDS
434 ; for simplicity use rc.file_namespace backend,
482 ; for simplicity use rc.file_namespace backend,
435 ; for performance and scale use rc.redis
483 ; for performance and scale use rc.redis
436 ; ***************************************************
484 ; ***************************************************
437 rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace
485 rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace
438 rc_cache.cache_repo.expiration_time = 2592000
486 rc_cache.cache_repo.expiration_time = 2592000
439 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
487 ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set
440 #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo_db
488 #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo_db
441
489
442 ; alternative `cache_repo` redis backend with distributed lock
490 ; alternative `cache_repo` redis backend with distributed lock
443 #rc_cache.cache_repo.backend = dogpile.cache.rc.redis
491 #rc_cache.cache_repo.backend = dogpile.cache.rc.redis
444 #rc_cache.cache_repo.expiration_time = 2592000
492 #rc_cache.cache_repo.expiration_time = 2592000
445
493
446 ; redis_expiration_time needs to be greater then expiration_time
494 ; redis_expiration_time needs to be greater then expiration_time
447 #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400
495 #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400
448
496
449 #rc_cache.cache_repo.arguments.host = localhost
497 #rc_cache.cache_repo.arguments.host = localhost
450 #rc_cache.cache_repo.arguments.port = 6379
498 #rc_cache.cache_repo.arguments.port = 6379
451 #rc_cache.cache_repo.arguments.db = 1
499 #rc_cache.cache_repo.arguments.db = 1
452 #rc_cache.cache_repo.arguments.socket_timeout = 30
500 #rc_cache.cache_repo.arguments.socket_timeout = 30
453 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
501 ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends
454 #rc_cache.cache_repo.arguments.distributed_lock = true
502 #rc_cache.cache_repo.arguments.distributed_lock = true
455
503
456 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
504 ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen
457 #rc_cache.cache_repo.arguments.lock_auto_renewal = true
505 #rc_cache.cache_repo.arguments.lock_auto_renewal = true
458
506
459 ; ##############
507 ; ##############
460 ; BEAKER SESSION
508 ; BEAKER SESSION
461 ; ##############
509 ; ##############
462
510
463 ; beaker.session.type is type of storage options for the logged users sessions. Current allowed
511 ; beaker.session.type is type of storage options for the logged users sessions. Current allowed
464 ; types are file, ext:redis, ext:database, ext:memcached
512 ; types are file, ext:redis, ext:database, ext:memcached
465 ; Fastest ones are ext:redis and ext:database, DO NOT use memory type for session
513 ; Fastest ones are ext:redis and ext:database, DO NOT use memory type for session
466 #beaker.session.type = file
514 #beaker.session.type = file
467 #beaker.session.data_dir = %(here)s/data/sessions
515 #beaker.session.data_dir = %(here)s/data/sessions
468
516
469 ; Redis based sessions
517 ; Redis based sessions
470 beaker.session.type = ext:redis
518 beaker.session.type = ext:redis
471 beaker.session.url = redis://redis:6379/2
519 beaker.session.url = redis://redis:6379/2
472
520
473 ; DB based session, fast, and allows easy management over logged in users
521 ; DB based session, fast, and allows easy management over logged in users
474 #beaker.session.type = ext:database
522 #beaker.session.type = ext:database
475 #beaker.session.table_name = db_session
523 #beaker.session.table_name = db_session
476 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
524 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
477 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
525 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
478 #beaker.session.sa.pool_recycle = 3600
526 #beaker.session.sa.pool_recycle = 3600
479 #beaker.session.sa.echo = false
527 #beaker.session.sa.echo = false
480
528
481 beaker.session.key = rhodecode
529 beaker.session.key = rhodecode
482 beaker.session.secret = production-rc-uytcxaz
530 beaker.session.secret = production-rc-uytcxaz
483 beaker.session.lock_dir = /data_ramdisk/lock
531 beaker.session.lock_dir = /data_ramdisk/lock
484
532
485 ; Secure encrypted cookie. Requires AES and AES python libraries
533 ; Secure encrypted cookie. Requires AES and AES python libraries
486 ; you must disable beaker.session.secret to use this
534 ; you must disable beaker.session.secret to use this
487 #beaker.session.encrypt_key = key_for_encryption
535 #beaker.session.encrypt_key = key_for_encryption
488 #beaker.session.validate_key = validation_key
536 #beaker.session.validate_key = validation_key
489
537
490 ; Sets session as invalid (also logging out user) if it haven not been
538 ; Sets session as invalid (also logging out user) if it haven not been
491 ; accessed for given amount of time in seconds
539 ; accessed for given amount of time in seconds
492 beaker.session.timeout = 2592000
540 beaker.session.timeout = 2592000
493 beaker.session.httponly = true
541 beaker.session.httponly = true
494
542
495 ; Path to use for the cookie. Set to prefix if you use prefix middleware
543 ; Path to use for the cookie. Set to prefix if you use prefix middleware
496 #beaker.session.cookie_path = /custom_prefix
544 #beaker.session.cookie_path = /custom_prefix
497
545
498 ; Set https secure cookie
546 ; Set https secure cookie
499 beaker.session.secure = false
547 beaker.session.secure = false
500
548
501 ; default cookie expiration time in seconds, set to `true` to set expire
549 ; default cookie expiration time in seconds, set to `true` to set expire
502 ; at browser close
550 ; at browser close
503 #beaker.session.cookie_expires = 3600
551 #beaker.session.cookie_expires = 3600
504
552
505 ; #############################
553 ; #############################
506 ; SEARCH INDEXING CONFIGURATION
554 ; SEARCH INDEXING CONFIGURATION
507 ; #############################
555 ; #############################
508
556
509 ; Full text search indexer is available in rhodecode-tools under
557 ; Full text search indexer is available in rhodecode-tools under
510 ; `rhodecode-tools index` command
558 ; `rhodecode-tools index` command
511
559
512 ; WHOOSH Backend, doesn't require additional services to run
560 ; WHOOSH Backend, doesn't require additional services to run
513 ; it works good with few dozen repos
561 ; it works good with few dozen repos
514 search.module = rhodecode.lib.index.whoosh
562 search.module = rhodecode.lib.index.whoosh
515 search.location = %(here)s/data/index
563 search.location = %(here)s/data/index
516
564
517 ; ####################
565 ; ####################
518 ; CHANNELSTREAM CONFIG
566 ; CHANNELSTREAM CONFIG
519 ; ####################
567 ; ####################
520
568
521 ; channelstream enables persistent connections and live notification
569 ; channelstream enables persistent connections and live notification
522 ; in the system. It's also used by the chat system
570 ; in the system. It's also used by the chat system
523
571
524 channelstream.enabled = true
572 channelstream.enabled = true
525
573
526 ; server address for channelstream server on the backend
574 ; server address for channelstream server on the backend
527 channelstream.server = channelstream:9800
575 channelstream.server = channelstream:9800
528
576
529 ; location of the channelstream server from outside world
577 ; location of the channelstream server from outside world
530 ; use ws:// for http or wss:// for https. This address needs to be handled
578 ; use ws:// for http or wss:// for https. This address needs to be handled
531 ; by external HTTP server such as Nginx or Apache
579 ; by external HTTP server such as Nginx or Apache
532 ; see Nginx/Apache configuration examples in our docs
580 ; see Nginx/Apache configuration examples in our docs
533 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
581 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
534 channelstream.secret = ENV_GENERATED
582 channelstream.secret = ENV_GENERATED
535 channelstream.history.location = /var/opt/rhodecode_data/channelstream_history
583 channelstream.history.location = /var/opt/rhodecode_data/channelstream_history
536
584
537 ; Internal application path that Javascript uses to connect into.
585 ; Internal application path that Javascript uses to connect into.
538 ; If you use proxy-prefix the prefix should be added before /_channelstream
586 ; If you use proxy-prefix the prefix should be added before /_channelstream
539 channelstream.proxy_path = /_channelstream
587 channelstream.proxy_path = /_channelstream
540
588
541
589
542 ; ##############################
590 ; ##############################
543 ; MAIN RHODECODE DATABASE CONFIG
591 ; MAIN RHODECODE DATABASE CONFIG
544 ; ##############################
592 ; ##############################
545
593
546 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
594 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30
547 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
595 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
548 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8
596 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8
549 ; pymysql is an alternative driver for MySQL, use in case of problems with default one
597 ; pymysql is an alternative driver for MySQL, use in case of problems with default one
550 #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode
598 #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode
551
599
552 sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
600 sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode
553
601
554 ; see sqlalchemy docs for other advanced settings
602 ; see sqlalchemy docs for other advanced settings
555 ; print the sql statements to output
603 ; print the sql statements to output
556 sqlalchemy.db1.echo = false
604 sqlalchemy.db1.echo = false
557
605
558 ; recycle the connections after this amount of seconds
606 ; recycle the connections after this amount of seconds
559 sqlalchemy.db1.pool_recycle = 3600
607 sqlalchemy.db1.pool_recycle = 3600
560
608
561 ; the number of connections to keep open inside the connection pool.
609 ; the number of connections to keep open inside the connection pool.
562 ; 0 indicates no limit
610 ; 0 indicates no limit
563 ; the general calculus with gevent is:
611 ; the general calculus with gevent is:
564 ; if your system allows 500 concurrent greenlets (max_connections) that all do database access,
612 ; if your system allows 500 concurrent greenlets (max_connections) that all do database access,
565 ; then increase pool size + max overflow so that they add up to 500.
613 ; then increase pool size + max overflow so that they add up to 500.
566 #sqlalchemy.db1.pool_size = 5
614 #sqlalchemy.db1.pool_size = 5
567
615
568 ; The number of connections to allow in connection pool "overflow", that is
616 ; The number of connections to allow in connection pool "overflow", that is
569 ; connections that can be opened above and beyond the pool_size setting,
617 ; connections that can be opened above and beyond the pool_size setting,
570 ; which defaults to five.
618 ; which defaults to five.
571 #sqlalchemy.db1.max_overflow = 10
619 #sqlalchemy.db1.max_overflow = 10
572
620
573 ; Connection check ping, used to detect broken database connections
621 ; Connection check ping, used to detect broken database connections
574 ; could be enabled to better handle cases if MySQL has gone away errors
622 ; could be enabled to better handle cases if MySQL has gone away errors
575 #sqlalchemy.db1.ping_connection = true
623 #sqlalchemy.db1.ping_connection = true
576
624
577 ; ##########
625 ; ##########
578 ; VCS CONFIG
626 ; VCS CONFIG
579 ; ##########
627 ; ##########
580 vcs.server.enable = true
628 vcs.server.enable = true
581 vcs.server = vcsserver:10010
629 vcs.server = vcsserver:10010
582
630
583 ; Web server connectivity protocol, responsible for web based VCS operations
631 ; Web server connectivity protocol, responsible for web based VCS operations
584 ; Available protocols are:
632 ; Available protocols are:
585 ; `http` - use http-rpc backend (default)
633 ; `http` - use http-rpc backend (default)
586 vcs.server.protocol = http
634 vcs.server.protocol = http
587
635
588 ; Push/Pull operations protocol, available options are:
636 ; Push/Pull operations protocol, available options are:
589 ; `http` - use http-rpc backend (default)
637 ; `http` - use http-rpc backend (default)
590 vcs.scm_app_implementation = http
638 vcs.scm_app_implementation = http
591
639
592 ; Push/Pull operations hooks protocol, available options are:
640 ; Push/Pull operations hooks protocol, available options are:
593 ; `http` - use http-rpc backend (default)
641 ; `http` - use http-rpc backend (default)
594 ; `celery` - use celery based hooks
642 ; `celery` - use celery based hooks
595 vcs.hooks.protocol = http
643 #DEPRECATED:vcs.hooks.protocol = http
644 vcs.hooks.protocol.v2 = celery
596
645
597 ; Host on which this instance is listening for hooks. vcsserver will call this host to pull/push hooks so it should be
646 ; Host on which this instance is listening for hooks. vcsserver will call this host to pull/push hooks so it should be
598 ; accessible via network.
647 ; accessible via network.
599 ; Use vcs.hooks.host = "*" to bind to current hostname (for Docker)
648 ; Use vcs.hooks.host = "*" to bind to current hostname (for Docker)
600 vcs.hooks.host = *
649 vcs.hooks.host = *
601
650
602 ; Start VCSServer with this instance as a subprocess, useful for development
651 ; Start VCSServer with this instance as a subprocess, useful for development
603 vcs.start_server = false
652 vcs.start_server = false
604
653
605 ; List of enabled VCS backends, available options are:
654 ; List of enabled VCS backends, available options are:
606 ; `hg` - mercurial
655 ; `hg` - mercurial
607 ; `git` - git
656 ; `git` - git
608 ; `svn` - subversion
657 ; `svn` - subversion
609 vcs.backends = hg, git, svn
658 vcs.backends = hg, git, svn
610
659
611 ; Wait this number of seconds before killing connection to the vcsserver
660 ; Wait this number of seconds before killing connection to the vcsserver
612 vcs.connection_timeout = 3600
661 vcs.connection_timeout = 3600
613
662
614 ; Cache flag to cache vcsserver remote calls locally
663 ; Cache flag to cache vcsserver remote calls locally
615 ; It uses cache_region `cache_repo`
664 ; It uses cache_region `cache_repo`
616 vcs.methods.cache = true
665 vcs.methods.cache = true
617
666
667 ; Filesystem location where Git lfs objects should be stored
668 vcs.git.lfs.storage_location = /var/opt/rhodecode_repo_store/.cache/git_lfs_store
669
670 ; Filesystem location where Mercurial largefile objects should be stored
671 vcs.hg.largefiles.storage_location = /var/opt/rhodecode_repo_store/.cache/hg_largefiles_store
672
618 ; ####################################################
673 ; ####################################################
619 ; Subversion proxy support (mod_dav_svn)
674 ; Subversion proxy support (mod_dav_svn)
620 ; Maps RhodeCode repo groups into SVN paths for Apache
675 ; Maps RhodeCode repo groups into SVN paths for Apache
621 ; ####################################################
676 ; ####################################################
622
677
623 ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
678 ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
624 ; Set a numeric version for your current SVN e.g 1.8, or 1.12
679 ; Set a numeric version for your current SVN e.g 1.8, or 1.12
625 ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible
680 ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible
626 #vcs.svn.compatible_version = 1.8
681 #vcs.svn.compatible_version = 1.8
627
682
628 ; Redis connection settings for svn integrations logic
683 ; Redis connection settings for svn integrations logic
629 ; This connection string needs to be the same on ce and vcsserver
684 ; This connection string needs to be the same on ce and vcsserver
630 vcs.svn.redis_conn = redis://redis:6379/0
685 vcs.svn.redis_conn = redis://redis:6379/0
631
686
632 ; Enable SVN proxy of requests over HTTP
687 ; Enable SVN proxy of requests over HTTP
633 vcs.svn.proxy.enabled = true
688 vcs.svn.proxy.enabled = true
634
689
635 ; host to connect to running SVN subsystem
690 ; host to connect to running SVN subsystem
636 vcs.svn.proxy.host = http://svn:8090
691 vcs.svn.proxy.host = http://svn:8090
637
692
638 ; Enable or disable the config file generation.
693 ; Enable or disable the config file generation.
639 svn.proxy.generate_config = true
694 svn.proxy.generate_config = true
640
695
641 ; Generate config file with `SVNListParentPath` set to `On`.
696 ; Generate config file with `SVNListParentPath` set to `On`.
642 svn.proxy.list_parent_path = true
697 svn.proxy.list_parent_path = true
643
698
644 ; Set location and file name of generated config file.
699 ; Set location and file name of generated config file.
645 svn.proxy.config_file_path = /etc/rhodecode/conf/svn/mod_dav_svn.conf
700 svn.proxy.config_file_path = /etc/rhodecode/conf/svn/mod_dav_svn.conf
646
701
647 ; alternative mod_dav config template. This needs to be a valid mako template
702 ; alternative mod_dav config template. This needs to be a valid mako template
648 ; Example template can be found in the source code:
703 ; Example template can be found in the source code:
649 ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako
704 ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako
650 #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako
705 #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako
651
706
652 ; Used as a prefix to the `Location` block in the generated config file.
707 ; Used as a prefix to the `Location` block in the generated config file.
653 ; In most cases it should be set to `/`.
708 ; In most cases it should be set to `/`.
654 svn.proxy.location_root = /
709 svn.proxy.location_root = /
655
710
656 ; Command to reload the mod dav svn configuration on change.
711 ; Command to reload the mod dav svn configuration on change.
657 ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh
712 ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh
658 ; Make sure user who runs RhodeCode process is allowed to reload Apache
713 ; Make sure user who runs RhodeCode process is allowed to reload Apache
659 #svn.proxy.reload_cmd = /etc/init.d/apache2 reload
714 #svn.proxy.reload_cmd = /etc/init.d/apache2 reload
660
715
661 ; If the timeout expires before the reload command finishes, the command will
716 ; If the timeout expires before the reload command finishes, the command will
662 ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds.
717 ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds.
663 #svn.proxy.reload_timeout = 10
718 #svn.proxy.reload_timeout = 10
664
719
665 ; ####################
720 ; ####################
666 ; SSH Support Settings
721 ; SSH Support Settings
667 ; ####################
722 ; ####################
668
723
669 ; Defines if a custom authorized_keys file should be created and written on
724 ; Defines if a custom authorized_keys file should be created and written on
670 ; any change user ssh keys. Setting this to false also disables possibility
725 ; any change user ssh keys. Setting this to false also disables possibility
671 ; of adding SSH keys by users from web interface. Super admins can still
726 ; of adding SSH keys by users from web interface. Super admins can still
672 ; manage SSH Keys.
727 ; manage SSH Keys.
673 ssh.generate_authorized_keyfile = true
728 ssh.generate_authorized_keyfile = true
674
729
675 ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding`
730 ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding`
676 # ssh.authorized_keys_ssh_opts =
731 # ssh.authorized_keys_ssh_opts =
677
732
678 ; Path to the authorized_keys file where the generate entries are placed.
733 ; Path to the authorized_keys file where the generate entries are placed.
679 ; It is possible to have multiple key files specified in `sshd_config` e.g.
734 ; It is possible to have multiple key files specified in `sshd_config` e.g.
680 ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode
735 ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode
681 ssh.authorized_keys_file_path = /etc/rhodecode/conf/ssh/authorized_keys_rhodecode
736 ssh.authorized_keys_file_path = /etc/rhodecode/conf/ssh/authorized_keys_rhodecode
682
737
683 ; Command to execute the SSH wrapper. The binary is available in the
738 ; Command to execute the SSH wrapper. The binary is available in the
684 ; RhodeCode installation directory.
739 ; RhodeCode installation directory.
685 ; legacy: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
740 ; legacy: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
686 ; new rewrite: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2
741 ; new rewrite: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2
687 ssh.wrapper_cmd = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
742 #DEPRECATED: ssh.wrapper_cmd = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper
743 ssh.wrapper_cmd.v2 = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2
688
744
689 ; Allow shell when executing the ssh-wrapper command
745 ; Allow shell when executing the ssh-wrapper command
690 ssh.wrapper_cmd_allow_shell = false
746 ssh.wrapper_cmd_allow_shell = false
691
747
692 ; Enables logging, and detailed output send back to the client during SSH
748 ; Enables logging, and detailed output send back to the client during SSH
693 ; operations. Useful for debugging, shouldn't be used in production.
749 ; operations. Useful for debugging, shouldn't be used in production.
694 ssh.enable_debug_logging = false
750 ssh.enable_debug_logging = false
695
751
696 ; Paths to binary executable, by default they are the names, but we can
752 ; Paths to binary executable, by default they are the names, but we can
697 ; override them if we want to use a custom one
753 ; override them if we want to use a custom one
698 ssh.executable.hg = /usr/local/bin/rhodecode_bin/vcs_bin/hg
754 ssh.executable.hg = /usr/local/bin/rhodecode_bin/vcs_bin/hg
699 ssh.executable.git = /usr/local/bin/rhodecode_bin/vcs_bin/git
755 ssh.executable.git = /usr/local/bin/rhodecode_bin/vcs_bin/git
700 ssh.executable.svn = /usr/local/bin/rhodecode_bin/vcs_bin/svnserve
756 ssh.executable.svn = /usr/local/bin/rhodecode_bin/vcs_bin/svnserve
701
757
702 ; Enables SSH key generator web interface. Disabling this still allows users
758 ; Enables SSH key generator web interface. Disabling this still allows users
703 ; to add their own keys.
759 ; to add their own keys.
704 ssh.enable_ui_key_generator = true
760 ssh.enable_ui_key_generator = true
705
761
706 ; Statsd client config, this is used to send metrics to statsd
762 ; Statsd client config, this is used to send metrics to statsd
707 ; We recommend setting statsd_exported and scrape them using Prometheus
763 ; We recommend setting statsd_exported and scrape them using Prometheus
708 #statsd.enabled = false
764 #statsd.enabled = false
709 #statsd.statsd_host = 0.0.0.0
765 #statsd.statsd_host = 0.0.0.0
710 #statsd.statsd_port = 8125
766 #statsd.statsd_port = 8125
711 #statsd.statsd_prefix =
767 #statsd.statsd_prefix =
712 #statsd.statsd_ipv6 = false
768 #statsd.statsd_ipv6 = false
713
769
714 ; configure logging automatically at server startup set to false
770 ; configure logging automatically at server startup set to false
715 ; to use the below custom logging config.
771 ; to use the below custom logging config.
716 ; RC_LOGGING_FORMATTER
772 ; RC_LOGGING_FORMATTER
717 ; RC_LOGGING_LEVEL
773 ; RC_LOGGING_LEVEL
718 ; env variables can control the settings for logging in case of autoconfigure
774 ; env variables can control the settings for logging in case of autoconfigure
719
775
720 #logging.autoconfigure = true
776 #logging.autoconfigure = true
721
777
722 ; specify your own custom logging config file to configure logging
778 ; specify your own custom logging config file to configure logging
723 #logging.logging_conf_file = /path/to/custom_logging.ini
779 #logging.logging_conf_file = /path/to/custom_logging.ini
724
780
725 ; Dummy marker to add new entries after.
781 ; Dummy marker to add new entries after.
726 ; Add any custom entries below. Please don't remove this marker.
782 ; Add any custom entries below. Please don't remove this marker.
727 custom.conf = 1
783 custom.conf = 1
728
784
729
785
730 ; #####################
786 ; #####################
731 ; LOGGING CONFIGURATION
787 ; LOGGING CONFIGURATION
732 ; #####################
788 ; #####################
733
789
734 [loggers]
790 [loggers]
735 keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper
791 keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper
736
792
737 [handlers]
793 [handlers]
738 keys = console, console_sql
794 keys = console, console_sql
739
795
740 [formatters]
796 [formatters]
741 keys = generic, json, color_formatter, color_formatter_sql
797 keys = generic, json, color_formatter, color_formatter_sql
742
798
743 ; #######
799 ; #######
744 ; LOGGERS
800 ; LOGGERS
745 ; #######
801 ; #######
746 [logger_root]
802 [logger_root]
747 level = NOTSET
803 level = NOTSET
748 handlers = console
804 handlers = console
749
805
750 [logger_sqlalchemy]
806 [logger_sqlalchemy]
751 level = INFO
807 level = INFO
752 handlers = console_sql
808 handlers = console_sql
753 qualname = sqlalchemy.engine
809 qualname = sqlalchemy.engine
754 propagate = 0
810 propagate = 0
755
811
756 [logger_beaker]
812 [logger_beaker]
757 level = DEBUG
813 level = DEBUG
758 handlers =
814 handlers =
759 qualname = beaker.container
815 qualname = beaker.container
760 propagate = 1
816 propagate = 1
761
817
762 [logger_rhodecode]
818 [logger_rhodecode]
763 level = DEBUG
819 level = DEBUG
764 handlers =
820 handlers =
765 qualname = rhodecode
821 qualname = rhodecode
766 propagate = 1
822 propagate = 1
767
823
768 [logger_ssh_wrapper]
824 [logger_ssh_wrapper]
769 level = DEBUG
825 level = DEBUG
770 handlers =
826 handlers =
771 qualname = ssh_wrapper
827 qualname = ssh_wrapper
772 propagate = 1
828 propagate = 1
773
829
774 [logger_celery]
830 [logger_celery]
775 level = DEBUG
831 level = DEBUG
776 handlers =
832 handlers =
777 qualname = celery
833 qualname = celery
778
834
779
835
780 ; ########
836 ; ########
781 ; HANDLERS
837 ; HANDLERS
782 ; ########
838 ; ########
783
839
784 [handler_console]
840 [handler_console]
785 class = StreamHandler
841 class = StreamHandler
786 args = (sys.stderr, )
842 args = (sys.stderr, )
787 level = INFO
843 level = INFO
788 ; To enable JSON formatted logs replace 'generic/color_formatter' with 'json'
844 ; To enable JSON formatted logs replace 'generic/color_formatter' with 'json'
789 ; This allows sending properly formatted logs to grafana loki or elasticsearch
845 ; This allows sending properly formatted logs to grafana loki or elasticsearch
790 formatter = generic
846 formatter = generic
791
847
792 [handler_console_sql]
848 [handler_console_sql]
793 ; "level = DEBUG" logs SQL queries and results.
849 ; "level = DEBUG" logs SQL queries and results.
794 ; "level = INFO" logs SQL queries.
850 ; "level = INFO" logs SQL queries.
795 ; "level = WARN" logs neither. (Recommended for production systems.)
851 ; "level = WARN" logs neither. (Recommended for production systems.)
796 class = StreamHandler
852 class = StreamHandler
797 args = (sys.stderr, )
853 args = (sys.stderr, )
798 level = WARN
854 level = WARN
799 ; To enable JSON formatted logs replace 'generic/color_formatter_sql' with 'json'
855 ; To enable JSON formatted logs replace 'generic/color_formatter_sql' with 'json'
800 ; This allows sending properly formatted logs to grafana loki or elasticsearch
856 ; This allows sending properly formatted logs to grafana loki or elasticsearch
801 formatter = generic
857 formatter = generic
802
858
803 ; ##########
859 ; ##########
804 ; FORMATTERS
860 ; FORMATTERS
805 ; ##########
861 ; ##########
806
862
807 [formatter_generic]
863 [formatter_generic]
808 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
864 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
809 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
865 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
810 datefmt = %Y-%m-%d %H:%M:%S
866 datefmt = %Y-%m-%d %H:%M:%S
811
867
812 [formatter_color_formatter]
868 [formatter_color_formatter]
813 class = rhodecode.lib.logging_formatter.ColorFormatter
869 class = rhodecode.lib.logging_formatter.ColorFormatter
814 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
870 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
815 datefmt = %Y-%m-%d %H:%M:%S
871 datefmt = %Y-%m-%d %H:%M:%S
816
872
817 [formatter_color_formatter_sql]
873 [formatter_color_formatter_sql]
818 class = rhodecode.lib.logging_formatter.ColorFormatterSql
874 class = rhodecode.lib.logging_formatter.ColorFormatterSql
819 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
875 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
820 datefmt = %Y-%m-%d %H:%M:%S
876 datefmt = %Y-%m-%d %H:%M:%S
821
877
822 [formatter_json]
878 [formatter_json]
823 format = %(timestamp)s %(levelname)s %(name)s %(message)s %(req_id)s
879 format = %(timestamp)s %(levelname)s %(name)s %(message)s %(req_id)s
824 class = rhodecode.lib._vendor.jsonlogger.JsonFormatter
880 class = rhodecode.lib._vendor.jsonlogger.JsonFormatter
@@ -1,33 +1,39 b''
1 FROM python:3.12.0-bullseye
1 FROM python:3.12.0-bullseye
2
2
3 WORKDIR /project
3 WORKDIR /project
4
4
5 RUN apt-get update \
5 RUN apt-get update \
6 && apt-get install --no-install-recommends --yes \
6 && apt-get install --no-install-recommends --yes \
7 curl \
7 curl \
8 zip \
8 zip \
9 graphviz \
9 graphviz \
10 dvipng \
10 dvipng \
11 imagemagick \
11 imagemagick \
12 make \
12 make \
13 latexmk \
13 latexmk \
14 texlive-latex-recommended \
14 texlive-latex-recommended \
15 texlive-latex-extra \
15 texlive-latex-extra \
16 texlive-xetex \
16 texlive-xetex \
17 fonts-freefont-otf \
17 fonts-freefont-otf \
18 texlive-fonts-recommended \
18 texlive-fonts-recommended \
19 texlive-lang-greek \
19 texlive-lang-greek \
20 tex-gyre \
20 tex-gyre \
21 && apt-get autoremove \
21 && apt-get autoremove \
22 && apt-get clean \
22 && apt-get clean \
23 && rm -rf /var/lib/apt/lists/*
23 && rm -rf /var/lib/apt/lists/*
24
24
25 RUN curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && \
26 unzip awscliv2.zip && \
27 ./aws/install && \
28 rm -rf ./aws && \
29 rm awscliv2.zip
30
25 RUN \
31 RUN \
26 python3 -m pip install --no-cache-dir --upgrade pip && \
32 python3 -m pip install --no-cache-dir --upgrade pip && \
27 python3 -m pip install --no-cache-dir Sphinx Pillow
33 python3 -m pip install --no-cache-dir Sphinx Pillow
28
34
29 ADD requirements_docs.txt /project
35 ADD requirements_docs.txt /project
30 RUN \
36 RUN \
31 python3 -m pip install -r requirements_docs.txt
37 python3 -m pip install -r requirements_docs.txt
32
38
33 CMD ["sphinx-build", "-M", "html", ".", "_build"]
39 CMD ["sphinx-build", "-M", "html", ".", "_build"]
@@ -1,172 +1,168 b''
1 .. _system-overview-ref:
1 .. _system-overview-ref:
2
2
3 System Overview
3 System Overview
4 ===============
4 ===============
5
5
6 Latest Version
6 Latest Version
7 --------------
7 --------------
8
8
9 * |release| on Unix and Windows systems.
9 * |release| on Unix and Windows systems.
10
10
11 System Architecture
11 System Architecture
12 -------------------
12 -------------------
13
13
14 The following diagram shows a typical production architecture.
14 The following diagram shows a typical production architecture.
15
15
16 .. image:: ../images/architecture-diagram.png
16 .. image:: ../images/architecture-diagram.png
17 :align: center
17 :align: center
18
18
19 Supported Operating Systems
19 Supported Operating Systems
20 ---------------------------
20 ---------------------------
21
21
22 Linux
22 Linux
23 ^^^^^
23 ^^^^^
24
24
25 * Ubuntu 14.04+
25 * Ubuntu 14.04+
26 * CentOS 6.2, 7 and 8
26 * CentOS 6.2, 7 and 8
27 * RHEL 6.2, 7 and 8
27 * RHEL 6.2, 7 and 8
28 * Debian 7.8
28 * Debian 7.8
29 * RedHat Fedora
29 * RedHat Fedora
30 * Arch Linux
30 * Arch Linux
31 * SUSE Linux
31 * SUSE Linux
32
32
33 Windows
33 Windows
34 ^^^^^^^
34 ^^^^^^^
35
35
36 * Windows Vista Ultimate 64bit
36 * Windows Vista Ultimate 64bit
37 * Windows 7 Ultimate 64bit
37 * Windows 7 Ultimate 64bit
38 * Windows 8 Professional 64bit
38 * Windows 8 Professional 64bit
39 * Windows 8.1 Enterprise 64bit
39 * Windows 8.1 Enterprise 64bit
40 * Windows Server 2008 64bit
40 * Windows Server 2008 64bit
41 * Windows Server 2008-R2 64bit
41 * Windows Server 2008-R2 64bit
42 * Windows Server 2012 64bit
42 * Windows Server 2012 64bit
43
43
44 Supported Databases
44 Supported Databases
45 -------------------
45 -------------------
46
46
47 * SQLite
47 * SQLite
48 * MySQL
48 * MySQL
49 * MariaDB
49 * MariaDB
50 * PostgreSQL
50 * PostgreSQL
51
51
52 Supported Browsers
52 Supported Browsers
53 ------------------
53 ------------------
54
54
55 * Chrome
55 * Chrome
56 * Safari
56 * Safari
57 * Firefox
57 * Firefox
58 * Internet Explorer 10 & 11
58 * Internet Explorer 10 & 11
59
59
60 System Requirements
60 System Requirements
61 -------------------
61 -------------------
62
62
63 |RCE| performs best on machines with ultra-fast hard disks. Generally disk
63 |RCE| performs best on machines with ultra-fast hard disks. Generally disk
64 performance is more important than CPU performance. In a corporate production
64 performance is more important than CPU performance. In a corporate production
65 environment handling 1000s of users and |repos| you should deploy on a 12+
65 environment handling 1000s of users and |repos| you should deploy on a 12+
66 core 64GB RAM server. In short, the more RAM the better.
66 core 64GB RAM server. In short, the more RAM the better.
67
67
68
68
69 For example:
69 For example:
70
70
71 - for team of 1 - 5 active users you can run on 1GB RAM machine with 1CPU
71 - for team of 1 - 5 active users you can run on 1GB RAM machine with 1CPU
72 - above 250 active users, |RCE| needs at least 8GB of memory.
72 - above 250 active users, |RCE| needs at least 8GB of memory.
73 Number of CPUs is less important, but recommended to have at least 2-3 CPUs
73 Number of CPUs is less important, but recommended to have at least 2-3 CPUs
74
74
75
75
76 .. _config-rce-files:
76 .. _config-rce-files:
77
77
78 Configuration Files
78 Configuration Files
79 -------------------
79 -------------------
80
80
81 * :file:`config/_shared/rhodecode.ini`
81 * :file:`config/_shared/rhodecode.ini`
82 * :file:`/home/{user}/.rccontrol/{instance-id}/search_mapping.ini`
82 * :file:`/home/{user}/.rccontrol/{instance-id}/search_mapping.ini`
83 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini`
83 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini`
84 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.ini`
84 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.ini`
85 * :file:`/home/{user}/.rccontrol.ini`
85 * :file:`/home/{user}/.rccontrol.ini`
86 * :file:`/home/{user}/.rhoderc`
86 * :file:`/home/{user}/.rhoderc`
87 * :file:`/home/{user}/.rccontrol/cache/MANIFEST`
87 * :file:`/home/{user}/.rccontrol/cache/MANIFEST`
88
88
89 For more information, see the :ref:`config-files` section.
89 For more information, see the :ref:`config-files` section.
90
90
91 Log Files
91 Log Files
92 ---------
92 ---------
93
93
94 * :file:`/home/{user}/.rccontrol/{instance-id}/enterprise.log`
94 * :file:`/home/{user}/.rccontrol/{instance-id}/enterprise.log`
95 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.log`
95 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.log`
96 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.log`
96 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.log`
97 * :file:`/tmp/rccontrol.log`
97 * :file:`/tmp/rccontrol.log`
98 * :file:`/tmp/rhodecode_tools.log`
98 * :file:`/tmp/rhodecode_tools.log`
99
99
100 Storage Files
100 Storage Files
101 -------------
101 -------------
102
102
103 * :file:`/home/{user}/.rccontrol/{instance-id}/data/index/{index-file.toc}`
103 * :file:`/home/{user}/.rccontrol/{instance-id}/data/index/{index-file.toc}`
104 * :file:`/home/{user}/repos/.rc_gist_store`
104 * :file:`/home/{user}/repos/.rc_gist_store`
105 * :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.db`
105 * :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.db`
106 * :file:`/opt/rhodecode/store/{unique-hash}`
106 * :file:`/opt/rhodecode/store/{unique-hash}`
107
107
108 Default Repositories Location
108 Default Repositories Location
109 -----------------------------
109 -----------------------------
110
110
111 * :file:`/home/{user}/repos`
111 * :file:`/home/{user}/repos`
112
112
113 Connection Methods
113 Connection Methods
114 ------------------
114 ------------------
115
115
116 * HTTPS
116 * HTTPS
117 * SSH
117 * SSH
118 * |RCE| API
118 * |RCE| API
119
119
120 Internationalization Support
120 Internationalization Support
121 ----------------------------
121 ----------------------------
122
122
123 Currently available in the following languages, see `Transifex`_ for the
123 Currently available in the following languages, see `Transifex`_ for the
124 latest details. If you want a new language added, please contact us. To
124 latest details. If you want a new language added, please contact us. To
125 configure your language settings, see the :ref:`set-lang` section.
125 configure your language settings, see the :ref:`set-lang` section.
126
126
127 .. hlist::
127 .. hlist::
128
128
129 * Belorussian
129 * Belorussian
130 * Chinese
130 * Chinese
131 * French
131 * French
132 * German
132 * German
133 * Italian
133 * Italian
134 * Japanese
134 * Japanese
135 * Portuguese
135 * Portuguese
136 * Polish
136 * Polish
137 * Russian
137 * Russian
138 * Spanish
138 * Spanish
139
139
140 Licencing Information
140 Licencing Information
141 ---------------------
141 ---------------------
142
142
143 * See licencing information `here`_
143 * See licencing information `here`_
144
144
145 Peer-to-peer Failover Support
145 Peer-to-peer Failover Support
146 -----------------------------
146 -----------------------------
147
147
148 * Yes
148 * Yes
149
149
150 Additional Binaries
151 -------------------
152
153 * Yes, see :ref:`rhodecode-nix-ref` for full details.
154
150
155 Remote Connectivity
151 Remote Connectivity
156 -------------------
152 -------------------
157
153
158 * Available
154 * Available
159
155
160 Executable Files
156 Executable Files
161 ----------------
157 ----------------
162
158
163 Windows: :file:`RhodeCode-installer-{version}.exe`
159 Windows: :file:`RhodeCode-installer-{version}.exe`
164
160
165 Deprecated Support
161 Deprecated Support
166 ------------------
162 ------------------
167
163
168 - Internet Explorer 8 support deprecated since version 3.7.0.
164 - Internet Explorer 8 support deprecated since version 3.7.0.
169 - Internet Explorer 9 support deprecated since version 3.8.0.
165 - Internet Explorer 9 support deprecated since version 3.8.0.
170
166
171 .. _here: https://rhodecode.com/licenses/
167 .. _here: https://rhodecode.com/licenses/
172 .. _Transifex: https://explore.transifex.com/rhodecode/RhodeCode/
168 .. _Transifex: https://explore.transifex.com/rhodecode/RhodeCode/
@@ -1,88 +1,90 b''
1 .. _auth-saml-bulk-enroll-users-ref:
1 .. _auth-saml-bulk-enroll-users-ref:
2
2
3
3
4 Bulk enroll multiple existing users
4 Bulk enroll multiple existing users
5 -----------------------------------
5 -----------------------------------
6
6
7
7
8 RhodeCode Supports standard SAML 2.0 SSO for the web-application part.
8 RhodeCode Supports standard SAML 2.0 SSO for the web-application part.
9 Below is an example how to enroll list of all or some users to use SAML authentication.
9 Below is an example how to enroll list of all or some users to use SAML authentication.
10 This method simply enables SAML authentication for many users at once.
10 This method simply enables SAML authentication for many users at once.
11
11
12
12
13 From the server RhodeCode Enterprise is running run ishell on the instance which we
13 From the server RhodeCode Enterprise is running run ishell on the instance which we
14 want to apply the SAML migration::
14 want to apply the SAML migration::
15
15
16 rccontrol ishell enterprise-1
16 ./rcstack cli ishell
17
17
18 Follow these steps to enable SAML authentication for multiple users.
18 Follow these steps to enable SAML authentication for multiple users.
19
19
20
20
21 1) Create a user_id => attribute mapping
21 1) Create a user_id => attribute mapping
22
22
23
23
24 `saml2user` is a mapping of external ID from SAML provider such as OneLogin, DuoSecurity, Google.
24 `saml2user` is a mapping of external ID from SAML provider such as OneLogin, DuoSecurity, Google.
25 This mapping consists of local rhodecode user_id mapped to set of required attributes needed to bind SAML
25 This mapping consists of local rhodecode user_id mapped to set of required attributes needed to bind SAML
26 account to internal rhodecode user.
26 account to internal rhodecode user.
27 For example, 123 is local rhodecode user_id, and '48253211' is OneLogin ID.
27 For example, 123 is local rhodecode user_id, and '48253211' is OneLogin ID.
28 For other providers you'd have to figure out what would be the user-id, sometimes it's the email, i.e for Google
28 For other providers you'd have to figure out what would be the user-id, sometimes it's the email, i.e for Google
29 The most important this id needs to be unique for each user.
29 The most important this id needs to be unique for each user.
30
30
31 .. code-block:: python
31 .. code-block:: python
32
32
33 In [1]: saml2user = {
33 In [1]: saml2user = {
34 ...: # OneLogin, uses externalID available to read from in the UI
34 ...: # OneLogin, uses externalID available to read from in the UI
35 ...: 123: {'id': '48253211'},
35 ...: 123: {'id': '48253211'},
36 ...: # for Google/DuoSecurity email is also an option for unique ID
36 ...: # for Google/DuoSecurity email is also an option for unique ID
37 ...: 124: {'id': 'email@domain.com'},
37 ...: 124: {'id': 'email@domain.com'},
38 ...: }
38 ...: }
39
39
40
40
41 2) Import the plugin you want to run migration for.
41 2) Import the plugin you want to run migration for.
42
42
43 From available options pick only one and run the `import` statement
43 From available options pick only one and run the `import` statement
44
44
45 .. code-block:: python
45 .. code-block:: python
46
46
47 # for Duo Security
47 # for Duo Security
48 In [2]: from rc_auth_plugins.auth_duo_security import RhodeCodeAuthPlugin
48 In [2]: from rc_auth_plugins.auth_duo_security import RhodeCodeAuthPlugin
49 # for Azure Entra
50 In [2]: from rc_auth_plugins.auth_azure import RhodeCodeAuthPlugin
49 # for OneLogin
51 # for OneLogin
50 In [2]: from rc_auth_plugins.auth_onelogin import RhodeCodeAuthPlugin
52 In [2]: from rc_auth_plugins.auth_onelogin import RhodeCodeAuthPlugin
51 # generic SAML plugin
53 # generic SAML plugin
52 In [2]: from rc_auth_plugins.auth_saml import RhodeCodeAuthPlugin
54 In [2]: from rc_auth_plugins.auth_saml import RhodeCodeAuthPlugin
53
55
54 3) Run the migration based on saml2user mapping.
56 3) Run the migration based on saml2user mapping.
55
57
56 Enter in the ishell prompt
58 Enter in the ishell prompt
57
59
58 .. code-block:: python
60 .. code-block:: python
59
61
60 In [3]: for user in User.get_all():
62 In [3]: for user in User.get_all():
61 ...: existing_identity = ExternalIdentity().query().filter(ExternalIdentity.local_user_id == user.user_id).scalar()
63 ...: existing_identity = ExternalIdentity().query().filter(ExternalIdentity.local_user_id == user.user_id).scalar()
62 ...: attrs = saml2user.get(user.user_id)
64 ...: attrs = saml2user.get(user.user_id)
63 ...: provider = RhodeCodeAuthPlugin.uid
65 ...: provider = RhodeCodeAuthPlugin.uid
64 ...: if existing_identity:
66 ...: if existing_identity:
65 ...: print('Identity for user `{}` already exists, skipping'.format(user.username))
67 ...: print(f'Identity for user `{user.username}` already exists, skipping')
66 ...: continue
68 ...: continue
67 ...: if attrs:
69 ...: if attrs:
68 ...: external_id = attrs['id']
70 ...: external_id = attrs['id']
69 ...: new_external_identity = ExternalIdentity()
71 ...: new_external_identity = ExternalIdentity()
70 ...: new_external_identity.external_id = external_id
72 ...: new_external_identity.external_id = external_id
71 ...: new_external_identity.external_username = '{}-saml-{}'.format(user.username, user.user_id)
73 ...: new_external_identity.external_username = f'{user.username}-saml-{user.user_id}'
72 ...: new_external_identity.provider_name = provider
74 ...: new_external_identity.provider_name = provider
73 ...: new_external_identity.local_user_id = user.user_id
75 ...: new_external_identity.local_user_id = user.user_id
74 ...: new_external_identity.access_token = ''
76 ...: new_external_identity.access_token = ''
75 ...: new_external_identity.token_secret = ''
77 ...: new_external_identity.token_secret = ''
76 ...: new_external_identity.alt_token = ''
78 ...: new_external_identity.alt_token = ''
77 ...: Session().add(ex_identity)
79 ...: Session().add(ex_identity)
78 ...: Session().commit()
80 ...: Session().commit()
79 ...: print('Set user `{}` external identity bound to ExternalID:{}'.format(user.username, external_id))
81 ...: print(f'Set user `{user.username}` external identity bound to ExternalID:{external_id}')
80
82
81 .. note::
83 .. note::
82
84
83 saml2user can be really big and hard to maintain in ishell. It's also possible
85 saml2user can be really big and hard to maintain in ishell. It's also possible
84 to load it as a JSON file prepared before and stored on disk. To do so run::
86 to load it as a JSON file prepared before and stored on disk. To do so run::
85
87
86 import json
88 import json
87 saml2user = json.loads(open('/path/to/saml2user.json','rb').read())
89 saml2user = json.loads(open('/path/to/saml2user.json','rb').read())
88
90
@@ -1,105 +1,161 b''
1 .. _config-saml-duosecurity-ref:
1 .. _config-saml-duosecurity-ref:
2
2
3
3
4 SAML 2.0 with Duo Security
4 SAML 2.0 with Duo Security
5 --------------------------
5 --------------------------
6
6
7 **This plugin is available only in EE Edition.**
7 **This plugin is available only in EE Edition.**
8
8
9 |RCE| supports SAML 2.0 Authentication with Duo Security provider. This allows
9 |RCE| supports SAML 2.0 Authentication with Duo Security provider. This allows
10 users to log-in to RhodeCode via SSO mechanism of external identity provider
10 users to log-in to RhodeCode via SSO mechanism of external identity provider
11 such as Duo. The login can be triggered either by the external IDP, or internally
11 such as Duo. The login can be triggered either by the external IDP, or internally
12 by clicking specific authentication button on the log-in page.
12 by clicking specific authentication button on the log-in page.
13
13
14
14
15 Configuration steps
15 Configuration steps
16 ^^^^^^^^^^^^^^^^^^^
16 ^^^^^^^^^^^^^^^^^^^
17
17
18 To configure Duo Security SAML authentication, use the following steps:
18 To configure Duo Security SAML authentication, use the following steps:
19
19
20 1. From the |RCE| interface, select
20 1. From the |RCE| interface, select
21 :menuselection:`Admin --> Authentication`
21 :menuselection:`Admin --> Authentication`
22 2. Activate the `Duo Security` plugin and select :guilabel:`Save`
22 2. Activate the `Duo Security` plugin and select :guilabel:`Save`
23 3. Go to newly available menu option called `Duo Security` on the left side.
23 3. Go to newly available menu option called `Duo Security` on the left side.
24 4. Check the `enabled` check box in the plugin configuration section,
24 4. Check the `enabled` check box in the plugin configuration section,
25 and fill in the required SAML information and :guilabel:`Save`, for more details,
25 and fill in the required SAML information and :guilabel:`Save`, for more details,
26 see :ref:`config-saml-duosecurity`
26 see :ref:`config-saml-duosecurity`
27
27
28
28
29 .. _config-saml-duosecurity:
29 .. _config-saml-duosecurity:
30
30
31
31
32 Example SAML Duo Security configuration
32 Example SAML Duo Security configuration
33 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
33 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
34
34
35 Example configuration for SAML 2.0 with Duo Security provider::
35 Example configuration for SAML 2.0 with Duo Security provider
36
37
38 Enabled
39 `True`:
36
40
37 *option*: `enabled` => `True`
41 .. note::
38 # Enable or disable this authentication plugin.
42 Enable or disable this authentication plugin.
43
44
45 Auth Cache TTL
46 `30`:
39
47
40 *option*: `cache_ttl` => `0`
48 .. note::
41 # Amount of seconds to cache the authentication and permissions check response call for this plugin.
49 Amount of seconds to cache the authentication and permissions check response call for this plugin.
42 # Useful for expensive calls like LDAP to improve the performance of the system (0 means disabled).
50 Useful for expensive calls like LDAP to improve the performance of the system (0 means disabled).
51
52 Debug
53 `True`:
43
54
44 *option*: `debug` => `True`
55 .. note::
45 # Enable or disable debug mode that shows SAML errors in the RhodeCode logs.
56 Enable or disable debug mode that shows SAML errors in the RhodeCode logs.
57
58
59 Auth button name
60 `Azure Entra ID`:
46
61
47 *option*: `entity_id` => `http://rc-app.com/dag/saml2/idp/metadata.php`
62 .. note::
48 # Identity Provider entity/metadata URI.
63 Alternative authentication display name. E.g AzureAuth, CorporateID etc.
49 # E.g. https://duo-gateway.com/dag/saml2/idp/metadata.php
64
65
66 Entity ID
67 `https://my-duo-gateway.com/dag/saml2/idp/metadata.php`:
68
69 .. note::
70 Identity Provider entity/metadata URI.
71 E.g. https://duo-gateway.com/dag/saml2/idp/metadata.php
72
73 SSO URL
74 `https://duo-gateway.com/dag/saml2/idp/SSOService.php?spentityid=<metadata_entity_id>`:
50
75
51 *option*: `sso_service_url` => `http://rc-app.com/dag/saml2/idp/SSOService.php?spentityid=http://rc.local.pl/_admin/auth/duosecurity/saml-metadata`
76 .. note::
52 # SSO (SingleSignOn) endpoint URL of the IdP. This can be used to initialize login
77 SSO (SingleSignOn) endpoint URL of the IdP. This can be used to initialize login, Known also as Login URL
53 # E.g. https://duo-gateway.com/dag/saml2/idp/SSOService.php?spentityid=<metadata_entity_id>
78 E.g. http://rc-app.com/dag/saml2/idp/SSOService.php?spentityid=https://docker-dev/_admin/auth/duosecurity/saml-metadata
79
80 SLO URL
81 `https://duo-gateway.com/dag/saml2/idp/SingleLogoutService.php?ReturnTo=<return_url>`:
54
82
55 *option*: `slo_service_url` => `http://rc-app.com/dag/saml2/idp/SingleLogoutService.php?ReturnTo=http://rc-app.com/dag/module.php/duosecurity/logout.php`
83 .. note::
56 # SLO (SingleLogout) endpoint URL of the IdP.
84 SLO (SingleLogout) endpoint URL of the IdP. , Known also as Logout URL
57 # E.g. https://duo-gateway.com/dag/saml2/idp/SingleLogoutService.php?ReturnTo=http://duo-gateway.com/_admin/saml/sign-out-endpoint
85 E.g. http://rc-app.com/dag/saml2/idp/SingleLogoutService.php?ReturnTo=https://docker-dev/_admin/auth/duosecurity/saml-sign-out-endpoint
58
86
59 *option*: `x509cert` => `<CERTIFICATE_STRING>`
87 x509cert
60 # Identity provider public x509 certificate. It will be converted to single-line format without headers
88 `<CERTIFICATE_STRING>`:
61
89
62 *option*: `name_id_format` => `sha-1`
90 .. note::
63 # The format that specifies how the NameID is sent to the service provider.
91 Identity provider public x509 certificate. It will be converted to single-line format without headers.
92 Download the raw base64 encoded certificate from the Identity provider and paste it here.
93
94 SAML Signature
95 `sha-256`:
96
97 .. note::
98 Type of Algorithm to use for verification of SAML signature on Identity provider side.
99
100 SAML Digest
101 `sha-256`:
64
102
65 *option*: `signature_algo` => `sha-256`
103 .. note::
66 # Type of Algorithm to use for verification of SAML signature on Identity provider side
104 Type of Algorithm to use for verification of SAML digest on Identity provider side.
105
106 Service Provider Cert Dir
107 `/etc/rhodecode/conf/saml_ssl/`:
67
108
68 *option*: `digest_algo` => `sha-256`
109 .. note::
69 # Type of Algorithm to use for verification of SAML digest on Identity provider side
110 Optional directory to store service provider certificate and private keys.
111 Expected certs for the SP should be stored in this folder as:
112
113 * sp.key Private Key
114 * sp.crt Public cert
115 * sp_new.crt Future Public cert
116
117 Also you can use other cert to sign the metadata of the SP using the:
70
118
71 *option*: `cert_dir` => `/etc/saml/`
119 * metadata.key
72 # Optional directory to store service provider certificate and private keys.
120 * metadata.crt
73 # Expected certs for the SP should be stored in this folder as:
121
74 # * sp.key Private Key
122 Expected NameID Format
75 # * sp.crt Public cert
123 `nameid-format:emailAddress`:
76 # * sp_new.crt Future Public cert
124
77 #
125 .. note::
78 # Also you can use other cert to sign the metadata of the SP using the:
126 The format that specifies how the NameID is sent to the service provider.
79 # * metadata.key
127
80 # * metadata.crt
128 User ID Attribute
129 `PersonImmutableID`:
81
130
82 *option*: `user_id_attribute` => `PersonImmutableID`
131 .. note::
83 # User ID Attribute name. This defines which attribute in SAML response will be used to link accounts via unique id.
132 User ID Attribute name. This defines which attribute in SAML response will be used to link accounts via unique id.
84 # Ensure this is returned from DuoSecurity for example via duo_username
133 Ensure this is returned from DuoSecurity for example via duo_username.
134
135 Username Attribute
136 `User.username`:
85
137
86 *option*: `username_attribute` => `User.username`
138 .. note::
87 # Username Attribute name. This defines which attribute in SAML response will map to an username.
139 Username Attribute name. This defines which attribute in SAML response will map to a username.
88
140
89 *option*: `email_attribute` => `User.email`
141 Email Attribute
90 # Email Attribute name. This defines which attribute in SAML response will map to an email address.
142 `User.email`:
143
144 .. note::
145 Email Attribute name. This defines which attribute in SAML response will map to an email address.
146
91
147
92
148
93 Below is example setup from DUO Administration page that can be used with above config.
149 Below is example setup from DUO Administration page that can be used with above config.
94
150
95 .. image:: ../images/saml-duosecurity-service-provider-example.png
151 .. image:: ../images/saml-duosecurity-service-provider-example.png
96 :alt: DUO Security SAML setup example
152 :alt: DUO Security SAML setup example
97 :scale: 50 %
153 :scale: 50 %
98
154
99
155
100 Below is an example attribute mapping set for IDP provider required by the above config.
156 Below is an example attribute mapping set for IDP provider required by the above config.
101
157
102
158
103 .. image:: ../images/saml-duosecurity-attributes-example.png
159 .. image:: ../images/saml-duosecurity-attributes-example.png
104 :alt: DUO Security SAML setup example
160 :alt: DUO Security SAML setup example
105 :scale: 50 % No newline at end of file
161 :scale: 50 %
@@ -1,19 +1,20 b''
1 .. _config-saml-generic-ref:
1 .. _config-saml-generic-ref:
2
2
3
3
4 SAML 2.0 Authentication
4 SAML 2.0 Authentication
5 -----------------------
5 -----------------------
6
6
7
7
8 **This plugin is available only in EE Edition.**
8 **This plugin is available only in EE Edition.**
9
9
10 RhodeCode Supports standard SAML 2.0 SSO for the web-application part.
10 RhodeCode Supports standard SAML 2.0 SSO for the web-application part.
11
11
12 Please check for reference two example providers:
12 Please check for reference two example providers:
13
13
14 .. toctree::
14 .. toctree::
15
15
16 auth-saml-duosecurity
16 auth-saml-duosecurity
17 auth-saml-onelogin
17 auth-saml-onelogin
18 auth-saml-azure
18 auth-saml-bulk-enroll-users
19 auth-saml-bulk-enroll-users
19
20
@@ -1,106 +1,161 b''
1 .. _config-saml-onelogin-ref:
1 .. _config-saml-onelogin-ref:
2
2
3
3
4 SAML 2.0 with One Login
4 SAML 2.0 with One Login
5 -----------------------
5 -----------------------
6
6
7 **This plugin is available only in EE Edition.**
7 **This plugin is available only in EE Edition.**
8
8
9 |RCE| supports SAML 2.0 Authentication with OneLogin provider. This allows
9 |RCE| supports SAML 2.0 Authentication with OneLogin provider. This allows
10 users to log-in to RhodeCode via SSO mechanism of external identity provider
10 users to log-in to RhodeCode via SSO mechanism of external identity provider
11 such as OneLogin. The login can be triggered either by the external IDP, or internally
11 such as OneLogin. The login can be triggered either by the external IDP, or internally
12 by clicking specific authentication button on the log-in page.
12 by clicking specific authentication button on the log-in page.
13
13
14
14
15 Configuration steps
15 Configuration steps
16 ^^^^^^^^^^^^^^^^^^^
16 ^^^^^^^^^^^^^^^^^^^
17
17
18 To configure OneLogin SAML authentication, use the following steps:
18 To configure OneLogin SAML authentication, use the following steps:
19
19
20 1. From the |RCE| interface, select
20 1. From the |RCE| interface, select
21 :menuselection:`Admin --> Authentication`
21 :menuselection:`Admin --> Authentication`
22 2. Activate the `OneLogin` plugin and select :guilabel:`Save`
22 2. Activate the `OneLogin` plugin and select :guilabel:`Save`
23 3. Go to newly available menu option called `OneLogin` on the left side.
23 3. Go to newly available menu option called `OneLogin` on the left side.
24 4. Check the `enabled` check box in the plugin configuration section,
24 4. Check the `enabled` check box in the plugin configuration section,
25 and fill in the required SAML information and :guilabel:`Save`, for more details,
25 and fill in the required SAML information and :guilabel:`Save`, for more details,
26 see :ref:`config-saml-onelogin`
26 see :ref:`config-saml-onelogin`
27
27
28
28
29 .. _config-saml-onelogin:
29 .. _config-saml-onelogin:
30
30
31
31
32 Example SAML OneLogin configuration
32 Example SAML OneLogin configuration
33 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
33 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
34
34
35 Example configuration for SAML 2.0 with OneLogin provider::
35 Example configuration for SAML 2.0 with OneLogin provider
36
37
38 Enabled
39 `True`:
36
40
37 *option*: `enabled` => `True`
41 .. note::
38 # Enable or disable this authentication plugin.
42 Enable or disable this authentication plugin.
43
44
45 Auth Cache TTL
46 `30`:
39
47
40 *option*: `cache_ttl` => `0`
48 .. note::
41 # Amount of seconds to cache the authentication and permissions check response call for this plugin.
49 Amount of seconds to cache the authentication and permissions check response call for this plugin.
42 # Useful for expensive calls like LDAP to improve the performance of the system (0 means disabled).
50 Useful for expensive calls like LDAP to improve the performance of the system (0 means disabled).
51
52 Debug
53 `True`:
43
54
44 *option*: `debug` => `True`
55 .. note::
45 # Enable or disable debug mode that shows SAML errors in the RhodeCode logs.
56 Enable or disable debug mode that shows SAML errors in the RhodeCode logs.
57
58
59 Auth button name
60 `Azure Entra ID`:
46
61
47 *option*: `entity_id` => `https://app.onelogin.com/saml/metadata/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx`
62 .. note::
48 # Identity Provider entity/metadata URI.
63 Alternative authentication display name. E.g AzureAuth, CorporateID etc.
49 # E.g. https://app.onelogin.com/saml/metadata/<onelogin_connector_id>
64
65
66 Entity ID
67 `https://app.onelogin.com/saml/metadata/<onelogin_connector_id>`:
68
69 .. note::
70 Identity Provider entity/metadata URI.
71 E.g. https://app.onelogin.com/saml/metadata/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
72
73 SSO URL
74 `https://app.onelogin.com/trust/saml2/http-post/sso/<onelogin_connector_id>`:
50
75
51 *option*: `sso_service_url` => `https://customer-domain.onelogin.com/trust/saml2/http-post/sso/xxxxxx`
76 .. note::
52 # SSO (SingleSignOn) endpoint URL of the IdP. This can be used to initialize login
77 SSO (SingleSignOn) endpoint URL of the IdP. This can be used to initialize login, Known also as Login URL
53 # E.g. https://app.onelogin.com/trust/saml2/http-post/sso/<onelogin_connector_id>
78 E.g. https://app.onelogin.com/trust/saml2/http-post/sso/<onelogin_connector_id>
79
80 SLO URL
81 `https://app.onelogin.com/trust/saml2/http-redirect/slo/<onelogin_connector_id>`:
54
82
55 *option*: `slo_service_url` => `https://customer-domain.onelogin.com/trust/saml2/http-redirect/slo/xxxxxx`
83 .. note::
56 # SLO (SingleLogout) endpoint URL of the IdP.
84 SLO (SingleLogout) endpoint URL of the IdP. , Known also as Logout URL
57 # E.g. https://app.onelogin.com/trust/saml2/http-redirect/slo/<onelogin_connector_id>
85 E.g. https://app.onelogin.com/trust/saml2/http-redirect/slo/<onelogin_connector_id>
58
86
59 *option*: `x509cert` => `<CERTIFICATE_STRING>`
87 x509cert
60 # Identity provider public x509 certificate. It will be converted to single-line format without headers
88 `<CERTIFICATE_STRING>`:
61
89
62 *option*: `name_id_format` => `sha-1`
90 .. note::
63 # The format that specifies how the NameID is sent to the service provider.
91 Identity provider public x509 certificate. It will be converted to single-line format without headers.
92 Download the raw base64 encoded certificate from the Identity provider and paste it here.
93
94 SAML Signature
95 `sha-256`:
96
97 .. note::
98 Type of Algorithm to use for verification of SAML signature on Identity provider side.
99
100 SAML Digest
101 `sha-256`:
64
102
65 *option*: `signature_algo` => `sha-256`
103 .. note::
66 # Type of Algorithm to use for verification of SAML signature on Identity provider side
104 Type of Algorithm to use for verification of SAML digest on Identity provider side.
105
106 Service Provider Cert Dir
107 `/etc/rhodecode/conf/saml_ssl/`:
67
108
68 *option*: `digest_algo` => `sha-256`
109 .. note::
69 # Type of Algorithm to use for verification of SAML digest on Identity provider side
110 Optional directory to store service provider certificate and private keys.
111 Expected certs for the SP should be stored in this folder as:
112
113 * sp.key Private Key
114 * sp.crt Public cert
115 * sp_new.crt Future Public cert
70
116
71 *option*: `cert_dir` => `/etc/saml/`
117 Also you can use other cert to sign the metadata of the SP using the:
72 # Optional directory to store service provider certificate and private keys.
118
73 # Expected certs for the SP should be stored in this folder as:
119 * metadata.key
74 # * sp.key Private Key
120 * metadata.crt
75 # * sp.crt Public cert
121
76 # * sp_new.crt Future Public cert
122 Expected NameID Format
77 #
123 `nameid-format:emailAddress`:
78 # Also you can use other cert to sign the metadata of the SP using the:
124
79 # * metadata.key
125 .. note::
80 # * metadata.crt
126 The format that specifies how the NameID is sent to the service provider.
127
128 User ID Attribute
129 `PersonImmutableID`:
81
130
82 *option*: `user_id_attribute` => `PersonImmutableID`
131 .. note::
83 # User ID Attribute name. This defines which attribute in SAML response will be used to link accounts via unique id.
132 User ID Attribute name. This defines which attribute in SAML response will be used to link accounts via unique id.
84 # Ensure this is returned from OneLogin for example via Internal ID
133 Ensure this is returned from DuoSecurity for example via duo_username.
134
135 Username Attribute
136 `User.username`:
85
137
86 *option*: `username_attribute` => `User.username`
138 .. note::
87 # Username Attribute name. This defines which attribute in SAML response will map to an username.
139 Username Attribute name. This defines which attribute in SAML response will map to a username.
88
140
89 *option*: `email_attribute` => `User.email`
141 Email Attribute
90 # Email Attribute name. This defines which attribute in SAML response will map to an email address.
142 `User.email`:
143
144 .. note::
145 Email Attribute name. This defines which attribute in SAML response will map to an email address.
91
146
92
147
93
148
94 Below is example setup that can be used with OneLogin SAML authentication that can be used with above config..
149 Below is example setup that can be used with OneLogin SAML authentication that can be used with above config..
95
150
96 .. image:: ../images/saml-onelogin-config-example.png
151 .. image:: ../images/saml-onelogin-config-example.png
97 :alt: OneLogin SAML setup example
152 :alt: OneLogin SAML setup example
98 :scale: 50 %
153 :scale: 50 %
99
154
100
155
101 Below is an example attribute mapping set for IDP provider required by the above config.
156 Below is an example attribute mapping set for IDP provider required by the above config.
102
157
103
158
104 .. image:: ../images/saml-onelogin-attributes-example.png
159 .. image:: ../images/saml-onelogin-attributes-example.png
105 :alt: OneLogin SAML setup example
160 :alt: OneLogin SAML setup example
106 :scale: 50 % No newline at end of file
161 :scale: 50 %
@@ -1,34 +1,35 b''
1 .. _authentication-ref:
1 .. _authentication-ref:
2
2
3 Authentication Options
3 Authentication Options
4 ======================
4 ======================
5
5
6 |RCE| provides a built in authentication against its own database. This is
6 |RCE| provides a built in authentication against its own database. This is
7 implemented using ``RhodeCode Internal`` plugin. This plugin is enabled by default.
7 implemented using ``RhodeCode Internal`` plugin. This plugin is enabled by default.
8 Additionally, |RCE| provides a Pluggable Authentication System. This gives the
8 Additionally, |RCE| provides a Pluggable Authentication System. This gives the
9 administrator greater control over how users authenticate with the system.
9 administrator greater control over how users authenticate with the system.
10
10
11 .. important::
11 .. important::
12
12
13 You can disable the built in |RCE| authentication plugin
13 You can disable the built in |RCE| authentication plugin
14 ``RhodeCode Internal`` and force all authentication to go
14 ``RhodeCode Internal`` and force all authentication to go
15 through your authentication plugin of choice e.g LDAP only.
15 through your authentication plugin of choice e.g LDAP only.
16 However, if you do this, and your external authentication tools fails,
16 However, if you do this, and your external authentication tools fails,
17 accessing |RCE| will be blocked unless a fallback plugin is
17 accessing |RCE| will be blocked unless a fallback plugin is
18 enabled via :file: rhodecode.ini
18 enabled via :file: rhodecode.ini
19
19
20
20
21 |RCE| comes with the following user authentication management plugins:
21 |RCE| comes with the following user authentication management plugins:
22
22
23
23
24 .. toctree::
24 .. toctree::
25
25
26 auth-token
26 auth-token
27 auth-ldap
27 auth-ldap
28 auth-ldap-groups
28 auth-ldap-groups
29 auth-saml-generic
29 auth-saml-generic
30 auth-saml-onelogin
30 auth-saml-onelogin
31 auth-saml-duosecurity
31 auth-saml-duosecurity
32 auth-saml-azure
32 auth-crowd
33 auth-crowd
33 auth-pam
34 auth-pam
34 ssh-connection
35 ssh-connection
@@ -1,243 +1,14 b''
1 .. _dev-setup:
1 .. _dev-setup:
2
2
3 ===================
3 ===================
4 Development setup
4 Development setup
5 ===================
5 ===================
6
6
7
7 Please refer to RCstack installed documentation for instructions on setting up dev environment:
8 RhodeCode Enterprise runs inside a Nix managed environment. This ensures build
8 https://docs.rhodecode.com/rcstack/dev/dev-setup.html
9 environment dependencies are correctly declared and installed during setup.
10 It also enables atomic upgrades, rollbacks, and multiple instances of RhodeCode
11 Enterprise running with isolation.
12
13 To set up RhodeCode Enterprise inside the Nix environment, use the following steps:
14
15
16
17 Setup Nix Package Manager
18 -------------------------
19
20 To install the Nix Package Manager, please run::
21
22 $ curl https://releases.nixos.org/nix/nix-2.3.4/install | sh
23
24 or go to https://nixos.org/nix/ and follow the installation instructions.
25 Once this is correctly set up on your system, you should be able to use the
26 following commands:
27
28 * `nix-env`
29
30 * `nix-shell`
31
32
33 .. tip::
34
35 Update your channels frequently by running ``nix-channel --update``.
36
37 .. note::
38
39 To uninstall nix run the following:
40
41 remove the . "$HOME/.nix-profile/etc/profile.d/nix.sh" line in your ~/.profile or ~/.bash_profile
42 rm -rf $HOME/{.nix-channels,.nix-defexpr,.nix-profile,.config/nixpkgs}
43 sudo rm -rf /nix
44
45 Switch nix to the latest STABLE channel
46 ---------------------------------------
47
48 run::
49
50 nix-channel --add https://nixos.org/channels/nixos-20.03 nixpkgs
51
52 Followed by::
53
54 nix-channel --update
55 nix-env -i nix-2.3.4
56
57
58 Install required binaries
59 -------------------------
60
61 We need some handy tools first.
62
63 run::
64
65 nix-env -i nix-prefetch-hg
66 nix-env -i nix-prefetch-git
67
68
69 Speed up JS build by installing PhantomJS
70 -----------------------------------------
71
72 PhantomJS will be downloaded each time nix-shell is invoked. To speed this by
73 setting already downloaded version do this::
74
75 nix-env -i phantomjs-2.1.1
76
77 # and set nix bin path
78 export PATH=$PATH:~/.nix-profile/bin
79
80
81 Clone the required repositories
82 -------------------------------
83
84 After Nix is set up, clone the RhodeCode Enterprise Community Edition and
85 RhodeCode VCSServer repositories into the same directory.
86 RhodeCode currently is using Mercurial Version Control System, please make sure
87 you have it installed before continuing.
88
89 To obtain the required sources, use the following commands::
90
91 mkdir rhodecode-develop && cd rhodecode-develop
92 hg clone -u default https://code.rhodecode.com/rhodecode-enterprise-ce
93 hg clone -u default https://code.rhodecode.com/rhodecode-vcsserver
94
95 .. note::
96
97 If you cannot clone the repository, please contact us via support@rhodecode.com
98
99
100 Install some required libraries
101 -------------------------------
102
103 There are some required drivers and dev libraries that we need to install to
104 test RhodeCode under different types of databases. For example in Ubuntu we
105 need to install the following.
106
107 required libraries::
108
109 # svn related
110 sudo apt-get install libapr1-dev libaprutil1-dev
111 sudo apt-get install libsvn-dev
112 # libcurl required too
113 sudo apt-get install libcurl4-openssl-dev
114 # mysql/pg server for development, optional
115 sudo apt-get install mysql-server libmysqlclient-dev
116 sudo apt-get install postgresql postgresql-contrib libpq-dev
117
118
119
120 Enter the Development Shell
121 ---------------------------
122
123 The final step is to start the development shells. To do this, run the
124 following command from inside the cloned repository::
125
126 # first, the vcsserver
127 cd ~/rhodecode-vcsserver
128 nix-shell
129
130 # then enterprise sources
131 cd ~/rhodecode-enterprise-ce
132 nix-shell
133
134 .. note::
135
136 On the first run, this will take a while to download and optionally compile
137 a few things. The following runs will be faster. The development shell works
138 fine on both MacOS and Linux platforms.
139
140
141 Create config.nix for development
142 ---------------------------------
143
144 In order to run proper tests and setup linking across projects, a config.nix
145 file needs to be setup::
146
147 # create config
148 mkdir -p ~/.nixpkgs
149 touch ~/.nixpkgs/config.nix
150
151 # put the below content into the ~/.nixpkgs/config.nix file
152 # adjusts, the path to where you cloned your repositories.
153
154 {
155 rc = {
156 sources = {
157 rhodecode-vcsserver = "/home/dev/rhodecode-vcsserver";
158 rhodecode-enterprise-ce = "/home/dev/rhodecode-enterprise-ce";
159 rhodecode-enterprise-ee = "/home/dev/rhodecode-enterprise-ee";
160 };
161 };
162 }
163
164
165
166 Creating a Development Configuration
167 ------------------------------------
168
169 To create a development environment for RhodeCode Enterprise,
170 use the following steps:
171
172 1. Create a copy of vcsserver config:
173 `cp ~/rhodecode-vcsserver/configs/development.ini ~/rhodecode-vcsserver/configs/dev.ini`
174 2. Create a copy of rhodocode config:
175 `cp ~/rhodecode-enterprise-ce/configs/development.ini ~/rhodecode-enterprise-ce/configs/dev.ini`
176 3. Adjust the configuration settings to your needs if needed.
177
178 .. note::
179
180 It is recommended to use the name `dev.ini` since it's included in .hgignore file.
181
182
183 Setup the Development Database
184 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
185
186 To create a development database, use the following example. This is a one
187 time operation executed from the nix-shell of rhodecode-enterprise-ce sources ::
188
189 rc-setup-app dev.ini \
190 --user=admin --password=secret \
191 --email=admin@example.com \
192 --repos=~/my_dev_repos
193
194
195 Compile CSS and JavaScript
196 ^^^^^^^^^^^^^^^^^^^^^^^^^^
197
198 To use the application's frontend and prepare it for production deployment,
199 you will need to compile the CSS and JavaScript with Grunt.
200 This is easily done from within the nix-shell using the following command::
201
202 make web-build
203
204 When developing new features you will need to recompile following any
205 changes made to the CSS or JavaScript files when developing the code::
206
207 grunt watch
208
209 This prepares the development (with comments/whitespace) versions of files.
210
211 Start the Development Servers
212 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
213
214 From the rhodecode-vcsserver directory, start the development server in another
215 nix-shell, using the following command::
216
217 pserve configs/dev.ini
218
219 In the adjacent nix-shell which you created for your development server, you may
220 now start CE with the following command::
221
222
223 pserve --reload configs/dev.ini
224
225 .. note::
226
227 `--reload` flag will automatically reload the server when source file changes.
228
229
230 Run the Environment Tests
231 ^^^^^^^^^^^^^^^^^^^^^^^^^
232
233 Please make sure that the tests are passing to verify that your environment is
234 set up correctly. RhodeCode uses py.test to run tests.
235 While your instance is running, start a new nix-shell and simply run
236 ``make test`` to run the basic test suite.
237
238
9
239 Need Help?
10 Need Help?
240 ^^^^^^^^^^
11 ^^^^^^^^^^
241
12
242 Join us on Slack via https://rhodecode.com/join or post questions in our
13 Join us on Slack via https://rhodecode.com/join or post questions in our
243 Community Portal at https://community.rhodecode.com
14 Community Portal at https://community.rhodecode.com
@@ -1,92 +1,104 b''
1 |RCE|
1 |RCE|
2 =====
2 =====
3
3
4 |RCE| is a high-performance source code management and collaboration system.
4 |RCE| is a high-performance source code management and collaboration system.
5 It enables you to develop projects securely behind the firewall while
5 It enables you to develop projects securely behind the firewall while
6 providing collaboration tools that work with |git|, |hg|,
6 providing collaboration tools that work with |git|, |hg|,
7 and |svn| |repos|. The user interface allows you to create, edit,
7 and |svn| |repos|. The user interface allows you to create, edit,
8 and commit files and |repos| while managing their security permissions.
8 and commit files and |repos| while managing their security permissions.
9
9
10 |RCE| provides the following features:
10 |RCE| provides the following features:
11
11
12 * Source code management.
12 * Source code management.
13 * Extended permissions management.
13 * Extended permissions management.
14 * Integrated code collaboration tools.
14 * Integrated code collaboration tools.
15 * Integrated code review and notifications.
15 * Integrated code review and notifications.
16 * Scalability provided by multi-node setup.
16 * Scalability provided by multi-node setup.
17 * Fully programmable automation API.
17 * Fully programmable automation API.
18 * Web-based hook management.
18 * Web-based hook management.
19 * Native |svn| support.
19 * Native |svn| support.
20 * Migration from existing databases.
20 * Migration from existing databases.
21 * |RCE| SDK.
21 * |RCE| SDK.
22 * Built-in analytics
22 * Built-in analytics
23 * Built in integrations including: Slack, Webhooks (used for Jenkins/TeamCity and other CIs), Jira, Redmine, Hipchat
23 * Built in integrations including: Slack, Webhooks (used for Jenkins/TeamCity and other CIs), Jira, Redmine, Hipchat
24 * Pluggable authentication system.
24 * Pluggable authentication system.
25 * Support for AD, |LDAP|, Crowd, CAS, PAM.
25 * Support for AD, |LDAP|, Crowd, CAS, PAM.
26 * Support for external authentication via Oauth Google, Github, Bitbucket, Twitter.
26 * Support for external authentication via Oauth Google, Github, Bitbucket, Twitter.
27 * Debug modes of operation.
27 * Debug modes of operation.
28 * Private and public gists.
28 * Private and public gists.
29 * Gists with limited lifetimes and within instance only sharing.
29 * Gists with limited lifetimes and within instance only sharing.
30 * Fully integrated code search function.
30 * Fully integrated code search function.
31 * Always on SSL connectivity.
31 * Always on SSL connectivity.
32
32
33 .. only:: html
33 .. only:: html
34
34
35 Table of Contents
35 Table of Contents
36 -----------------
36 -----------------
37
37
38 .. toctree::
38 .. toctree::
39 :maxdepth: 1
39 :maxdepth: 1
40 :caption: Documentation directory
41
42 Back to documentation directory <https://docs.rhodecode.com/>
43
44 .. toctree::
45 :maxdepth: 1
46 :caption: RhodeCode RCstack Documentation
47
48 RhodeCode RCstack Installer <https://docs.rhodecode.com/rcstack/>
49
50 .. toctree::
51 :maxdepth: 1
40 :caption: Admin Documentation
52 :caption: Admin Documentation
41
53
42 install/quick-start
54 install/quick-start
43 install/install-database
55 install/install-database
44 install/install-steps
56 install/install-steps
45 admin/system-overview
57 admin/system-overview
46 admin/system-admin
58 admin/system-admin
47 admin/user-admin
59 admin/user-admin
48 admin/repo-admin
60 admin/repo-admin
49 admin/security-tips
61 admin/security-tips
50 auth/auth
62 auth/auth
51 issue-trackers/issue-trackers
63 issue-trackers/issue-trackers
52 admin/lab-settings
64 admin/lab-settings
53
65
54 .. toctree::
66 .. toctree::
55 :maxdepth: 1
67 :maxdepth: 1
56 :caption: Feature Documentation
68 :caption: Feature Documentation
57
69
58 collaboration/collaboration
70 collaboration/collaboration
59 collaboration/review-notifications
71 collaboration/review-notifications
60 collaboration/pull-requests
72 collaboration/pull-requests
61 code-review/code-review
73 code-review/code-review
62 integrations/integrations
74 integrations/integrations
63
75
64 .. toctree::
76 .. toctree::
65 :maxdepth: 1
77 :maxdepth: 1
66 :caption: User Documentation
78 :caption: User Documentation
67
79
68 usage/basic-usage
80 usage/basic-usage
69 tutorials/tutorials
81 tutorials/tutorials
70
82
71 .. toctree::
83 .. toctree::
72 :maxdepth: 1
84 :maxdepth: 1
73 :caption: Developer Documentation
85 :caption: Developer Documentation
74
86
75 api/api
87 api/api
76 tools/rhodecode-tools
88 tools/rhodecode-tools
77 extensions/extensions-hooks
89 extensions/extensions-hooks
78 contributing/contributing
90 contributing/contributing
79
91
80 .. toctree::
92 .. toctree::
81 :maxdepth: 2
93 :maxdepth: 2
82 :caption: RhodeCode rcstack Documentation
94 :caption: RhodeCode rcstack Documentation
83
95
84 RhodeCode Installer <https://docs.rhodecode.com/rcstack/>
96 RhodeCode Installer <https://docs.rhodecode.com/rcstack/>
85
97
86 .. toctree::
98 .. toctree::
87 :maxdepth: 1
99 :maxdepth: 1
88 :caption: About
100 :caption: About
89
101
90 release-notes/release-notes
102 release-notes/release-notes
91 known-issues/known-issues
103 known-issues/known-issues
92 admin/glossary
104 admin/glossary
@@ -1,92 +1,93 b''
1 .. _quick-start:
1 .. _quick-start:
2
2
3 Quick Start Installation Guide
3 Quick Start Installation Guide
4 ==============================
4 ==============================
5
5
6 .. important::
6 .. important::
7
7
8 These are quick start instructions. To optimize your |RCE|,
8 These are quick start instructions. To optimize your |RCE|,
9 |RCC|, and |RCT| usage, read the more detailed instructions in our guides.
9 |RCC|, and |RCT| usage, read the more detailed instructions in our guides.
10 For detailed installation instructions, see
10 For detailed installation instructions, see
11 :ref:`RhodeCode rcstack Documentation <rcstack:installation>`
11 :ref:`RhodeCode rcstack Documentation <rcstack:installation>`
12
12
13
13
14
14
15 To get |RCE| up and running, run through the below steps:
15 To get |RCE| up and running, run through the below steps:
16
16
17 1. Register to get the latest |RCC| installer instruction from `rhodecode.com/download`_.
17 1. Register to get the latest |RCC| installer instruction from `rhodecode.com/download`_.
18 If you don't have an account, sign up at `rhodecode.com/register`_.
18 If you don't have an account, sign up at `rhodecode.com/register`_.
19
19
20 2. Run the |RCS| installer and start init process.
20 2. Run the |RCS| installer and start init process.
21 following example:
21 following example:
22
22
23 .. code-block:: bash
23 .. code-block:: bash
24
24
25 mkdir docker-rhodecode && cd docker-rhodecode
25 mkdir docker-rhodecode && cd docker-rhodecode
26 curl -L -s -o rcstack https://dls.rhodecode.com/get-rcstack && chmod +x rcstack
26 curl -L -s -o rcstack https://dls.rhodecode.com/get-rcstack && chmod +x rcstack
27
27
28 ./rcstack init
28 ./rcstack init
29
29
30
30
31 .. important::
31 .. important::
32
32
33 We recommend running RhodeCode as a non-root user, such as `rhodecode`;
33 We recommend running RhodeCode as a non-root user, such as `rhodecode`;
34 this user must have a proper home directory and sudo permissions (to start Docker)
34 this user must have a proper home directory and sudo permissions (to start Docker)
35 Either log in as that user to install the software, or do it as root
35 Either log in as that user to install the software, or do it as root
36 with `sudo -i -u rhodecode ./rcstack init`
36 with `sudo -i -u rhodecode ./rcstack init`
37
37
38
38
39 3. Follow instructions on |RCS| documentation pages
39 3. Follow instructions on |RCS| documentation pages
40
40
41 :ref:`Quick install tutorial <rcstack:quick_installation>`
41 :ref:`Quick install tutorial <rcstack:quick_installation>`
42
42
43 4. Check stack status
43 4. Check stack status
44
44
45 .. code-block:: bash
45 .. code-block:: bash
46
46
47 ./rcstack status
47 ./rcstack status
48
48
49
49
50 Output should look similar to this:
50 Output should look similar to this:
51
51
52 .. code-block:: bash
52 .. code-block:: bash
53
53
54 ---
54 ---
55 CONTAINER ID IMAGE STATUS NAMES PORTS
55 CONTAINER ID IMAGE STATUS NAMES PORTS
56 ef54fc528e3a traefik:v2.9.5 Up 2 hours rc_cluster_router-traefik-1 0.0.0.0:80->80/tcp, :::80->80/tcp
56 ef54fc528e3a traefik:v2.9.5 Up 2 hours rc_cluster_router-traefik-1 0.0.0.0:80->80/tcp, :::80->80/tcp
57 f3ea0539e8b0 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-rhodecode-1 0.0.0.0:10020->10020/tcp, :::10020->10020/tcp
57 f3ea0539e8b0 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-rhodecode-1 0.0.0.0:10020->10020/tcp, :::10020->10020/tcp
58 2be52ba58ffe rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-vcsserver-1
58 2be52ba58ffe rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-vcsserver-1
59 7cd730ad3263 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-celery-1
59 7cd730ad3263 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-celery-1
60 dfa231342c87 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-celery-beat-1
60 dfa231342c87 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-celery-beat-1
61 d3d76ce2de96 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-sshd-1
61 d3d76ce2de96 rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-sshd-1
62 daaac329414b rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-svn-1
62 daaac329414b rhodecode/rhodecode-ee:4.28.0 Up 2 hours (healthy) rc_cluster_apps-svn-1
63 7b8504fb9acb nginx:1.23.2 Up 2 hours (healthy) rc_cluster_services-nginx-1 80/tcp
63 7b8504fb9acb nginx:1.23.2 Up 2 hours (healthy) rc_cluster_services-nginx-1 80/tcp
64 7279c25feb6b elasticsearch:6.8.23 Up 2 hours (healthy) rc_cluster_services-elasticsearch-1 9200/tcp, 9300/tcp
64 7279c25feb6b elasticsearch:6.8.23 Up 2 hours (healthy) rc_cluster_services-elasticsearch-1 9200/tcp, 9300/tcp
65 19fb93587493 redis:7.0.5 Up 2 hours (healthy) rc_cluster_services-redis-1 6379/tcp
65 19fb93587493 redis:7.0.5 Up 2 hours (healthy) rc_cluster_services-redis-1 6379/tcp
66 fb77fb6496c6 channelstream/channelstream:0.7.1 Up 2 hours (healthy) rc_cluster_services-channelstream-1 8000/tcp
66 fb77fb6496c6 channelstream/channelstream:0.7.1 Up 2 hours (healthy) rc_cluster_services-channelstream-1 8000/tcp
67 cb6c5c022f5b postgres:14.6 Up 2 hours (healthy) rc_cluster_services-database-1 5432/tcp
67 cb6c5c022f5b postgres:14.6 Up 2 hours (healthy) rc_cluster_services-database-1 5432/tcp
68
68
69
69 At this point you should be able to access:
70 At this point you should be able to access:
70
71
71 - RhodeCode instance at your domain entered, e.g http://rhodecode.local, the default access
72 - RhodeCode instance at your domain entered, e.g http://rhodecode.local, the default access
72 credentials are generated and stored inside .runtime.env.
73 credentials are generated and stored inside .runtime.env.
73 For example::
74 For example::
74
75
75 RHODECODE_USER_NAME=admin
76 RHODECODE_USER_NAME=admin
76 RHODECODE_USER_PASS=super-secret-password
77 RHODECODE_USER_PASS=super-secret-password
77
78
78
79
80
79 .. note::
81 .. note::
80
82
81 Recommended post quick start install instructions:
83 Recommended post quick start install instructions:
82
84
83 * Read the documentation
85 * Read the documentation
84 * Carry out the :ref:`rhodecode-post-install-ref`
86 * Carry out the :ref:`rhodecode-post-install-ref`
85 * Set up :ref:`indexing-ref`
87 * Set up :ref:`indexing-ref`
86 * Familiarise yourself with the :ref:`rhodecode-admin-ref` section.
88 * Familiarise yourself with the :ref:`rhodecode-admin-ref` section.
87
89
88 .. _rhodecode.com/download/: https://rhodecode.com/download/
89 .. _rhodecode.com: https://rhodecode.com/
90 .. _rhodecode.com: https://rhodecode.com/
90 .. _rhodecode.com/register: https://rhodecode.com/register/
91 .. _rhodecode.com/register: https://rhodecode.com/register/
91 .. _rhodecode.com/download: https://rhodecode.com/download/
92 .. _rhodecode.com/download: https://rhodecode.com/download/
92
93
@@ -1,32 +1,21 b''
1 .. _install-sqlite-database:
1 .. _install-sqlite-database:
2
2
3 SQLite
3 SQLite (Deprecated)
4 ------
4 -------------------
5
5
6 .. important::
6 .. important::
7
7
8 We do not recommend using SQLite in a large development environment
8 As of 5.x, SQLite is no longer supported, we advise to migrate to MySQL or PostgreSQL.
9 as it has an internal locking mechanism which can become a performance
10 bottleneck when there are more than 5 concurrent users.
11
9
12 |RCE| installs SQLite as the default database if you do not specify another
13 during installation. SQLite is suitable for small teams,
14 projects with a low load, and evaluation purposes since it is built into
15 |RCE| and does not require any additional database server.
16
17 Using MySQL or PostgreSQL in an large setup gives you much greater
18 performance, and while migration tools exist to move from one database type
19 to another, it is better to get it right first time and to immediately use
20 MySQL or PostgreSQL when you deploy |RCE| in a production environment.
21
10
22 Migrating From SQLite to PostgreSQL
11 Migrating From SQLite to PostgreSQL
23 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
12 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
24
13
25 If you started working with SQLite and now need to migrate your database
14 If you started working with SQLite and now need to migrate your database
26 to PostgreSQL, you can contact support@rhodecode.com for some help. We have a
15 to PostgreSQL, you can contact support@rhodecode.com for some help. We have a
27 set of scripts that enable SQLite to PostgreSQL migration. These scripts have
16 set of scripts that enable SQLite to PostgreSQL migration. These scripts have
28 been tested, and work with PostgreSQL 9.1+.
17 been tested, and work with PostgreSQL 9.1+.
29
18
30 .. note::
19 .. note::
31
20
32 There are no SQLite to MySQL or MariaDB scripts available.
21 There are no SQLite to MySQL or MariaDB scripts available.
@@ -1,95 +1,117 b''
1 .. _known-issues:
1 .. _known-issues:
2
2
3 Known Issues
3 Known Issues
4 ============
4 ============
5
5
6 Windows Upload
6 Windows Upload
7 --------------
7 --------------
8
8
9 There can be an issue with uploading files from web interface on Windows,
9 There can be an issue with uploading files from web interface on Windows,
10 and afterwards users cannot properly clone or synchronize with the repository.
10 and afterwards users cannot properly clone or synchronize with the repository.
11
11
12 Early testing shows that often uploading files via HTML forms on Windows
12 Early testing shows that often uploading files via HTML forms on Windows
13 includes the full path of the file being uploaded and not the name of the file.
13 includes the full path of the file being uploaded and not the name of the file.
14
14
15 Old Format of Git Repositories
15 Old Format of Git Repositories
16 ------------------------------
16 ------------------------------
17
17
18 There is an issue when trying to import old |git| format |repos| into recent
18 There is an issue when trying to import old |git| format |repos| into recent
19 versions of |RCE|. This issue can occur when importing from external |git|
19 versions of |RCE|. This issue can occur when importing from external |git|
20 repositories or from older versions of |RCE| (<=2.2.7).
20 repositories or from older versions of |RCE| (<=2.2.7).
21
21
22 To convert the old version into a current version, clone the old
22 To convert the old version into a current version, clone the old
23 |repo| into a local machine using a recent |git| client, then push it to a new
23 |repo| into a local machine using a recent |git| client, then push it to a new
24 |repo| inside |RCE|.
24 |repo| inside |RCE|.
25
25
26
26
27 VCS Server Memory Consumption
27 VCS Server Memory Consumption
28 -----------------------------
28 -----------------------------
29
29
30 The VCS Server cache grows without limits if not configured correctly. This
30 The VCS Server cache grows without limits if not configured correctly. This
31 applies to |RCE| versions prior to the 3.3.2 releases, as 3.3.2
31 applies to |RCE| versions prior to the 3.3.2 releases, as 3.3.2
32 shipped with the optimal configuration as default. See the
32 shipped with the optimal configuration as default. See the
33 :ref:`vcs-server-maintain` section for details.
33 :ref:`vcs-server-maintain` section for details.
34
34
35 To fix this issue, upgrade to |RCE| 3.3.2 or greater, and if you discover
35 To fix this issue, upgrade to |RCE| 3.3.2 or greater, and if you discover
36 memory consumption issues check the VCS Server settings.
36 memory consumption issues check the VCS Server settings.
37
37
38 Newer Operating system locales
38 Newer Operating system locales
39 ------------------------------
39 ------------------------------
40
40
41 |RCC| has a know problem with locales, due to changes in glibc 2.27+ which affects
41 |RCC| has a know problem with locales, due to changes in glibc 2.27+ which affects
42 the local-archive format, which is now incompatible with our used glibc 2.26.
42 the local-archive format, which is now incompatible with our used glibc 2.26.
43
43
44 Mostly affected are:
44 Mostly affected are:
45
45 - Fedora 23+
46 - Fedora 23+
46 - Ubuntu 18.04
47 - Ubuntu 18.04
47 - CentOS / RHEL 8
48 - CentOS / RHEL 8
48
49
49 To work around this problem, you need set path to ``$LOCAL_ARCHIVE`` to the
50 To work around this problem, you need set path to ``$LOCAL_ARCHIVE`` to the
50 locale package in older pre glibc 2.27 format, or set `LC_ALL=C` in your enviroment.
51 locale package in older pre glibc 2.27 format, or set `LC_ALL=C` in your enviroment.
51
52
52 To use the pre 2.27 locale-archive fix follow these steps:
53 To use the pre 2.27 locale-archive fix follow these steps:
53
54
54 1. Download the pre 2.27 locale-archive package
55 1. Download the pre 2.27 locale-archive package
55
56
56 .. code-block:: bash
57 .. code-block:: bash
57
58
58 wget https://dls.rhodecode.com/assets/locale-archive
59 wget https://dls.rhodecode.com/assets/locale-archive
59
60
60
61
61 2. Point ``$LOCAL_ARCHIVE`` to the locale package.
62 2. Point ``$LOCAL_ARCHIVE`` to the locale package.
62
63
63 .. code-block:: bash
64 .. code-block:: bash
64
65
65 $ export LOCALE_ARCHIVE=/home/USER/locale-archive # change to your path
66 $ export LOCALE_ARCHIVE=/home/USER/locale-archive # change to your path
66
67
67 This should be added *both* in `enviroment` variable of `~/.rccontrol/supervisor/supervisord.ini`
68 This should be added *both* in `enviroment` variable of `~/.rccontrol/supervisor/supervisord.ini`
68 e.g
69 e.g
69
70
70 ```
71 ```
71 [supervisord]
72 [supervisord]
72 environment = HOME=/home/user/rhodecode,LOCALE_ARCHIVE=/YOUR-PATH/locale-archive`
73 environment = HOME=/home/user/rhodecode,LOCALE_ARCHIVE=/YOUR-PATH/locale-archive`
73 ```
74 ```
74
75
75 and in user .bashrc/.zshrc etc, or via a startup script that
76 and in user .bashrc/.zshrc etc, or via a startup script that
76 runs `rccontrol self-init`
77 runs `rccontrol self-init`
77
78
78 If you happen to be running |RCC| from systemd, use the following
79 If you happen to be running |RCC| from systemd, use the following
79 example to pass the correct locale information on boot.
80 example to pass the correct locale information on boot.
80
81
81 .. code-block:: ini
82 .. code-block:: ini
82
83
83 [Unit]
84 [Unit]
84 Description=Rhodecode
85 Description=Rhodecode
85 After=network.target
86 After=network.target
86
87
87 [Service]
88 [Service]
88 Type=forking
89 Type=forking
89 User=scm
90 User=scm
90 Environment="LOCALE_ARCHIVE=/YOUR-PATH/locale-archive"
91 Environment="LOCALE_ARCHIVE=/YOUR-PATH/locale-archive"
91 ExecStart=/YOUR-PATH/.rccontrol-profile/bin/rccontrol-self-init
92 ExecStart=/YOUR-PATH/.rccontrol-profile/bin/rccontrol-self-init
92
93
93 [Install]
94 [Install]
94 WantedBy=multi-user.target
95 WantedBy=multi-user.target
95
96
97
98 Merge stucks in "merging" status
99 --------------------------------
100
101 Similar issues:
102
103 - Pull Request duplicated and/or stucks in "creating" status.
104
105 Mostly affected are:
106
107 - Kubernetes AWS EKS setup with NFS as shared storage
108 - AWS EFS as shared storage
109
110 Workaround:
111
112 1. Manually clear the repo cache via UI:
113 :menuselection:`Repository Settings --> Caches --> Invalidate repository cache`
114
115 1. Open problematic PR and reset status to "created"
116
117 Now you can merge PR normally
@@ -1,59 +1,58 b''
1 |RCE| 5.1.0 |RNS|
1 |RCE| 5.1.0 |RNS|
2 -----------------
2 -----------------
3
3
4 Release Date
4 Release Date
5 ^^^^^^^^^^^^
5 ^^^^^^^^^^^^
6
6
7 - 2024-07-18
7 - 2024-07-18
8
8
9
9
10 New Features
10 New Features
11 ^^^^^^^^^^^^
11 ^^^^^^^^^^^^
12
12
13 - We've introduced 2FA for users. Now alongside the external auth 2fa support RhodeCode allows to enable 2FA for users
13 - We've introduced 2FA for users. Now alongside the external auth 2FA support RhodeCode allows to enable 2FA for users.
14 2FA options will be available for each user individually, or enforced via authentication plugins like ldap, or internal.
14 2FA options will be available for each user individually, or enforced via authentication plugins like ldap, or internal.
15 - Email based log-in. RhodeCode now allows to log-in using email as well as username for main authentication type.
15 - Email based log-in. RhodeCode now allows to log-in using email as well as username for main authentication type.
16 - Ability to replace a file using web UI. Now one can replace an existing file from the web-ui.
16 - Ability to replace a file using web UI. Now one can replace an existing file from the web-ui.
17 - GIT LFS Sync automation. Remote push/pull commands now can also sync GIT LFS objects.
17 - GIT LFS Sync automation. Remote push/pull commands now can also sync GIT LFS objects.
18 - Added ability to remove or close branches from the web ui
18 - Added ability to remove or close branches from the web ui.
19 - Added ability to delete a branch automatically after merging PR for git repositories
19 - Added ability to delete a branch automatically after merging PR for git repositories.
20 - Added support for S3 based archive_cache based that allows storing cached archives in S3 compatible object store.
20 - Added support for S3 based archive_cache that allows storing cached archives in S3 compatible object store.
21
21
22
22
23 General
23 General
24 ^^^^^^^
24 ^^^^^^^
25
25
26 - Upgraded all dependency libraries to their latest available versions
26 - Upgraded all dependency libraries to their latest available versions.
27 - Repository storage is no longer controlled via DB settings, but .ini file. This allows easier automated deployments.
27 - Repository storage is no longer controlled via DB settings, but .ini file. This allows easier automated deployments.
28 - Bumped mercurial to 6.7.4
28 - Bumped mercurial to 6.7.4
29 - Mercurial: enable httppostarguments for better support of large repositories with lots of heads.
29 - Mercurial: enable httppostarguments for better support of large repositories with lots of heads.
30 - Added explicit db-migrate step to update hooks for 5.X release.
30 - Added explicit db-migrate step to update hooks for 5.X release.
31
31
32
32
33 Security
33 Security
34 ^^^^^^^^
34 ^^^^^^^^
35
35
36
36
37
37
38 Performance
38 Performance
39 ^^^^^^^^^^^
39 ^^^^^^^^^^^
40
40
41 - Introduced a full rewrite of ssh backend for performance. The result is 2-5x speed improvement for operation with ssh.
41 - Introduced a full rewrite of ssh backend for performance. The result is 2-5x speed improvement for operation with ssh.
42 enable new ssh wrapper by setting: `ssh.wrapper_cmd = /home/rhodecode/venv/bin/rc-ssh-wrapper-v2`
42 Enable new ssh wrapper by setting: `ssh.wrapper_cmd = /home/rhodecode/venv/bin/rc-ssh-wrapper-v2`
43 - Introduced a new hooks subsystem that is more scalable and faster, enable it by settings: `vcs.hooks.protocol = celery`
43 - Introduced a new hooks subsystem that is more scalable and faster, enable it by setting: `vcs.hooks.protocol = celery`
44
44
45
45
46 Fixes
46 Fixes
47 ^^^^^
47 ^^^^^
48
48
49 - Archives: Zip archive download breaks when a gitmodules file is present
49 - Archives: Zip archive download breaks when a gitmodules file is present.
50 - Branch permissions: fixed bug preventing to specify own rules from 4.X install
50 - Branch permissions: fixed bug preventing to specify own rules from 4.X install.
51 - SVN: refactored svn events, thus fixing support for it in dockerized env
51 - SVN: refactored svn events, thus fixing support for it in dockerized environment.
52 - Fixed empty server url in PR link after push from cli
52 - Fixed empty server url in PR link after push from cli.
53
53
54
54
55 Upgrade notes
55 Upgrade notes
56 ^^^^^^^^^^^^^
56 ^^^^^^^^^^^^^
57
57
58 - RhodeCode 5.1.0 is a mayor feature release after big 5.0.0 python3 migration. Happy to ship a first time feature
58 - RhodeCode 5.1.0 is a major feature release after big 5.0.0 python3 migration. Happy to ship a first time feature-rich release.
59 rich release
@@ -1,173 +1,175 b''
1 .. _rhodecode-release-notes-ref:
1 .. _rhodecode-release-notes-ref:
2
2
3 Release Notes
3 Release Notes
4 =============
4 =============
5
5
6 |RCE| 5.x Versions
6 |RCE| 5.x Versions
7 ------------------
7 ------------------
8
8
9 .. toctree::
9 .. toctree::
10 :maxdepth: 1
10 :maxdepth: 1
11
11
12
12 release-notes-5.2.0.rst
13 release-notes-5.1.2.rst
14 release-notes-5.1.1.rst
13 release-notes-5.1.0.rst
15 release-notes-5.1.0.rst
14 release-notes-5.0.3.rst
16 release-notes-5.0.3.rst
15 release-notes-5.0.2.rst
17 release-notes-5.0.2.rst
16 release-notes-5.0.1.rst
18 release-notes-5.0.1.rst
17 release-notes-5.0.0.rst
19 release-notes-5.0.0.rst
18
20
19
21
20 |RCE| 4.x Versions
22 |RCE| 4.x Versions
21 ------------------
23 ------------------
22
24
23 .. toctree::
25 .. toctree::
24 :maxdepth: 1
26 :maxdepth: 1
25
27
26 release-notes-4.27.1.rst
28 release-notes-4.27.1.rst
27 release-notes-4.27.0.rst
29 release-notes-4.27.0.rst
28 release-notes-4.26.0.rst
30 release-notes-4.26.0.rst
29 release-notes-4.25.2.rst
31 release-notes-4.25.2.rst
30 release-notes-4.25.1.rst
32 release-notes-4.25.1.rst
31 release-notes-4.25.0.rst
33 release-notes-4.25.0.rst
32 release-notes-4.24.1.rst
34 release-notes-4.24.1.rst
33 release-notes-4.24.0.rst
35 release-notes-4.24.0.rst
34 release-notes-4.23.2.rst
36 release-notes-4.23.2.rst
35 release-notes-4.23.1.rst
37 release-notes-4.23.1.rst
36 release-notes-4.23.0.rst
38 release-notes-4.23.0.rst
37 release-notes-4.22.0.rst
39 release-notes-4.22.0.rst
38 release-notes-4.21.0.rst
40 release-notes-4.21.0.rst
39 release-notes-4.20.1.rst
41 release-notes-4.20.1.rst
40 release-notes-4.20.0.rst
42 release-notes-4.20.0.rst
41 release-notes-4.19.3.rst
43 release-notes-4.19.3.rst
42 release-notes-4.19.2.rst
44 release-notes-4.19.2.rst
43 release-notes-4.19.1.rst
45 release-notes-4.19.1.rst
44 release-notes-4.19.0.rst
46 release-notes-4.19.0.rst
45 release-notes-4.18.3.rst
47 release-notes-4.18.3.rst
46 release-notes-4.18.2.rst
48 release-notes-4.18.2.rst
47 release-notes-4.18.1.rst
49 release-notes-4.18.1.rst
48 release-notes-4.18.0.rst
50 release-notes-4.18.0.rst
49 release-notes-4.17.4.rst
51 release-notes-4.17.4.rst
50 release-notes-4.17.3.rst
52 release-notes-4.17.3.rst
51 release-notes-4.17.2.rst
53 release-notes-4.17.2.rst
52 release-notes-4.17.1.rst
54 release-notes-4.17.1.rst
53 release-notes-4.17.0.rst
55 release-notes-4.17.0.rst
54 release-notes-4.16.2.rst
56 release-notes-4.16.2.rst
55 release-notes-4.16.1.rst
57 release-notes-4.16.1.rst
56 release-notes-4.16.0.rst
58 release-notes-4.16.0.rst
57 release-notes-4.15.2.rst
59 release-notes-4.15.2.rst
58 release-notes-4.15.1.rst
60 release-notes-4.15.1.rst
59 release-notes-4.15.0.rst
61 release-notes-4.15.0.rst
60 release-notes-4.14.1.rst
62 release-notes-4.14.1.rst
61 release-notes-4.14.0.rst
63 release-notes-4.14.0.rst
62 release-notes-4.13.3.rst
64 release-notes-4.13.3.rst
63 release-notes-4.13.2.rst
65 release-notes-4.13.2.rst
64 release-notes-4.13.1.rst
66 release-notes-4.13.1.rst
65 release-notes-4.13.0.rst
67 release-notes-4.13.0.rst
66 release-notes-4.12.4.rst
68 release-notes-4.12.4.rst
67 release-notes-4.12.3.rst
69 release-notes-4.12.3.rst
68 release-notes-4.12.2.rst
70 release-notes-4.12.2.rst
69 release-notes-4.12.1.rst
71 release-notes-4.12.1.rst
70 release-notes-4.12.0.rst
72 release-notes-4.12.0.rst
71 release-notes-4.11.6.rst
73 release-notes-4.11.6.rst
72 release-notes-4.11.5.rst
74 release-notes-4.11.5.rst
73 release-notes-4.11.4.rst
75 release-notes-4.11.4.rst
74 release-notes-4.11.3.rst
76 release-notes-4.11.3.rst
75 release-notes-4.11.2.rst
77 release-notes-4.11.2.rst
76 release-notes-4.11.1.rst
78 release-notes-4.11.1.rst
77 release-notes-4.11.0.rst
79 release-notes-4.11.0.rst
78 release-notes-4.10.6.rst
80 release-notes-4.10.6.rst
79 release-notes-4.10.5.rst
81 release-notes-4.10.5.rst
80 release-notes-4.10.4.rst
82 release-notes-4.10.4.rst
81 release-notes-4.10.3.rst
83 release-notes-4.10.3.rst
82 release-notes-4.10.2.rst
84 release-notes-4.10.2.rst
83 release-notes-4.10.1.rst
85 release-notes-4.10.1.rst
84 release-notes-4.10.0.rst
86 release-notes-4.10.0.rst
85 release-notes-4.9.1.rst
87 release-notes-4.9.1.rst
86 release-notes-4.9.0.rst
88 release-notes-4.9.0.rst
87 release-notes-4.8.0.rst
89 release-notes-4.8.0.rst
88 release-notes-4.7.2.rst
90 release-notes-4.7.2.rst
89 release-notes-4.7.1.rst
91 release-notes-4.7.1.rst
90 release-notes-4.7.0.rst
92 release-notes-4.7.0.rst
91 release-notes-4.6.1.rst
93 release-notes-4.6.1.rst
92 release-notes-4.6.0.rst
94 release-notes-4.6.0.rst
93 release-notes-4.5.2.rst
95 release-notes-4.5.2.rst
94 release-notes-4.5.1.rst
96 release-notes-4.5.1.rst
95 release-notes-4.5.0.rst
97 release-notes-4.5.0.rst
96 release-notes-4.4.2.rst
98 release-notes-4.4.2.rst
97 release-notes-4.4.1.rst
99 release-notes-4.4.1.rst
98 release-notes-4.4.0.rst
100 release-notes-4.4.0.rst
99 release-notes-4.3.1.rst
101 release-notes-4.3.1.rst
100 release-notes-4.3.0.rst
102 release-notes-4.3.0.rst
101 release-notes-4.2.1.rst
103 release-notes-4.2.1.rst
102 release-notes-4.2.0.rst
104 release-notes-4.2.0.rst
103 release-notes-4.1.2.rst
105 release-notes-4.1.2.rst
104 release-notes-4.1.1.rst
106 release-notes-4.1.1.rst
105 release-notes-4.1.0.rst
107 release-notes-4.1.0.rst
106 release-notes-4.0.1.rst
108 release-notes-4.0.1.rst
107 release-notes-4.0.0.rst
109 release-notes-4.0.0.rst
108
110
109 |RCE| 3.x Versions
111 |RCE| 3.x Versions
110 ------------------
112 ------------------
111
113
112 .. toctree::
114 .. toctree::
113 :maxdepth: 1
115 :maxdepth: 1
114
116
115 release-notes-3.8.4.rst
117 release-notes-3.8.4.rst
116 release-notes-3.8.3.rst
118 release-notes-3.8.3.rst
117 release-notes-3.8.2.rst
119 release-notes-3.8.2.rst
118 release-notes-3.8.1.rst
120 release-notes-3.8.1.rst
119 release-notes-3.8.0.rst
121 release-notes-3.8.0.rst
120 release-notes-3.7.1.rst
122 release-notes-3.7.1.rst
121 release-notes-3.7.0.rst
123 release-notes-3.7.0.rst
122 release-notes-3.6.1.rst
124 release-notes-3.6.1.rst
123 release-notes-3.6.0.rst
125 release-notes-3.6.0.rst
124 release-notes-3.5.2.rst
126 release-notes-3.5.2.rst
125 release-notes-3.5.1.rst
127 release-notes-3.5.1.rst
126 release-notes-3.5.0.rst
128 release-notes-3.5.0.rst
127 release-notes-3.4.1.rst
129 release-notes-3.4.1.rst
128 release-notes-3.4.0.rst
130 release-notes-3.4.0.rst
129 release-notes-3.3.4.rst
131 release-notes-3.3.4.rst
130 release-notes-3.3.3.rst
132 release-notes-3.3.3.rst
131 release-notes-3.3.2.rst
133 release-notes-3.3.2.rst
132 release-notes-3.3.1.rst
134 release-notes-3.3.1.rst
133 release-notes-3.3.0.rst
135 release-notes-3.3.0.rst
134 release-notes-3.2.3.rst
136 release-notes-3.2.3.rst
135 release-notes-3.2.2.rst
137 release-notes-3.2.2.rst
136 release-notes-3.2.1.rst
138 release-notes-3.2.1.rst
137 release-notes-3.2.0.rst
139 release-notes-3.2.0.rst
138 release-notes-3.1.1.rst
140 release-notes-3.1.1.rst
139 release-notes-3.1.0.rst
141 release-notes-3.1.0.rst
140 release-notes-3.0.2.rst
142 release-notes-3.0.2.rst
141 release-notes-3.0.1.rst
143 release-notes-3.0.1.rst
142 release-notes-3.0.0.rst
144 release-notes-3.0.0.rst
143
145
144 |RCE| 2.x Versions
146 |RCE| 2.x Versions
145 ------------------
147 ------------------
146
148
147 .. toctree::
149 .. toctree::
148 :maxdepth: 1
150 :maxdepth: 1
149
151
150 release-notes-2.2.8.rst
152 release-notes-2.2.8.rst
151 release-notes-2.2.7.rst
153 release-notes-2.2.7.rst
152 release-notes-2.2.6.rst
154 release-notes-2.2.6.rst
153 release-notes-2.2.5.rst
155 release-notes-2.2.5.rst
154 release-notes-2.2.4.rst
156 release-notes-2.2.4.rst
155 release-notes-2.2.3.rst
157 release-notes-2.2.3.rst
156 release-notes-2.2.2.rst
158 release-notes-2.2.2.rst
157 release-notes-2.2.1.rst
159 release-notes-2.2.1.rst
158 release-notes-2.2.0.rst
160 release-notes-2.2.0.rst
159 release-notes-2.1.0.rst
161 release-notes-2.1.0.rst
160 release-notes-2.0.2.rst
162 release-notes-2.0.2.rst
161 release-notes-2.0.1.rst
163 release-notes-2.0.1.rst
162 release-notes-2.0.0.rst
164 release-notes-2.0.0.rst
163
165
164 |RCE| 1.x Versions
166 |RCE| 1.x Versions
165 ------------------
167 ------------------
166
168
167 .. toctree::
169 .. toctree::
168 :maxdepth: 1
170 :maxdepth: 1
169
171
170 release-notes-1.7.2.rst
172 release-notes-1.7.2.rst
171 release-notes-1.7.1.rst
173 release-notes-1.7.1.rst
172 release-notes-1.7.0.rst
174 release-notes-1.7.0.rst
173 release-notes-1.6.0.rst
175 release-notes-1.6.0.rst
@@ -1,11 +1,11 b''
1 sphinx==7.2.6
1 sphinx==7.2.6
2
2
3 furo==2023.9.10
3 furo==2023.9.10
4 sphinx-press-theme==0.8.0
4 sphinx-press-theme==0.8.0
5 sphinx-rtd-theme==1.3.0
5 sphinx-rtd-theme==1.3.0
6
6
7 pygments==2.16.1
7 pygments==2.18.0
8
8
9 docutils<0.19
9 docutils<0.19
10 markupsafe==2.1.3
10 markupsafe==2.1.3
11 jinja2==3.1.2
11 jinja2==3.1.2
@@ -1,313 +1,299 b''
1 # deps, generated via pipdeptree --exclude setuptools,wheel,pipdeptree,pip -f | tr '[:upper:]' '[:lower:]'
1 # deps, generated via pipdeptree --exclude setuptools,wheel,pipdeptree,pip -f | tr '[:upper:]' '[:lower:]'
2
2
3 alembic==1.13.1
3 alembic==1.13.1
4 mako==1.2.4
4 mako==1.2.4
5 markupsafe==2.1.2
5 markupsafe==2.1.2
6 sqlalchemy==1.4.52
6 sqlalchemy==1.4.52
7 greenlet==3.0.3
7 greenlet==3.0.3
8 typing_extensions==4.9.0
8 typing_extensions==4.12.2
9 async-timeout==4.0.3
9 async-timeout==4.0.3
10 babel==2.12.1
10 babel==2.12.1
11 beaker==1.12.1
11 beaker==1.12.1
12 celery==5.3.6
12 celery==5.3.6
13 billiard==4.2.0
13 billiard==4.2.0
14 click==8.1.3
14 click==8.1.3
15 click-didyoumean==0.3.0
15 click-didyoumean==0.3.0
16 click==8.1.3
16 click==8.1.3
17 click-plugins==1.1.1
17 click-plugins==1.1.1
18 click==8.1.3
18 click==8.1.3
19 click-repl==0.2.0
19 click-repl==0.2.0
20 click==8.1.3
20 click==8.1.3
21 prompt-toolkit==3.0.38
21 prompt_toolkit==3.0.47
22 wcwidth==0.2.6
22 wcwidth==0.2.13
23 six==1.16.0
23 six==1.16.0
24 kombu==5.3.5
24 kombu==5.3.5
25 amqp==5.2.0
25 amqp==5.2.0
26 vine==5.1.0
26 vine==5.1.0
27 vine==5.1.0
27 vine==5.1.0
28 python-dateutil==2.8.2
28 python-dateutil==2.8.2
29 six==1.16.0
29 six==1.16.0
30 tzdata==2024.1
30 tzdata==2024.1
31 vine==5.1.0
31 vine==5.1.0
32 channelstream==0.7.1
32 channelstream==0.7.1
33 gevent==24.2.1
33 gevent==24.2.1
34 greenlet==3.0.3
34 greenlet==3.0.3
35 zope.event==5.0.0
35 zope.event==5.0.0
36 zope.interface==6.3.0
36 zope.interface==7.0.3
37 itsdangerous==1.1.0
37 itsdangerous==1.1.0
38 marshmallow==2.18.0
38 marshmallow==2.18.0
39 pyramid==2.0.2
39 pyramid==2.0.2
40 hupper==1.12
40 hupper==1.12
41 plaster==1.1.2
41 plaster==1.1.2
42 plaster-pastedeploy==1.0.1
42 plaster-pastedeploy==1.0.1
43 pastedeploy==3.1.0
43 pastedeploy==3.1.0
44 plaster==1.1.2
44 plaster==1.1.2
45 translationstring==1.4
45 translationstring==1.4
46 venusian==3.0.0
46 venusian==3.0.0
47 webob==1.8.7
47 webob==1.8.7
48 zope.deprecation==5.0.0
48 zope.deprecation==5.0.0
49 zope.interface==6.3.0
49 zope.interface==7.0.3
50 pyramid-jinja2==2.10
50 pyramid-jinja2==2.10
51 jinja2==3.1.2
51 jinja2==3.1.2
52 markupsafe==2.1.2
52 markupsafe==2.1.2
53 markupsafe==2.1.2
53 markupsafe==2.1.2
54 pyramid==2.0.2
54 pyramid==2.0.2
55 hupper==1.12
55 hupper==1.12
56 plaster==1.1.2
56 plaster==1.1.2
57 plaster-pastedeploy==1.0.1
57 plaster-pastedeploy==1.0.1
58 pastedeploy==3.1.0
58 pastedeploy==3.1.0
59 plaster==1.1.2
59 plaster==1.1.2
60 translationstring==1.4
60 translationstring==1.4
61 venusian==3.0.0
61 venusian==3.0.0
62 webob==1.8.7
62 webob==1.8.7
63 zope.deprecation==5.0.0
63 zope.deprecation==5.0.0
64 zope.interface==6.3.0
64 zope.interface==7.0.3
65 zope.deprecation==5.0.0
65 zope.deprecation==5.0.0
66 python-dateutil==2.8.2
66 python-dateutil==2.8.2
67 six==1.16.0
67 six==1.16.0
68 requests==2.28.2
68 requests==2.28.2
69 certifi==2022.12.7
69 certifi==2022.12.7
70 charset-normalizer==3.1.0
70 charset-normalizer==3.1.0
71 idna==3.4
71 idna==3.4
72 urllib3==1.26.14
72 urllib3==1.26.14
73 ws4py==0.5.1
73 ws4py==0.5.1
74 deform==2.0.15
74 deform==2.0.15
75 chameleon==3.10.2
75 chameleon==3.10.2
76 colander==2.0
76 colander==2.0
77 iso8601==1.1.0
77 iso8601==1.1.0
78 translationstring==1.4
78 translationstring==1.4
79 iso8601==1.1.0
79 iso8601==1.1.0
80 peppercorn==0.6
80 peppercorn==0.6
81 translationstring==1.4
81 translationstring==1.4
82 zope.deprecation==5.0.0
82 zope.deprecation==5.0.0
83 docutils==0.19
83 docutils==0.19
84 dogpile.cache==1.3.3
84 dogpile.cache==1.3.3
85 decorator==5.1.1
85 decorator==5.1.1
86 stevedore==5.1.0
86 stevedore==5.1.0
87 pbr==5.11.1
87 pbr==5.11.1
88 formencode==2.1.0
88 formencode==2.1.0
89 six==1.16.0
89 six==1.16.0
90 fsspec==2024.6.0
90 fsspec==2024.9.0
91 gunicorn==21.2.0
91 gunicorn==23.0.0
92 packaging==24.0
92 packaging==24.1
93 gevent==24.2.1
93 gevent==24.2.1
94 greenlet==3.0.3
94 greenlet==3.0.3
95 zope.event==5.0.0
95 zope.event==5.0.0
96 zope.interface==6.3.0
96 zope.interface==7.0.3
97 ipython==8.14.0
97 ipython==8.26.0
98 backcall==0.2.0
99 decorator==5.1.1
98 decorator==5.1.1
100 jedi==0.19.0
99 jedi==0.19.1
101 parso==0.8.3
100 parso==0.8.4
102 matplotlib-inline==0.1.6
101 matplotlib-inline==0.1.7
103 traitlets==5.9.0
102 traitlets==5.14.3
104 pexpect==4.8.0
103 pexpect==4.9.0
105 ptyprocess==0.7.0
104 ptyprocess==0.7.0
106 pickleshare==0.7.5
105 prompt_toolkit==3.0.47
107 prompt-toolkit==3.0.38
106 wcwidth==0.2.13
108 wcwidth==0.2.6
107 pygments==2.18.0
109 pygments==2.15.1
108 stack-data==0.6.3
110 stack-data==0.6.2
109 asttokens==2.4.1
111 asttokens==2.2.1
112 six==1.16.0
110 six==1.16.0
113 executing==1.2.0
111 executing==2.0.1
114 pure-eval==0.2.2
112 pure_eval==0.2.3
115 traitlets==5.9.0
113 traitlets==5.14.3
114 typing_extensions==4.12.2
116 markdown==3.4.3
115 markdown==3.4.3
117 msgpack==1.0.8
116 msgpack==1.0.8
118 mysqlclient==2.1.1
117 mysqlclient==2.1.1
119 nbconvert==7.7.3
118 nbconvert==7.7.3
120 beautifulsoup4==4.12.3
119 beautifulsoup4==4.12.3
121 soupsieve==2.5
120 soupsieve==2.5
122 bleach==6.1.0
121 bleach==6.1.0
123 six==1.16.0
122 six==1.16.0
124 webencodings==0.5.1
123 webencodings==0.5.1
125 defusedxml==0.7.1
124 defusedxml==0.7.1
126 jinja2==3.1.2
125 jinja2==3.1.2
127 markupsafe==2.1.2
126 markupsafe==2.1.2
128 jupyter_core==5.3.1
127 jupyter_core==5.3.1
129 platformdirs==3.10.0
128 platformdirs==3.10.0
130 traitlets==5.9.0
129 traitlets==5.14.3
131 jupyterlab-pygments==0.2.2
130 jupyterlab-pygments==0.2.2
132 markupsafe==2.1.2
131 markupsafe==2.1.2
133 mistune==2.0.5
132 mistune==2.0.5
134 nbclient==0.8.0
133 nbclient==0.8.0
135 jupyter_client==8.3.0
134 jupyter_client==8.3.0
136 jupyter_core==5.3.1
135 jupyter_core==5.3.1
137 platformdirs==3.10.0
136 platformdirs==3.10.0
138 traitlets==5.9.0
137 traitlets==5.14.3
139 python-dateutil==2.8.2
138 python-dateutil==2.8.2
140 six==1.16.0
139 six==1.16.0
141 pyzmq==25.0.0
140 pyzmq==25.0.0
142 tornado==6.2
141 tornado==6.2
143 traitlets==5.9.0
142 traitlets==5.14.3
144 jupyter_core==5.3.1
143 jupyter_core==5.3.1
145 platformdirs==3.10.0
144 platformdirs==3.10.0
146 traitlets==5.9.0
145 traitlets==5.14.3
147 nbformat==5.9.2
146 nbformat==5.9.2
148 fastjsonschema==2.18.0
147 fastjsonschema==2.18.0
149 jsonschema==4.18.6
148 jsonschema==4.18.6
150 attrs==22.2.0
149 attrs==22.2.0
151 pyrsistent==0.19.3
150 pyrsistent==0.19.3
152 jupyter_core==5.3.1
151 jupyter_core==5.3.1
153 platformdirs==3.10.0
152 platformdirs==3.10.0
154 traitlets==5.9.0
153 traitlets==5.14.3
155 traitlets==5.9.0
154 traitlets==5.14.3
156 traitlets==5.9.0
155 traitlets==5.14.3
157 nbformat==5.9.2
156 nbformat==5.9.2
158 fastjsonschema==2.18.0
157 fastjsonschema==2.18.0
159 jsonschema==4.18.6
158 jsonschema==4.18.6
160 attrs==22.2.0
159 attrs==22.2.0
161 pyrsistent==0.19.3
160 pyrsistent==0.19.3
162 jupyter_core==5.3.1
161 jupyter_core==5.3.1
163 platformdirs==3.10.0
162 platformdirs==3.10.0
164 traitlets==5.9.0
163 traitlets==5.14.3
165 traitlets==5.9.0
164 traitlets==5.14.3
166 pandocfilters==1.5.0
165 pandocfilters==1.5.0
167 pygments==2.15.1
166 pygments==2.18.0
168 tinycss2==1.2.1
167 tinycss2==1.2.1
169 webencodings==0.5.1
168 webencodings==0.5.1
170 traitlets==5.9.0
169 traitlets==5.14.3
171 orjson==3.10.3
170 orjson==3.10.7
172 paste==3.10.1
171 paste==3.10.1
173 premailer==3.10.0
172 premailer==3.10.0
174 cachetools==5.3.3
173 cachetools==5.3.3
175 cssselect==1.2.0
174 cssselect==1.2.0
176 cssutils==2.6.0
175 cssutils==2.6.0
177 lxml==4.9.3
176 lxml==5.3.0
178 requests==2.28.2
177 requests==2.28.2
179 certifi==2022.12.7
178 certifi==2022.12.7
180 charset-normalizer==3.1.0
179 charset-normalizer==3.1.0
181 idna==3.4
180 idna==3.4
182 urllib3==1.26.14
181 urllib3==1.26.14
183 psutil==5.9.8
182 psutil==5.9.8
184 psycopg2==2.9.9
183 psycopg2==2.9.9
185 py-bcrypt==0.4
184 py-bcrypt==0.4
186 pycmarkgfm==1.2.0
185 pycmarkgfm==1.2.0
187 cffi==1.16.0
186 cffi==1.16.0
188 pycparser==2.21
187 pycparser==2.21
189 pycryptodome==3.17
188 pycryptodome==3.17
190 pycurl==7.45.3
189 pycurl==7.45.3
191 pymysql==1.0.3
190 pymysql==1.0.3
192 pyotp==2.8.0
191 pyotp==2.8.0
193 pyparsing==3.1.1
192 pyparsing==3.1.1
194 pyramid-debugtoolbar==4.12.1
195 pygments==2.15.1
196 pyramid==2.0.2
197 hupper==1.12
198 plaster==1.1.2
199 plaster-pastedeploy==1.0.1
200 pastedeploy==3.1.0
201 plaster==1.1.2
202 translationstring==1.4
203 venusian==3.0.0
204 webob==1.8.7
205 zope.deprecation==5.0.0
206 zope.interface==6.3.0
207 pyramid-mako==1.1.0
208 mako==1.2.4
209 markupsafe==2.1.2
210 pyramid==2.0.2
211 hupper==1.12
212 plaster==1.1.2
213 plaster-pastedeploy==1.0.1
214 pastedeploy==3.1.0
215 plaster==1.1.2
216 translationstring==1.4
217 venusian==3.0.0
218 webob==1.8.7
219 zope.deprecation==5.0.0
220 zope.interface==6.3.0
221 pyramid-mailer==0.15.1
193 pyramid-mailer==0.15.1
222 pyramid==2.0.2
194 pyramid==2.0.2
223 hupper==1.12
195 hupper==1.12
224 plaster==1.1.2
196 plaster==1.1.2
225 plaster-pastedeploy==1.0.1
197 plaster-pastedeploy==1.0.1
226 pastedeploy==3.1.0
198 pastedeploy==3.1.0
227 plaster==1.1.2
199 plaster==1.1.2
228 translationstring==1.4
200 translationstring==1.4
229 venusian==3.0.0
201 venusian==3.0.0
230 webob==1.8.7
202 webob==1.8.7
231 zope.deprecation==5.0.0
203 zope.deprecation==5.0.0
232 zope.interface==6.3.0
204 zope.interface==7.0.3
233 repoze.sendmail==4.4.1
205 repoze.sendmail==4.4.1
234 transaction==3.1.0
206 transaction==5.0.0
235 zope.interface==6.3.0
207 zope.interface==7.0.3
236 zope.interface==6.3.0
208 zope.interface==7.0.3
237 transaction==3.1.0
209 transaction==5.0.0
238 zope.interface==6.3.0
210 zope.interface==7.0.3
211 pyramid-mako==1.1.0
212 mako==1.2.4
213 markupsafe==2.1.2
214 pyramid==2.0.2
215 hupper==1.12
216 plaster==1.1.2
217 plaster-pastedeploy==1.0.1
218 pastedeploy==3.1.0
219 plaster==1.1.2
220 translationstring==1.4
221 venusian==3.0.0
222 webob==1.8.7
223 zope.deprecation==5.0.0
224 zope.interface==7.0.3
239 python-ldap==3.4.3
225 python-ldap==3.4.3
240 pyasn1==0.4.8
226 pyasn1==0.4.8
241 pyasn1-modules==0.2.8
227 pyasn1-modules==0.2.8
242 pyasn1==0.4.8
228 pyasn1==0.4.8
243 python-memcached==1.59
229 python-memcached==1.59
244 six==1.16.0
230 six==1.16.0
245 python-pam==2.0.2
231 python-pam==2.0.2
246 python3-saml==1.15.0
232 python3-saml==1.16.0
247 isodate==0.6.1
233 isodate==0.6.1
248 six==1.16.0
234 six==1.16.0
249 lxml==4.9.3
235 lxml==5.3.0
250 xmlsec==1.3.13
236 xmlsec==1.3.14
251 lxml==4.9.3
237 lxml==5.3.0
252 pyyaml==6.0.1
238 pyyaml==6.0.1
253 redis==5.0.4
239 redis==5.1.0
254 async-timeout==4.0.3
240 async-timeout==4.0.3
255 regex==2022.10.31
241 regex==2022.10.31
256 routes==2.5.1
242 routes==2.5.1
257 repoze.lru==0.7
243 repoze.lru==0.7
258 six==1.16.0
244 six==1.16.0
259 s3fs==2024.6.0
245 s3fs==2024.9.0
260 aiobotocore==2.13.0
246 aiobotocore==2.13.0
261 aiohttp==3.9.5
247 aiohttp==3.9.5
262 aiosignal==1.3.1
248 aiosignal==1.3.1
263 frozenlist==1.4.1
249 frozenlist==1.4.1
264 attrs==22.2.0
250 attrs==22.2.0
265 frozenlist==1.4.1
251 frozenlist==1.4.1
266 multidict==6.0.5
252 multidict==6.0.5
267 yarl==1.9.4
253 yarl==1.9.4
268 idna==3.4
254 idna==3.4
269 multidict==6.0.5
255 multidict==6.0.5
270 aioitertools==0.11.0
256 aioitertools==0.11.0
271 botocore==1.34.106
257 botocore==1.34.106
272 jmespath==1.0.1
258 jmespath==1.0.1
273 python-dateutil==2.8.2
259 python-dateutil==2.8.2
274 six==1.16.0
260 six==1.16.0
275 urllib3==1.26.14
261 urllib3==1.26.14
276 wrapt==1.16.0
262 wrapt==1.16.0
277 aiohttp==3.9.5
263 aiohttp==3.9.5
278 aiosignal==1.3.1
264 aiosignal==1.3.1
279 frozenlist==1.4.1
265 frozenlist==1.4.1
280 attrs==22.2.0
266 attrs==22.2.0
281 frozenlist==1.4.1
267 frozenlist==1.4.1
282 multidict==6.0.5
268 multidict==6.0.5
283 yarl==1.9.4
269 yarl==1.9.4
284 idna==3.4
270 idna==3.4
285 multidict==6.0.5
271 multidict==6.0.5
286 fsspec==2024.6.0
272 fsspec==2024.9.0
287 simplejson==3.19.2
273 simplejson==3.19.2
288 sshpubkeys==3.3.1
274 sshpubkeys==3.3.1
289 cryptography==40.0.2
275 cryptography==40.0.2
290 cffi==1.16.0
276 cffi==1.16.0
291 pycparser==2.21
277 pycparser==2.21
292 ecdsa==0.18.0
278 ecdsa==0.18.0
293 six==1.16.0
279 six==1.16.0
294 sqlalchemy==1.4.52
280 sqlalchemy==1.4.52
295 greenlet==3.0.3
281 greenlet==3.0.3
296 typing_extensions==4.9.0
282 typing_extensions==4.12.2
297 supervisor==4.2.5
283 supervisor==4.2.5
298 tzlocal==4.3
284 tzlocal==4.3
299 pytz-deprecation-shim==0.1.0.post0
285 pytz-deprecation-shim==0.1.0.post0
300 tzdata==2024.1
286 tzdata==2024.1
301 tempita==0.5.2
287 tempita==0.5.2
302 unidecode==1.3.6
288 unidecode==1.3.6
303 urlobject==2.4.3
289 urlobject==2.4.3
304 waitress==3.0.0
290 waitress==3.0.0
305 webhelpers2==2.1
291 webhelpers2==2.1
306 markupsafe==2.1.2
292 markupsafe==2.1.2
307 six==1.16.0
293 six==1.16.0
308 whoosh==2.7.4
294 whoosh==2.7.4
309 zope.cachedescriptors==5.0.0
295 zope.cachedescriptors==5.0.0
310 qrcode==7.4.2
296 qrcode==7.4.2
311
297
312 ## uncomment to add the debug libraries
298 ## uncomment to add the debug libraries
313 #-r requirements_debug.txt
299 #-r requirements_debug.txt
@@ -1,28 +1,29 b''
1 ## special libraries we could extend the requirements.txt file with to add some
1 ## special libraries we could extend the requirements.txt file with to add some
2 ## custom libraries usefull for debug and memory tracing
2 ## custom libraries usefull for debug and memory tracing
3
3
4 objgraph
4 objgraph
5 memory-profiler
5 memory-profiler
6 pympler
6 pympler
7
7
8 ## debug
8 ## debug
9 ipdb
9 ipdb
10 ipython
10 ipython
11 rich
11 rich
12 pyramid-debugtoolbar
12
13
13 # format
14 # format
14 flake8
15 flake8
15 ruff
16 ruff
16
17
17 pipdeptree==2.7.1
18 pipdeptree==2.7.1
18 invoke==2.0.0
19 invoke==2.0.0
19 bumpversion==0.6.0
20 bumpversion==0.6.0
20 bump2version==1.0.1
21 bump2version==1.0.1
21
22
22 docutils-stubs
23 docutils-stubs
23 types-redis
24 types-redis
24 types-requests==2.31.0.6
25 types-requests==2.31.0.6
25 types-sqlalchemy
26 types-sqlalchemy
26 types-psutil
27 types-psutil
27 types-pycurl
28 types-pycurl
28 types-ujson
29 types-ujson
@@ -1,48 +1,48 b''
1 # test related requirements
1 # test related requirements
2 mock==5.1.0
2 mock==5.1.0
3 pytest-cov==4.1.0
3 pytest-cov==4.1.0
4 coverage==7.4.3
4 coverage==7.4.3
5 pytest==8.1.1
5 pytest==8.1.1
6 iniconfig==2.0.0
6 iniconfig==2.0.0
7 packaging==24.0
7 packaging==24.1
8 pluggy==1.4.0
8 pluggy==1.4.0
9 pytest-env==1.1.3
9 pytest-env==1.1.3
10 pytest==8.1.1
10 pytest==8.1.1
11 iniconfig==2.0.0
11 iniconfig==2.0.0
12 packaging==24.0
12 packaging==24.1
13 pluggy==1.4.0
13 pluggy==1.4.0
14 pytest-profiling==1.7.0
14 pytest-profiling==1.7.0
15 gprof2dot==2022.7.29
15 gprof2dot==2022.7.29
16 pytest==8.1.1
16 pytest==8.1.1
17 iniconfig==2.0.0
17 iniconfig==2.0.0
18 packaging==24.0
18 packaging==24.1
19 pluggy==1.4.0
19 pluggy==1.4.0
20 six==1.16.0
20 six==1.16.0
21 pytest-rerunfailures==13.0
21 pytest-rerunfailures==13.0
22 packaging==24.0
22 packaging==24.1
23 pytest==8.1.1
23 pytest==8.1.1
24 iniconfig==2.0.0
24 iniconfig==2.0.0
25 packaging==24.0
25 packaging==24.1
26 pluggy==1.4.0
26 pluggy==1.4.0
27 pytest-runner==6.0.1
27 pytest-runner==6.0.1
28 pytest-sugar==1.0.0
28 pytest-sugar==1.0.0
29 packaging==24.0
29 packaging==24.1
30 pytest==8.1.1
30 pytest==8.1.1
31 iniconfig==2.0.0
31 iniconfig==2.0.0
32 packaging==24.0
32 packaging==24.1
33 pluggy==1.4.0
33 pluggy==1.4.0
34 termcolor==2.4.0
34 termcolor==2.4.0
35 pytest-timeout==2.3.1
35 pytest-timeout==2.3.1
36 pytest==8.1.1
36 pytest==8.1.1
37 iniconfig==2.0.0
37 iniconfig==2.0.0
38 packaging==24.0
38 packaging==24.1
39 pluggy==1.4.0
39 pluggy==1.4.0
40 webtest==3.0.0
40 webtest==3.0.0
41 beautifulsoup4==4.12.3
41 beautifulsoup4==4.12.3
42 soupsieve==2.5
42 soupsieve==2.5
43 waitress==3.0.0
43 waitress==3.0.0
44 webob==1.8.7
44 webob==1.8.7
45
45
46 # RhodeCode test-data
46 # RhodeCode test-data
47 rc_testdata @ https://code.rhodecode.com/upstream/rc-testdata-dist/raw/77378e9097f700b4c1b9391b56199fe63566b5c9/rc_testdata-0.11.0.tar.gz#egg=rc_testdata
47 rc_testdata @ https://code.rhodecode.com/upstream/rc-testdata-dist/raw/77378e9097f700b4c1b9391b56199fe63566b5c9/rc_testdata-0.11.0.tar.gz#egg=rc_testdata
48 rc_testdata==0.11.0
48 rc_testdata==0.11.0
@@ -1,1 +1,1 b''
1 5.1.2 No newline at end of file
1 5.2.0
@@ -1,581 +1,581 b''
1 # Copyright (C) 2011-2023 RhodeCode GmbH
1 # Copyright (C) 2011-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import itertools
19 import itertools
20 import logging
20 import logging
21 import sys
21 import sys
22 import fnmatch
22 import fnmatch
23
23
24 import decorator
24 import decorator
25 import venusian
25 import venusian
26 from collections import OrderedDict
26 from collections import OrderedDict
27
27
28 from pyramid.exceptions import ConfigurationError
28 from pyramid.exceptions import ConfigurationError
29 from pyramid.renderers import render
29 from pyramid.renderers import render
30 from pyramid.response import Response
30 from pyramid.response import Response
31 from pyramid.httpexceptions import HTTPNotFound
31 from pyramid.httpexceptions import HTTPNotFound
32
32
33 from rhodecode.api.exc import (
33 from rhodecode.api.exc import (
34 JSONRPCBaseError, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
34 JSONRPCBaseError, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
35 from rhodecode.apps._base import TemplateArgs
35 from rhodecode.apps._base import TemplateArgs
36 from rhodecode.lib.auth import AuthUser
36 from rhodecode.lib.auth import AuthUser
37 from rhodecode.lib.base import get_ip_addr, attach_context_attributes
37 from rhodecode.lib.base import get_ip_addr, attach_context_attributes
38 from rhodecode.lib.exc_tracking import store_exception
38 from rhodecode.lib.exc_tracking import store_exception
39 from rhodecode.lib import ext_json
39 from rhodecode.lib import ext_json
40 from rhodecode.lib.utils2 import safe_str
40 from rhodecode.lib.utils2 import safe_str
41 from rhodecode.lib.plugins.utils import get_plugin_settings
41 from rhodecode.lib.plugins.utils import get_plugin_settings
42 from rhodecode.model.db import User, UserApiKeys
42 from rhodecode.model.db import User, UserApiKeys
43 from rhodecode.config.patches import inspect_getargspec
43
44
44 log = logging.getLogger(__name__)
45 log = logging.getLogger(__name__)
45
46
46 DEFAULT_RENDERER = 'jsonrpc_renderer'
47 DEFAULT_RENDERER = 'jsonrpc_renderer'
47 DEFAULT_URL = '/_admin/api'
48 DEFAULT_URL = '/_admin/api'
48 SERVICE_API_IDENTIFIER = 'service_'
49 SERVICE_API_IDENTIFIER = 'service_'
49
50
50
51
51 def find_methods(jsonrpc_methods, pattern):
52 def find_methods(jsonrpc_methods, pattern):
52 matches = OrderedDict()
53 matches = OrderedDict()
53 if not isinstance(pattern, (list, tuple)):
54 if not isinstance(pattern, (list, tuple)):
54 pattern = [pattern]
55 pattern = [pattern]
55
56
56 for single_pattern in pattern:
57 for single_pattern in pattern:
57 for method_name, method in filter(
58 for method_name, method in filter(
58 lambda x: not x[0].startswith(SERVICE_API_IDENTIFIER), jsonrpc_methods.items()
59 lambda x: not x[0].startswith(SERVICE_API_IDENTIFIER), jsonrpc_methods.items()
59 ):
60 ):
60 if fnmatch.fnmatch(method_name, single_pattern):
61 if fnmatch.fnmatch(method_name, single_pattern):
61 matches[method_name] = method
62 matches[method_name] = method
62 return matches
63 return matches
63
64
64
65
65 class ExtJsonRenderer(object):
66 class ExtJsonRenderer(object):
66 """
67 """
67 Custom renderer that makes use of our ext_json lib
68 Custom renderer that makes use of our ext_json lib
68
69
69 """
70 """
70
71
71 def __init__(self):
72 def __init__(self):
72 self.serializer = ext_json.formatted_json
73 self.serializer = ext_json.formatted_json
73
74
74 def __call__(self, info):
75 def __call__(self, info):
75 """ Returns a plain JSON-encoded string with content-type
76 """ Returns a plain JSON-encoded string with content-type
76 ``application/json``. The content-type may be overridden by
77 ``application/json``. The content-type may be overridden by
77 setting ``request.response.content_type``."""
78 setting ``request.response.content_type``."""
78
79
79 def _render(value, system):
80 def _render(value, system):
80 request = system.get('request')
81 request = system.get('request')
81 if request is not None:
82 if request is not None:
82 response = request.response
83 response = request.response
83 ct = response.content_type
84 ct = response.content_type
84 if ct == response.default_content_type:
85 if ct == response.default_content_type:
85 response.content_type = 'application/json'
86 response.content_type = 'application/json'
86
87
87 return self.serializer(value)
88 return self.serializer(value)
88
89
89 return _render
90 return _render
90
91
91
92
92 def jsonrpc_response(request, result):
93 def jsonrpc_response(request, result):
93 rpc_id = getattr(request, 'rpc_id', None)
94 rpc_id = getattr(request, 'rpc_id', None)
94
95
95 ret_value = ''
96 ret_value = ''
96 if rpc_id:
97 if rpc_id:
97 ret_value = {'id': rpc_id, 'result': result, 'error': None}
98 ret_value = {'id': rpc_id, 'result': result, 'error': None}
98
99
99 # fetch deprecation warnings, and store it inside results
100 # fetch deprecation warnings, and store it inside results
100 deprecation = getattr(request, 'rpc_deprecation', None)
101 deprecation = getattr(request, 'rpc_deprecation', None)
101 if deprecation:
102 if deprecation:
102 ret_value['DEPRECATION_WARNING'] = deprecation
103 ret_value['DEPRECATION_WARNING'] = deprecation
103
104
104 raw_body = render(DEFAULT_RENDERER, ret_value, request=request)
105 raw_body = render(DEFAULT_RENDERER, ret_value, request=request)
105 content_type = 'application/json'
106 content_type = 'application/json'
106 content_type_header = 'Content-Type'
107 content_type_header = 'Content-Type'
107 headers = {
108 headers = {
108 content_type_header: content_type
109 content_type_header: content_type
109 }
110 }
110 return Response(
111 return Response(
111 body=raw_body,
112 body=raw_body,
112 content_type=content_type,
113 content_type=content_type,
113 headerlist=[(k, v) for k, v in headers.items()]
114 headerlist=[(k, v) for k, v in headers.items()]
114 )
115 )
115
116
116
117
117 def jsonrpc_error(request, message, retid=None, code: int | None = None, headers: dict | None = None):
118 def jsonrpc_error(request, message, retid=None, code: int | None = None, headers: dict | None = None):
118 """
119 """
119 Generate a Response object with a JSON-RPC error body
120 Generate a Response object with a JSON-RPC error body
120 """
121 """
121 headers = headers or {}
122 headers = headers or {}
122 content_type = 'application/json'
123 content_type = 'application/json'
123 content_type_header = 'Content-Type'
124 content_type_header = 'Content-Type'
124 if content_type_header not in headers:
125 if content_type_header not in headers:
125 headers[content_type_header] = content_type
126 headers[content_type_header] = content_type
126
127
127 err_dict = {'id': retid, 'result': None, 'error': message}
128 err_dict = {'id': retid, 'result': None, 'error': message}
128 raw_body = render(DEFAULT_RENDERER, err_dict, request=request)
129 raw_body = render(DEFAULT_RENDERER, err_dict, request=request)
129
130
130 return Response(
131 return Response(
131 body=raw_body,
132 body=raw_body,
132 status=code,
133 status=code,
133 content_type=content_type,
134 content_type=content_type,
134 headerlist=[(k, v) for k, v in headers.items()]
135 headerlist=[(k, v) for k, v in headers.items()]
135 )
136 )
136
137
137
138
138 def exception_view(exc, request):
139 def exception_view(exc, request):
139 rpc_id = getattr(request, 'rpc_id', None)
140 rpc_id = getattr(request, 'rpc_id', None)
140
141
141 if isinstance(exc, JSONRPCError):
142 if isinstance(exc, JSONRPCError):
142 fault_message = safe_str(exc)
143 fault_message = safe_str(exc)
143 log.debug('json-rpc error rpc_id:%s "%s"', rpc_id, fault_message)
144 log.debug('json-rpc error rpc_id:%s "%s"', rpc_id, fault_message)
144 elif isinstance(exc, JSONRPCValidationError):
145 elif isinstance(exc, JSONRPCValidationError):
145 colander_exc = exc.colander_exception
146 colander_exc = exc.colander_exception
146 # TODO(marcink): think maybe of nicer way to serialize errors ?
147 # TODO(marcink): think maybe of nicer way to serialize errors ?
147 fault_message = colander_exc.asdict()
148 fault_message = colander_exc.asdict()
148 log.debug('json-rpc colander error rpc_id:%s "%s"', rpc_id, fault_message)
149 log.debug('json-rpc colander error rpc_id:%s "%s"', rpc_id, fault_message)
149 elif isinstance(exc, JSONRPCForbidden):
150 elif isinstance(exc, JSONRPCForbidden):
150 fault_message = 'Access was denied to this resource.'
151 fault_message = 'Access was denied to this resource.'
151 log.warning('json-rpc forbidden call rpc_id:%s "%s"', rpc_id, fault_message)
152 log.warning('json-rpc forbidden call rpc_id:%s "%s"', rpc_id, fault_message)
152 elif isinstance(exc, HTTPNotFound):
153 elif isinstance(exc, HTTPNotFound):
153 method = request.rpc_method
154 method = request.rpc_method
154 log.debug('json-rpc method `%s` not found in list of '
155 log.debug('json-rpc method `%s` not found in list of '
155 'api calls: %s, rpc_id:%s',
156 'api calls: %s, rpc_id:%s',
156 method, list(request.registry.jsonrpc_methods.keys()), rpc_id)
157 method, list(request.registry.jsonrpc_methods.keys()), rpc_id)
157
158
158 similar = 'none'
159 similar = 'none'
159 try:
160 try:
160 similar_paterns = [f'*{x}*' for x in method.split('_')]
161 similar_paterns = [f'*{x}*' for x in method.split('_')]
161 similar_found = find_methods(
162 similar_found = find_methods(
162 request.registry.jsonrpc_methods, similar_paterns)
163 request.registry.jsonrpc_methods, similar_paterns)
163 similar = ', '.join(similar_found.keys()) or similar
164 similar = ', '.join(similar_found.keys()) or similar
164 except Exception:
165 except Exception:
165 # make the whole above block safe
166 # make the whole above block safe
166 pass
167 pass
167
168
168 fault_message = f"No such method: {method}. Similar methods: {similar}"
169 fault_message = f"No such method: {method}. Similar methods: {similar}"
169 else:
170 else:
170 fault_message = 'undefined error'
171 fault_message = 'undefined error'
171 exc_info = exc.exc_info()
172 exc_info = exc.exc_info()
172 store_exception(id(exc_info), exc_info, prefix='rhodecode-api')
173 store_exception(id(exc_info), exc_info, prefix='rhodecode-api')
173
174
174 statsd = request.registry.statsd
175 statsd = request.registry.statsd
175 if statsd:
176 if statsd:
176 exc_type = f"{exc.__class__.__module__}.{exc.__class__.__name__}"
177 exc_type = f"{exc.__class__.__module__}.{exc.__class__.__name__}"
177 statsd.incr('rhodecode_exception_total',
178 statsd.incr('rhodecode_exception_total',
178 tags=["exc_source:api", f"type:{exc_type}"])
179 tags=["exc_source:api", f"type:{exc_type}"])
179
180
180 return jsonrpc_error(request, fault_message, rpc_id)
181 return jsonrpc_error(request, fault_message, rpc_id)
181
182
182
183
183 def request_view(request):
184 def request_view(request):
184 """
185 """
185 Main request handling method. It handles all logic to call a specific
186 Main request handling method. It handles all logic to call a specific
186 exposed method
187 exposed method
187 """
188 """
188 # cython compatible inspect
189 # cython compatible inspect
189 from rhodecode.config.patches import inspect_getargspec
190 inspect = inspect_getargspec()
190 inspect = inspect_getargspec()
191
191
192 # check if we can find this session using api_key, get_by_auth_token
192 # check if we can find this session using api_key, get_by_auth_token
193 # search not expired tokens only
193 # search not expired tokens only
194 try:
194 try:
195 if not request.rpc_method.startswith(SERVICE_API_IDENTIFIER):
195 if not request.rpc_method.startswith(SERVICE_API_IDENTIFIER):
196 api_user = User.get_by_auth_token(request.rpc_api_key)
196 api_user = User.get_by_auth_token(request.rpc_api_key)
197
197
198 if api_user is None:
198 if api_user is None:
199 return jsonrpc_error(
199 return jsonrpc_error(
200 request, retid=request.rpc_id, message='Invalid API KEY')
200 request, retid=request.rpc_id, message='Invalid API KEY')
201
201
202 if not api_user.active:
202 if not api_user.active:
203 return jsonrpc_error(
203 return jsonrpc_error(
204 request, retid=request.rpc_id,
204 request, retid=request.rpc_id,
205 message='Request from this user not allowed')
205 message='Request from this user not allowed')
206
206
207 # check if we are allowed to use this IP
207 # check if we are allowed to use this IP
208 auth_u = AuthUser(
208 auth_u = AuthUser(
209 api_user.user_id, request.rpc_api_key, ip_addr=request.rpc_ip_addr)
209 api_user.user_id, request.rpc_api_key, ip_addr=request.rpc_ip_addr)
210 if not auth_u.ip_allowed:
210 if not auth_u.ip_allowed:
211 return jsonrpc_error(
211 return jsonrpc_error(
212 request, retid=request.rpc_id,
212 request, retid=request.rpc_id,
213 message='Request from IP:{} not allowed'.format(
213 message='Request from IP:{} not allowed'.format(
214 request.rpc_ip_addr))
214 request.rpc_ip_addr))
215 else:
215 else:
216 log.info('Access for IP:%s allowed', request.rpc_ip_addr)
216 log.info('Access for IP:%s allowed', request.rpc_ip_addr)
217
217
218 # register our auth-user
218 # register our auth-user
219 request.rpc_user = auth_u
219 request.rpc_user = auth_u
220 request.environ['rc_auth_user_id'] = str(auth_u.user_id)
220 request.environ['rc_auth_user_id'] = str(auth_u.user_id)
221
221
222 # now check if token is valid for API
222 # now check if token is valid for API
223 auth_token = request.rpc_api_key
223 auth_token = request.rpc_api_key
224 token_match = api_user.authenticate_by_token(
224 token_match = api_user.authenticate_by_token(
225 auth_token, roles=[UserApiKeys.ROLE_API])
225 auth_token, roles=[UserApiKeys.ROLE_API])
226 invalid_token = not token_match
226 invalid_token = not token_match
227
227
228 log.debug('Checking if API KEY is valid with proper role')
228 log.debug('Checking if API KEY is valid with proper role')
229 if invalid_token:
229 if invalid_token:
230 return jsonrpc_error(
230 return jsonrpc_error(
231 request, retid=request.rpc_id,
231 request, retid=request.rpc_id,
232 message='API KEY invalid or, has bad role for an API call')
232 message='API KEY invalid or, has bad role for an API call')
233 else:
233 else:
234 auth_u = 'service'
234 auth_u = 'service'
235 if request.rpc_api_key != request.registry.settings['app.service_api.token']:
235 if request.rpc_api_key != request.registry.settings['app.service_api.token']:
236 raise Exception("Provided service secret is not recognized!")
236 raise Exception("Provided service secret is not recognized!")
237
237
238 except Exception:
238 except Exception:
239 log.exception('Error on API AUTH')
239 log.exception('Error on API AUTH')
240 return jsonrpc_error(
240 return jsonrpc_error(
241 request, retid=request.rpc_id, message='Invalid API KEY')
241 request, retid=request.rpc_id, message='Invalid API KEY')
242
242
243 method = request.rpc_method
243 method = request.rpc_method
244 func = request.registry.jsonrpc_methods[method]
244 func = request.registry.jsonrpc_methods[method]
245
245
246 # now that we have a method, add request._req_params to
246 # now that we have a method, add request._req_params to
247 # self.kargs and dispatch control to WGIController
247 # self.kargs and dispatch control to WGIController
248
248
249 argspec = inspect.getargspec(func)
249 argspec = inspect.getargspec(func)
250 arglist = argspec[0]
250 arglist = argspec[0]
251 defs = argspec[3] or []
251 defs = argspec[3] or []
252 defaults = [type(a) for a in defs]
252 defaults = [type(a) for a in defs]
253 default_empty = type(NotImplemented)
253 default_empty = type(NotImplemented)
254
254
255 # kw arguments required by this method
255 # kw arguments required by this method
256 func_kwargs = dict(itertools.zip_longest(
256 func_kwargs = dict(itertools.zip_longest(
257 reversed(arglist), reversed(defaults), fillvalue=default_empty))
257 reversed(arglist), reversed(defaults), fillvalue=default_empty))
258
258
259 # This attribute will need to be first param of a method that uses
259 # This attribute will need to be first param of a method that uses
260 # api_key, which is translated to instance of user at that name
260 # api_key, which is translated to instance of user at that name
261 user_var = 'apiuser'
261 user_var = 'apiuser'
262 request_var = 'request'
262 request_var = 'request'
263
263
264 for arg in [user_var, request_var]:
264 for arg in [user_var, request_var]:
265 if arg not in arglist:
265 if arg not in arglist:
266 return jsonrpc_error(
266 return jsonrpc_error(
267 request,
267 request,
268 retid=request.rpc_id,
268 retid=request.rpc_id,
269 message='This method [%s] does not support '
269 message='This method [%s] does not support '
270 'required parameter `%s`' % (func.__name__, arg))
270 'required parameter `%s`' % (func.__name__, arg))
271
271
272 # get our arglist and check if we provided them as args
272 # get our arglist and check if we provided them as args
273 for arg, default in func_kwargs.items():
273 for arg, default in func_kwargs.items():
274 if arg in [user_var, request_var]:
274 if arg in [user_var, request_var]:
275 # user_var and request_var are pre-hardcoded parameters and we
275 # user_var and request_var are pre-hardcoded parameters and we
276 # don't need to do any translation
276 # don't need to do any translation
277 continue
277 continue
278
278
279 # skip the required param check if it's default value is
279 # skip the required param check if it's default value is
280 # NotImplementedType (default_empty)
280 # NotImplementedType (default_empty)
281 if default == default_empty and arg not in request.rpc_params:
281 if default == default_empty and arg not in request.rpc_params:
282 return jsonrpc_error(
282 return jsonrpc_error(
283 request,
283 request,
284 retid=request.rpc_id,
284 retid=request.rpc_id,
285 message=('Missing non optional `%s` arg in JSON DATA' % arg)
285 message=('Missing non optional `%s` arg in JSON DATA' % arg)
286 )
286 )
287
287
288 # sanitize extra passed arguments
288 # sanitize extra passed arguments
289 for k in list(request.rpc_params.keys()):
289 for k in list(request.rpc_params.keys()):
290 if k not in func_kwargs:
290 if k not in func_kwargs:
291 del request.rpc_params[k]
291 del request.rpc_params[k]
292
292
293 call_params = request.rpc_params
293 call_params = request.rpc_params
294 call_params.update({
294 call_params.update({
295 'request': request,
295 'request': request,
296 'apiuser': auth_u
296 'apiuser': auth_u
297 })
297 })
298
298
299 # register some common functions for usage
299 # register some common functions for usage
300 rpc_user = request.rpc_user.user_id if hasattr(request, 'rpc_user') else None
300 rpc_user = request.rpc_user.user_id if hasattr(request, 'rpc_user') else None
301 attach_context_attributes(TemplateArgs(), request, rpc_user)
301 attach_context_attributes(TemplateArgs(), request, rpc_user)
302
302
303 statsd = request.registry.statsd
303 statsd = request.registry.statsd
304
304
305 try:
305 try:
306 ret_value = func(**call_params)
306 ret_value = func(**call_params)
307 resp = jsonrpc_response(request, ret_value)
307 resp = jsonrpc_response(request, ret_value)
308 if statsd:
308 if statsd:
309 statsd.incr('rhodecode_api_call_success_total')
309 statsd.incr('rhodecode_api_call_success_total')
310 return resp
310 return resp
311 except JSONRPCBaseError:
311 except JSONRPCBaseError:
312 raise
312 raise
313 except Exception:
313 except Exception:
314 log.exception('Unhandled exception occurred on api call: %s', func)
314 log.exception('Unhandled exception occurred on api call: %s', func)
315 exc_info = sys.exc_info()
315 exc_info = sys.exc_info()
316 exc_id, exc_type_name = store_exception(
316 exc_id, exc_type_name = store_exception(
317 id(exc_info), exc_info, prefix='rhodecode-api')
317 id(exc_info), exc_info, prefix='rhodecode-api')
318 error_headers = {
318 error_headers = {
319 'RhodeCode-Exception-Id': str(exc_id),
319 'RhodeCode-Exception-Id': str(exc_id),
320 'RhodeCode-Exception-Type': str(exc_type_name)
320 'RhodeCode-Exception-Type': str(exc_type_name)
321 }
321 }
322 err_resp = jsonrpc_error(
322 err_resp = jsonrpc_error(
323 request, retid=request.rpc_id, message='Internal server error',
323 request, retid=request.rpc_id, message='Internal server error',
324 headers=error_headers)
324 headers=error_headers)
325 if statsd:
325 if statsd:
326 statsd.incr('rhodecode_api_call_fail_total')
326 statsd.incr('rhodecode_api_call_fail_total')
327 return err_resp
327 return err_resp
328
328
329
329
330 def setup_request(request):
330 def setup_request(request):
331 """
331 """
332 Parse a JSON-RPC request body. It's used inside the predicates method
332 Parse a JSON-RPC request body. It's used inside the predicates method
333 to validate and bootstrap requests for usage in rpc calls.
333 to validate and bootstrap requests for usage in rpc calls.
334
334
335 We need to raise JSONRPCError here if we want to return some errors back to
335 We need to raise JSONRPCError here if we want to return some errors back to
336 user.
336 user.
337 """
337 """
338
338
339 log.debug('Executing setup request: %r', request)
339 log.debug('Executing setup request: %r', request)
340 request.rpc_ip_addr = get_ip_addr(request.environ)
340 request.rpc_ip_addr = get_ip_addr(request.environ)
341 # TODO(marcink): deprecate GET at some point
341 # TODO(marcink): deprecate GET at some point
342 if request.method not in ['POST', 'GET']:
342 if request.method not in ['POST', 'GET']:
343 log.debug('unsupported request method "%s"', request.method)
343 log.debug('unsupported request method "%s"', request.method)
344 raise JSONRPCError(
344 raise JSONRPCError(
345 'unsupported request method "%s". Please use POST' % request.method)
345 'unsupported request method "%s". Please use POST' % request.method)
346
346
347 if 'CONTENT_LENGTH' not in request.environ:
347 if 'CONTENT_LENGTH' not in request.environ:
348 log.debug("No Content-Length")
348 log.debug("No Content-Length")
349 raise JSONRPCError("Empty body, No Content-Length in request")
349 raise JSONRPCError("Empty body, No Content-Length in request")
350
350
351 else:
351 else:
352 length = request.environ['CONTENT_LENGTH']
352 length = request.environ['CONTENT_LENGTH']
353 log.debug('Content-Length: %s', length)
353 log.debug('Content-Length: %s', length)
354
354
355 if length == 0:
355 if length == 0:
356 log.debug("Content-Length is 0")
356 log.debug("Content-Length is 0")
357 raise JSONRPCError("Content-Length is 0")
357 raise JSONRPCError("Content-Length is 0")
358
358
359 raw_body = request.body
359 raw_body = request.body
360 log.debug("Loading JSON body now")
360 log.debug("Loading JSON body now")
361 try:
361 try:
362 json_body = ext_json.json.loads(raw_body)
362 json_body = ext_json.json.loads(raw_body)
363 except ValueError as e:
363 except ValueError as e:
364 # catch JSON errors Here
364 # catch JSON errors Here
365 raise JSONRPCError(f"JSON parse error ERR:{e} RAW:{raw_body!r}")
365 raise JSONRPCError(f"JSON parse error ERR:{e} RAW:{raw_body!r}")
366
366
367 request.rpc_id = json_body.get('id')
367 request.rpc_id = json_body.get('id')
368 request.rpc_method = json_body.get('method')
368 request.rpc_method = json_body.get('method')
369
369
370 # check required base parameters
370 # check required base parameters
371 try:
371 try:
372 api_key = json_body.get('api_key')
372 api_key = json_body.get('api_key')
373 if not api_key:
373 if not api_key:
374 api_key = json_body.get('auth_token')
374 api_key = json_body.get('auth_token')
375
375
376 if not api_key:
376 if not api_key:
377 raise KeyError('api_key or auth_token')
377 raise KeyError('api_key or auth_token')
378
378
379 # TODO(marcink): support passing in token in request header
379 # TODO(marcink): support passing in token in request header
380
380
381 request.rpc_api_key = api_key
381 request.rpc_api_key = api_key
382 request.rpc_id = json_body['id']
382 request.rpc_id = json_body['id']
383 request.rpc_method = json_body['method']
383 request.rpc_method = json_body['method']
384 request.rpc_params = json_body['args'] \
384 request.rpc_params = json_body['args'] \
385 if isinstance(json_body['args'], dict) else {}
385 if isinstance(json_body['args'], dict) else {}
386
386
387 log.debug('method: %s, params: %.10240r', request.rpc_method, request.rpc_params)
387 log.debug('method: %s, params: %.10240r', request.rpc_method, request.rpc_params)
388 except KeyError as e:
388 except KeyError as e:
389 raise JSONRPCError(f'Incorrect JSON data. Missing {e}')
389 raise JSONRPCError(f'Incorrect JSON data. Missing {e}')
390
390
391 log.debug('setup complete, now handling method:%s rpcid:%s',
391 log.debug('setup complete, now handling method:%s rpcid:%s',
392 request.rpc_method, request.rpc_id, )
392 request.rpc_method, request.rpc_id, )
393
393
394
394
395 class RoutePredicate(object):
395 class RoutePredicate(object):
396 def __init__(self, val, config):
396 def __init__(self, val, config):
397 self.val = val
397 self.val = val
398
398
399 def text(self):
399 def text(self):
400 return f'jsonrpc route = {self.val}'
400 return f'jsonrpc route = {self.val}'
401
401
402 phash = text
402 phash = text
403
403
404 def __call__(self, info, request):
404 def __call__(self, info, request):
405 if self.val:
405 if self.val:
406 # potentially setup and bootstrap our call
406 # potentially setup and bootstrap our call
407 setup_request(request)
407 setup_request(request)
408
408
409 # Always return True so that even if it isn't a valid RPC it
409 # Always return True so that even if it isn't a valid RPC it
410 # will fall through to the underlaying handlers like notfound_view
410 # will fall through to the underlaying handlers like notfound_view
411 return True
411 return True
412
412
413
413
414 class NotFoundPredicate(object):
414 class NotFoundPredicate(object):
415 def __init__(self, val, config):
415 def __init__(self, val, config):
416 self.val = val
416 self.val = val
417 self.methods = config.registry.jsonrpc_methods
417 self.methods = config.registry.jsonrpc_methods
418
418
419 def text(self):
419 def text(self):
420 return f'jsonrpc method not found = {self.val}'
420 return f'jsonrpc method not found = {self.val}'
421
421
422 phash = text
422 phash = text
423
423
424 def __call__(self, info, request):
424 def __call__(self, info, request):
425 return hasattr(request, 'rpc_method')
425 return hasattr(request, 'rpc_method')
426
426
427
427
428 class MethodPredicate(object):
428 class MethodPredicate(object):
429 def __init__(self, val, config):
429 def __init__(self, val, config):
430 self.method = val
430 self.method = val
431
431
432 def text(self):
432 def text(self):
433 return f'jsonrpc method = {self.method}'
433 return f'jsonrpc method = {self.method}'
434
434
435 phash = text
435 phash = text
436
436
437 def __call__(self, context, request):
437 def __call__(self, context, request):
438 # we need to explicitly return False here, so pyramid doesn't try to
438 # we need to explicitly return False here, so pyramid doesn't try to
439 # execute our view directly. We need our main handler to execute things
439 # execute our view directly. We need our main handler to execute things
440 return getattr(request, 'rpc_method') == self.method
440 return getattr(request, 'rpc_method') == self.method
441
441
442
442
443 def add_jsonrpc_method(config, view, **kwargs):
443 def add_jsonrpc_method(config, view, **kwargs):
444 # pop the method name
444 # pop the method name
445 method = kwargs.pop('method', None)
445 method = kwargs.pop('method', None)
446
446
447 if method is None:
447 if method is None:
448 raise ConfigurationError(
448 raise ConfigurationError(
449 'Cannot register a JSON-RPC method without specifying the "method"')
449 'Cannot register a JSON-RPC method without specifying the "method"')
450
450
451 # we define custom predicate, to enable to detect conflicting methods,
451 # we define custom predicate, to enable to detect conflicting methods,
452 # those predicates are kind of "translation" from the decorator variables
452 # those predicates are kind of "translation" from the decorator variables
453 # to internal predicates names
453 # to internal predicates names
454
454
455 kwargs['jsonrpc_method'] = method
455 kwargs['jsonrpc_method'] = method
456
456
457 # register our view into global view store for validation
457 # register our view into global view store for validation
458 config.registry.jsonrpc_methods[method] = view
458 config.registry.jsonrpc_methods[method] = view
459
459
460 # we're using our main request_view handler, here, so each method
460 # we're using our main request_view handler, here, so each method
461 # has a unified handler for itself
461 # has a unified handler for itself
462 config.add_view(request_view, route_name='apiv2', **kwargs)
462 config.add_view(request_view, route_name='apiv2', **kwargs)
463
463
464
464
465 class jsonrpc_method(object):
465 class jsonrpc_method(object):
466 """
466 """
467 decorator that works similar to @add_view_config decorator,
467 decorator that works similar to @add_view_config decorator,
468 but tailored for our JSON RPC
468 but tailored for our JSON RPC
469 """
469 """
470
470
471 venusian = venusian # for testing injection
471 venusian = venusian # for testing injection
472
472
473 def __init__(self, method=None, **kwargs):
473 def __init__(self, method=None, **kwargs):
474 self.method = method
474 self.method = method
475 self.kwargs = kwargs
475 self.kwargs = kwargs
476
476
477 def __call__(self, wrapped):
477 def __call__(self, wrapped):
478 kwargs = self.kwargs.copy()
478 kwargs = self.kwargs.copy()
479 kwargs['method'] = self.method or wrapped.__name__
479 kwargs['method'] = self.method or wrapped.__name__
480 depth = kwargs.pop('_depth', 0)
480 depth = kwargs.pop('_depth', 0)
481
481
482 def callback(context, name, ob):
482 def callback(context, name, ob):
483 config = context.config.with_package(info.module)
483 config = context.config.with_package(info.module)
484 config.add_jsonrpc_method(view=ob, **kwargs)
484 config.add_jsonrpc_method(view=ob, **kwargs)
485
485
486 info = venusian.attach(wrapped, callback, category='pyramid',
486 info = venusian.attach(wrapped, callback, category='pyramid',
487 depth=depth + 1)
487 depth=depth + 1)
488 if info.scope == 'class':
488 if info.scope == 'class':
489 # ensure that attr is set if decorating a class method
489 # ensure that attr is set if decorating a class method
490 kwargs.setdefault('attr', wrapped.__name__)
490 kwargs.setdefault('attr', wrapped.__name__)
491
491
492 kwargs['_info'] = info.codeinfo # fbo action_method
492 kwargs['_info'] = info.codeinfo # fbo action_method
493 return wrapped
493 return wrapped
494
494
495
495
496 class jsonrpc_deprecated_method(object):
496 class jsonrpc_deprecated_method(object):
497 """
497 """
498 Marks method as deprecated, adds log.warning, and inject special key to
498 Marks method as deprecated, adds log.warning, and inject special key to
499 the request variable to mark method as deprecated.
499 the request variable to mark method as deprecated.
500 Also injects special docstring that extract_docs will catch to mark
500 Also injects special docstring that extract_docs will catch to mark
501 method as deprecated.
501 method as deprecated.
502
502
503 :param use_method: specify which method should be used instead of
503 :param use_method: specify which method should be used instead of
504 the decorated one
504 the decorated one
505
505
506 Use like::
506 Use like::
507
507
508 @jsonrpc_method()
508 @jsonrpc_method()
509 @jsonrpc_deprecated_method(use_method='new_func', deprecated_at_version='3.0.0')
509 @jsonrpc_deprecated_method(use_method='new_func', deprecated_at_version='3.0.0')
510 def old_func(request, apiuser, arg1, arg2):
510 def old_func(request, apiuser, arg1, arg2):
511 ...
511 ...
512 """
512 """
513
513
514 def __init__(self, use_method, deprecated_at_version):
514 def __init__(self, use_method, deprecated_at_version):
515 self.use_method = use_method
515 self.use_method = use_method
516 self.deprecated_at_version = deprecated_at_version
516 self.deprecated_at_version = deprecated_at_version
517 self.deprecated_msg = ''
517 self.deprecated_msg = ''
518
518
519 def __call__(self, func):
519 def __call__(self, func):
520 self.deprecated_msg = 'Please use method `{method}` instead.'.format(
520 self.deprecated_msg = 'Please use method `{method}` instead.'.format(
521 method=self.use_method)
521 method=self.use_method)
522
522
523 docstring = """\n
523 docstring = """\n
524 .. deprecated:: {version}
524 .. deprecated:: {version}
525
525
526 {deprecation_message}
526 {deprecation_message}
527
527
528 {original_docstring}
528 {original_docstring}
529 """
529 """
530 func.__doc__ = docstring.format(
530 func.__doc__ = docstring.format(
531 version=self.deprecated_at_version,
531 version=self.deprecated_at_version,
532 deprecation_message=self.deprecated_msg,
532 deprecation_message=self.deprecated_msg,
533 original_docstring=func.__doc__)
533 original_docstring=func.__doc__)
534 return decorator.decorator(self.__wrapper, func)
534 return decorator.decorator(self.__wrapper, func)
535
535
536 def __wrapper(self, func, *fargs, **fkwargs):
536 def __wrapper(self, func, *fargs, **fkwargs):
537 log.warning('DEPRECATED API CALL on function %s, please '
537 log.warning('DEPRECATED API CALL on function %s, please '
538 'use `%s` instead', func, self.use_method)
538 'use `%s` instead', func, self.use_method)
539 # alter function docstring to mark as deprecated, this is picked up
539 # alter function docstring to mark as deprecated, this is picked up
540 # via fabric file that generates API DOC.
540 # via fabric file that generates API DOC.
541 result = func(*fargs, **fkwargs)
541 result = func(*fargs, **fkwargs)
542
542
543 request = fargs[0]
543 request = fargs[0]
544 request.rpc_deprecation = 'DEPRECATED METHOD ' + self.deprecated_msg
544 request.rpc_deprecation = 'DEPRECATED METHOD ' + self.deprecated_msg
545 return result
545 return result
546
546
547
547
548 def add_api_methods(config):
548 def add_api_methods(config):
549 from rhodecode.api.views import (
549 from rhodecode.api.views import (
550 deprecated_api, gist_api, pull_request_api, repo_api, repo_group_api,
550 deprecated_api, gist_api, pull_request_api, repo_api, repo_group_api,
551 server_api, search_api, testing_api, user_api, user_group_api)
551 server_api, search_api, testing_api, user_api, user_group_api)
552
552
553 config.scan('rhodecode.api.views')
553 config.scan('rhodecode.api.views')
554
554
555
555
556 def includeme(config):
556 def includeme(config):
557 plugin_module = 'rhodecode.api'
557 plugin_module = 'rhodecode.api'
558 plugin_settings = get_plugin_settings(
558 plugin_settings = get_plugin_settings(
559 plugin_module, config.registry.settings)
559 plugin_module, config.registry.settings)
560
560
561 if not hasattr(config.registry, 'jsonrpc_methods'):
561 if not hasattr(config.registry, 'jsonrpc_methods'):
562 config.registry.jsonrpc_methods = OrderedDict()
562 config.registry.jsonrpc_methods = OrderedDict()
563
563
564 # match filter by given method only
564 # match filter by given method only
565 config.add_view_predicate('jsonrpc_method', MethodPredicate)
565 config.add_view_predicate('jsonrpc_method', MethodPredicate)
566 config.add_view_predicate('jsonrpc_method_not_found', NotFoundPredicate)
566 config.add_view_predicate('jsonrpc_method_not_found', NotFoundPredicate)
567
567
568 config.add_renderer(DEFAULT_RENDERER, ExtJsonRenderer())
568 config.add_renderer(DEFAULT_RENDERER, ExtJsonRenderer())
569 config.add_directive('add_jsonrpc_method', add_jsonrpc_method)
569 config.add_directive('add_jsonrpc_method', add_jsonrpc_method)
570
570
571 config.add_route_predicate(
571 config.add_route_predicate(
572 'jsonrpc_call', RoutePredicate)
572 'jsonrpc_call', RoutePredicate)
573
573
574 config.add_route(
574 config.add_route(
575 'apiv2', plugin_settings.get('url', DEFAULT_URL), jsonrpc_call=True)
575 'apiv2', plugin_settings.get('url', DEFAULT_URL), jsonrpc_call=True)
576
576
577 # register some exception handling view
577 # register some exception handling view
578 config.add_view(exception_view, context=JSONRPCBaseError)
578 config.add_view(exception_view, context=JSONRPCBaseError)
579 config.add_notfound_view(exception_view, jsonrpc_method_not_found=True)
579 config.add_notfound_view(exception_view, jsonrpc_method_not_found=True)
580
580
581 add_api_methods(config)
581 add_api_methods(config)
@@ -1,423 +1,423 b''
1 # Copyright (C) 2011-2023 RhodeCode GmbH
1 # Copyright (C) 2011-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import logging
19 import logging
20 import itertools
20 import itertools
21 import base64
21 import base64
22
22
23 from rhodecode.api import (
23 from rhodecode.api import (
24 jsonrpc_method, JSONRPCError, JSONRPCForbidden, find_methods)
24 jsonrpc_method, JSONRPCError, JSONRPCForbidden, find_methods)
25
25
26 from rhodecode.api.utils import (
26 from rhodecode.api.utils import (
27 Optional, OAttr, has_superadmin_permission, get_user_or_error)
27 Optional, OAttr, has_superadmin_permission, get_user_or_error)
28 from rhodecode.lib.utils import repo2db_mapper, get_rhodecode_repo_store_path
28 from rhodecode.lib.utils import repo2db_mapper, get_rhodecode_repo_store_path
29 from rhodecode.lib import system_info
29 from rhodecode.lib import system_info
30 from rhodecode.lib import user_sessions
30 from rhodecode.lib import user_sessions
31 from rhodecode.lib import exc_tracking
31 from rhodecode.lib import exc_tracking
32 from rhodecode.lib.ext_json import json
32 from rhodecode.lib.ext_json import json
33 from rhodecode.lib.utils2 import safe_int
33 from rhodecode.lib.utils2 import safe_int
34 from rhodecode.model.db import UserIpMap
34 from rhodecode.model.db import UserIpMap
35 from rhodecode.model.scm import ScmModel
35 from rhodecode.model.scm import ScmModel
36 from rhodecode.apps.file_store import utils
36 from rhodecode.apps.file_store import utils as store_utils
37 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, \
37 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, \
38 FileOverSizeException
38 FileOverSizeException
39
39
40 log = logging.getLogger(__name__)
40 log = logging.getLogger(__name__)
41
41
42
42
43 @jsonrpc_method()
43 @jsonrpc_method()
44 def get_server_info(request, apiuser):
44 def get_server_info(request, apiuser):
45 """
45 """
46 Returns the |RCE| server information.
46 Returns the |RCE| server information.
47
47
48 This includes the running version of |RCE| and all installed
48 This includes the running version of |RCE| and all installed
49 packages. This command takes the following options:
49 packages. This command takes the following options:
50
50
51 :param apiuser: This is filled automatically from the |authtoken|.
51 :param apiuser: This is filled automatically from the |authtoken|.
52 :type apiuser: AuthUser
52 :type apiuser: AuthUser
53
53
54 Example output:
54 Example output:
55
55
56 .. code-block:: bash
56 .. code-block:: bash
57
57
58 id : <id_given_in_input>
58 id : <id_given_in_input>
59 result : {
59 result : {
60 'modules': [<module name>,...]
60 'modules': [<module name>,...]
61 'py_version': <python version>,
61 'py_version': <python version>,
62 'platform': <platform type>,
62 'platform': <platform type>,
63 'rhodecode_version': <rhodecode version>
63 'rhodecode_version': <rhodecode version>
64 }
64 }
65 error : null
65 error : null
66 """
66 """
67
67
68 if not has_superadmin_permission(apiuser):
68 if not has_superadmin_permission(apiuser):
69 raise JSONRPCForbidden()
69 raise JSONRPCForbidden()
70
70
71 server_info = ScmModel().get_server_info(request.environ)
71 server_info = ScmModel().get_server_info(request.environ)
72 # rhodecode-index requires those
72 # rhodecode-index requires those
73
73
74 server_info['index_storage'] = server_info['search']['value']['location']
74 server_info['index_storage'] = server_info['search']['value']['location']
75 server_info['storage'] = server_info['storage']['value']['path']
75 server_info['storage'] = server_info['storage']['value']['path']
76
76
77 return server_info
77 return server_info
78
78
79
79
80 @jsonrpc_method()
80 @jsonrpc_method()
81 def get_repo_store(request, apiuser):
81 def get_repo_store(request, apiuser):
82 """
82 """
83 Returns the |RCE| repository storage information.
83 Returns the |RCE| repository storage information.
84
84
85 :param apiuser: This is filled automatically from the |authtoken|.
85 :param apiuser: This is filled automatically from the |authtoken|.
86 :type apiuser: AuthUser
86 :type apiuser: AuthUser
87
87
88 Example output:
88 Example output:
89
89
90 .. code-block:: bash
90 .. code-block:: bash
91
91
92 id : <id_given_in_input>
92 id : <id_given_in_input>
93 result : {
93 result : {
94 'modules': [<module name>,...]
94 'modules': [<module name>,...]
95 'py_version': <python version>,
95 'py_version': <python version>,
96 'platform': <platform type>,
96 'platform': <platform type>,
97 'rhodecode_version': <rhodecode version>
97 'rhodecode_version': <rhodecode version>
98 }
98 }
99 error : null
99 error : null
100 """
100 """
101
101
102 if not has_superadmin_permission(apiuser):
102 if not has_superadmin_permission(apiuser):
103 raise JSONRPCForbidden()
103 raise JSONRPCForbidden()
104
104
105 path = get_rhodecode_repo_store_path()
105 path = get_rhodecode_repo_store_path()
106 return {"path": path}
106 return {"path": path}
107
107
108
108
109 @jsonrpc_method()
109 @jsonrpc_method()
110 def get_ip(request, apiuser, userid=Optional(OAttr('apiuser'))):
110 def get_ip(request, apiuser, userid=Optional(OAttr('apiuser'))):
111 """
111 """
112 Displays the IP Address as seen from the |RCE| server.
112 Displays the IP Address as seen from the |RCE| server.
113
113
114 * This command displays the IP Address, as well as all the defined IP
114 * This command displays the IP Address, as well as all the defined IP
115 addresses for the specified user. If the ``userid`` is not set, the
115 addresses for the specified user. If the ``userid`` is not set, the
116 data returned is for the user calling the method.
116 data returned is for the user calling the method.
117
117
118 This command can only be run using an |authtoken| with admin rights to
118 This command can only be run using an |authtoken| with admin rights to
119 the specified repository.
119 the specified repository.
120
120
121 This command takes the following options:
121 This command takes the following options:
122
122
123 :param apiuser: This is filled automatically from |authtoken|.
123 :param apiuser: This is filled automatically from |authtoken|.
124 :type apiuser: AuthUser
124 :type apiuser: AuthUser
125 :param userid: Sets the userid for which associated IP Address data
125 :param userid: Sets the userid for which associated IP Address data
126 is returned.
126 is returned.
127 :type userid: Optional(str or int)
127 :type userid: Optional(str or int)
128
128
129 Example output:
129 Example output:
130
130
131 .. code-block:: bash
131 .. code-block:: bash
132
132
133 id : <id_given_in_input>
133 id : <id_given_in_input>
134 result : {
134 result : {
135 "server_ip_addr": "<ip_from_clien>",
135 "server_ip_addr": "<ip_from_clien>",
136 "user_ips": [
136 "user_ips": [
137 {
137 {
138 "ip_addr": "<ip_with_mask>",
138 "ip_addr": "<ip_with_mask>",
139 "ip_range": ["<start_ip>", "<end_ip>"],
139 "ip_range": ["<start_ip>", "<end_ip>"],
140 },
140 },
141 ...
141 ...
142 ]
142 ]
143 }
143 }
144
144
145 """
145 """
146 if not has_superadmin_permission(apiuser):
146 if not has_superadmin_permission(apiuser):
147 raise JSONRPCForbidden()
147 raise JSONRPCForbidden()
148
148
149 userid = Optional.extract(userid, evaluate_locals=locals())
149 userid = Optional.extract(userid, evaluate_locals=locals())
150 userid = getattr(userid, 'user_id', userid)
150 userid = getattr(userid, 'user_id', userid)
151
151
152 user = get_user_or_error(userid)
152 user = get_user_or_error(userid)
153 ips = UserIpMap.query().filter(UserIpMap.user == user).all()
153 ips = UserIpMap.query().filter(UserIpMap.user == user).all()
154 return {
154 return {
155 'server_ip_addr': request.rpc_ip_addr,
155 'server_ip_addr': request.rpc_ip_addr,
156 'user_ips': ips
156 'user_ips': ips
157 }
157 }
158
158
159
159
160 @jsonrpc_method()
160 @jsonrpc_method()
161 def rescan_repos(request, apiuser, remove_obsolete=Optional(False)):
161 def rescan_repos(request, apiuser, remove_obsolete=Optional(False)):
162 """
162 """
163 Triggers a rescan of the specified repositories.
163 Triggers a rescan of the specified repositories.
164
164
165 * If the ``remove_obsolete`` option is set, it also deletes repositories
165 * If the ``remove_obsolete`` option is set, it also deletes repositories
166 that are found in the database but not on the file system, so called
166 that are found in the database but not on the file system, so called
167 "clean zombies".
167 "clean zombies".
168
168
169 This command can only be run using an |authtoken| with admin rights to
169 This command can only be run using an |authtoken| with admin rights to
170 the specified repository.
170 the specified repository.
171
171
172 This command takes the following options:
172 This command takes the following options:
173
173
174 :param apiuser: This is filled automatically from the |authtoken|.
174 :param apiuser: This is filled automatically from the |authtoken|.
175 :type apiuser: AuthUser
175 :type apiuser: AuthUser
176 :param remove_obsolete: Deletes repositories from the database that
176 :param remove_obsolete: Deletes repositories from the database that
177 are not found on the filesystem.
177 are not found on the filesystem.
178 :type remove_obsolete: Optional(``True`` | ``False``)
178 :type remove_obsolete: Optional(``True`` | ``False``)
179
179
180 Example output:
180 Example output:
181
181
182 .. code-block:: bash
182 .. code-block:: bash
183
183
184 id : <id_given_in_input>
184 id : <id_given_in_input>
185 result : {
185 result : {
186 'added': [<added repository name>,...]
186 'added': [<added repository name>,...]
187 'removed': [<removed repository name>,...]
187 'removed': [<removed repository name>,...]
188 }
188 }
189 error : null
189 error : null
190
190
191 Example error output:
191 Example error output:
192
192
193 .. code-block:: bash
193 .. code-block:: bash
194
194
195 id : <id_given_in_input>
195 id : <id_given_in_input>
196 result : null
196 result : null
197 error : {
197 error : {
198 'Error occurred during rescan repositories action'
198 'Error occurred during rescan repositories action'
199 }
199 }
200
200
201 """
201 """
202 if not has_superadmin_permission(apiuser):
202 if not has_superadmin_permission(apiuser):
203 raise JSONRPCForbidden()
203 raise JSONRPCForbidden()
204
204
205 try:
205 try:
206 rm_obsolete = Optional.extract(remove_obsolete)
206 rm_obsolete = Optional.extract(remove_obsolete)
207 added, removed = repo2db_mapper(ScmModel().repo_scan(),
207 added, removed = repo2db_mapper(ScmModel().repo_scan(),
208 remove_obsolete=rm_obsolete, force_hooks_rebuild=True)
208 remove_obsolete=rm_obsolete, force_hooks_rebuild=True)
209 return {'added': added, 'removed': removed}
209 return {'added': added, 'removed': removed}
210 except Exception:
210 except Exception:
211 log.exception('Failed to run repo rescann')
211 log.exception('Failed to run repo rescann')
212 raise JSONRPCError(
212 raise JSONRPCError(
213 'Error occurred during rescan repositories action'
213 'Error occurred during rescan repositories action'
214 )
214 )
215
215
216
216
217 @jsonrpc_method()
217 @jsonrpc_method()
218 def cleanup_sessions(request, apiuser, older_then=Optional(60)):
218 def cleanup_sessions(request, apiuser, older_then=Optional(60)):
219 """
219 """
220 Triggers a session cleanup action.
220 Triggers a session cleanup action.
221
221
222 If the ``older_then`` option is set, only sessions that hasn't been
222 If the ``older_then`` option is set, only sessions that hasn't been
223 accessed in the given number of days will be removed.
223 accessed in the given number of days will be removed.
224
224
225 This command can only be run using an |authtoken| with admin rights to
225 This command can only be run using an |authtoken| with admin rights to
226 the specified repository.
226 the specified repository.
227
227
228 This command takes the following options:
228 This command takes the following options:
229
229
230 :param apiuser: This is filled automatically from the |authtoken|.
230 :param apiuser: This is filled automatically from the |authtoken|.
231 :type apiuser: AuthUser
231 :type apiuser: AuthUser
232 :param older_then: Deletes session that hasn't been accessed
232 :param older_then: Deletes session that hasn't been accessed
233 in given number of days.
233 in given number of days.
234 :type older_then: Optional(int)
234 :type older_then: Optional(int)
235
235
236 Example output:
236 Example output:
237
237
238 .. code-block:: bash
238 .. code-block:: bash
239
239
240 id : <id_given_in_input>
240 id : <id_given_in_input>
241 result: {
241 result: {
242 "backend": "<type of backend>",
242 "backend": "<type of backend>",
243 "sessions_removed": <number_of_removed_sessions>
243 "sessions_removed": <number_of_removed_sessions>
244 }
244 }
245 error : null
245 error : null
246
246
247 Example error output:
247 Example error output:
248
248
249 .. code-block:: bash
249 .. code-block:: bash
250
250
251 id : <id_given_in_input>
251 id : <id_given_in_input>
252 result : null
252 result : null
253 error : {
253 error : {
254 'Error occurred during session cleanup'
254 'Error occurred during session cleanup'
255 }
255 }
256
256
257 """
257 """
258 if not has_superadmin_permission(apiuser):
258 if not has_superadmin_permission(apiuser):
259 raise JSONRPCForbidden()
259 raise JSONRPCForbidden()
260
260
261 older_then = safe_int(Optional.extract(older_then)) or 60
261 older_then = safe_int(Optional.extract(older_then)) or 60
262 older_than_seconds = 60 * 60 * 24 * older_then
262 older_than_seconds = 60 * 60 * 24 * older_then
263
263
264 config = system_info.rhodecode_config().get_value()['value']['config']
264 config = system_info.rhodecode_config().get_value()['value']['config']
265 session_model = user_sessions.get_session_handler(
265 session_model = user_sessions.get_session_handler(
266 config.get('beaker.session.type', 'memory'))(config)
266 config.get('beaker.session.type', 'memory'))(config)
267
267
268 backend = session_model.SESSION_TYPE
268 backend = session_model.SESSION_TYPE
269 try:
269 try:
270 cleaned = session_model.clean_sessions(
270 cleaned = session_model.clean_sessions(
271 older_than_seconds=older_than_seconds)
271 older_than_seconds=older_than_seconds)
272 return {'sessions_removed': cleaned, 'backend': backend}
272 return {'sessions_removed': cleaned, 'backend': backend}
273 except user_sessions.CleanupCommand as msg:
273 except user_sessions.CleanupCommand as msg:
274 return {'cleanup_command': str(msg), 'backend': backend}
274 return {'cleanup_command': str(msg), 'backend': backend}
275 except Exception as e:
275 except Exception as e:
276 log.exception('Failed session cleanup')
276 log.exception('Failed session cleanup')
277 raise JSONRPCError(
277 raise JSONRPCError(
278 'Error occurred during session cleanup'
278 'Error occurred during session cleanup'
279 )
279 )
280
280
281
281
282 @jsonrpc_method()
282 @jsonrpc_method()
283 def get_method(request, apiuser, pattern=Optional('*')):
283 def get_method(request, apiuser, pattern=Optional('*')):
284 """
284 """
285 Returns list of all available API methods. By default match pattern
285 Returns list of all available API methods. By default match pattern
286 os "*" but any other pattern can be specified. eg *comment* will return
286 os "*" but any other pattern can be specified. eg *comment* will return
287 all methods with comment inside them. If just single method is matched
287 all methods with comment inside them. If just single method is matched
288 returned data will also include method specification
288 returned data will also include method specification
289
289
290 This command can only be run using an |authtoken| with admin rights to
290 This command can only be run using an |authtoken| with admin rights to
291 the specified repository.
291 the specified repository.
292
292
293 This command takes the following options:
293 This command takes the following options:
294
294
295 :param apiuser: This is filled automatically from the |authtoken|.
295 :param apiuser: This is filled automatically from the |authtoken|.
296 :type apiuser: AuthUser
296 :type apiuser: AuthUser
297 :param pattern: pattern to match method names against
297 :param pattern: pattern to match method names against
298 :type pattern: Optional("*")
298 :type pattern: Optional("*")
299
299
300 Example output:
300 Example output:
301
301
302 .. code-block:: bash
302 .. code-block:: bash
303
303
304 id : <id_given_in_input>
304 id : <id_given_in_input>
305 "result": [
305 "result": [
306 "changeset_comment",
306 "changeset_comment",
307 "comment_pull_request",
307 "comment_pull_request",
308 "comment_commit"
308 "comment_commit"
309 ]
309 ]
310 error : null
310 error : null
311
311
312 .. code-block:: bash
312 .. code-block:: bash
313
313
314 id : <id_given_in_input>
314 id : <id_given_in_input>
315 "result": [
315 "result": [
316 "comment_commit",
316 "comment_commit",
317 {
317 {
318 "apiuser": "<RequiredType>",
318 "apiuser": "<RequiredType>",
319 "comment_type": "<Optional:u'note'>",
319 "comment_type": "<Optional:u'note'>",
320 "commit_id": "<RequiredType>",
320 "commit_id": "<RequiredType>",
321 "message": "<RequiredType>",
321 "message": "<RequiredType>",
322 "repoid": "<RequiredType>",
322 "repoid": "<RequiredType>",
323 "request": "<RequiredType>",
323 "request": "<RequiredType>",
324 "resolves_comment_id": "<Optional:None>",
324 "resolves_comment_id": "<Optional:None>",
325 "status": "<Optional:None>",
325 "status": "<Optional:None>",
326 "userid": "<Optional:<OptionalAttr:apiuser>>"
326 "userid": "<Optional:<OptionalAttr:apiuser>>"
327 }
327 }
328 ]
328 ]
329 error : null
329 error : null
330 """
330 """
331 from rhodecode.config.patches import inspect_getargspec
331 from rhodecode.config import patches
332 inspect = inspect_getargspec()
332 inspect = patches.inspect_getargspec()
333
333
334 if not has_superadmin_permission(apiuser):
334 if not has_superadmin_permission(apiuser):
335 raise JSONRPCForbidden()
335 raise JSONRPCForbidden()
336
336
337 pattern = Optional.extract(pattern)
337 pattern = Optional.extract(pattern)
338
338
339 matches = find_methods(request.registry.jsonrpc_methods, pattern)
339 matches = find_methods(request.registry.jsonrpc_methods, pattern)
340
340
341 args_desc = []
341 args_desc = []
342 matches_keys = list(matches.keys())
342 matches_keys = list(matches.keys())
343 if len(matches_keys) == 1:
343 if len(matches_keys) == 1:
344 func = matches[matches_keys[0]]
344 func = matches[matches_keys[0]]
345
345
346 argspec = inspect.getargspec(func)
346 argspec = inspect.getargspec(func)
347 arglist = argspec[0]
347 arglist = argspec[0]
348 defaults = list(map(repr, argspec[3] or []))
348 defaults = list(map(repr, argspec[3] or []))
349
349
350 default_empty = '<RequiredType>'
350 default_empty = '<RequiredType>'
351
351
352 # kw arguments required by this method
352 # kw arguments required by this method
353 func_kwargs = dict(itertools.zip_longest(
353 func_kwargs = dict(itertools.zip_longest(
354 reversed(arglist), reversed(defaults), fillvalue=default_empty))
354 reversed(arglist), reversed(defaults), fillvalue=default_empty))
355 args_desc.append(func_kwargs)
355 args_desc.append(func_kwargs)
356
356
357 return matches_keys + args_desc
357 return matches_keys + args_desc
358
358
359
359
360 @jsonrpc_method()
360 @jsonrpc_method()
361 def store_exception(request, apiuser, exc_data_json, prefix=Optional('rhodecode')):
361 def store_exception(request, apiuser, exc_data_json, prefix=Optional('rhodecode')):
362 """
362 """
363 Stores sent exception inside the built-in exception tracker in |RCE| server.
363 Stores sent exception inside the built-in exception tracker in |RCE| server.
364
364
365 This command can only be run using an |authtoken| with admin rights to
365 This command can only be run using an |authtoken| with admin rights to
366 the specified repository.
366 the specified repository.
367
367
368 This command takes the following options:
368 This command takes the following options:
369
369
370 :param apiuser: This is filled automatically from the |authtoken|.
370 :param apiuser: This is filled automatically from the |authtoken|.
371 :type apiuser: AuthUser
371 :type apiuser: AuthUser
372
372
373 :param exc_data_json: JSON data with exception e.g
373 :param exc_data_json: JSON data with exception e.g
374 {"exc_traceback": "Value `1` is not allowed", "exc_type_name": "ValueError"}
374 {"exc_traceback": "Value `1` is not allowed", "exc_type_name": "ValueError"}
375 :type exc_data_json: JSON data
375 :type exc_data_json: JSON data
376
376
377 :param prefix: prefix for error type, e.g 'rhodecode', 'vcsserver', 'rhodecode-tools'
377 :param prefix: prefix for error type, e.g 'rhodecode', 'vcsserver', 'rhodecode-tools'
378 :type prefix: Optional("rhodecode")
378 :type prefix: Optional("rhodecode")
379
379
380 Example output:
380 Example output:
381
381
382 .. code-block:: bash
382 .. code-block:: bash
383
383
384 id : <id_given_in_input>
384 id : <id_given_in_input>
385 "result": {
385 "result": {
386 "exc_id": 139718459226384,
386 "exc_id": 139718459226384,
387 "exc_url": "http://localhost:8080/_admin/settings/exceptions/139718459226384"
387 "exc_url": "http://localhost:8080/_admin/settings/exceptions/139718459226384"
388 }
388 }
389 error : null
389 error : null
390 """
390 """
391 if not has_superadmin_permission(apiuser):
391 if not has_superadmin_permission(apiuser):
392 raise JSONRPCForbidden()
392 raise JSONRPCForbidden()
393
393
394 prefix = Optional.extract(prefix)
394 prefix = Optional.extract(prefix)
395 exc_id = exc_tracking.generate_id()
395 exc_id = exc_tracking.generate_id()
396
396
397 try:
397 try:
398 exc_data = json.loads(exc_data_json)
398 exc_data = json.loads(exc_data_json)
399 except Exception:
399 except Exception:
400 log.error('Failed to parse JSON: %r', exc_data_json)
400 log.error('Failed to parse JSON: %r', exc_data_json)
401 raise JSONRPCError('Failed to parse JSON data from exc_data_json field. '
401 raise JSONRPCError('Failed to parse JSON data from exc_data_json field. '
402 'Please make sure it contains a valid JSON.')
402 'Please make sure it contains a valid JSON.')
403
403
404 try:
404 try:
405 exc_traceback = exc_data['exc_traceback']
405 exc_traceback = exc_data['exc_traceback']
406 exc_type_name = exc_data['exc_type_name']
406 exc_type_name = exc_data['exc_type_name']
407 exc_value = ''
407 exc_value = ''
408 except KeyError as err:
408 except KeyError as err:
409 raise JSONRPCError(
409 raise JSONRPCError(
410 f'Missing exc_traceback, or exc_type_name '
410 f'Missing exc_traceback, or exc_type_name '
411 f'in exc_data_json field. Missing: {err}')
411 f'in exc_data_json field. Missing: {err}')
412
412
413 class ExcType:
413 class ExcType:
414 __name__ = exc_type_name
414 __name__ = exc_type_name
415
415
416 exc_info = (ExcType(), exc_value, exc_traceback)
416 exc_info = (ExcType(), exc_value, exc_traceback)
417
417
418 exc_tracking._store_exception(
418 exc_tracking._store_exception(
419 exc_id=exc_id, exc_info=exc_info, prefix=prefix)
419 exc_id=exc_id, exc_info=exc_info, prefix=prefix)
420
420
421 exc_url = request.route_url(
421 exc_url = request.route_url(
422 'admin_settings_exception_tracker_show', exception_id=exc_id)
422 'admin_settings_exception_tracker_show', exception_id=exc_id)
423 return {'exc_id': exc_id, 'exc_url': exc_url}
423 return {'exc_id': exc_id, 'exc_url': exc_url}
@@ -1,1093 +1,1124 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19
19
20 from rhodecode.apps._base import ADMIN_PREFIX
20 from rhodecode.apps._base import ADMIN_PREFIX
21 from rhodecode.apps._base.navigation import includeme as nav_includeme
21 from rhodecode.apps._base.navigation import includeme as nav_includeme
22 from rhodecode.apps.admin.views.main_views import AdminMainView
22 from rhodecode.apps.admin.views.main_views import AdminMainView
23
23
24
24
25 def admin_routes(config):
25 def admin_routes(config):
26 """
26 """
27 Admin prefixed routes
27 Admin prefixed routes
28 """
28 """
29 from rhodecode.apps.admin.views.audit_logs import AdminAuditLogsView
29 from rhodecode.apps.admin.views.audit_logs import AdminAuditLogsView
30 from rhodecode.apps.admin.views.artifacts import AdminArtifactsView
30 from rhodecode.apps.admin.views.artifacts import AdminArtifactsView
31 from rhodecode.apps.admin.views.automation import AdminAutomationView
31 from rhodecode.apps.admin.views.automation import AdminAutomationView
32 from rhodecode.apps.admin.views.scheduler import AdminSchedulerView
32 from rhodecode.apps.admin.views.scheduler import AdminSchedulerView
33 from rhodecode.apps.admin.views.defaults import AdminDefaultSettingsView
33 from rhodecode.apps.admin.views.defaults import AdminDefaultSettingsView
34 from rhodecode.apps.admin.views.exception_tracker import ExceptionsTrackerView
34 from rhodecode.apps.admin.views.exception_tracker import ExceptionsTrackerView
35 from rhodecode.apps.admin.views.open_source_licenses import OpenSourceLicensesAdminSettingsView
35 from rhodecode.apps.admin.views.open_source_licenses import OpenSourceLicensesAdminSettingsView
36 from rhodecode.apps.admin.views.permissions import AdminPermissionsView
36 from rhodecode.apps.admin.views.permissions import AdminPermissionsView
37 from rhodecode.apps.admin.views.process_management import AdminProcessManagementView
37 from rhodecode.apps.admin.views.process_management import AdminProcessManagementView
38 from rhodecode.apps.admin.views.repo_groups import AdminRepoGroupsView
38 from rhodecode.apps.admin.views.repo_groups import AdminRepoGroupsView
39 from rhodecode.apps.admin.views.repositories import AdminReposView
39 from rhodecode.apps.admin.views.repositories import AdminReposView
40 from rhodecode.apps.admin.views.sessions import AdminSessionSettingsView
40 from rhodecode.apps.admin.views.sessions import AdminSessionSettingsView
41 from rhodecode.apps.admin.views.settings import AdminSettingsView
41 from rhodecode.apps.admin.views.settings import AdminSettingsView
42 from rhodecode.apps.admin.views.svn_config import AdminSvnConfigView
42 from rhodecode.apps.admin.views.svn_config import AdminSvnConfigView
43 from rhodecode.apps.admin.views.system_info import AdminSystemInfoSettingsView
43 from rhodecode.apps.admin.views.system_info import AdminSystemInfoSettingsView
44 from rhodecode.apps.admin.views.user_groups import AdminUserGroupsView
44 from rhodecode.apps.admin.views.user_groups import AdminUserGroupsView
45 from rhodecode.apps.admin.views.users import AdminUsersView, UsersView
45 from rhodecode.apps.admin.views.users import AdminUsersView, UsersView
46
46 from rhodecode.apps.admin.views.security import AdminSecurityView
47
48 # Security EE feature
49
50 config.add_route(
51 'admin_security',
52 pattern='/security')
53 config.add_view(
54 AdminSecurityView,
55 attr='security',
56 route_name='admin_security', request_method='GET',
57 renderer='rhodecode:templates/admin/security/security.mako')
58
59 config.add_route(
60 name='admin_security_update',
61 pattern='/security/update')
62 config.add_view(
63 AdminSecurityView,
64 attr='security_update',
65 route_name='admin_security_update', request_method='POST',
66 renderer='rhodecode:templates/admin/security/security.mako')
67
68 config.add_route(
69 name='admin_security_modify_allowed_vcs_client_versions',
70 pattern=ADMIN_PREFIX + '/security/modify/allowed_vcs_client_versions')
71 config.add_view(
72 AdminSecurityView,
73 attr='vcs_whitelisted_client_versions_edit',
74 route_name='admin_security_modify_allowed_vcs_client_versions', request_method=('GET', 'POST'),
75 renderer='rhodecode:templates/admin/security/edit_allowed_vcs_client_versions.mako')
76
77
47 config.add_route(
78 config.add_route(
48 name='admin_audit_logs',
79 name='admin_audit_logs',
49 pattern='/audit_logs')
80 pattern='/audit_logs')
50 config.add_view(
81 config.add_view(
51 AdminAuditLogsView,
82 AdminAuditLogsView,
52 attr='admin_audit_logs',
83 attr='admin_audit_logs',
53 route_name='admin_audit_logs', request_method='GET',
84 route_name='admin_audit_logs', request_method='GET',
54 renderer='rhodecode:templates/admin/admin_audit_logs.mako')
85 renderer='rhodecode:templates/admin/admin_audit_logs.mako')
55
86
56 config.add_route(
87 config.add_route(
57 name='admin_audit_log_entry',
88 name='admin_audit_log_entry',
58 pattern='/audit_logs/{audit_log_id}')
89 pattern='/audit_logs/{audit_log_id}')
59 config.add_view(
90 config.add_view(
60 AdminAuditLogsView,
91 AdminAuditLogsView,
61 attr='admin_audit_log_entry',
92 attr='admin_audit_log_entry',
62 route_name='admin_audit_log_entry', request_method='GET',
93 route_name='admin_audit_log_entry', request_method='GET',
63 renderer='rhodecode:templates/admin/admin_audit_log_entry.mako')
94 renderer='rhodecode:templates/admin/admin_audit_log_entry.mako')
64
95
65 # Artifacts EE feature
96 # Artifacts EE feature
66 config.add_route(
97 config.add_route(
67 'admin_artifacts',
98 'admin_artifacts',
68 pattern=ADMIN_PREFIX + '/artifacts')
99 pattern=ADMIN_PREFIX + '/artifacts')
69 config.add_route(
100 config.add_route(
70 'admin_artifacts_show_all',
101 'admin_artifacts_show_all',
71 pattern=ADMIN_PREFIX + '/artifacts')
102 pattern=ADMIN_PREFIX + '/artifacts')
72 config.add_view(
103 config.add_view(
73 AdminArtifactsView,
104 AdminArtifactsView,
74 attr='artifacts',
105 attr='artifacts',
75 route_name='admin_artifacts', request_method='GET',
106 route_name='admin_artifacts', request_method='GET',
76 renderer='rhodecode:templates/admin/artifacts/artifacts.mako')
107 renderer='rhodecode:templates/admin/artifacts/artifacts.mako')
77 config.add_view(
108 config.add_view(
78 AdminArtifactsView,
109 AdminArtifactsView,
79 attr='artifacts',
110 attr='artifacts',
80 route_name='admin_artifacts_show_all', request_method='GET',
111 route_name='admin_artifacts_show_all', request_method='GET',
81 renderer='rhodecode:templates/admin/artifacts/artifacts.mako')
112 renderer='rhodecode:templates/admin/artifacts/artifacts.mako')
82
113
83 # EE views
114 # EE views
84 config.add_route(
115 config.add_route(
85 name='admin_artifacts_show_info',
116 name='admin_artifacts_show_info',
86 pattern=ADMIN_PREFIX + '/artifacts/{uid}')
117 pattern=ADMIN_PREFIX + '/artifacts/{uid}')
87 config.add_route(
118 config.add_route(
88 name='admin_artifacts_delete',
119 name='admin_artifacts_delete',
89 pattern=ADMIN_PREFIX + '/artifacts/{uid}/delete')
120 pattern=ADMIN_PREFIX + '/artifacts/{uid}/delete')
90 config.add_route(
121 config.add_route(
91 name='admin_artifacts_update',
122 name='admin_artifacts_update',
92 pattern=ADMIN_PREFIX + '/artifacts/{uid}/update')
123 pattern=ADMIN_PREFIX + '/artifacts/{uid}/update')
93
124
94 # Automation EE feature
125 # Automation EE feature
95 config.add_route(
126 config.add_route(
96 'admin_automation',
127 'admin_automation',
97 pattern=ADMIN_PREFIX + '/automation')
128 pattern=ADMIN_PREFIX + '/automation')
98 config.add_view(
129 config.add_view(
99 AdminAutomationView,
130 AdminAutomationView,
100 attr='automation',
131 attr='automation',
101 route_name='admin_automation', request_method='GET',
132 route_name='admin_automation', request_method='GET',
102 renderer='rhodecode:templates/admin/automation/automation.mako')
133 renderer='rhodecode:templates/admin/automation/automation.mako')
103
134
104 # Scheduler EE feature
135 # Scheduler EE feature
105 config.add_route(
136 config.add_route(
106 'admin_scheduler',
137 'admin_scheduler',
107 pattern=ADMIN_PREFIX + '/scheduler')
138 pattern=ADMIN_PREFIX + '/scheduler')
108 config.add_view(
139 config.add_view(
109 AdminSchedulerView,
140 AdminSchedulerView,
110 attr='scheduler',
141 attr='scheduler',
111 route_name='admin_scheduler', request_method='GET',
142 route_name='admin_scheduler', request_method='GET',
112 renderer='rhodecode:templates/admin/scheduler/scheduler.mako')
143 renderer='rhodecode:templates/admin/scheduler/scheduler.mako')
113
144
114 config.add_route(
145 config.add_route(
115 name='admin_settings_open_source',
146 name='admin_settings_open_source',
116 pattern='/settings/open_source')
147 pattern='/settings/open_source')
117 config.add_view(
148 config.add_view(
118 OpenSourceLicensesAdminSettingsView,
149 OpenSourceLicensesAdminSettingsView,
119 attr='open_source_licenses',
150 attr='open_source_licenses',
120 route_name='admin_settings_open_source', request_method='GET',
151 route_name='admin_settings_open_source', request_method='GET',
121 renderer='rhodecode:templates/admin/settings/settings.mako')
152 renderer='rhodecode:templates/admin/settings/settings.mako')
122
153
123 config.add_route(
154 config.add_route(
124 name='admin_settings_vcs_svn_generate_cfg',
155 name='admin_settings_vcs_svn_generate_cfg',
125 pattern='/settings/vcs/svn_generate_cfg')
156 pattern='/settings/vcs/svn_generate_cfg')
126 config.add_view(
157 config.add_view(
127 AdminSvnConfigView,
158 AdminSvnConfigView,
128 attr='vcs_svn_generate_config',
159 attr='vcs_svn_generate_config',
129 route_name='admin_settings_vcs_svn_generate_cfg',
160 route_name='admin_settings_vcs_svn_generate_cfg',
130 request_method='POST', renderer='json')
161 request_method='POST', renderer='json')
131
162
132 config.add_route(
163 config.add_route(
133 name='admin_settings_system',
164 name='admin_settings_system',
134 pattern='/settings/system')
165 pattern='/settings/system')
135 config.add_view(
166 config.add_view(
136 AdminSystemInfoSettingsView,
167 AdminSystemInfoSettingsView,
137 attr='settings_system_info',
168 attr='settings_system_info',
138 route_name='admin_settings_system', request_method='GET',
169 route_name='admin_settings_system', request_method='GET',
139 renderer='rhodecode:templates/admin/settings/settings.mako')
170 renderer='rhodecode:templates/admin/settings/settings.mako')
140
171
141 config.add_route(
172 config.add_route(
142 name='admin_settings_system_update',
173 name='admin_settings_system_update',
143 pattern='/settings/system/updates')
174 pattern='/settings/system/updates')
144 config.add_view(
175 config.add_view(
145 AdminSystemInfoSettingsView,
176 AdminSystemInfoSettingsView,
146 attr='settings_system_info_check_update',
177 attr='settings_system_info_check_update',
147 route_name='admin_settings_system_update', request_method='GET',
178 route_name='admin_settings_system_update', request_method='GET',
148 renderer='rhodecode:templates/admin/settings/settings_system_update.mako')
179 renderer='rhodecode:templates/admin/settings/settings_system_update.mako')
149
180
150 config.add_route(
181 config.add_route(
151 name='admin_settings_exception_tracker',
182 name='admin_settings_exception_tracker',
152 pattern='/settings/exceptions')
183 pattern='/settings/exceptions')
153 config.add_view(
184 config.add_view(
154 ExceptionsTrackerView,
185 ExceptionsTrackerView,
155 attr='browse_exceptions',
186 attr='browse_exceptions',
156 route_name='admin_settings_exception_tracker', request_method='GET',
187 route_name='admin_settings_exception_tracker', request_method='GET',
157 renderer='rhodecode:templates/admin/settings/settings.mako')
188 renderer='rhodecode:templates/admin/settings/settings.mako')
158
189
159 config.add_route(
190 config.add_route(
160 name='admin_settings_exception_tracker_delete_all',
191 name='admin_settings_exception_tracker_delete_all',
161 pattern='/settings/exceptions_delete_all')
192 pattern='/settings/exceptions_delete_all')
162 config.add_view(
193 config.add_view(
163 ExceptionsTrackerView,
194 ExceptionsTrackerView,
164 attr='exception_delete_all',
195 attr='exception_delete_all',
165 route_name='admin_settings_exception_tracker_delete_all', request_method='POST',
196 route_name='admin_settings_exception_tracker_delete_all', request_method='POST',
166 renderer='rhodecode:templates/admin/settings/settings.mako')
197 renderer='rhodecode:templates/admin/settings/settings.mako')
167
198
168 config.add_route(
199 config.add_route(
169 name='admin_settings_exception_tracker_show',
200 name='admin_settings_exception_tracker_show',
170 pattern='/settings/exceptions/{exception_id}')
201 pattern='/settings/exceptions/{exception_id}')
171 config.add_view(
202 config.add_view(
172 ExceptionsTrackerView,
203 ExceptionsTrackerView,
173 attr='exception_show',
204 attr='exception_show',
174 route_name='admin_settings_exception_tracker_show', request_method='GET',
205 route_name='admin_settings_exception_tracker_show', request_method='GET',
175 renderer='rhodecode:templates/admin/settings/settings.mako')
206 renderer='rhodecode:templates/admin/settings/settings.mako')
176
207
177 config.add_route(
208 config.add_route(
178 name='admin_settings_exception_tracker_delete',
209 name='admin_settings_exception_tracker_delete',
179 pattern='/settings/exceptions/{exception_id}/delete')
210 pattern='/settings/exceptions/{exception_id}/delete')
180 config.add_view(
211 config.add_view(
181 ExceptionsTrackerView,
212 ExceptionsTrackerView,
182 attr='exception_delete',
213 attr='exception_delete',
183 route_name='admin_settings_exception_tracker_delete', request_method='POST',
214 route_name='admin_settings_exception_tracker_delete', request_method='POST',
184 renderer='rhodecode:templates/admin/settings/settings.mako')
215 renderer='rhodecode:templates/admin/settings/settings.mako')
185
216
186 config.add_route(
217 config.add_route(
187 name='admin_settings_sessions',
218 name='admin_settings_sessions',
188 pattern='/settings/sessions')
219 pattern='/settings/sessions')
189 config.add_view(
220 config.add_view(
190 AdminSessionSettingsView,
221 AdminSessionSettingsView,
191 attr='settings_sessions',
222 attr='settings_sessions',
192 route_name='admin_settings_sessions', request_method='GET',
223 route_name='admin_settings_sessions', request_method='GET',
193 renderer='rhodecode:templates/admin/settings/settings.mako')
224 renderer='rhodecode:templates/admin/settings/settings.mako')
194
225
195 config.add_route(
226 config.add_route(
196 name='admin_settings_sessions_cleanup',
227 name='admin_settings_sessions_cleanup',
197 pattern='/settings/sessions/cleanup')
228 pattern='/settings/sessions/cleanup')
198 config.add_view(
229 config.add_view(
199 AdminSessionSettingsView,
230 AdminSessionSettingsView,
200 attr='settings_sessions_cleanup',
231 attr='settings_sessions_cleanup',
201 route_name='admin_settings_sessions_cleanup', request_method='POST')
232 route_name='admin_settings_sessions_cleanup', request_method='POST')
202
233
203 config.add_route(
234 config.add_route(
204 name='admin_settings_process_management',
235 name='admin_settings_process_management',
205 pattern='/settings/process_management')
236 pattern='/settings/process_management')
206 config.add_view(
237 config.add_view(
207 AdminProcessManagementView,
238 AdminProcessManagementView,
208 attr='process_management',
239 attr='process_management',
209 route_name='admin_settings_process_management', request_method='GET',
240 route_name='admin_settings_process_management', request_method='GET',
210 renderer='rhodecode:templates/admin/settings/settings.mako')
241 renderer='rhodecode:templates/admin/settings/settings.mako')
211
242
212 config.add_route(
243 config.add_route(
213 name='admin_settings_process_management_data',
244 name='admin_settings_process_management_data',
214 pattern='/settings/process_management/data')
245 pattern='/settings/process_management/data')
215 config.add_view(
246 config.add_view(
216 AdminProcessManagementView,
247 AdminProcessManagementView,
217 attr='process_management_data',
248 attr='process_management_data',
218 route_name='admin_settings_process_management_data', request_method='GET',
249 route_name='admin_settings_process_management_data', request_method='GET',
219 renderer='rhodecode:templates/admin/settings/settings_process_management_data.mako')
250 renderer='rhodecode:templates/admin/settings/settings_process_management_data.mako')
220
251
221 config.add_route(
252 config.add_route(
222 name='admin_settings_process_management_signal',
253 name='admin_settings_process_management_signal',
223 pattern='/settings/process_management/signal')
254 pattern='/settings/process_management/signal')
224 config.add_view(
255 config.add_view(
225 AdminProcessManagementView,
256 AdminProcessManagementView,
226 attr='process_management_signal',
257 attr='process_management_signal',
227 route_name='admin_settings_process_management_signal',
258 route_name='admin_settings_process_management_signal',
228 request_method='POST', renderer='json_ext')
259 request_method='POST', renderer='json_ext')
229
260
230 config.add_route(
261 config.add_route(
231 name='admin_settings_process_management_master_signal',
262 name='admin_settings_process_management_master_signal',
232 pattern='/settings/process_management/master_signal')
263 pattern='/settings/process_management/master_signal')
233 config.add_view(
264 config.add_view(
234 AdminProcessManagementView,
265 AdminProcessManagementView,
235 attr='process_management_master_signal',
266 attr='process_management_master_signal',
236 route_name='admin_settings_process_management_master_signal',
267 route_name='admin_settings_process_management_master_signal',
237 request_method='POST', renderer='json_ext')
268 request_method='POST', renderer='json_ext')
238
269
239 # default settings
270 # default settings
240 config.add_route(
271 config.add_route(
241 name='admin_defaults_repositories',
272 name='admin_defaults_repositories',
242 pattern='/defaults/repositories')
273 pattern='/defaults/repositories')
243 config.add_view(
274 config.add_view(
244 AdminDefaultSettingsView,
275 AdminDefaultSettingsView,
245 attr='defaults_repository_show',
276 attr='defaults_repository_show',
246 route_name='admin_defaults_repositories', request_method='GET',
277 route_name='admin_defaults_repositories', request_method='GET',
247 renderer='rhodecode:templates/admin/defaults/defaults.mako')
278 renderer='rhodecode:templates/admin/defaults/defaults.mako')
248
279
249 config.add_route(
280 config.add_route(
250 name='admin_defaults_repositories_update',
281 name='admin_defaults_repositories_update',
251 pattern='/defaults/repositories/update')
282 pattern='/defaults/repositories/update')
252 config.add_view(
283 config.add_view(
253 AdminDefaultSettingsView,
284 AdminDefaultSettingsView,
254 attr='defaults_repository_update',
285 attr='defaults_repository_update',
255 route_name='admin_defaults_repositories_update', request_method='POST',
286 route_name='admin_defaults_repositories_update', request_method='POST',
256 renderer='rhodecode:templates/admin/defaults/defaults.mako')
287 renderer='rhodecode:templates/admin/defaults/defaults.mako')
257
288
258 # admin settings
289 # admin settings
259
290
260 config.add_route(
291 config.add_route(
261 name='admin_settings',
292 name='admin_settings',
262 pattern='/settings')
293 pattern='/settings')
263 config.add_view(
294 config.add_view(
264 AdminSettingsView,
295 AdminSettingsView,
265 attr='settings_global',
296 attr='settings_global',
266 route_name='admin_settings', request_method='GET',
297 route_name='admin_settings', request_method='GET',
267 renderer='rhodecode:templates/admin/settings/settings.mako')
298 renderer='rhodecode:templates/admin/settings/settings.mako')
268
299
269 config.add_route(
300 config.add_route(
270 name='admin_settings_update',
301 name='admin_settings_update',
271 pattern='/settings/update')
302 pattern='/settings/update')
272 config.add_view(
303 config.add_view(
273 AdminSettingsView,
304 AdminSettingsView,
274 attr='settings_global_update',
305 attr='settings_global_update',
275 route_name='admin_settings_update', request_method='POST',
306 route_name='admin_settings_update', request_method='POST',
276 renderer='rhodecode:templates/admin/settings/settings.mako')
307 renderer='rhodecode:templates/admin/settings/settings.mako')
277
308
278 config.add_route(
309 config.add_route(
279 name='admin_settings_global',
310 name='admin_settings_global',
280 pattern='/settings/global')
311 pattern='/settings/global')
281 config.add_view(
312 config.add_view(
282 AdminSettingsView,
313 AdminSettingsView,
283 attr='settings_global',
314 attr='settings_global',
284 route_name='admin_settings_global', request_method='GET',
315 route_name='admin_settings_global', request_method='GET',
285 renderer='rhodecode:templates/admin/settings/settings.mako')
316 renderer='rhodecode:templates/admin/settings/settings.mako')
286
317
287 config.add_route(
318 config.add_route(
288 name='admin_settings_global_update',
319 name='admin_settings_global_update',
289 pattern='/settings/global/update')
320 pattern='/settings/global/update')
290 config.add_view(
321 config.add_view(
291 AdminSettingsView,
322 AdminSettingsView,
292 attr='settings_global_update',
323 attr='settings_global_update',
293 route_name='admin_settings_global_update', request_method='POST',
324 route_name='admin_settings_global_update', request_method='POST',
294 renderer='rhodecode:templates/admin/settings/settings.mako')
325 renderer='rhodecode:templates/admin/settings/settings.mako')
295
326
296 config.add_route(
327 config.add_route(
297 name='admin_settings_vcs',
328 name='admin_settings_vcs',
298 pattern='/settings/vcs')
329 pattern='/settings/vcs')
299 config.add_view(
330 config.add_view(
300 AdminSettingsView,
331 AdminSettingsView,
301 attr='settings_vcs',
332 attr='settings_vcs',
302 route_name='admin_settings_vcs', request_method='GET',
333 route_name='admin_settings_vcs', request_method='GET',
303 renderer='rhodecode:templates/admin/settings/settings.mako')
334 renderer='rhodecode:templates/admin/settings/settings.mako')
304
335
305 config.add_route(
336 config.add_route(
306 name='admin_settings_vcs_update',
337 name='admin_settings_vcs_update',
307 pattern='/settings/vcs/update')
338 pattern='/settings/vcs/update')
308 config.add_view(
339 config.add_view(
309 AdminSettingsView,
340 AdminSettingsView,
310 attr='settings_vcs_update',
341 attr='settings_vcs_update',
311 route_name='admin_settings_vcs_update', request_method='POST',
342 route_name='admin_settings_vcs_update', request_method='POST',
312 renderer='rhodecode:templates/admin/settings/settings.mako')
343 renderer='rhodecode:templates/admin/settings/settings.mako')
313
344
314 config.add_route(
345 config.add_route(
315 name='admin_settings_vcs_svn_pattern_delete',
346 name='admin_settings_vcs_svn_pattern_delete',
316 pattern='/settings/vcs/svn_pattern_delete')
347 pattern='/settings/vcs/svn_pattern_delete')
317 config.add_view(
348 config.add_view(
318 AdminSettingsView,
349 AdminSettingsView,
319 attr='settings_vcs_delete_svn_pattern',
350 attr='settings_vcs_delete_svn_pattern',
320 route_name='admin_settings_vcs_svn_pattern_delete', request_method='POST',
351 route_name='admin_settings_vcs_svn_pattern_delete', request_method='POST',
321 renderer='json_ext', xhr=True)
352 renderer='json_ext', xhr=True)
322
353
323 config.add_route(
354 config.add_route(
324 name='admin_settings_mapping',
355 name='admin_settings_mapping',
325 pattern='/settings/mapping')
356 pattern='/settings/mapping')
326 config.add_view(
357 config.add_view(
327 AdminSettingsView,
358 AdminSettingsView,
328 attr='settings_mapping',
359 attr='settings_mapping',
329 route_name='admin_settings_mapping', request_method='GET',
360 route_name='admin_settings_mapping', request_method='GET',
330 renderer='rhodecode:templates/admin/settings/settings.mako')
361 renderer='rhodecode:templates/admin/settings/settings.mako')
331
362
332 config.add_route(
363 config.add_route(
333 name='admin_settings_mapping_update',
364 name='admin_settings_mapping_update',
334 pattern='/settings/mapping/update')
365 pattern='/settings/mapping/update')
335 config.add_view(
366 config.add_view(
336 AdminSettingsView,
367 AdminSettingsView,
337 attr='settings_mapping_update',
368 attr='settings_mapping_update',
338 route_name='admin_settings_mapping_update', request_method='POST',
369 route_name='admin_settings_mapping_update', request_method='POST',
339 renderer='rhodecode:templates/admin/settings/settings.mako')
370 renderer='rhodecode:templates/admin/settings/settings.mako')
340
371
341 config.add_route(
372 config.add_route(
342 name='admin_settings_visual',
373 name='admin_settings_visual',
343 pattern='/settings/visual')
374 pattern='/settings/visual')
344 config.add_view(
375 config.add_view(
345 AdminSettingsView,
376 AdminSettingsView,
346 attr='settings_visual',
377 attr='settings_visual',
347 route_name='admin_settings_visual', request_method='GET',
378 route_name='admin_settings_visual', request_method='GET',
348 renderer='rhodecode:templates/admin/settings/settings.mako')
379 renderer='rhodecode:templates/admin/settings/settings.mako')
349
380
350 config.add_route(
381 config.add_route(
351 name='admin_settings_visual_update',
382 name='admin_settings_visual_update',
352 pattern='/settings/visual/update')
383 pattern='/settings/visual/update')
353 config.add_view(
384 config.add_view(
354 AdminSettingsView,
385 AdminSettingsView,
355 attr='settings_visual_update',
386 attr='settings_visual_update',
356 route_name='admin_settings_visual_update', request_method='POST',
387 route_name='admin_settings_visual_update', request_method='POST',
357 renderer='rhodecode:templates/admin/settings/settings.mako')
388 renderer='rhodecode:templates/admin/settings/settings.mako')
358
389
359 config.add_route(
390 config.add_route(
360 name='admin_settings_issuetracker',
391 name='admin_settings_issuetracker',
361 pattern='/settings/issue-tracker')
392 pattern='/settings/issue-tracker')
362 config.add_view(
393 config.add_view(
363 AdminSettingsView,
394 AdminSettingsView,
364 attr='settings_issuetracker',
395 attr='settings_issuetracker',
365 route_name='admin_settings_issuetracker', request_method='GET',
396 route_name='admin_settings_issuetracker', request_method='GET',
366 renderer='rhodecode:templates/admin/settings/settings.mako')
397 renderer='rhodecode:templates/admin/settings/settings.mako')
367
398
368 config.add_route(
399 config.add_route(
369 name='admin_settings_issuetracker_update',
400 name='admin_settings_issuetracker_update',
370 pattern='/settings/issue-tracker/update')
401 pattern='/settings/issue-tracker/update')
371 config.add_view(
402 config.add_view(
372 AdminSettingsView,
403 AdminSettingsView,
373 attr='settings_issuetracker_update',
404 attr='settings_issuetracker_update',
374 route_name='admin_settings_issuetracker_update', request_method='POST',
405 route_name='admin_settings_issuetracker_update', request_method='POST',
375 renderer='rhodecode:templates/admin/settings/settings.mako')
406 renderer='rhodecode:templates/admin/settings/settings.mako')
376
407
377 config.add_route(
408 config.add_route(
378 name='admin_settings_issuetracker_test',
409 name='admin_settings_issuetracker_test',
379 pattern='/settings/issue-tracker/test')
410 pattern='/settings/issue-tracker/test')
380 config.add_view(
411 config.add_view(
381 AdminSettingsView,
412 AdminSettingsView,
382 attr='settings_issuetracker_test',
413 attr='settings_issuetracker_test',
383 route_name='admin_settings_issuetracker_test', request_method='POST',
414 route_name='admin_settings_issuetracker_test', request_method='POST',
384 renderer='string', xhr=True)
415 renderer='string', xhr=True)
385
416
386 config.add_route(
417 config.add_route(
387 name='admin_settings_issuetracker_delete',
418 name='admin_settings_issuetracker_delete',
388 pattern='/settings/issue-tracker/delete')
419 pattern='/settings/issue-tracker/delete')
389 config.add_view(
420 config.add_view(
390 AdminSettingsView,
421 AdminSettingsView,
391 attr='settings_issuetracker_delete',
422 attr='settings_issuetracker_delete',
392 route_name='admin_settings_issuetracker_delete', request_method='POST',
423 route_name='admin_settings_issuetracker_delete', request_method='POST',
393 renderer='json_ext', xhr=True)
424 renderer='json_ext', xhr=True)
394
425
395 config.add_route(
426 config.add_route(
396 name='admin_settings_email',
427 name='admin_settings_email',
397 pattern='/settings/email')
428 pattern='/settings/email')
398 config.add_view(
429 config.add_view(
399 AdminSettingsView,
430 AdminSettingsView,
400 attr='settings_email',
431 attr='settings_email',
401 route_name='admin_settings_email', request_method='GET',
432 route_name='admin_settings_email', request_method='GET',
402 renderer='rhodecode:templates/admin/settings/settings.mako')
433 renderer='rhodecode:templates/admin/settings/settings.mako')
403
434
404 config.add_route(
435 config.add_route(
405 name='admin_settings_email_update',
436 name='admin_settings_email_update',
406 pattern='/settings/email/update')
437 pattern='/settings/email/update')
407 config.add_view(
438 config.add_view(
408 AdminSettingsView,
439 AdminSettingsView,
409 attr='settings_email_update',
440 attr='settings_email_update',
410 route_name='admin_settings_email_update', request_method='POST',
441 route_name='admin_settings_email_update', request_method='POST',
411 renderer='rhodecode:templates/admin/settings/settings.mako')
442 renderer='rhodecode:templates/admin/settings/settings.mako')
412
443
413 config.add_route(
444 config.add_route(
414 name='admin_settings_hooks',
445 name='admin_settings_hooks',
415 pattern='/settings/hooks')
446 pattern='/settings/hooks')
416 config.add_view(
447 config.add_view(
417 AdminSettingsView,
448 AdminSettingsView,
418 attr='settings_hooks',
449 attr='settings_hooks',
419 route_name='admin_settings_hooks', request_method='GET',
450 route_name='admin_settings_hooks', request_method='GET',
420 renderer='rhodecode:templates/admin/settings/settings.mako')
451 renderer='rhodecode:templates/admin/settings/settings.mako')
421
452
422 config.add_route(
453 config.add_route(
423 name='admin_settings_hooks_update',
454 name='admin_settings_hooks_update',
424 pattern='/settings/hooks/update')
455 pattern='/settings/hooks/update')
425 config.add_view(
456 config.add_view(
426 AdminSettingsView,
457 AdminSettingsView,
427 attr='settings_hooks_update',
458 attr='settings_hooks_update',
428 route_name='admin_settings_hooks_update', request_method='POST',
459 route_name='admin_settings_hooks_update', request_method='POST',
429 renderer='rhodecode:templates/admin/settings/settings.mako')
460 renderer='rhodecode:templates/admin/settings/settings.mako')
430
461
431 config.add_route(
462 config.add_route(
432 name='admin_settings_hooks_delete',
463 name='admin_settings_hooks_delete',
433 pattern='/settings/hooks/delete')
464 pattern='/settings/hooks/delete')
434 config.add_view(
465 config.add_view(
435 AdminSettingsView,
466 AdminSettingsView,
436 attr='settings_hooks_update',
467 attr='settings_hooks_update',
437 route_name='admin_settings_hooks_delete', request_method='POST',
468 route_name='admin_settings_hooks_delete', request_method='POST',
438 renderer='rhodecode:templates/admin/settings/settings.mako')
469 renderer='rhodecode:templates/admin/settings/settings.mako')
439
470
440 config.add_route(
471 config.add_route(
441 name='admin_settings_search',
472 name='admin_settings_search',
442 pattern='/settings/search')
473 pattern='/settings/search')
443 config.add_view(
474 config.add_view(
444 AdminSettingsView,
475 AdminSettingsView,
445 attr='settings_search',
476 attr='settings_search',
446 route_name='admin_settings_search', request_method='GET',
477 route_name='admin_settings_search', request_method='GET',
447 renderer='rhodecode:templates/admin/settings/settings.mako')
478 renderer='rhodecode:templates/admin/settings/settings.mako')
448
479
449 config.add_route(
480 config.add_route(
450 name='admin_settings_labs',
481 name='admin_settings_labs',
451 pattern='/settings/labs')
482 pattern='/settings/labs')
452 config.add_view(
483 config.add_view(
453 AdminSettingsView,
484 AdminSettingsView,
454 attr='settings_labs',
485 attr='settings_labs',
455 route_name='admin_settings_labs', request_method='GET',
486 route_name='admin_settings_labs', request_method='GET',
456 renderer='rhodecode:templates/admin/settings/settings.mako')
487 renderer='rhodecode:templates/admin/settings/settings.mako')
457
488
458 config.add_route(
489 config.add_route(
459 name='admin_settings_labs_update',
490 name='admin_settings_labs_update',
460 pattern='/settings/labs/update')
491 pattern='/settings/labs/update')
461 config.add_view(
492 config.add_view(
462 AdminSettingsView,
493 AdminSettingsView,
463 attr='settings_labs_update',
494 attr='settings_labs_update',
464 route_name='admin_settings_labs_update', request_method='POST',
495 route_name='admin_settings_labs_update', request_method='POST',
465 renderer='rhodecode:templates/admin/settings/settings.mako')
496 renderer='rhodecode:templates/admin/settings/settings.mako')
466
497
467 # global permissions
498 # global permissions
468
499
469 config.add_route(
500 config.add_route(
470 name='admin_permissions_application',
501 name='admin_permissions_application',
471 pattern='/permissions/application')
502 pattern='/permissions/application')
472 config.add_view(
503 config.add_view(
473 AdminPermissionsView,
504 AdminPermissionsView,
474 attr='permissions_application',
505 attr='permissions_application',
475 route_name='admin_permissions_application', request_method='GET',
506 route_name='admin_permissions_application', request_method='GET',
476 renderer='rhodecode:templates/admin/permissions/permissions.mako')
507 renderer='rhodecode:templates/admin/permissions/permissions.mako')
477
508
478 config.add_route(
509 config.add_route(
479 name='admin_permissions_application_update',
510 name='admin_permissions_application_update',
480 pattern='/permissions/application/update')
511 pattern='/permissions/application/update')
481 config.add_view(
512 config.add_view(
482 AdminPermissionsView,
513 AdminPermissionsView,
483 attr='permissions_application_update',
514 attr='permissions_application_update',
484 route_name='admin_permissions_application_update', request_method='POST',
515 route_name='admin_permissions_application_update', request_method='POST',
485 renderer='rhodecode:templates/admin/permissions/permissions.mako')
516 renderer='rhodecode:templates/admin/permissions/permissions.mako')
486
517
487 config.add_route(
518 config.add_route(
488 name='admin_permissions_global',
519 name='admin_permissions_global',
489 pattern='/permissions/global')
520 pattern='/permissions/global')
490 config.add_view(
521 config.add_view(
491 AdminPermissionsView,
522 AdminPermissionsView,
492 attr='permissions_global',
523 attr='permissions_global',
493 route_name='admin_permissions_global', request_method='GET',
524 route_name='admin_permissions_global', request_method='GET',
494 renderer='rhodecode:templates/admin/permissions/permissions.mako')
525 renderer='rhodecode:templates/admin/permissions/permissions.mako')
495
526
496 config.add_route(
527 config.add_route(
497 name='admin_permissions_global_update',
528 name='admin_permissions_global_update',
498 pattern='/permissions/global/update')
529 pattern='/permissions/global/update')
499 config.add_view(
530 config.add_view(
500 AdminPermissionsView,
531 AdminPermissionsView,
501 attr='permissions_global_update',
532 attr='permissions_global_update',
502 route_name='admin_permissions_global_update', request_method='POST',
533 route_name='admin_permissions_global_update', request_method='POST',
503 renderer='rhodecode:templates/admin/permissions/permissions.mako')
534 renderer='rhodecode:templates/admin/permissions/permissions.mako')
504
535
505 config.add_route(
536 config.add_route(
506 name='admin_permissions_object',
537 name='admin_permissions_object',
507 pattern='/permissions/object')
538 pattern='/permissions/object')
508 config.add_view(
539 config.add_view(
509 AdminPermissionsView,
540 AdminPermissionsView,
510 attr='permissions_objects',
541 attr='permissions_objects',
511 route_name='admin_permissions_object', request_method='GET',
542 route_name='admin_permissions_object', request_method='GET',
512 renderer='rhodecode:templates/admin/permissions/permissions.mako')
543 renderer='rhodecode:templates/admin/permissions/permissions.mako')
513
544
514 config.add_route(
545 config.add_route(
515 name='admin_permissions_object_update',
546 name='admin_permissions_object_update',
516 pattern='/permissions/object/update')
547 pattern='/permissions/object/update')
517 config.add_view(
548 config.add_view(
518 AdminPermissionsView,
549 AdminPermissionsView,
519 attr='permissions_objects_update',
550 attr='permissions_objects_update',
520 route_name='admin_permissions_object_update', request_method='POST',
551 route_name='admin_permissions_object_update', request_method='POST',
521 renderer='rhodecode:templates/admin/permissions/permissions.mako')
552 renderer='rhodecode:templates/admin/permissions/permissions.mako')
522
553
523 # Branch perms EE feature
554 # Branch perms EE feature
524 config.add_route(
555 config.add_route(
525 name='admin_permissions_branch',
556 name='admin_permissions_branch',
526 pattern='/permissions/branch')
557 pattern='/permissions/branch')
527 config.add_view(
558 config.add_view(
528 AdminPermissionsView,
559 AdminPermissionsView,
529 attr='permissions_branch',
560 attr='permissions_branch',
530 route_name='admin_permissions_branch', request_method='GET',
561 route_name='admin_permissions_branch', request_method='GET',
531 renderer='rhodecode:templates/admin/permissions/permissions.mako')
562 renderer='rhodecode:templates/admin/permissions/permissions.mako')
532
563
533 config.add_route(
564 config.add_route(
534 name='admin_permissions_ips',
565 name='admin_permissions_ips',
535 pattern='/permissions/ips')
566 pattern='/permissions/ips')
536 config.add_view(
567 config.add_view(
537 AdminPermissionsView,
568 AdminPermissionsView,
538 attr='permissions_ips',
569 attr='permissions_ips',
539 route_name='admin_permissions_ips', request_method='GET',
570 route_name='admin_permissions_ips', request_method='GET',
540 renderer='rhodecode:templates/admin/permissions/permissions.mako')
571 renderer='rhodecode:templates/admin/permissions/permissions.mako')
541
572
542 config.add_route(
573 config.add_route(
543 name='admin_permissions_overview',
574 name='admin_permissions_overview',
544 pattern='/permissions/overview')
575 pattern='/permissions/overview')
545 config.add_view(
576 config.add_view(
546 AdminPermissionsView,
577 AdminPermissionsView,
547 attr='permissions_overview',
578 attr='permissions_overview',
548 route_name='admin_permissions_overview', request_method='GET',
579 route_name='admin_permissions_overview', request_method='GET',
549 renderer='rhodecode:templates/admin/permissions/permissions.mako')
580 renderer='rhodecode:templates/admin/permissions/permissions.mako')
550
581
551 config.add_route(
582 config.add_route(
552 name='admin_permissions_auth_token_access',
583 name='admin_permissions_auth_token_access',
553 pattern='/permissions/auth_token_access')
584 pattern='/permissions/auth_token_access')
554 config.add_view(
585 config.add_view(
555 AdminPermissionsView,
586 AdminPermissionsView,
556 attr='auth_token_access',
587 attr='auth_token_access',
557 route_name='admin_permissions_auth_token_access', request_method='GET',
588 route_name='admin_permissions_auth_token_access', request_method='GET',
558 renderer='rhodecode:templates/admin/permissions/permissions.mako')
589 renderer='rhodecode:templates/admin/permissions/permissions.mako')
559
590
560 config.add_route(
591 config.add_route(
561 name='admin_permissions_ssh_keys',
592 name='admin_permissions_ssh_keys',
562 pattern='/permissions/ssh_keys')
593 pattern='/permissions/ssh_keys')
563 config.add_view(
594 config.add_view(
564 AdminPermissionsView,
595 AdminPermissionsView,
565 attr='ssh_keys',
596 attr='ssh_keys',
566 route_name='admin_permissions_ssh_keys', request_method='GET',
597 route_name='admin_permissions_ssh_keys', request_method='GET',
567 renderer='rhodecode:templates/admin/permissions/permissions.mako')
598 renderer='rhodecode:templates/admin/permissions/permissions.mako')
568
599
569 config.add_route(
600 config.add_route(
570 name='admin_permissions_ssh_keys_data',
601 name='admin_permissions_ssh_keys_data',
571 pattern='/permissions/ssh_keys/data')
602 pattern='/permissions/ssh_keys/data')
572 config.add_view(
603 config.add_view(
573 AdminPermissionsView,
604 AdminPermissionsView,
574 attr='ssh_keys_data',
605 attr='ssh_keys_data',
575 route_name='admin_permissions_ssh_keys_data', request_method='GET',
606 route_name='admin_permissions_ssh_keys_data', request_method='GET',
576 renderer='json_ext', xhr=True)
607 renderer='json_ext', xhr=True)
577
608
578 config.add_route(
609 config.add_route(
579 name='admin_permissions_ssh_keys_update',
610 name='admin_permissions_ssh_keys_update',
580 pattern='/permissions/ssh_keys/update')
611 pattern='/permissions/ssh_keys/update')
581 config.add_view(
612 config.add_view(
582 AdminPermissionsView,
613 AdminPermissionsView,
583 attr='ssh_keys_update',
614 attr='ssh_keys_update',
584 route_name='admin_permissions_ssh_keys_update', request_method='POST',
615 route_name='admin_permissions_ssh_keys_update', request_method='POST',
585 renderer='rhodecode:templates/admin/permissions/permissions.mako')
616 renderer='rhodecode:templates/admin/permissions/permissions.mako')
586
617
587 # users admin
618 # users admin
588 config.add_route(
619 config.add_route(
589 name='users',
620 name='users',
590 pattern='/users')
621 pattern='/users')
591 config.add_view(
622 config.add_view(
592 AdminUsersView,
623 AdminUsersView,
593 attr='users_list',
624 attr='users_list',
594 route_name='users', request_method='GET',
625 route_name='users', request_method='GET',
595 renderer='rhodecode:templates/admin/users/users.mako')
626 renderer='rhodecode:templates/admin/users/users.mako')
596
627
597 config.add_route(
628 config.add_route(
598 name='users_data',
629 name='users_data',
599 pattern='/users_data')
630 pattern='/users_data')
600 config.add_view(
631 config.add_view(
601 AdminUsersView,
632 AdminUsersView,
602 attr='users_list_data',
633 attr='users_list_data',
603 # renderer defined below
634 # renderer defined below
604 route_name='users_data', request_method='GET',
635 route_name='users_data', request_method='GET',
605 renderer='json_ext', xhr=True)
636 renderer='json_ext', xhr=True)
606
637
607 config.add_route(
638 config.add_route(
608 name='users_create',
639 name='users_create',
609 pattern='/users/create')
640 pattern='/users/create')
610 config.add_view(
641 config.add_view(
611 AdminUsersView,
642 AdminUsersView,
612 attr='users_create',
643 attr='users_create',
613 route_name='users_create', request_method='POST',
644 route_name='users_create', request_method='POST',
614 renderer='rhodecode:templates/admin/users/user_add.mako')
645 renderer='rhodecode:templates/admin/users/user_add.mako')
615
646
616 config.add_route(
647 config.add_route(
617 name='users_new',
648 name='users_new',
618 pattern='/users/new')
649 pattern='/users/new')
619 config.add_view(
650 config.add_view(
620 AdminUsersView,
651 AdminUsersView,
621 attr='users_new',
652 attr='users_new',
622 route_name='users_new', request_method='GET',
653 route_name='users_new', request_method='GET',
623 renderer='rhodecode:templates/admin/users/user_add.mako')
654 renderer='rhodecode:templates/admin/users/user_add.mako')
624
655
625 # user management
656 # user management
626 config.add_route(
657 config.add_route(
627 name='user_edit',
658 name='user_edit',
628 pattern=r'/users/{user_id:\d+}/edit',
659 pattern=r'/users/{user_id:\d+}/edit',
629 user_route=True)
660 user_route=True)
630 config.add_view(
661 config.add_view(
631 UsersView,
662 UsersView,
632 attr='user_edit',
663 attr='user_edit',
633 route_name='user_edit', request_method='GET',
664 route_name='user_edit', request_method='GET',
634 renderer='rhodecode:templates/admin/users/user_edit.mako')
665 renderer='rhodecode:templates/admin/users/user_edit.mako')
635
666
636 config.add_route(
667 config.add_route(
637 name='user_edit_advanced',
668 name='user_edit_advanced',
638 pattern=r'/users/{user_id:\d+}/edit/advanced',
669 pattern=r'/users/{user_id:\d+}/edit/advanced',
639 user_route=True)
670 user_route=True)
640 config.add_view(
671 config.add_view(
641 UsersView,
672 UsersView,
642 attr='user_edit_advanced',
673 attr='user_edit_advanced',
643 route_name='user_edit_advanced', request_method='GET',
674 route_name='user_edit_advanced', request_method='GET',
644 renderer='rhodecode:templates/admin/users/user_edit.mako')
675 renderer='rhodecode:templates/admin/users/user_edit.mako')
645
676
646 config.add_route(
677 config.add_route(
647 name='user_edit_global_perms',
678 name='user_edit_global_perms',
648 pattern=r'/users/{user_id:\d+}/edit/global_permissions',
679 pattern=r'/users/{user_id:\d+}/edit/global_permissions',
649 user_route=True)
680 user_route=True)
650 config.add_view(
681 config.add_view(
651 UsersView,
682 UsersView,
652 attr='user_edit_global_perms',
683 attr='user_edit_global_perms',
653 route_name='user_edit_global_perms', request_method='GET',
684 route_name='user_edit_global_perms', request_method='GET',
654 renderer='rhodecode:templates/admin/users/user_edit.mako')
685 renderer='rhodecode:templates/admin/users/user_edit.mako')
655
686
656 config.add_route(
687 config.add_route(
657 name='user_edit_global_perms_update',
688 name='user_edit_global_perms_update',
658 pattern=r'/users/{user_id:\d+}/edit/global_permissions/update',
689 pattern=r'/users/{user_id:\d+}/edit/global_permissions/update',
659 user_route=True)
690 user_route=True)
660 config.add_view(
691 config.add_view(
661 UsersView,
692 UsersView,
662 attr='user_edit_global_perms_update',
693 attr='user_edit_global_perms_update',
663 route_name='user_edit_global_perms_update', request_method='POST',
694 route_name='user_edit_global_perms_update', request_method='POST',
664 renderer='rhodecode:templates/admin/users/user_edit.mako')
695 renderer='rhodecode:templates/admin/users/user_edit.mako')
665
696
666 config.add_route(
697 config.add_route(
667 name='user_update',
698 name='user_update',
668 pattern=r'/users/{user_id:\d+}/update',
699 pattern=r'/users/{user_id:\d+}/update',
669 user_route=True)
700 user_route=True)
670 config.add_view(
701 config.add_view(
671 UsersView,
702 UsersView,
672 attr='user_update',
703 attr='user_update',
673 route_name='user_update', request_method='POST',
704 route_name='user_update', request_method='POST',
674 renderer='rhodecode:templates/admin/users/user_edit.mako')
705 renderer='rhodecode:templates/admin/users/user_edit.mako')
675
706
676 config.add_route(
707 config.add_route(
677 name='user_delete',
708 name='user_delete',
678 pattern=r'/users/{user_id:\d+}/delete',
709 pattern=r'/users/{user_id:\d+}/delete',
679 user_route=True)
710 user_route=True)
680 config.add_view(
711 config.add_view(
681 UsersView,
712 UsersView,
682 attr='user_delete',
713 attr='user_delete',
683 route_name='user_delete', request_method='POST',
714 route_name='user_delete', request_method='POST',
684 renderer='rhodecode:templates/admin/users/user_edit.mako')
715 renderer='rhodecode:templates/admin/users/user_edit.mako')
685
716
686 config.add_route(
717 config.add_route(
687 name='user_enable_force_password_reset',
718 name='user_enable_force_password_reset',
688 pattern=r'/users/{user_id:\d+}/password_reset_enable',
719 pattern=r'/users/{user_id:\d+}/password_reset_enable',
689 user_route=True)
720 user_route=True)
690 config.add_view(
721 config.add_view(
691 UsersView,
722 UsersView,
692 attr='user_enable_force_password_reset',
723 attr='user_enable_force_password_reset',
693 route_name='user_enable_force_password_reset', request_method='POST',
724 route_name='user_enable_force_password_reset', request_method='POST',
694 renderer='rhodecode:templates/admin/users/user_edit.mako')
725 renderer='rhodecode:templates/admin/users/user_edit.mako')
695
726
696 config.add_route(
727 config.add_route(
697 name='user_disable_force_password_reset',
728 name='user_disable_force_password_reset',
698 pattern=r'/users/{user_id:\d+}/password_reset_disable',
729 pattern=r'/users/{user_id:\d+}/password_reset_disable',
699 user_route=True)
730 user_route=True)
700 config.add_view(
731 config.add_view(
701 UsersView,
732 UsersView,
702 attr='user_disable_force_password_reset',
733 attr='user_disable_force_password_reset',
703 route_name='user_disable_force_password_reset', request_method='POST',
734 route_name='user_disable_force_password_reset', request_method='POST',
704 renderer='rhodecode:templates/admin/users/user_edit.mako')
735 renderer='rhodecode:templates/admin/users/user_edit.mako')
705
736
706 config.add_route(
737 config.add_route(
707 name='user_create_personal_repo_group',
738 name='user_create_personal_repo_group',
708 pattern=r'/users/{user_id:\d+}/create_repo_group',
739 pattern=r'/users/{user_id:\d+}/create_repo_group',
709 user_route=True)
740 user_route=True)
710 config.add_view(
741 config.add_view(
711 UsersView,
742 UsersView,
712 attr='user_create_personal_repo_group',
743 attr='user_create_personal_repo_group',
713 route_name='user_create_personal_repo_group', request_method='POST',
744 route_name='user_create_personal_repo_group', request_method='POST',
714 renderer='rhodecode:templates/admin/users/user_edit.mako')
745 renderer='rhodecode:templates/admin/users/user_edit.mako')
715
746
716 # user notice
747 # user notice
717 config.add_route(
748 config.add_route(
718 name='user_notice_dismiss',
749 name='user_notice_dismiss',
719 pattern=r'/users/{user_id:\d+}/notice_dismiss',
750 pattern=r'/users/{user_id:\d+}/notice_dismiss',
720 user_route=True)
751 user_route=True)
721 config.add_view(
752 config.add_view(
722 UsersView,
753 UsersView,
723 attr='user_notice_dismiss',
754 attr='user_notice_dismiss',
724 route_name='user_notice_dismiss', request_method='POST',
755 route_name='user_notice_dismiss', request_method='POST',
725 renderer='json_ext', xhr=True)
756 renderer='json_ext', xhr=True)
726
757
727 # user auth tokens
758 # user auth tokens
728 config.add_route(
759 config.add_route(
729 name='edit_user_auth_tokens',
760 name='edit_user_auth_tokens',
730 pattern=r'/users/{user_id:\d+}/edit/auth_tokens',
761 pattern=r'/users/{user_id:\d+}/edit/auth_tokens',
731 user_route=True)
762 user_route=True)
732 config.add_view(
763 config.add_view(
733 UsersView,
764 UsersView,
734 attr='auth_tokens',
765 attr='auth_tokens',
735 route_name='edit_user_auth_tokens', request_method='GET',
766 route_name='edit_user_auth_tokens', request_method='GET',
736 renderer='rhodecode:templates/admin/users/user_edit.mako')
767 renderer='rhodecode:templates/admin/users/user_edit.mako')
737
768
738 config.add_route(
769 config.add_route(
739 name='edit_user_auth_tokens_view',
770 name='edit_user_auth_tokens_view',
740 pattern=r'/users/{user_id:\d+}/edit/auth_tokens/view',
771 pattern=r'/users/{user_id:\d+}/edit/auth_tokens/view',
741 user_route=True)
772 user_route=True)
742 config.add_view(
773 config.add_view(
743 UsersView,
774 UsersView,
744 attr='auth_tokens_view',
775 attr='auth_tokens_view',
745 route_name='edit_user_auth_tokens_view', request_method='POST',
776 route_name='edit_user_auth_tokens_view', request_method='POST',
746 renderer='json_ext', xhr=True)
777 renderer='json_ext', xhr=True)
747
778
748 config.add_route(
779 config.add_route(
749 name='edit_user_auth_tokens_add',
780 name='edit_user_auth_tokens_add',
750 pattern=r'/users/{user_id:\d+}/edit/auth_tokens/new',
781 pattern=r'/users/{user_id:\d+}/edit/auth_tokens/new',
751 user_route=True)
782 user_route=True)
752 config.add_view(
783 config.add_view(
753 UsersView,
784 UsersView,
754 attr='auth_tokens_add',
785 attr='auth_tokens_add',
755 route_name='edit_user_auth_tokens_add', request_method='POST')
786 route_name='edit_user_auth_tokens_add', request_method='POST')
756
787
757 config.add_route(
788 config.add_route(
758 name='edit_user_auth_tokens_delete',
789 name='edit_user_auth_tokens_delete',
759 pattern=r'/users/{user_id:\d+}/edit/auth_tokens/delete',
790 pattern=r'/users/{user_id:\d+}/edit/auth_tokens/delete',
760 user_route=True)
791 user_route=True)
761 config.add_view(
792 config.add_view(
762 UsersView,
793 UsersView,
763 attr='auth_tokens_delete',
794 attr='auth_tokens_delete',
764 route_name='edit_user_auth_tokens_delete', request_method='POST')
795 route_name='edit_user_auth_tokens_delete', request_method='POST')
765
796
766 # user ssh keys
797 # user ssh keys
767 config.add_route(
798 config.add_route(
768 name='edit_user_ssh_keys',
799 name='edit_user_ssh_keys',
769 pattern=r'/users/{user_id:\d+}/edit/ssh_keys',
800 pattern=r'/users/{user_id:\d+}/edit/ssh_keys',
770 user_route=True)
801 user_route=True)
771 config.add_view(
802 config.add_view(
772 UsersView,
803 UsersView,
773 attr='ssh_keys',
804 attr='ssh_keys',
774 route_name='edit_user_ssh_keys', request_method='GET',
805 route_name='edit_user_ssh_keys', request_method='GET',
775 renderer='rhodecode:templates/admin/users/user_edit.mako')
806 renderer='rhodecode:templates/admin/users/user_edit.mako')
776
807
777 config.add_route(
808 config.add_route(
778 name='edit_user_ssh_keys_generate_keypair',
809 name='edit_user_ssh_keys_generate_keypair',
779 pattern=r'/users/{user_id:\d+}/edit/ssh_keys/generate',
810 pattern=r'/users/{user_id:\d+}/edit/ssh_keys/generate',
780 user_route=True)
811 user_route=True)
781 config.add_view(
812 config.add_view(
782 UsersView,
813 UsersView,
783 attr='ssh_keys_generate_keypair',
814 attr='ssh_keys_generate_keypair',
784 route_name='edit_user_ssh_keys_generate_keypair', request_method='GET',
815 route_name='edit_user_ssh_keys_generate_keypair', request_method='GET',
785 renderer='rhodecode:templates/admin/users/user_edit.mako')
816 renderer='rhodecode:templates/admin/users/user_edit.mako')
786
817
787 config.add_route(
818 config.add_route(
788 name='edit_user_ssh_keys_add',
819 name='edit_user_ssh_keys_add',
789 pattern=r'/users/{user_id:\d+}/edit/ssh_keys/new',
820 pattern=r'/users/{user_id:\d+}/edit/ssh_keys/new',
790 user_route=True)
821 user_route=True)
791 config.add_view(
822 config.add_view(
792 UsersView,
823 UsersView,
793 attr='ssh_keys_add',
824 attr='ssh_keys_add',
794 route_name='edit_user_ssh_keys_add', request_method='POST')
825 route_name='edit_user_ssh_keys_add', request_method='POST')
795
826
796 config.add_route(
827 config.add_route(
797 name='edit_user_ssh_keys_delete',
828 name='edit_user_ssh_keys_delete',
798 pattern=r'/users/{user_id:\d+}/edit/ssh_keys/delete',
829 pattern=r'/users/{user_id:\d+}/edit/ssh_keys/delete',
799 user_route=True)
830 user_route=True)
800 config.add_view(
831 config.add_view(
801 UsersView,
832 UsersView,
802 attr='ssh_keys_delete',
833 attr='ssh_keys_delete',
803 route_name='edit_user_ssh_keys_delete', request_method='POST')
834 route_name='edit_user_ssh_keys_delete', request_method='POST')
804
835
805 # user emails
836 # user emails
806 config.add_route(
837 config.add_route(
807 name='edit_user_emails',
838 name='edit_user_emails',
808 pattern=r'/users/{user_id:\d+}/edit/emails',
839 pattern=r'/users/{user_id:\d+}/edit/emails',
809 user_route=True)
840 user_route=True)
810 config.add_view(
841 config.add_view(
811 UsersView,
842 UsersView,
812 attr='emails',
843 attr='emails',
813 route_name='edit_user_emails', request_method='GET',
844 route_name='edit_user_emails', request_method='GET',
814 renderer='rhodecode:templates/admin/users/user_edit.mako')
845 renderer='rhodecode:templates/admin/users/user_edit.mako')
815
846
816 config.add_route(
847 config.add_route(
817 name='edit_user_emails_add',
848 name='edit_user_emails_add',
818 pattern=r'/users/{user_id:\d+}/edit/emails/new',
849 pattern=r'/users/{user_id:\d+}/edit/emails/new',
819 user_route=True)
850 user_route=True)
820 config.add_view(
851 config.add_view(
821 UsersView,
852 UsersView,
822 attr='emails_add',
853 attr='emails_add',
823 route_name='edit_user_emails_add', request_method='POST')
854 route_name='edit_user_emails_add', request_method='POST')
824
855
825 config.add_route(
856 config.add_route(
826 name='edit_user_emails_delete',
857 name='edit_user_emails_delete',
827 pattern=r'/users/{user_id:\d+}/edit/emails/delete',
858 pattern=r'/users/{user_id:\d+}/edit/emails/delete',
828 user_route=True)
859 user_route=True)
829 config.add_view(
860 config.add_view(
830 UsersView,
861 UsersView,
831 attr='emails_delete',
862 attr='emails_delete',
832 route_name='edit_user_emails_delete', request_method='POST')
863 route_name='edit_user_emails_delete', request_method='POST')
833
864
834 # user IPs
865 # user IPs
835 config.add_route(
866 config.add_route(
836 name='edit_user_ips',
867 name='edit_user_ips',
837 pattern=r'/users/{user_id:\d+}/edit/ips',
868 pattern=r'/users/{user_id:\d+}/edit/ips',
838 user_route=True)
869 user_route=True)
839 config.add_view(
870 config.add_view(
840 UsersView,
871 UsersView,
841 attr='ips',
872 attr='ips',
842 route_name='edit_user_ips', request_method='GET',
873 route_name='edit_user_ips', request_method='GET',
843 renderer='rhodecode:templates/admin/users/user_edit.mako')
874 renderer='rhodecode:templates/admin/users/user_edit.mako')
844
875
845 config.add_route(
876 config.add_route(
846 name='edit_user_ips_add',
877 name='edit_user_ips_add',
847 pattern=r'/users/{user_id:\d+}/edit/ips/new',
878 pattern=r'/users/{user_id:\d+}/edit/ips/new',
848 user_route_with_default=True) # enabled for default user too
879 user_route_with_default=True) # enabled for default user too
849 config.add_view(
880 config.add_view(
850 UsersView,
881 UsersView,
851 attr='ips_add',
882 attr='ips_add',
852 route_name='edit_user_ips_add', request_method='POST')
883 route_name='edit_user_ips_add', request_method='POST')
853
884
854 config.add_route(
885 config.add_route(
855 name='edit_user_ips_delete',
886 name='edit_user_ips_delete',
856 pattern=r'/users/{user_id:\d+}/edit/ips/delete',
887 pattern=r'/users/{user_id:\d+}/edit/ips/delete',
857 user_route_with_default=True) # enabled for default user too
888 user_route_with_default=True) # enabled for default user too
858 config.add_view(
889 config.add_view(
859 UsersView,
890 UsersView,
860 attr='ips_delete',
891 attr='ips_delete',
861 route_name='edit_user_ips_delete', request_method='POST')
892 route_name='edit_user_ips_delete', request_method='POST')
862
893
863 # user perms
894 # user perms
864 config.add_route(
895 config.add_route(
865 name='edit_user_perms_summary',
896 name='edit_user_perms_summary',
866 pattern=r'/users/{user_id:\d+}/edit/permissions_summary',
897 pattern=r'/users/{user_id:\d+}/edit/permissions_summary',
867 user_route=True)
898 user_route=True)
868 config.add_view(
899 config.add_view(
869 UsersView,
900 UsersView,
870 attr='user_perms_summary',
901 attr='user_perms_summary',
871 route_name='edit_user_perms_summary', request_method='GET',
902 route_name='edit_user_perms_summary', request_method='GET',
872 renderer='rhodecode:templates/admin/users/user_edit.mako')
903 renderer='rhodecode:templates/admin/users/user_edit.mako')
873
904
874 config.add_route(
905 config.add_route(
875 name='edit_user_perms_summary_json',
906 name='edit_user_perms_summary_json',
876 pattern=r'/users/{user_id:\d+}/edit/permissions_summary/json',
907 pattern=r'/users/{user_id:\d+}/edit/permissions_summary/json',
877 user_route=True)
908 user_route=True)
878 config.add_view(
909 config.add_view(
879 UsersView,
910 UsersView,
880 attr='user_perms_summary_json',
911 attr='user_perms_summary_json',
881 route_name='edit_user_perms_summary_json', request_method='GET',
912 route_name='edit_user_perms_summary_json', request_method='GET',
882 renderer='json_ext')
913 renderer='json_ext')
883
914
884 # user user groups management
915 # user user groups management
885 config.add_route(
916 config.add_route(
886 name='edit_user_groups_management',
917 name='edit_user_groups_management',
887 pattern=r'/users/{user_id:\d+}/edit/groups_management',
918 pattern=r'/users/{user_id:\d+}/edit/groups_management',
888 user_route=True)
919 user_route=True)
889 config.add_view(
920 config.add_view(
890 UsersView,
921 UsersView,
891 attr='groups_management',
922 attr='groups_management',
892 route_name='edit_user_groups_management', request_method='GET',
923 route_name='edit_user_groups_management', request_method='GET',
893 renderer='rhodecode:templates/admin/users/user_edit.mako')
924 renderer='rhodecode:templates/admin/users/user_edit.mako')
894
925
895 config.add_route(
926 config.add_route(
896 name='edit_user_groups_management_updates',
927 name='edit_user_groups_management_updates',
897 pattern=r'/users/{user_id:\d+}/edit/edit_user_groups_management/updates',
928 pattern=r'/users/{user_id:\d+}/edit/edit_user_groups_management/updates',
898 user_route=True)
929 user_route=True)
899 config.add_view(
930 config.add_view(
900 UsersView,
931 UsersView,
901 attr='groups_management_updates',
932 attr='groups_management_updates',
902 route_name='edit_user_groups_management_updates', request_method='POST')
933 route_name='edit_user_groups_management_updates', request_method='POST')
903
934
904 # user audit logs
935 # user audit logs
905 config.add_route(
936 config.add_route(
906 name='edit_user_audit_logs',
937 name='edit_user_audit_logs',
907 pattern=r'/users/{user_id:\d+}/edit/audit', user_route=True)
938 pattern=r'/users/{user_id:\d+}/edit/audit', user_route=True)
908 config.add_view(
939 config.add_view(
909 UsersView,
940 UsersView,
910 attr='user_audit_logs',
941 attr='user_audit_logs',
911 route_name='edit_user_audit_logs', request_method='GET',
942 route_name='edit_user_audit_logs', request_method='GET',
912 renderer='rhodecode:templates/admin/users/user_edit.mako')
943 renderer='rhodecode:templates/admin/users/user_edit.mako')
913
944
914 config.add_route(
945 config.add_route(
915 name='edit_user_audit_logs_download',
946 name='edit_user_audit_logs_download',
916 pattern=r'/users/{user_id:\d+}/edit/audit/download', user_route=True)
947 pattern=r'/users/{user_id:\d+}/edit/audit/download', user_route=True)
917 config.add_view(
948 config.add_view(
918 UsersView,
949 UsersView,
919 attr='user_audit_logs_download',
950 attr='user_audit_logs_download',
920 route_name='edit_user_audit_logs_download', request_method='GET',
951 route_name='edit_user_audit_logs_download', request_method='GET',
921 renderer='string')
952 renderer='string')
922
953
923 # user caches
954 # user caches
924 config.add_route(
955 config.add_route(
925 name='edit_user_caches',
956 name='edit_user_caches',
926 pattern=r'/users/{user_id:\d+}/edit/caches',
957 pattern=r'/users/{user_id:\d+}/edit/caches',
927 user_route=True)
958 user_route=True)
928 config.add_view(
959 config.add_view(
929 UsersView,
960 UsersView,
930 attr='user_caches',
961 attr='user_caches',
931 route_name='edit_user_caches', request_method='GET',
962 route_name='edit_user_caches', request_method='GET',
932 renderer='rhodecode:templates/admin/users/user_edit.mako')
963 renderer='rhodecode:templates/admin/users/user_edit.mako')
933
964
934 config.add_route(
965 config.add_route(
935 name='edit_user_caches_update',
966 name='edit_user_caches_update',
936 pattern=r'/users/{user_id:\d+}/edit/caches/update',
967 pattern=r'/users/{user_id:\d+}/edit/caches/update',
937 user_route=True)
968 user_route=True)
938 config.add_view(
969 config.add_view(
939 UsersView,
970 UsersView,
940 attr='user_caches_update',
971 attr='user_caches_update',
941 route_name='edit_user_caches_update', request_method='POST')
972 route_name='edit_user_caches_update', request_method='POST')
942
973
943 # user-groups admin
974 # user-groups admin
944 config.add_route(
975 config.add_route(
945 name='user_groups',
976 name='user_groups',
946 pattern='/user_groups')
977 pattern='/user_groups')
947 config.add_view(
978 config.add_view(
948 AdminUserGroupsView,
979 AdminUserGroupsView,
949 attr='user_groups_list',
980 attr='user_groups_list',
950 route_name='user_groups', request_method='GET',
981 route_name='user_groups', request_method='GET',
951 renderer='rhodecode:templates/admin/user_groups/user_groups.mako')
982 renderer='rhodecode:templates/admin/user_groups/user_groups.mako')
952
983
953 config.add_route(
984 config.add_route(
954 name='user_groups_data',
985 name='user_groups_data',
955 pattern='/user_groups_data')
986 pattern='/user_groups_data')
956 config.add_view(
987 config.add_view(
957 AdminUserGroupsView,
988 AdminUserGroupsView,
958 attr='user_groups_list_data',
989 attr='user_groups_list_data',
959 route_name='user_groups_data', request_method='GET',
990 route_name='user_groups_data', request_method='GET',
960 renderer='json_ext', xhr=True)
991 renderer='json_ext', xhr=True)
961
992
962 config.add_route(
993 config.add_route(
963 name='user_groups_new',
994 name='user_groups_new',
964 pattern='/user_groups/new')
995 pattern='/user_groups/new')
965 config.add_view(
996 config.add_view(
966 AdminUserGroupsView,
997 AdminUserGroupsView,
967 attr='user_groups_new',
998 attr='user_groups_new',
968 route_name='user_groups_new', request_method='GET',
999 route_name='user_groups_new', request_method='GET',
969 renderer='rhodecode:templates/admin/user_groups/user_group_add.mako')
1000 renderer='rhodecode:templates/admin/user_groups/user_group_add.mako')
970
1001
971 config.add_route(
1002 config.add_route(
972 name='user_groups_create',
1003 name='user_groups_create',
973 pattern='/user_groups/create')
1004 pattern='/user_groups/create')
974 config.add_view(
1005 config.add_view(
975 AdminUserGroupsView,
1006 AdminUserGroupsView,
976 attr='user_groups_create',
1007 attr='user_groups_create',
977 route_name='user_groups_create', request_method='POST',
1008 route_name='user_groups_create', request_method='POST',
978 renderer='rhodecode:templates/admin/user_groups/user_group_add.mako')
1009 renderer='rhodecode:templates/admin/user_groups/user_group_add.mako')
979
1010
980 # repos admin
1011 # repos admin
981 config.add_route(
1012 config.add_route(
982 name='repos',
1013 name='repos',
983 pattern='/repos')
1014 pattern='/repos')
984 config.add_view(
1015 config.add_view(
985 AdminReposView,
1016 AdminReposView,
986 attr='repository_list',
1017 attr='repository_list',
987 route_name='repos', request_method='GET',
1018 route_name='repos', request_method='GET',
988 renderer='rhodecode:templates/admin/repos/repos.mako')
1019 renderer='rhodecode:templates/admin/repos/repos.mako')
989
1020
990 config.add_route(
1021 config.add_route(
991 name='repos_data',
1022 name='repos_data',
992 pattern='/repos_data')
1023 pattern='/repos_data')
993 config.add_view(
1024 config.add_view(
994 AdminReposView,
1025 AdminReposView,
995 attr='repository_list_data',
1026 attr='repository_list_data',
996 route_name='repos_data', request_method='GET',
1027 route_name='repos_data', request_method='GET',
997 renderer='json_ext', xhr=True)
1028 renderer='json_ext', xhr=True)
998
1029
999 config.add_route(
1030 config.add_route(
1000 name='repo_new',
1031 name='repo_new',
1001 pattern='/repos/new')
1032 pattern='/repos/new')
1002 config.add_view(
1033 config.add_view(
1003 AdminReposView,
1034 AdminReposView,
1004 attr='repository_new',
1035 attr='repository_new',
1005 route_name='repo_new', request_method='GET',
1036 route_name='repo_new', request_method='GET',
1006 renderer='rhodecode:templates/admin/repos/repo_add.mako')
1037 renderer='rhodecode:templates/admin/repos/repo_add.mako')
1007
1038
1008 config.add_route(
1039 config.add_route(
1009 name='repo_create',
1040 name='repo_create',
1010 pattern='/repos/create')
1041 pattern='/repos/create')
1011 config.add_view(
1042 config.add_view(
1012 AdminReposView,
1043 AdminReposView,
1013 attr='repository_create',
1044 attr='repository_create',
1014 route_name='repo_create', request_method='POST',
1045 route_name='repo_create', request_method='POST',
1015 renderer='rhodecode:templates/admin/repos/repos.mako')
1046 renderer='rhodecode:templates/admin/repos/repos.mako')
1016
1047
1017 # repo groups admin
1048 # repo groups admin
1018 config.add_route(
1049 config.add_route(
1019 name='repo_groups',
1050 name='repo_groups',
1020 pattern='/repo_groups')
1051 pattern='/repo_groups')
1021 config.add_view(
1052 config.add_view(
1022 AdminRepoGroupsView,
1053 AdminRepoGroupsView,
1023 attr='repo_group_list',
1054 attr='repo_group_list',
1024 route_name='repo_groups', request_method='GET',
1055 route_name='repo_groups', request_method='GET',
1025 renderer='rhodecode:templates/admin/repo_groups/repo_groups.mako')
1056 renderer='rhodecode:templates/admin/repo_groups/repo_groups.mako')
1026
1057
1027 config.add_route(
1058 config.add_route(
1028 name='repo_groups_data',
1059 name='repo_groups_data',
1029 pattern='/repo_groups_data')
1060 pattern='/repo_groups_data')
1030 config.add_view(
1061 config.add_view(
1031 AdminRepoGroupsView,
1062 AdminRepoGroupsView,
1032 attr='repo_group_list_data',
1063 attr='repo_group_list_data',
1033 route_name='repo_groups_data', request_method='GET',
1064 route_name='repo_groups_data', request_method='GET',
1034 renderer='json_ext', xhr=True)
1065 renderer='json_ext', xhr=True)
1035
1066
1036 config.add_route(
1067 config.add_route(
1037 name='repo_group_new',
1068 name='repo_group_new',
1038 pattern='/repo_group/new')
1069 pattern='/repo_group/new')
1039 config.add_view(
1070 config.add_view(
1040 AdminRepoGroupsView,
1071 AdminRepoGroupsView,
1041 attr='repo_group_new',
1072 attr='repo_group_new',
1042 route_name='repo_group_new', request_method='GET',
1073 route_name='repo_group_new', request_method='GET',
1043 renderer='rhodecode:templates/admin/repo_groups/repo_group_add.mako')
1074 renderer='rhodecode:templates/admin/repo_groups/repo_group_add.mako')
1044
1075
1045 config.add_route(
1076 config.add_route(
1046 name='repo_group_create',
1077 name='repo_group_create',
1047 pattern='/repo_group/create')
1078 pattern='/repo_group/create')
1048 config.add_view(
1079 config.add_view(
1049 AdminRepoGroupsView,
1080 AdminRepoGroupsView,
1050 attr='repo_group_create',
1081 attr='repo_group_create',
1051 route_name='repo_group_create', request_method='POST',
1082 route_name='repo_group_create', request_method='POST',
1052 renderer='rhodecode:templates/admin/repo_groups/repo_group_add.mako')
1083 renderer='rhodecode:templates/admin/repo_groups/repo_group_add.mako')
1053
1084
1054
1085
1055 def includeme(config):
1086 def includeme(config):
1056 # Create admin navigation registry and add it to the pyramid registry.
1087 # Create admin navigation registry and add it to the pyramid registry.
1057 nav_includeme(config)
1088 nav_includeme(config)
1058
1089
1059 # main admin routes
1090 # main admin routes
1060 config.add_route(
1091 config.add_route(
1061 name='admin_home', pattern=ADMIN_PREFIX)
1092 name='admin_home', pattern=ADMIN_PREFIX)
1062 config.add_view(
1093 config.add_view(
1063 AdminMainView,
1094 AdminMainView,
1064 attr='admin_main',
1095 attr='admin_main',
1065 route_name='admin_home', request_method='GET',
1096 route_name='admin_home', request_method='GET',
1066 renderer='rhodecode:templates/admin/main.mako')
1097 renderer='rhodecode:templates/admin/main.mako')
1067
1098
1068 # pr global redirect
1099 # pr global redirect
1069 config.add_route(
1100 config.add_route(
1070 name='pull_requests_global_0', # backward compat
1101 name='pull_requests_global_0', # backward compat
1071 pattern=ADMIN_PREFIX + r'/pull_requests/{pull_request_id:\d+}')
1102 pattern=ADMIN_PREFIX + r'/pull_requests/{pull_request_id:\d+}')
1072 config.add_view(
1103 config.add_view(
1073 AdminMainView,
1104 AdminMainView,
1074 attr='pull_requests',
1105 attr='pull_requests',
1075 route_name='pull_requests_global_0', request_method='GET')
1106 route_name='pull_requests_global_0', request_method='GET')
1076
1107
1077 config.add_route(
1108 config.add_route(
1078 name='pull_requests_global_1', # backward compat
1109 name='pull_requests_global_1', # backward compat
1079 pattern=ADMIN_PREFIX + r'/pull-requests/{pull_request_id:\d+}')
1110 pattern=ADMIN_PREFIX + r'/pull-requests/{pull_request_id:\d+}')
1080 config.add_view(
1111 config.add_view(
1081 AdminMainView,
1112 AdminMainView,
1082 attr='pull_requests',
1113 attr='pull_requests',
1083 route_name='pull_requests_global_1', request_method='GET')
1114 route_name='pull_requests_global_1', request_method='GET')
1084
1115
1085 config.add_route(
1116 config.add_route(
1086 name='pull_requests_global',
1117 name='pull_requests_global',
1087 pattern=ADMIN_PREFIX + r'/pull-request/{pull_request_id:\d+}')
1118 pattern=ADMIN_PREFIX + r'/pull-request/{pull_request_id:\d+}')
1088 config.add_view(
1119 config.add_view(
1089 AdminMainView,
1120 AdminMainView,
1090 attr='pull_requests',
1121 attr='pull_requests',
1091 route_name='pull_requests_global', request_method='GET')
1122 route_name='pull_requests_global', request_method='GET')
1092
1123
1093 config.include(admin_routes, route_prefix=ADMIN_PREFIX)
1124 config.include(admin_routes, route_prefix=ADMIN_PREFIX)
@@ -1,708 +1,715 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19
19
20 import logging
20 import logging
21 import collections
21 import collections
22
22
23 import datetime
23 import datetime
24 import formencode
24 import formencode
25 import formencode.htmlfill
25 import formencode.htmlfill
26
26
27 import rhodecode
27 import rhodecode
28
28
29 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
29 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
30 from pyramid.renderers import render
30 from pyramid.renderers import render
31 from pyramid.response import Response
31 from pyramid.response import Response
32
32
33 from rhodecode.apps._base import BaseAppView
33 from rhodecode.apps._base import BaseAppView
34 from rhodecode.apps._base.navigation import navigation_list
34 from rhodecode.apps._base.navigation import navigation_list
35 from rhodecode.apps.svn_support import config_keys
35 from rhodecode.apps.svn_support import config_keys
36 from rhodecode.lib import helpers as h
36 from rhodecode.lib import helpers as h
37 from rhodecode.lib.auth import (
37 from rhodecode.lib.auth import (
38 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
38 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
39 from rhodecode.lib.celerylib import tasks, run_task
39 from rhodecode.lib.celerylib import tasks, run_task
40 from rhodecode.lib.str_utils import safe_str
40 from rhodecode.lib.str_utils import safe_str
41 from rhodecode.lib.utils import repo2db_mapper, get_rhodecode_repo_store_path
41 from rhodecode.lib.utils import repo2db_mapper, get_rhodecode_repo_store_path
42 from rhodecode.lib.utils2 import str2bool, AttributeDict
42 from rhodecode.lib.utils2 import str2bool, AttributeDict
43 from rhodecode.lib.index import searcher_from_config
43 from rhodecode.lib.index import searcher_from_config
44
44
45 from rhodecode.model.db import RhodeCodeUi, Repository
45 from rhodecode.model.db import RhodeCodeUi, Repository
46 from rhodecode.model.forms import (ApplicationSettingsForm,
46 from rhodecode.model.forms import (ApplicationSettingsForm,
47 ApplicationUiSettingsForm, ApplicationVisualisationForm,
47 ApplicationUiSettingsForm, ApplicationVisualisationForm,
48 LabsSettingsForm, IssueTrackerPatternsForm)
48 LabsSettingsForm, IssueTrackerPatternsForm)
49 from rhodecode.model.permission import PermissionModel
49 from rhodecode.model.permission import PermissionModel
50 from rhodecode.model.repo_group import RepoGroupModel
50 from rhodecode.model.repo_group import RepoGroupModel
51
51
52 from rhodecode.model.scm import ScmModel
52 from rhodecode.model.scm import ScmModel
53 from rhodecode.model.notification import EmailNotificationModel
53 from rhodecode.model.notification import EmailNotificationModel
54 from rhodecode.model.meta import Session
54 from rhodecode.model.meta import Session
55 from rhodecode.model.settings import (
55 from rhodecode.model.settings import (
56 IssueTrackerSettingsModel, VcsSettingsModel, SettingNotFound,
56 IssueTrackerSettingsModel, VcsSettingsModel, SettingNotFound,
57 SettingsModel)
57 SettingsModel)
58
58
59
59
60 log = logging.getLogger(__name__)
60 log = logging.getLogger(__name__)
61
61
62
62
63 class AdminSettingsView(BaseAppView):
63 class AdminSettingsView(BaseAppView):
64
64
65 def load_default_context(self):
65 def load_default_context(self):
66 c = self._get_local_tmpl_context()
66 c = self._get_local_tmpl_context()
67 c.labs_active = str2bool(
67 c.labs_active = str2bool(
68 rhodecode.CONFIG.get('labs_settings_active', 'true'))
68 rhodecode.CONFIG.get('labs_settings_active', 'true'))
69 c.navlist = navigation_list(self.request)
69 c.navlist = navigation_list(self.request)
70 return c
70 return c
71
71
72 @classmethod
72 @classmethod
73 def _get_ui_settings(cls):
73 def _get_ui_settings(cls):
74 ret = RhodeCodeUi.query().all()
74 ret = RhodeCodeUi.query().all()
75
75
76 if not ret:
76 if not ret:
77 raise Exception('Could not get application ui settings !')
77 raise Exception('Could not get application ui settings !')
78 settings = {}
78 settings = {
79 # legacy param that needs to be kept
80 'web_push_ssl': False
81 }
79 for each in ret:
82 for each in ret:
80 k = each.ui_key
83 k = each.ui_key
81 v = each.ui_value
84 v = each.ui_value
85 # skip some options if they are defined
86 if k in ['push_ssl']:
87 continue
88
82 if k == '/':
89 if k == '/':
83 k = 'root_path'
90 k = 'root_path'
84
91
85 if k in ['push_ssl', 'publish', 'enabled']:
92 if k in ['publish', 'enabled']:
86 v = str2bool(v)
93 v = str2bool(v)
87
94
88 if k.find('.') != -1:
95 if k.find('.') != -1:
89 k = k.replace('.', '_')
96 k = k.replace('.', '_')
90
97
91 if each.ui_section in ['hooks', 'extensions']:
98 if each.ui_section in ['hooks', 'extensions']:
92 v = each.ui_active
99 v = each.ui_active
93
100
94 settings[each.ui_section + '_' + k] = v
101 settings[each.ui_section + '_' + k] = v
102
95 return settings
103 return settings
96
104
97 @classmethod
105 @classmethod
98 def _form_defaults(cls):
106 def _form_defaults(cls):
99 defaults = SettingsModel().get_all_settings()
107 defaults = SettingsModel().get_all_settings()
100 defaults.update(cls._get_ui_settings())
108 defaults.update(cls._get_ui_settings())
101
109
102 defaults.update({
110 defaults.update({
103 'new_svn_branch': '',
111 'new_svn_branch': '',
104 'new_svn_tag': '',
112 'new_svn_tag': '',
105 })
113 })
106 return defaults
114 return defaults
107
115
108 @LoginRequired()
116 @LoginRequired()
109 @HasPermissionAllDecorator('hg.admin')
117 @HasPermissionAllDecorator('hg.admin')
110 def settings_vcs(self):
118 def settings_vcs(self):
111 c = self.load_default_context()
119 c = self.load_default_context()
112 c.active = 'vcs'
120 c.active = 'vcs'
113 model = VcsSettingsModel()
121 model = VcsSettingsModel()
114 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
122 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
115 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
123 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
116 c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config)
124 c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config)
117 c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path)
125 c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path)
118 defaults = self._form_defaults()
126 defaults = self._form_defaults()
119
127
120 model.create_largeobjects_dirs_if_needed(defaults['paths_root_path'])
128 model.create_largeobjects_dirs_if_needed(defaults['paths_root_path'])
121
129
122 data = render('rhodecode:templates/admin/settings/settings.mako',
130 data = render('rhodecode:templates/admin/settings/settings.mako',
123 self._get_template_context(c), self.request)
131 self._get_template_context(c), self.request)
124 html = formencode.htmlfill.render(
132 html = formencode.htmlfill.render(
125 data,
133 data,
126 defaults=defaults,
134 defaults=defaults,
127 encoding="UTF-8",
135 encoding="UTF-8",
128 force_defaults=False
136 force_defaults=False
129 )
137 )
130 return Response(html)
138 return Response(html)
131
139
132 @LoginRequired()
140 @LoginRequired()
133 @HasPermissionAllDecorator('hg.admin')
141 @HasPermissionAllDecorator('hg.admin')
134 @CSRFRequired()
142 @CSRFRequired()
135 def settings_vcs_update(self):
143 def settings_vcs_update(self):
136 _ = self.request.translate
144 _ = self.request.translate
137 c = self.load_default_context()
145 c = self.load_default_context()
138 c.active = 'vcs'
146 c.active = 'vcs'
139
147
140 model = VcsSettingsModel()
148 model = VcsSettingsModel()
141 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
149 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
142 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
150 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
143
151
144 c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config)
152 c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config)
145 c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path)
153 c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path)
146 application_form = ApplicationUiSettingsForm(self.request.translate)()
154 application_form = ApplicationUiSettingsForm(self.request.translate)()
147
155
148 try:
156 try:
149 form_result = application_form.to_python(dict(self.request.POST))
157 form_result = application_form.to_python(dict(self.request.POST))
150 except formencode.Invalid as errors:
158 except formencode.Invalid as errors:
151 h.flash(
159 h.flash(
152 _("Some form inputs contain invalid data."),
160 _("Some form inputs contain invalid data."),
153 category='error')
161 category='error')
154 data = render('rhodecode:templates/admin/settings/settings.mako',
162 data = render('rhodecode:templates/admin/settings/settings.mako',
155 self._get_template_context(c), self.request)
163 self._get_template_context(c), self.request)
156 html = formencode.htmlfill.render(
164 html = formencode.htmlfill.render(
157 data,
165 data,
158 defaults=errors.value,
166 defaults=errors.value,
159 errors=errors.unpack_errors() or {},
167 errors=errors.unpack_errors() or {},
160 prefix_error=False,
168 prefix_error=False,
161 encoding="UTF-8",
169 encoding="UTF-8",
162 force_defaults=False
170 force_defaults=False
163 )
171 )
164 return Response(html)
172 return Response(html)
165
173
166 try:
174 try:
167 model.update_global_ssl_setting(form_result['web_push_ssl'])
168 model.update_global_hook_settings(form_result)
175 model.update_global_hook_settings(form_result)
169
176
170 model.create_or_update_global_svn_settings(form_result)
177 model.create_or_update_global_svn_settings(form_result)
171 model.create_or_update_global_hg_settings(form_result)
178 model.create_or_update_global_hg_settings(form_result)
172 model.create_or_update_global_git_settings(form_result)
179 model.create_or_update_global_git_settings(form_result)
173 model.create_or_update_global_pr_settings(form_result)
180 model.create_or_update_global_pr_settings(form_result)
174 except Exception:
181 except Exception:
175 log.exception("Exception while updating settings")
182 log.exception("Exception while updating settings")
176 h.flash(_('Error occurred during updating '
183 h.flash(_('Error occurred during updating '
177 'application settings'), category='error')
184 'application settings'), category='error')
178 else:
185 else:
179 Session().commit()
186 Session().commit()
180 h.flash(_('Updated VCS settings'), category='success')
187 h.flash(_('Updated VCS settings'), category='success')
181 raise HTTPFound(h.route_path('admin_settings_vcs'))
188 raise HTTPFound(h.route_path('admin_settings_vcs'))
182
189
183 data = render('rhodecode:templates/admin/settings/settings.mako',
190 data = render('rhodecode:templates/admin/settings/settings.mako',
184 self._get_template_context(c), self.request)
191 self._get_template_context(c), self.request)
185 html = formencode.htmlfill.render(
192 html = formencode.htmlfill.render(
186 data,
193 data,
187 defaults=self._form_defaults(),
194 defaults=self._form_defaults(),
188 encoding="UTF-8",
195 encoding="UTF-8",
189 force_defaults=False
196 force_defaults=False
190 )
197 )
191 return Response(html)
198 return Response(html)
192
199
193 @LoginRequired()
200 @LoginRequired()
194 @HasPermissionAllDecorator('hg.admin')
201 @HasPermissionAllDecorator('hg.admin')
195 @CSRFRequired()
202 @CSRFRequired()
196 def settings_vcs_delete_svn_pattern(self):
203 def settings_vcs_delete_svn_pattern(self):
197 delete_pattern_id = self.request.POST.get('delete_svn_pattern')
204 delete_pattern_id = self.request.POST.get('delete_svn_pattern')
198 model = VcsSettingsModel()
205 model = VcsSettingsModel()
199 try:
206 try:
200 model.delete_global_svn_pattern(delete_pattern_id)
207 model.delete_global_svn_pattern(delete_pattern_id)
201 except SettingNotFound:
208 except SettingNotFound:
202 log.exception(
209 log.exception(
203 'Failed to delete svn_pattern with id %s', delete_pattern_id)
210 'Failed to delete svn_pattern with id %s', delete_pattern_id)
204 raise HTTPNotFound()
211 raise HTTPNotFound()
205
212
206 Session().commit()
213 Session().commit()
207 return True
214 return True
208
215
209 @LoginRequired()
216 @LoginRequired()
210 @HasPermissionAllDecorator('hg.admin')
217 @HasPermissionAllDecorator('hg.admin')
211 def settings_mapping(self):
218 def settings_mapping(self):
212 c = self.load_default_context()
219 c = self.load_default_context()
213 c.active = 'mapping'
220 c.active = 'mapping'
214 c.storage_path = get_rhodecode_repo_store_path()
221 c.storage_path = get_rhodecode_repo_store_path()
215 data = render('rhodecode:templates/admin/settings/settings.mako',
222 data = render('rhodecode:templates/admin/settings/settings.mako',
216 self._get_template_context(c), self.request)
223 self._get_template_context(c), self.request)
217 html = formencode.htmlfill.render(
224 html = formencode.htmlfill.render(
218 data,
225 data,
219 defaults=self._form_defaults(),
226 defaults=self._form_defaults(),
220 encoding="UTF-8",
227 encoding="UTF-8",
221 force_defaults=False
228 force_defaults=False
222 )
229 )
223 return Response(html)
230 return Response(html)
224
231
225 @LoginRequired()
232 @LoginRequired()
226 @HasPermissionAllDecorator('hg.admin')
233 @HasPermissionAllDecorator('hg.admin')
227 @CSRFRequired()
234 @CSRFRequired()
228 def settings_mapping_update(self):
235 def settings_mapping_update(self):
229 _ = self.request.translate
236 _ = self.request.translate
230 c = self.load_default_context()
237 c = self.load_default_context()
231 c.active = 'mapping'
238 c.active = 'mapping'
232 rm_obsolete = self.request.POST.get('destroy', False)
239 rm_obsolete = self.request.POST.get('destroy', False)
233 invalidate_cache = self.request.POST.get('invalidate', False)
240 invalidate_cache = self.request.POST.get('invalidate', False)
234 log.debug('rescanning repo location with destroy obsolete=%s', rm_obsolete)
241 log.debug('rescanning repo location with destroy obsolete=%s', rm_obsolete)
235
242
236 if invalidate_cache:
243 if invalidate_cache:
237 log.debug('invalidating all repositories cache')
244 log.debug('invalidating all repositories cache')
238 for repo in Repository.get_all():
245 for repo in Repository.get_all():
239 ScmModel().mark_for_invalidation(repo.repo_name, delete=True)
246 ScmModel().mark_for_invalidation(repo.repo_name, delete=True)
240
247
241 filesystem_repos = ScmModel().repo_scan()
248 filesystem_repos = ScmModel().repo_scan()
242 added, removed = repo2db_mapper(filesystem_repos, rm_obsolete, force_hooks_rebuild=True)
249 added, removed = repo2db_mapper(filesystem_repos, rm_obsolete, force_hooks_rebuild=True)
243 PermissionModel().trigger_permission_flush()
250 PermissionModel().trigger_permission_flush()
244
251
245 def _repr(rm_repo):
252 def _repr(rm_repo):
246 return ', '.join(map(safe_str, rm_repo)) or '-'
253 return ', '.join(map(safe_str, rm_repo)) or '-'
247
254
248 h.flash(_('Repositories successfully '
255 h.flash(_('Repositories successfully '
249 'rescanned added: %s ; removed: %s') %
256 'rescanned added: %s ; removed: %s') %
250 (_repr(added), _repr(removed)),
257 (_repr(added), _repr(removed)),
251 category='success')
258 category='success')
252 raise HTTPFound(h.route_path('admin_settings_mapping'))
259 raise HTTPFound(h.route_path('admin_settings_mapping'))
253
260
254 @LoginRequired()
261 @LoginRequired()
255 @HasPermissionAllDecorator('hg.admin')
262 @HasPermissionAllDecorator('hg.admin')
256 def settings_global(self):
263 def settings_global(self):
257 c = self.load_default_context()
264 c = self.load_default_context()
258 c.active = 'global'
265 c.active = 'global'
259 c.personal_repo_group_default_pattern = RepoGroupModel()\
266 c.personal_repo_group_default_pattern = RepoGroupModel()\
260 .get_personal_group_name_pattern()
267 .get_personal_group_name_pattern()
261
268
262 data = render('rhodecode:templates/admin/settings/settings.mako',
269 data = render('rhodecode:templates/admin/settings/settings.mako',
263 self._get_template_context(c), self.request)
270 self._get_template_context(c), self.request)
264 html = formencode.htmlfill.render(
271 html = formencode.htmlfill.render(
265 data,
272 data,
266 defaults=self._form_defaults(),
273 defaults=self._form_defaults(),
267 encoding="UTF-8",
274 encoding="UTF-8",
268 force_defaults=False
275 force_defaults=False
269 )
276 )
270 return Response(html)
277 return Response(html)
271
278
272 @LoginRequired()
279 @LoginRequired()
273 @HasPermissionAllDecorator('hg.admin')
280 @HasPermissionAllDecorator('hg.admin')
274 @CSRFRequired()
281 @CSRFRequired()
275 def settings_global_update(self):
282 def settings_global_update(self):
276 _ = self.request.translate
283 _ = self.request.translate
277 c = self.load_default_context()
284 c = self.load_default_context()
278 c.active = 'global'
285 c.active = 'global'
279 c.personal_repo_group_default_pattern = RepoGroupModel()\
286 c.personal_repo_group_default_pattern = RepoGroupModel()\
280 .get_personal_group_name_pattern()
287 .get_personal_group_name_pattern()
281 application_form = ApplicationSettingsForm(self.request.translate)()
288 application_form = ApplicationSettingsForm(self.request.translate)()
282 try:
289 try:
283 form_result = application_form.to_python(dict(self.request.POST))
290 form_result = application_form.to_python(dict(self.request.POST))
284 except formencode.Invalid as errors:
291 except formencode.Invalid as errors:
285 h.flash(
292 h.flash(
286 _("Some form inputs contain invalid data."),
293 _("Some form inputs contain invalid data."),
287 category='error')
294 category='error')
288 data = render('rhodecode:templates/admin/settings/settings.mako',
295 data = render('rhodecode:templates/admin/settings/settings.mako',
289 self._get_template_context(c), self.request)
296 self._get_template_context(c), self.request)
290 html = formencode.htmlfill.render(
297 html = formencode.htmlfill.render(
291 data,
298 data,
292 defaults=errors.value,
299 defaults=errors.value,
293 errors=errors.unpack_errors() or {},
300 errors=errors.unpack_errors() or {},
294 prefix_error=False,
301 prefix_error=False,
295 encoding="UTF-8",
302 encoding="UTF-8",
296 force_defaults=False
303 force_defaults=False
297 )
304 )
298 return Response(html)
305 return Response(html)
299
306
300 settings = [
307 settings = [
301 ('title', 'rhodecode_title', 'unicode'),
308 ('title', 'rhodecode_title', 'unicode'),
302 ('realm', 'rhodecode_realm', 'unicode'),
309 ('realm', 'rhodecode_realm', 'unicode'),
303 ('pre_code', 'rhodecode_pre_code', 'unicode'),
310 ('pre_code', 'rhodecode_pre_code', 'unicode'),
304 ('post_code', 'rhodecode_post_code', 'unicode'),
311 ('post_code', 'rhodecode_post_code', 'unicode'),
305 ('captcha_public_key', 'rhodecode_captcha_public_key', 'unicode'),
312 ('captcha_public_key', 'rhodecode_captcha_public_key', 'unicode'),
306 ('captcha_private_key', 'rhodecode_captcha_private_key', 'unicode'),
313 ('captcha_private_key', 'rhodecode_captcha_private_key', 'unicode'),
307 ('create_personal_repo_group', 'rhodecode_create_personal_repo_group', 'bool'),
314 ('create_personal_repo_group', 'rhodecode_create_personal_repo_group', 'bool'),
308 ('personal_repo_group_pattern', 'rhodecode_personal_repo_group_pattern', 'unicode'),
315 ('personal_repo_group_pattern', 'rhodecode_personal_repo_group_pattern', 'unicode'),
309 ]
316 ]
310
317
311 try:
318 try:
312 for setting, form_key, type_ in settings:
319 for setting, form_key, type_ in settings:
313 sett = SettingsModel().create_or_update_setting(
320 sett = SettingsModel().create_or_update_setting(
314 setting, form_result[form_key], type_)
321 setting, form_result[form_key], type_)
315 Session().add(sett)
322 Session().add(sett)
316
323
317 Session().commit()
324 Session().commit()
318 SettingsModel().invalidate_settings_cache()
325 SettingsModel().invalidate_settings_cache()
319 h.flash(_('Updated application settings'), category='success')
326 h.flash(_('Updated application settings'), category='success')
320 except Exception:
327 except Exception:
321 log.exception("Exception while updating application settings")
328 log.exception("Exception while updating application settings")
322 h.flash(
329 h.flash(
323 _('Error occurred during updating application settings'),
330 _('Error occurred during updating application settings'),
324 category='error')
331 category='error')
325
332
326 raise HTTPFound(h.route_path('admin_settings_global'))
333 raise HTTPFound(h.route_path('admin_settings_global'))
327
334
328 @LoginRequired()
335 @LoginRequired()
329 @HasPermissionAllDecorator('hg.admin')
336 @HasPermissionAllDecorator('hg.admin')
330 def settings_visual(self):
337 def settings_visual(self):
331 c = self.load_default_context()
338 c = self.load_default_context()
332 c.active = 'visual'
339 c.active = 'visual'
333
340
334 data = render('rhodecode:templates/admin/settings/settings.mako',
341 data = render('rhodecode:templates/admin/settings/settings.mako',
335 self._get_template_context(c), self.request)
342 self._get_template_context(c), self.request)
336 html = formencode.htmlfill.render(
343 html = formencode.htmlfill.render(
337 data,
344 data,
338 defaults=self._form_defaults(),
345 defaults=self._form_defaults(),
339 encoding="UTF-8",
346 encoding="UTF-8",
340 force_defaults=False
347 force_defaults=False
341 )
348 )
342 return Response(html)
349 return Response(html)
343
350
344 @LoginRequired()
351 @LoginRequired()
345 @HasPermissionAllDecorator('hg.admin')
352 @HasPermissionAllDecorator('hg.admin')
346 @CSRFRequired()
353 @CSRFRequired()
347 def settings_visual_update(self):
354 def settings_visual_update(self):
348 _ = self.request.translate
355 _ = self.request.translate
349 c = self.load_default_context()
356 c = self.load_default_context()
350 c.active = 'visual'
357 c.active = 'visual'
351 application_form = ApplicationVisualisationForm(self.request.translate)()
358 application_form = ApplicationVisualisationForm(self.request.translate)()
352 try:
359 try:
353 form_result = application_form.to_python(dict(self.request.POST))
360 form_result = application_form.to_python(dict(self.request.POST))
354 except formencode.Invalid as errors:
361 except formencode.Invalid as errors:
355 h.flash(
362 h.flash(
356 _("Some form inputs contain invalid data."),
363 _("Some form inputs contain invalid data."),
357 category='error')
364 category='error')
358 data = render('rhodecode:templates/admin/settings/settings.mako',
365 data = render('rhodecode:templates/admin/settings/settings.mako',
359 self._get_template_context(c), self.request)
366 self._get_template_context(c), self.request)
360 html = formencode.htmlfill.render(
367 html = formencode.htmlfill.render(
361 data,
368 data,
362 defaults=errors.value,
369 defaults=errors.value,
363 errors=errors.unpack_errors() or {},
370 errors=errors.unpack_errors() or {},
364 prefix_error=False,
371 prefix_error=False,
365 encoding="UTF-8",
372 encoding="UTF-8",
366 force_defaults=False
373 force_defaults=False
367 )
374 )
368 return Response(html)
375 return Response(html)
369
376
370 try:
377 try:
371 settings = [
378 settings = [
372 ('show_public_icon', 'rhodecode_show_public_icon', 'bool'),
379 ('show_public_icon', 'rhodecode_show_public_icon', 'bool'),
373 ('show_private_icon', 'rhodecode_show_private_icon', 'bool'),
380 ('show_private_icon', 'rhodecode_show_private_icon', 'bool'),
374 ('stylify_metatags', 'rhodecode_stylify_metatags', 'bool'),
381 ('stylify_metatags', 'rhodecode_stylify_metatags', 'bool'),
375 ('repository_fields', 'rhodecode_repository_fields', 'bool'),
382 ('repository_fields', 'rhodecode_repository_fields', 'bool'),
376 ('dashboard_items', 'rhodecode_dashboard_items', 'int'),
383 ('dashboard_items', 'rhodecode_dashboard_items', 'int'),
377 ('admin_grid_items', 'rhodecode_admin_grid_items', 'int'),
384 ('admin_grid_items', 'rhodecode_admin_grid_items', 'int'),
378 ('show_version', 'rhodecode_show_version', 'bool'),
385 ('show_version', 'rhodecode_show_version', 'bool'),
379 ('use_gravatar', 'rhodecode_use_gravatar', 'bool'),
386 ('use_gravatar', 'rhodecode_use_gravatar', 'bool'),
380 ('markup_renderer', 'rhodecode_markup_renderer', 'unicode'),
387 ('markup_renderer', 'rhodecode_markup_renderer', 'unicode'),
381 ('gravatar_url', 'rhodecode_gravatar_url', 'unicode'),
388 ('gravatar_url', 'rhodecode_gravatar_url', 'unicode'),
382 ('clone_uri_tmpl', 'rhodecode_clone_uri_tmpl', 'unicode'),
389 ('clone_uri_tmpl', 'rhodecode_clone_uri_tmpl', 'unicode'),
383 ('clone_uri_id_tmpl', 'rhodecode_clone_uri_id_tmpl', 'unicode'),
390 ('clone_uri_id_tmpl', 'rhodecode_clone_uri_id_tmpl', 'unicode'),
384 ('clone_uri_ssh_tmpl', 'rhodecode_clone_uri_ssh_tmpl', 'unicode'),
391 ('clone_uri_ssh_tmpl', 'rhodecode_clone_uri_ssh_tmpl', 'unicode'),
385 ('support_url', 'rhodecode_support_url', 'unicode'),
392 ('support_url', 'rhodecode_support_url', 'unicode'),
386 ('show_revision_number', 'rhodecode_show_revision_number', 'bool'),
393 ('show_revision_number', 'rhodecode_show_revision_number', 'bool'),
387 ('show_sha_length', 'rhodecode_show_sha_length', 'int'),
394 ('show_sha_length', 'rhodecode_show_sha_length', 'int'),
388 ]
395 ]
389 for setting, form_key, type_ in settings:
396 for setting, form_key, type_ in settings:
390 sett = SettingsModel().create_or_update_setting(
397 sett = SettingsModel().create_or_update_setting(
391 setting, form_result[form_key], type_)
398 setting, form_result[form_key], type_)
392 Session().add(sett)
399 Session().add(sett)
393
400
394 Session().commit()
401 Session().commit()
395 SettingsModel().invalidate_settings_cache()
402 SettingsModel().invalidate_settings_cache()
396 h.flash(_('Updated visualisation settings'), category='success')
403 h.flash(_('Updated visualisation settings'), category='success')
397 except Exception:
404 except Exception:
398 log.exception("Exception updating visualization settings")
405 log.exception("Exception updating visualization settings")
399 h.flash(_('Error occurred during updating '
406 h.flash(_('Error occurred during updating '
400 'visualisation settings'),
407 'visualisation settings'),
401 category='error')
408 category='error')
402
409
403 raise HTTPFound(h.route_path('admin_settings_visual'))
410 raise HTTPFound(h.route_path('admin_settings_visual'))
404
411
405 @LoginRequired()
412 @LoginRequired()
406 @HasPermissionAllDecorator('hg.admin')
413 @HasPermissionAllDecorator('hg.admin')
407 def settings_issuetracker(self):
414 def settings_issuetracker(self):
408 c = self.load_default_context()
415 c = self.load_default_context()
409 c.active = 'issuetracker'
416 c.active = 'issuetracker'
410 defaults = c.rc_config
417 defaults = c.rc_config
411
418
412 entry_key = 'rhodecode_issuetracker_pat_'
419 entry_key = 'rhodecode_issuetracker_pat_'
413
420
414 c.issuetracker_entries = {}
421 c.issuetracker_entries = {}
415 for k, v in defaults.items():
422 for k, v in defaults.items():
416 if k.startswith(entry_key):
423 if k.startswith(entry_key):
417 uid = k[len(entry_key):]
424 uid = k[len(entry_key):]
418 c.issuetracker_entries[uid] = None
425 c.issuetracker_entries[uid] = None
419
426
420 for uid in c.issuetracker_entries:
427 for uid in c.issuetracker_entries:
421 c.issuetracker_entries[uid] = AttributeDict({
428 c.issuetracker_entries[uid] = AttributeDict({
422 'pat': defaults.get('rhodecode_issuetracker_pat_' + uid),
429 'pat': defaults.get('rhodecode_issuetracker_pat_' + uid),
423 'url': defaults.get('rhodecode_issuetracker_url_' + uid),
430 'url': defaults.get('rhodecode_issuetracker_url_' + uid),
424 'pref': defaults.get('rhodecode_issuetracker_pref_' + uid),
431 'pref': defaults.get('rhodecode_issuetracker_pref_' + uid),
425 'desc': defaults.get('rhodecode_issuetracker_desc_' + uid),
432 'desc': defaults.get('rhodecode_issuetracker_desc_' + uid),
426 })
433 })
427
434
428 return self._get_template_context(c)
435 return self._get_template_context(c)
429
436
430 @LoginRequired()
437 @LoginRequired()
431 @HasPermissionAllDecorator('hg.admin')
438 @HasPermissionAllDecorator('hg.admin')
432 @CSRFRequired()
439 @CSRFRequired()
433 def settings_issuetracker_test(self):
440 def settings_issuetracker_test(self):
434 error_container = []
441 error_container = []
435
442
436 urlified_commit = h.urlify_commit_message(
443 urlified_commit = h.urlify_commit_message(
437 self.request.POST.get('test_text', ''),
444 self.request.POST.get('test_text', ''),
438 'repo_group/test_repo1', error_container=error_container)
445 'repo_group/test_repo1', error_container=error_container)
439 if error_container:
446 if error_container:
440 def converter(inp):
447 def converter(inp):
441 return h.html_escape(inp)
448 return h.html_escape(inp)
442
449
443 return 'ERRORS: ' + '\n'.join(map(converter, error_container))
450 return 'ERRORS: ' + '\n'.join(map(converter, error_container))
444
451
445 return urlified_commit
452 return urlified_commit
446
453
447 @LoginRequired()
454 @LoginRequired()
448 @HasPermissionAllDecorator('hg.admin')
455 @HasPermissionAllDecorator('hg.admin')
449 @CSRFRequired()
456 @CSRFRequired()
450 def settings_issuetracker_update(self):
457 def settings_issuetracker_update(self):
451 _ = self.request.translate
458 _ = self.request.translate
452 self.load_default_context()
459 self.load_default_context()
453 settings_model = IssueTrackerSettingsModel()
460 settings_model = IssueTrackerSettingsModel()
454
461
455 try:
462 try:
456 form = IssueTrackerPatternsForm(self.request.translate)()
463 form = IssueTrackerPatternsForm(self.request.translate)()
457 data = form.to_python(self.request.POST)
464 data = form.to_python(self.request.POST)
458 except formencode.Invalid as errors:
465 except formencode.Invalid as errors:
459 log.exception('Failed to add new pattern')
466 log.exception('Failed to add new pattern')
460 error = errors
467 error = errors
461 h.flash(_(f'Invalid issue tracker pattern: {error}'),
468 h.flash(_(f'Invalid issue tracker pattern: {error}'),
462 category='error')
469 category='error')
463 raise HTTPFound(h.route_path('admin_settings_issuetracker'))
470 raise HTTPFound(h.route_path('admin_settings_issuetracker'))
464
471
465 if data:
472 if data:
466 for uid in data.get('delete_patterns', []):
473 for uid in data.get('delete_patterns', []):
467 settings_model.delete_entries(uid)
474 settings_model.delete_entries(uid)
468
475
469 for pattern in data.get('patterns', []):
476 for pattern in data.get('patterns', []):
470 for setting, value, type_ in pattern:
477 for setting, value, type_ in pattern:
471 sett = settings_model.create_or_update_setting(
478 sett = settings_model.create_or_update_setting(
472 setting, value, type_)
479 setting, value, type_)
473 Session().add(sett)
480 Session().add(sett)
474
481
475 Session().commit()
482 Session().commit()
476
483
477 SettingsModel().invalidate_settings_cache()
484 SettingsModel().invalidate_settings_cache()
478 h.flash(_('Updated issue tracker entries'), category='success')
485 h.flash(_('Updated issue tracker entries'), category='success')
479 raise HTTPFound(h.route_path('admin_settings_issuetracker'))
486 raise HTTPFound(h.route_path('admin_settings_issuetracker'))
480
487
481 @LoginRequired()
488 @LoginRequired()
482 @HasPermissionAllDecorator('hg.admin')
489 @HasPermissionAllDecorator('hg.admin')
483 @CSRFRequired()
490 @CSRFRequired()
484 def settings_issuetracker_delete(self):
491 def settings_issuetracker_delete(self):
485 _ = self.request.translate
492 _ = self.request.translate
486 self.load_default_context()
493 self.load_default_context()
487 uid = self.request.POST.get('uid')
494 uid = self.request.POST.get('uid')
488 try:
495 try:
489 IssueTrackerSettingsModel().delete_entries(uid)
496 IssueTrackerSettingsModel().delete_entries(uid)
490 except Exception:
497 except Exception:
491 log.exception('Failed to delete issue tracker setting %s', uid)
498 log.exception('Failed to delete issue tracker setting %s', uid)
492 raise HTTPNotFound()
499 raise HTTPNotFound()
493
500
494 SettingsModel().invalidate_settings_cache()
501 SettingsModel().invalidate_settings_cache()
495 h.flash(_('Removed issue tracker entry.'), category='success')
502 h.flash(_('Removed issue tracker entry.'), category='success')
496
503
497 return {'deleted': uid}
504 return {'deleted': uid}
498
505
499 @LoginRequired()
506 @LoginRequired()
500 @HasPermissionAllDecorator('hg.admin')
507 @HasPermissionAllDecorator('hg.admin')
501 def settings_email(self):
508 def settings_email(self):
502 c = self.load_default_context()
509 c = self.load_default_context()
503 c.active = 'email'
510 c.active = 'email'
504 c.rhodecode_ini = rhodecode.CONFIG
511 c.rhodecode_ini = rhodecode.CONFIG
505
512
506 data = render('rhodecode:templates/admin/settings/settings.mako',
513 data = render('rhodecode:templates/admin/settings/settings.mako',
507 self._get_template_context(c), self.request)
514 self._get_template_context(c), self.request)
508 html = formencode.htmlfill.render(
515 html = formencode.htmlfill.render(
509 data,
516 data,
510 defaults=self._form_defaults(),
517 defaults=self._form_defaults(),
511 encoding="UTF-8",
518 encoding="UTF-8",
512 force_defaults=False
519 force_defaults=False
513 )
520 )
514 return Response(html)
521 return Response(html)
515
522
516 @LoginRequired()
523 @LoginRequired()
517 @HasPermissionAllDecorator('hg.admin')
524 @HasPermissionAllDecorator('hg.admin')
518 @CSRFRequired()
525 @CSRFRequired()
519 def settings_email_update(self):
526 def settings_email_update(self):
520 _ = self.request.translate
527 _ = self.request.translate
521 c = self.load_default_context()
528 c = self.load_default_context()
522 c.active = 'email'
529 c.active = 'email'
523
530
524 test_email = self.request.POST.get('test_email')
531 test_email = self.request.POST.get('test_email')
525
532
526 if not test_email:
533 if not test_email:
527 h.flash(_('Please enter email address'), category='error')
534 h.flash(_('Please enter email address'), category='error')
528 raise HTTPFound(h.route_path('admin_settings_email'))
535 raise HTTPFound(h.route_path('admin_settings_email'))
529
536
530 email_kwargs = {
537 email_kwargs = {
531 'date': datetime.datetime.now(),
538 'date': datetime.datetime.now(),
532 'user': self._rhodecode_db_user
539 'user': self._rhodecode_db_user
533 }
540 }
534
541
535 (subject, email_body, email_body_plaintext) = EmailNotificationModel().render_email(
542 (subject, email_body, email_body_plaintext) = EmailNotificationModel().render_email(
536 EmailNotificationModel.TYPE_EMAIL_TEST, **email_kwargs)
543 EmailNotificationModel.TYPE_EMAIL_TEST, **email_kwargs)
537
544
538 recipients = [test_email] if test_email else None
545 recipients = [test_email] if test_email else None
539
546
540 run_task(tasks.send_email, recipients, subject,
547 run_task(tasks.send_email, recipients, subject,
541 email_body_plaintext, email_body)
548 email_body_plaintext, email_body)
542
549
543 h.flash(_('Send email task created'), category='success')
550 h.flash(_('Send email task created'), category='success')
544 raise HTTPFound(h.route_path('admin_settings_email'))
551 raise HTTPFound(h.route_path('admin_settings_email'))
545
552
546 @LoginRequired()
553 @LoginRequired()
547 @HasPermissionAllDecorator('hg.admin')
554 @HasPermissionAllDecorator('hg.admin')
548 def settings_hooks(self):
555 def settings_hooks(self):
549 c = self.load_default_context()
556 c = self.load_default_context()
550 c.active = 'hooks'
557 c.active = 'hooks'
551
558
552 model = SettingsModel()
559 model = SettingsModel()
553 c.hooks = model.get_builtin_hooks()
560 c.hooks = model.get_builtin_hooks()
554 c.custom_hooks = model.get_custom_hooks()
561 c.custom_hooks = model.get_custom_hooks()
555
562
556 data = render('rhodecode:templates/admin/settings/settings.mako',
563 data = render('rhodecode:templates/admin/settings/settings.mako',
557 self._get_template_context(c), self.request)
564 self._get_template_context(c), self.request)
558 html = formencode.htmlfill.render(
565 html = formencode.htmlfill.render(
559 data,
566 data,
560 defaults=self._form_defaults(),
567 defaults=self._form_defaults(),
561 encoding="UTF-8",
568 encoding="UTF-8",
562 force_defaults=False
569 force_defaults=False
563 )
570 )
564 return Response(html)
571 return Response(html)
565
572
566 @LoginRequired()
573 @LoginRequired()
567 @HasPermissionAllDecorator('hg.admin')
574 @HasPermissionAllDecorator('hg.admin')
568 @CSRFRequired()
575 @CSRFRequired()
569 def settings_hooks_update(self):
576 def settings_hooks_update(self):
570 _ = self.request.translate
577 _ = self.request.translate
571 c = self.load_default_context()
578 c = self.load_default_context()
572 c.active = 'hooks'
579 c.active = 'hooks'
573 if c.visual.allow_custom_hooks_settings:
580 if c.visual.allow_custom_hooks_settings:
574 ui_key = self.request.POST.get('new_hook_ui_key')
581 ui_key = self.request.POST.get('new_hook_ui_key')
575 ui_value = self.request.POST.get('new_hook_ui_value')
582 ui_value = self.request.POST.get('new_hook_ui_value')
576
583
577 hook_id = self.request.POST.get('hook_id')
584 hook_id = self.request.POST.get('hook_id')
578 new_hook = False
585 new_hook = False
579
586
580 model = SettingsModel()
587 model = SettingsModel()
581 try:
588 try:
582 if ui_value and ui_key:
589 if ui_value and ui_key:
583 model.create_or_update_hook(ui_key, ui_value)
590 model.create_or_update_hook(ui_key, ui_value)
584 h.flash(_('Added new hook'), category='success')
591 h.flash(_('Added new hook'), category='success')
585 new_hook = True
592 new_hook = True
586 elif hook_id:
593 elif hook_id:
587 RhodeCodeUi.delete(hook_id)
594 RhodeCodeUi.delete(hook_id)
588 Session().commit()
595 Session().commit()
589
596
590 # check for edits
597 # check for edits
591 update = False
598 update = False
592 _d = self.request.POST.dict_of_lists()
599 _d = self.request.POST.dict_of_lists()
593 for k, v in zip(_d.get('hook_ui_key', []),
600 for k, v in zip(_d.get('hook_ui_key', []),
594 _d.get('hook_ui_value_new', [])):
601 _d.get('hook_ui_value_new', [])):
595 model.create_or_update_hook(k, v)
602 model.create_or_update_hook(k, v)
596 update = True
603 update = True
597
604
598 if update and not new_hook:
605 if update and not new_hook:
599 h.flash(_('Updated hooks'), category='success')
606 h.flash(_('Updated hooks'), category='success')
600 Session().commit()
607 Session().commit()
601 except Exception:
608 except Exception:
602 log.exception("Exception during hook creation")
609 log.exception("Exception during hook creation")
603 h.flash(_('Error occurred during hook creation'),
610 h.flash(_('Error occurred during hook creation'),
604 category='error')
611 category='error')
605
612
606 raise HTTPFound(h.route_path('admin_settings_hooks'))
613 raise HTTPFound(h.route_path('admin_settings_hooks'))
607
614
608 @LoginRequired()
615 @LoginRequired()
609 @HasPermissionAllDecorator('hg.admin')
616 @HasPermissionAllDecorator('hg.admin')
610 def settings_search(self):
617 def settings_search(self):
611 c = self.load_default_context()
618 c = self.load_default_context()
612 c.active = 'search'
619 c.active = 'search'
613
620
614 c.searcher = searcher_from_config(self.request.registry.settings)
621 c.searcher = searcher_from_config(self.request.registry.settings)
615 c.statistics = c.searcher.statistics(self.request.translate)
622 c.statistics = c.searcher.statistics(self.request.translate)
616
623
617 return self._get_template_context(c)
624 return self._get_template_context(c)
618
625
619 @LoginRequired()
626 @LoginRequired()
620 @HasPermissionAllDecorator('hg.admin')
627 @HasPermissionAllDecorator('hg.admin')
621 def settings_labs(self):
628 def settings_labs(self):
622 c = self.load_default_context()
629 c = self.load_default_context()
623 if not c.labs_active:
630 if not c.labs_active:
624 raise HTTPFound(h.route_path('admin_settings'))
631 raise HTTPFound(h.route_path('admin_settings'))
625
632
626 c.active = 'labs'
633 c.active = 'labs'
627 c.lab_settings = _LAB_SETTINGS
634 c.lab_settings = _LAB_SETTINGS
628
635
629 data = render('rhodecode:templates/admin/settings/settings.mako',
636 data = render('rhodecode:templates/admin/settings/settings.mako',
630 self._get_template_context(c), self.request)
637 self._get_template_context(c), self.request)
631 html = formencode.htmlfill.render(
638 html = formencode.htmlfill.render(
632 data,
639 data,
633 defaults=self._form_defaults(),
640 defaults=self._form_defaults(),
634 encoding="UTF-8",
641 encoding="UTF-8",
635 force_defaults=False
642 force_defaults=False
636 )
643 )
637 return Response(html)
644 return Response(html)
638
645
639 @LoginRequired()
646 @LoginRequired()
640 @HasPermissionAllDecorator('hg.admin')
647 @HasPermissionAllDecorator('hg.admin')
641 @CSRFRequired()
648 @CSRFRequired()
642 def settings_labs_update(self):
649 def settings_labs_update(self):
643 _ = self.request.translate
650 _ = self.request.translate
644 c = self.load_default_context()
651 c = self.load_default_context()
645 c.active = 'labs'
652 c.active = 'labs'
646
653
647 application_form = LabsSettingsForm(self.request.translate)()
654 application_form = LabsSettingsForm(self.request.translate)()
648 try:
655 try:
649 form_result = application_form.to_python(dict(self.request.POST))
656 form_result = application_form.to_python(dict(self.request.POST))
650 except formencode.Invalid as errors:
657 except formencode.Invalid as errors:
651 h.flash(
658 h.flash(
652 _("Some form inputs contain invalid data."),
659 _("Some form inputs contain invalid data."),
653 category='error')
660 category='error')
654 data = render('rhodecode:templates/admin/settings/settings.mako',
661 data = render('rhodecode:templates/admin/settings/settings.mako',
655 self._get_template_context(c), self.request)
662 self._get_template_context(c), self.request)
656 html = formencode.htmlfill.render(
663 html = formencode.htmlfill.render(
657 data,
664 data,
658 defaults=errors.value,
665 defaults=errors.value,
659 errors=errors.unpack_errors() or {},
666 errors=errors.unpack_errors() or {},
660 prefix_error=False,
667 prefix_error=False,
661 encoding="UTF-8",
668 encoding="UTF-8",
662 force_defaults=False
669 force_defaults=False
663 )
670 )
664 return Response(html)
671 return Response(html)
665
672
666 try:
673 try:
667 session = Session()
674 session = Session()
668 for setting in _LAB_SETTINGS:
675 for setting in _LAB_SETTINGS:
669 setting_name = setting.key[len('rhodecode_'):]
676 setting_name = setting.key[len('rhodecode_'):]
670 sett = SettingsModel().create_or_update_setting(
677 sett = SettingsModel().create_or_update_setting(
671 setting_name, form_result[setting.key], setting.type)
678 setting_name, form_result[setting.key], setting.type)
672 session.add(sett)
679 session.add(sett)
673
680
674 except Exception:
681 except Exception:
675 log.exception('Exception while updating lab settings')
682 log.exception('Exception while updating lab settings')
676 h.flash(_('Error occurred during updating labs settings'),
683 h.flash(_('Error occurred during updating labs settings'),
677 category='error')
684 category='error')
678 else:
685 else:
679 Session().commit()
686 Session().commit()
680 SettingsModel().invalidate_settings_cache()
687 SettingsModel().invalidate_settings_cache()
681 h.flash(_('Updated Labs settings'), category='success')
688 h.flash(_('Updated Labs settings'), category='success')
682 raise HTTPFound(h.route_path('admin_settings_labs'))
689 raise HTTPFound(h.route_path('admin_settings_labs'))
683
690
684 data = render('rhodecode:templates/admin/settings/settings.mako',
691 data = render('rhodecode:templates/admin/settings/settings.mako',
685 self._get_template_context(c), self.request)
692 self._get_template_context(c), self.request)
686 html = formencode.htmlfill.render(
693 html = formencode.htmlfill.render(
687 data,
694 data,
688 defaults=self._form_defaults(),
695 defaults=self._form_defaults(),
689 encoding="UTF-8",
696 encoding="UTF-8",
690 force_defaults=False
697 force_defaults=False
691 )
698 )
692 return Response(html)
699 return Response(html)
693
700
694
701
695 # :param key: name of the setting including the 'rhodecode_' prefix
702 # :param key: name of the setting including the 'rhodecode_' prefix
696 # :param type: the RhodeCodeSetting type to use.
703 # :param type: the RhodeCodeSetting type to use.
697 # :param group: the i18ned group in which we should dispaly this setting
704 # :param group: the i18ned group in which we should dispaly this setting
698 # :param label: the i18ned label we should display for this setting
705 # :param label: the i18ned label we should display for this setting
699 # :param help: the i18ned help we should dispaly for this setting
706 # :param help: the i18ned help we should dispaly for this setting
700 LabSetting = collections.namedtuple(
707 LabSetting = collections.namedtuple(
701 'LabSetting', ('key', 'type', 'group', 'label', 'help'))
708 'LabSetting', ('key', 'type', 'group', 'label', 'help'))
702
709
703
710
704 # This list has to be kept in sync with the form
711 # This list has to be kept in sync with the form
705 # rhodecode.model.forms.LabsSettingsForm.
712 # rhodecode.model.forms.LabsSettingsForm.
706 _LAB_SETTINGS = [
713 _LAB_SETTINGS = [
707
714
708 ]
715 ]
@@ -1,243 +1,249 b''
1
1
2
2
3 # Copyright (C) 2016-2023 RhodeCode GmbH
3 # Copyright (C) 2016-2023 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import urllib.request
22 import urllib.request
23 import urllib.error
23 import urllib.error
24 import urllib.parse
24 import urllib.parse
25 import os
25 import os
26
26
27 import rhodecode
27 import rhodecode
28 from rhodecode.apps._base import BaseAppView
28 from rhodecode.apps._base import BaseAppView
29 from rhodecode.apps._base.navigation import navigation_list
29 from rhodecode.apps._base.navigation import navigation_list
30 from rhodecode.lib import helpers as h
30 from rhodecode.lib import helpers as h
31 from rhodecode.lib.auth import (LoginRequired, HasPermissionAllDecorator)
31 from rhodecode.lib.auth import (LoginRequired, HasPermissionAllDecorator)
32 from rhodecode.lib.utils2 import str2bool
32 from rhodecode.lib.utils2 import str2bool
33 from rhodecode.lib import system_info
33 from rhodecode.lib import system_info
34 from rhodecode.model.update import UpdateModel
34 from rhodecode.model.update import UpdateModel
35
35
36 log = logging.getLogger(__name__)
36 log = logging.getLogger(__name__)
37
37
38
38
39 class AdminSystemInfoSettingsView(BaseAppView):
39 class AdminSystemInfoSettingsView(BaseAppView):
40 def load_default_context(self):
40 def load_default_context(self):
41 c = self._get_local_tmpl_context()
41 c = self._get_local_tmpl_context()
42 return c
42 return c
43
43
44 def get_env_data(self):
44 def get_env_data(self):
45 black_list = [
45 black_list = [
46 'NIX_LDFLAGS',
46 'NIX_LDFLAGS',
47 'NIX_CFLAGS_COMPILE',
47 'NIX_CFLAGS_COMPILE',
48 'propagatedBuildInputs',
48 'propagatedBuildInputs',
49 'propagatedNativeBuildInputs',
49 'propagatedNativeBuildInputs',
50 'postInstall',
50 'postInstall',
51 'buildInputs',
51 'buildInputs',
52 'buildPhase',
52 'buildPhase',
53 'preShellHook',
53 'preShellHook',
54 'preShellHook',
54 'preShellHook',
55 'preCheck',
55 'preCheck',
56 'preBuild',
56 'preBuild',
57 'postShellHook',
57 'postShellHook',
58 'postFixup',
58 'postFixup',
59 'postCheck',
59 'postCheck',
60 'nativeBuildInputs',
60 'nativeBuildInputs',
61 'installPhase',
61 'installPhase',
62 'installCheckPhase',
62 'installCheckPhase',
63 'checkPhase',
63 'checkPhase',
64 'configurePhase',
64 'configurePhase',
65 'shellHook'
65 'shellHook'
66 ]
66 ]
67 secret_list = [
67 secret_list = [
68 'RHODECODE_USER_PASS'
68 'RHODECODE_USER_PASS'
69 ]
69 ]
70
70
71 for k, v in sorted(os.environ.items()):
71 for k, v in sorted(os.environ.items()):
72 if k in black_list:
72 if k in black_list:
73 continue
73 continue
74 if k in secret_list:
74 if k in secret_list:
75 v = '*****'
75 v = '*****'
76 yield k, v
76 yield k, v
77
77
78 @LoginRequired()
78 @LoginRequired()
79 @HasPermissionAllDecorator('hg.admin')
79 @HasPermissionAllDecorator('hg.admin')
80 def settings_system_info(self):
80 def settings_system_info(self):
81 _ = self.request.translate
81 _ = self.request.translate
82 c = self.load_default_context()
82 c = self.load_default_context()
83
83
84 c.active = 'system'
84 c.active = 'system'
85 c.navlist = navigation_list(self.request)
85 c.navlist = navigation_list(self.request)
86
86
87 # TODO(marcink), figure out how to allow only selected users to do this
87 # TODO(marcink), figure out how to allow only selected users to do this
88 c.allowed_to_snapshot = self._rhodecode_user.admin
88 c.allowed_to_snapshot = self._rhodecode_user.admin
89
89
90 snapshot = str2bool(self.request.params.get('snapshot'))
90 snapshot = str2bool(self.request.params.get('snapshot'))
91
91
92 c.rhodecode_update_url = UpdateModel().get_update_url()
92 c.rhodecode_update_url = UpdateModel().get_update_url()
93 c.env_data = self.get_env_data()
93 c.env_data = self.get_env_data()
94 server_info = system_info.get_system_info(self.request.environ)
94 server_info = system_info.get_system_info(self.request.environ)
95
95
96 for key, val in server_info.items():
96 for key, val in server_info.items():
97 setattr(c, key, val)
97 setattr(c, key, val)
98
98
99 def val(name, subkey='human_value'):
99 def val(name, subkey='human_value'):
100 return server_info[name][subkey]
100 return server_info[name][subkey]
101
101
102 def state(name):
102 def state(name):
103 return server_info[name]['state']
103 return server_info[name]['state']
104
104
105 def val2(name):
105 def val2(name):
106 val = server_info[name]['human_value']
106 val = server_info[name]['human_value']
107 state = server_info[name]['state']
107 state = server_info[name]['state']
108 return val, state
108 return val, state
109
109
110 update_info_msg = _('Note: please make sure this server can '
110 update_info_msg = _('Note: please make sure this server can '
111 'access `${url}` for the update link to work',
111 'access `${url}` for the update link to work',
112 mapping=dict(url=c.rhodecode_update_url))
112 mapping=dict(url=c.rhodecode_update_url))
113 version = UpdateModel().get_stored_version()
113 version = UpdateModel().get_stored_version()
114 is_outdated = UpdateModel().is_outdated(
114 is_outdated = UpdateModel().is_outdated(
115 rhodecode.__version__, version)
115 rhodecode.__version__, version)
116 update_state = {
116 update_state = {
117 'type': 'warning',
117 'type': 'warning',
118 'message': 'New version available: {}'.format(version)
118 'message': 'New version available: {}'.format(version)
119 } \
119 } \
120 if is_outdated else {}
120 if is_outdated else {}
121 c.data_items = [
121 c.data_items = [
122 # update info
122 # update info
123 (_('Update info'), h.literal(
123 (_('Update info'), h.literal(
124 '<span class="link" id="check_for_update" >%s.</span>' % (
124 '<span class="link" id="check_for_update" >%s.</span>' % (
125 _('Check for updates')) +
125 _('Check for updates')) +
126 '<br/> <span >%s.</span>' % (update_info_msg)
126 '<br/> <span >%s.</span>' % (update_info_msg)
127 ), ''),
127 ), ''),
128
128
129 # RhodeCode specific
129 # RhodeCode specific
130 (_('RhodeCode Version'), val('rhodecode_app')['text'], state('rhodecode_app')),
130 (_('RhodeCode Version'), val('rhodecode_app')['text'], state('rhodecode_app')),
131 (_('Latest version'), version, update_state),
131 (_('Latest version'), version, update_state),
132 (_('RhodeCode Base URL'), val('rhodecode_config')['config'].get('app.base_url'), state('rhodecode_config')),
132 (_('RhodeCode Base URL'), val('rhodecode_config')['config'].get('app.base_url'), state('rhodecode_config')),
133 (_('RhodeCode Server IP'), val('server')['server_ip'], state('server')),
133 (_('RhodeCode Server IP'), val('server')['server_ip'], state('server')),
134 (_('RhodeCode Server ID'), val('server')['server_id'], state('server')),
134 (_('RhodeCode Server ID'), val('server')['server_id'], state('server')),
135 (_('RhodeCode Configuration'), val('rhodecode_config')['path'], state('rhodecode_config')),
135 (_('RhodeCode Configuration'), val('rhodecode_config')['path'], state('rhodecode_config')),
136 (_('RhodeCode Certificate'), val('rhodecode_config')['cert_path'], state('rhodecode_config')),
136 (_('RhodeCode Certificate'), val('rhodecode_config')['cert_path'], state('rhodecode_config')),
137 (_('Workers'), val('rhodecode_config')['config']['server:main'].get('workers', '?'), state('rhodecode_config')),
137 (_('Workers'), val('rhodecode_config')['config']['server:main'].get('workers', '?'), state('rhodecode_config')),
138 (_('Worker Type'), val('rhodecode_config')['config']['server:main'].get('worker_class', 'sync'), state('rhodecode_config')),
138 (_('Worker Type'), val('rhodecode_config')['config']['server:main'].get('worker_class', 'sync'), state('rhodecode_config')),
139 ('', '', ''), # spacer
139 ('', '', ''), # spacer
140
140
141 # Database
141 # Database
142 (_('Database'), val('database')['url'], state('database')),
142 (_('Database'), val('database')['url'], state('database')),
143 (_('Database version'), val('database')['version'], state('database')),
143 (_('Database version'), val('database')['version'], state('database')),
144 ('', '', ''), # spacer
144 ('', '', ''), # spacer
145
145
146 # Platform/Python
146 # Platform/Python
147 (_('Platform'), val('platform')['name'], state('platform')),
147 (_('Platform'), val('platform')['name'], state('platform')),
148 (_('Platform UUID'), val('platform')['uuid'], state('platform')),
148 (_('Platform UUID'), val('platform')['uuid'], state('platform')),
149 (_('Lang'), val('locale'), state('locale')),
149 (_('Lang'), val('locale'), state('locale')),
150 (_('Python version'), val('python')['version'], state('python')),
150 (_('Python version'), val('python')['version'], state('python')),
151 (_('Python path'), val('python')['executable'], state('python')),
151 (_('Python path'), val('python')['executable'], state('python')),
152 ('', '', ''), # spacer
152 ('', '', ''), # spacer
153
153
154 # Systems stats
154 # Systems stats
155 (_('CPU'), val('cpu')['text'], state('cpu')),
155 (_('CPU'), val('cpu')['text'], state('cpu')),
156 (_('Load'), val('load')['text'], state('load')),
156 (_('Load'), val('load')['text'], state('load')),
157 (_('Memory'), val('memory')['text'], state('memory')),
157 (_('Memory'), val('memory')['text'], state('memory')),
158 (_('Uptime'), val('uptime')['text'], state('uptime')),
158 (_('Uptime'), val('uptime')['text'], state('uptime')),
159 ('', '', ''), # spacer
159 ('', '', ''), # spacer
160
160
161 # ulimit
161 # ulimit
162 (_('Ulimit'), val('ulimit')['text'], state('ulimit')),
162 (_('Ulimit'), val('ulimit')['text'], state('ulimit')),
163
163
164 # Repo storage
164 # Repo storage
165 (_('Storage location'), val('storage')['path'], state('storage')),
165 (_('Storage location'), val('storage')['path'], state('storage')),
166 (_('Storage info'), val('storage')['text'], state('storage')),
166 (_('Storage info'), val('storage')['text'], state('storage')),
167 (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')),
167 (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')),
168 ('', '', ''), # spacer
168 ('', '', ''), # spacer
169
169
170 (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')),
170 (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')),
171 (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')),
171 (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')),
172 ('', '', ''), # spacer
172 ('', '', ''), # spacer
173
173
174 (_('Archive cache storage type'), val('storage_archive')['type'], state('storage_archive')),
174 (_('Artifacts storage backend'), val('storage_artifacts')['type'], state('storage_artifacts')),
175 (_('Artifacts storage location'), val('storage_artifacts')['path'], state('storage_artifacts')),
176 (_('Artifacts info'), val('storage_artifacts')['text'], state('storage_artifacts')),
177 ('', '', ''), # spacer
178
179 (_('Archive cache storage backend'), val('storage_archive')['type'], state('storage_archive')),
175 (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')),
180 (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')),
176 (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')),
181 (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')),
177 ('', '', ''), # spacer
182 ('', '', ''), # spacer
178
183
184
179 (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')),
185 (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')),
180 (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')),
186 (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')),
181 ('', '', ''), # spacer
187 ('', '', ''), # spacer
182
188
183 (_('Search info'), val('search')['text'], state('search')),
189 (_('Search info'), val('search')['text'], state('search')),
184 (_('Search location'), val('search')['location'], state('search')),
190 (_('Search location'), val('search')['location'], state('search')),
185 ('', '', ''), # spacer
191 ('', '', ''), # spacer
186
192
187 # VCS specific
193 # VCS specific
188 (_('VCS Backends'), val('vcs_backends'), state('vcs_backends')),
194 (_('VCS Backends'), val('vcs_backends'), state('vcs_backends')),
189 (_('VCS Server'), val('vcs_server')['text'], state('vcs_server')),
195 (_('VCS Server'), val('vcs_server')['text'], state('vcs_server')),
190 (_('GIT'), val('git'), state('git')),
196 (_('GIT'), val('git'), state('git')),
191 (_('HG'), val('hg'), state('hg')),
197 (_('HG'), val('hg'), state('hg')),
192 (_('SVN'), val('svn'), state('svn')),
198 (_('SVN'), val('svn'), state('svn')),
193
199
194 ]
200 ]
195
201
196 c.vcsserver_data_items = [
202 c.vcsserver_data_items = [
197 (k, v) for k, v in (val('vcs_server_config') or {}).items()
203 (k, v) for k, v in (val('vcs_server_config') or {}).items()
198 ]
204 ]
199
205
200 if snapshot:
206 if snapshot:
201 if c.allowed_to_snapshot:
207 if c.allowed_to_snapshot:
202 c.data_items.pop(0) # remove server info
208 c.data_items.pop(0) # remove server info
203 self.request.override_renderer = 'admin/settings/settings_system_snapshot.mako'
209 self.request.override_renderer = 'admin/settings/settings_system_snapshot.mako'
204 else:
210 else:
205 h.flash('You are not allowed to do this', category='warning')
211 h.flash('You are not allowed to do this', category='warning')
206 return self._get_template_context(c)
212 return self._get_template_context(c)
207
213
208 @LoginRequired()
214 @LoginRequired()
209 @HasPermissionAllDecorator('hg.admin')
215 @HasPermissionAllDecorator('hg.admin')
210 def settings_system_info_check_update(self):
216 def settings_system_info_check_update(self):
211 _ = self.request.translate
217 _ = self.request.translate
212 c = self.load_default_context()
218 c = self.load_default_context()
213
219
214 update_url = UpdateModel().get_update_url()
220 update_url = UpdateModel().get_update_url()
215
221
216 def _err(s):
222 def _err(s):
217 return f'<div style="color:#ff8888; padding:4px 0px">{s}</div>'
223 return f'<div style="color:#ff8888; padding:4px 0px">{s}</div>'
218
224
219 try:
225 try:
220 data = UpdateModel().get_update_data(update_url)
226 data = UpdateModel().get_update_data(update_url)
221 except urllib.error.URLError as e:
227 except urllib.error.URLError as e:
222 log.exception("Exception contacting upgrade server")
228 log.exception("Exception contacting upgrade server")
223 self.request.override_renderer = 'string'
229 self.request.override_renderer = 'string'
224 return _err('Failed to contact upgrade server: %r' % e)
230 return _err('Failed to contact upgrade server: %r' % e)
225 except ValueError as e:
231 except ValueError as e:
226 log.exception("Bad data sent from update server")
232 log.exception("Bad data sent from update server")
227 self.request.override_renderer = 'string'
233 self.request.override_renderer = 'string'
228 return _err('Bad data sent from update server')
234 return _err('Bad data sent from update server')
229
235
230 latest = data['versions'][0]
236 latest = data['versions'][0]
231
237
232 c.update_url = update_url
238 c.update_url = update_url
233 c.latest_data = latest
239 c.latest_data = latest
234 c.latest_ver = (latest['version'] or '').strip()
240 c.latest_ver = (latest['version'] or '').strip()
235 c.cur_ver = self.request.GET.get('ver') or rhodecode.__version__
241 c.cur_ver = self.request.GET.get('ver') or rhodecode.__version__
236 c.should_upgrade = False
242 c.should_upgrade = False
237
243
238 is_outdated = UpdateModel().is_outdated(c.cur_ver, c.latest_ver)
244 is_outdated = UpdateModel().is_outdated(c.cur_ver, c.latest_ver)
239 if is_outdated:
245 if is_outdated:
240 c.should_upgrade = True
246 c.should_upgrade = True
241 c.important_notices = latest['general']
247 c.important_notices = latest['general']
242 UpdateModel().store_version(latest['version'])
248 UpdateModel().store_version(latest['version'])
243 return self._get_template_context(c)
249 return self._get_template_context(c)
@@ -1,66 +1,97 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18 import os
18 import os
19 from rhodecode.apps.file_store import config_keys
19
20
20 from rhodecode.config.settings_maker import SettingsMaker
21 from rhodecode.config.settings_maker import SettingsMaker
21
22
22
23
23 def _sanitize_settings_and_apply_defaults(settings):
24 def _sanitize_settings_and_apply_defaults(settings):
24 """
25 """
25 Set defaults, convert to python types and validate settings.
26 Set defaults, convert to python types and validate settings.
26 """
27 """
28 from rhodecode.apps.file_store import config_keys
29
30 # translate "legacy" params into new config
31 settings.pop(config_keys.deprecated_enabled, True)
32 if config_keys.deprecated_backend in settings:
33 # if legacy backend key is detected we use "legacy" backward compat setting
34 settings.pop(config_keys.deprecated_backend)
35 settings[config_keys.backend_type] = config_keys.backend_legacy_filesystem
36
37 if config_keys.deprecated_store_path in settings:
38 store_path = settings.pop(config_keys.deprecated_store_path)
39 settings[config_keys.legacy_filesystem_storage_path] = store_path
40
27 settings_maker = SettingsMaker(settings)
41 settings_maker = SettingsMaker(settings)
28
42
29 settings_maker.make_setting(config_keys.enabled, True, parser='bool')
43 default_cache_dir = settings['cache_dir']
30 settings_maker.make_setting(config_keys.backend, 'local')
44 default_store_dir = os.path.join(default_cache_dir, 'artifacts_filestore')
45
46 # set default backend
47 settings_maker.make_setting(config_keys.backend_type, config_keys.backend_legacy_filesystem)
48
49 # legacy filesystem defaults
50 settings_maker.make_setting(config_keys.legacy_filesystem_storage_path, default_store_dir, default_when_empty=True, )
31
51
32 default_store = os.path.join(os.path.dirname(settings['__file__']), 'upload_store')
52 # filesystem defaults
33 settings_maker.make_setting(config_keys.store_path, default_store)
53 settings_maker.make_setting(config_keys.filesystem_storage_path, default_store_dir, default_when_empty=True,)
54 settings_maker.make_setting(config_keys.filesystem_shards, 8, parser='int')
55
56 # objectstore defaults
57 settings_maker.make_setting(config_keys.objectstore_url, 'http://s3-minio:9000')
58 settings_maker.make_setting(config_keys.objectstore_bucket, 'rhodecode-artifacts-filestore')
59 settings_maker.make_setting(config_keys.objectstore_bucket_shards, 8, parser='int')
60
61 settings_maker.make_setting(config_keys.objectstore_region, '')
62 settings_maker.make_setting(config_keys.objectstore_key, '')
63 settings_maker.make_setting(config_keys.objectstore_secret, '')
34
64
35 settings_maker.env_expand()
65 settings_maker.env_expand()
36
66
37
67
38 def includeme(config):
68 def includeme(config):
69
39 from rhodecode.apps.file_store.views import FileStoreView
70 from rhodecode.apps.file_store.views import FileStoreView
40
71
41 settings = config.registry.settings
72 settings = config.registry.settings
42 _sanitize_settings_and_apply_defaults(settings)
73 _sanitize_settings_and_apply_defaults(settings)
43
74
44 config.add_route(
75 config.add_route(
45 name='upload_file',
76 name='upload_file',
46 pattern='/_file_store/upload')
77 pattern='/_file_store/upload')
47 config.add_view(
78 config.add_view(
48 FileStoreView,
79 FileStoreView,
49 attr='upload_file',
80 attr='upload_file',
50 route_name='upload_file', request_method='POST', renderer='json_ext')
81 route_name='upload_file', request_method='POST', renderer='json_ext')
51
82
52 config.add_route(
83 config.add_route(
53 name='download_file',
84 name='download_file',
54 pattern='/_file_store/download/{fid:.*}')
85 pattern='/_file_store/download/{fid:.*}')
55 config.add_view(
86 config.add_view(
56 FileStoreView,
87 FileStoreView,
57 attr='download_file',
88 attr='download_file',
58 route_name='download_file')
89 route_name='download_file')
59
90
60 config.add_route(
91 config.add_route(
61 name='download_file_by_token',
92 name='download_file_by_token',
62 pattern='/_file_store/token-download/{_auth_token}/{fid:.*}')
93 pattern='/_file_store/token-download/{_auth_token}/{fid:.*}')
63 config.add_view(
94 config.add_view(
64 FileStoreView,
95 FileStoreView,
65 attr='download_file_by_token',
96 attr='download_file_by_token',
66 route_name='download_file_by_token')
97 route_name='download_file_by_token')
@@ -1,25 +1,57 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19
19
20 # Definition of setting keys used to configure this module. Defined here to
20 # Definition of setting keys used to configure this module. Defined here to
21 # avoid repetition of keys throughout the module.
21 # avoid repetition of keys throughout the module.
22
22
23 enabled = 'file_store.enabled'
23 # OLD and deprecated keys not used anymore
24 backend = 'file_store.backend'
24 deprecated_enabled = 'file_store.enabled'
25 store_path = 'file_store.storage_path'
25 deprecated_backend = 'file_store.backend'
26 deprecated_store_path = 'file_store.storage_path'
27
28
29 backend_type = 'file_store.backend.type'
30
31 backend_legacy_filesystem = 'filesystem_v1'
32 backend_filesystem = 'filesystem_v2'
33 backend_objectstore = 'objectstore'
34
35 backend_types = [
36 backend_legacy_filesystem,
37 backend_filesystem,
38 backend_objectstore,
39 ]
40
41 # filesystem_v1 legacy
42 legacy_filesystem_storage_path = 'file_store.filesystem_v1.storage_path'
43
44
45 # filesystem_v2 new option
46 filesystem_storage_path = 'file_store.filesystem_v2.storage_path'
47 filesystem_shards = 'file_store.filesystem_v2.shards'
48
49 # objectstore
50 objectstore_url = 'file_store.objectstore.url'
51 objectstore_bucket = 'file_store.objectstore.bucket'
52 objectstore_bucket_shards = 'file_store.objectstore.bucket_shards'
53
54 objectstore_region = 'file_store.objectstore.region'
55 objectstore_key = 'file_store.objectstore.key'
56 objectstore_secret = 'file_store.objectstore.secret'
57
@@ -1,18 +1,57 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import os
20 import random
21 import tempfile
22 import string
23
24 import pytest
25
26 from rhodecode.apps.file_store import utils as store_utils
27
28
29 @pytest.fixture()
30 def file_store_instance(ini_settings):
31 config = ini_settings
32 f_store = store_utils.get_filestore_backend(config=config, always_init=True)
33 return f_store
34
35
36 @pytest.fixture
37 def random_binary_file():
38 # Generate random binary data
39 data = bytearray(random.getrandbits(8) for _ in range(1024 * 512)) # 512 KB of random data
40
41 # Create a temporary file
42 temp_file = tempfile.NamedTemporaryFile(delete=False)
43 filename = temp_file.name
44
45 try:
46 # Write the random binary data to the file
47 temp_file.write(data)
48 temp_file.seek(0) # Rewind the file pointer to the beginning
49 yield filename, temp_file
50 finally:
51 # Close and delete the temporary file after the test
52 temp_file.close()
53 os.remove(filename)
54
55
56 def generate_random_filename(length=10):
57 return ''.join(random.choices(string.ascii_letters + string.digits, k=length)) No newline at end of file
@@ -1,246 +1,253 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18 import os
19 import os
20
19 import pytest
21 import pytest
20
22
21 from rhodecode.lib.ext_json import json
23 from rhodecode.lib.ext_json import json
22 from rhodecode.model.auth_token import AuthTokenModel
24 from rhodecode.model.auth_token import AuthTokenModel
23 from rhodecode.model.db import Session, FileStore, Repository, User
25 from rhodecode.model.db import Session, FileStore, Repository, User
24 from rhodecode.apps.file_store import utils, config_keys
26 from rhodecode.apps.file_store import utils as store_utils
27 from rhodecode.apps.file_store import config_keys
25
28
26 from rhodecode.tests import TestController
29 from rhodecode.tests import TestController
27 from rhodecode.tests.routes import route_path
30 from rhodecode.tests.routes import route_path
28
31
29
32
30 class TestFileStoreViews(TestController):
33 class TestFileStoreViews(TestController):
31
34
35 @pytest.fixture()
36 def create_artifact_factory(self, tmpdir, ini_settings):
37
38 def factory(user_id, content, f_name='example.txt'):
39
40 config = ini_settings
41 config[config_keys.backend_type] = config_keys.backend_legacy_filesystem
42
43 f_store = store_utils.get_filestore_backend(config)
44
45 filesystem_file = os.path.join(str(tmpdir), f_name)
46 with open(filesystem_file, 'wt') as f:
47 f.write(content)
48
49 with open(filesystem_file, 'rb') as f:
50 store_uid, metadata = f_store.store(f_name, f, metadata={'filename': f_name})
51 os.remove(filesystem_file)
52
53 entry = FileStore.create(
54 file_uid=store_uid, filename=metadata["filename"],
55 file_hash=metadata["sha256"], file_size=metadata["size"],
56 file_display_name='file_display_name',
57 file_description='repo artifact `{}`'.format(metadata["filename"]),
58 check_acl=True, user_id=user_id,
59 )
60 Session().add(entry)
61 Session().commit()
62 return entry
63 return factory
64
32 @pytest.mark.parametrize("fid, content, exists", [
65 @pytest.mark.parametrize("fid, content, exists", [
33 ('abcde-0.jpg', "xxxxx", True),
66 ('abcde-0.jpg', "xxxxx", True),
34 ('abcde-0.exe', "1234567", True),
67 ('abcde-0.exe', "1234567", True),
35 ('abcde-0.jpg', "xxxxx", False),
68 ('abcde-0.jpg', "xxxxx", False),
36 ])
69 ])
37 def test_get_files_from_store(self, fid, content, exists, tmpdir, user_util):
70 def test_get_files_from_store(self, fid, content, exists, tmpdir, user_util, ini_settings):
38 user = self.log_user()
71 user = self.log_user()
39 user_id = user['user_id']
72 user_id = user['user_id']
40 repo_id = user_util.create_repo().repo_id
73 repo_id = user_util.create_repo().repo_id
41 store_path = self.app._pyramid_settings[config_keys.store_path]
74
75 config = ini_settings
76 config[config_keys.backend_type] = config_keys.backend_legacy_filesystem
77
42 store_uid = fid
78 store_uid = fid
43
79
44 if exists:
80 if exists:
45 status = 200
81 status = 200
46 store = utils.get_file_storage({config_keys.store_path: store_path})
82 f_store = store_utils.get_filestore_backend(config)
47 filesystem_file = os.path.join(str(tmpdir), fid)
83 filesystem_file = os.path.join(str(tmpdir), fid)
48 with open(filesystem_file, 'wt') as f:
84 with open(filesystem_file, 'wt') as f:
49 f.write(content)
85 f.write(content)
50
86
51 with open(filesystem_file, 'rb') as f:
87 with open(filesystem_file, 'rb') as f:
52 store_uid, metadata = store.save_file(f, fid, extra_metadata={'filename': fid})
88 store_uid, metadata = f_store.store(fid, f, metadata={'filename': fid})
89 os.remove(filesystem_file)
53
90
54 entry = FileStore.create(
91 entry = FileStore.create(
55 file_uid=store_uid, filename=metadata["filename"],
92 file_uid=store_uid, filename=metadata["filename"],
56 file_hash=metadata["sha256"], file_size=metadata["size"],
93 file_hash=metadata["sha256"], file_size=metadata["size"],
57 file_display_name='file_display_name',
94 file_display_name='file_display_name',
58 file_description='repo artifact `{}`'.format(metadata["filename"]),
95 file_description='repo artifact `{}`'.format(metadata["filename"]),
59 check_acl=True, user_id=user_id,
96 check_acl=True, user_id=user_id,
60 scope_repo_id=repo_id
97 scope_repo_id=repo_id
61 )
98 )
62 Session().add(entry)
99 Session().add(entry)
63 Session().commit()
100 Session().commit()
64
101
65 else:
102 else:
66 status = 404
103 status = 404
67
104
68 response = self.app.get(route_path('download_file', fid=store_uid), status=status)
105 response = self.app.get(route_path('download_file', fid=store_uid), status=status)
69
106
70 if exists:
107 if exists:
71 assert response.text == content
108 assert response.text == content
72 file_store_path = os.path.dirname(store.resolve_name(store_uid, store_path)[1])
73 metadata_file = os.path.join(file_store_path, store_uid + '.meta')
74 assert os.path.exists(metadata_file)
75 with open(metadata_file, 'rb') as f:
76 json_data = json.loads(f.read())
77
109
78 assert json_data
110 metadata = f_store.get_metadata(store_uid)
79 assert 'size' in json_data
111
112 assert 'size' in metadata
80
113
81 def test_upload_files_without_content_to_store(self):
114 def test_upload_files_without_content_to_store(self):
82 self.log_user()
115 self.log_user()
83 response = self.app.post(
116 response = self.app.post(
84 route_path('upload_file'),
117 route_path('upload_file'),
85 params={'csrf_token': self.csrf_token},
118 params={'csrf_token': self.csrf_token},
86 status=200)
119 status=200)
87
120
88 assert response.json == {
121 assert response.json == {
89 'error': 'store_file data field is missing',
122 'error': 'store_file data field is missing',
90 'access_path': None,
123 'access_path': None,
91 'store_fid': None}
124 'store_fid': None}
92
125
93 def test_upload_files_bogus_content_to_store(self):
126 def test_upload_files_bogus_content_to_store(self):
94 self.log_user()
127 self.log_user()
95 response = self.app.post(
128 response = self.app.post(
96 route_path('upload_file'),
129 route_path('upload_file'),
97 params={'csrf_token': self.csrf_token, 'store_file': 'bogus'},
130 params={'csrf_token': self.csrf_token, 'store_file': 'bogus'},
98 status=200)
131 status=200)
99
132
100 assert response.json == {
133 assert response.json == {
101 'error': 'filename cannot be read from the data field',
134 'error': 'filename cannot be read from the data field',
102 'access_path': None,
135 'access_path': None,
103 'store_fid': None}
136 'store_fid': None}
104
137
105 def test_upload_content_to_store(self):
138 def test_upload_content_to_store(self):
106 self.log_user()
139 self.log_user()
107 response = self.app.post(
140 response = self.app.post(
108 route_path('upload_file'),
141 route_path('upload_file'),
109 upload_files=[('store_file', b'myfile.txt', b'SOME CONTENT')],
142 upload_files=[('store_file', b'myfile.txt', b'SOME CONTENT')],
110 params={'csrf_token': self.csrf_token},
143 params={'csrf_token': self.csrf_token},
111 status=200)
144 status=200)
112
145
113 assert response.json['store_fid']
146 assert response.json['store_fid']
114
147
115 @pytest.fixture()
116 def create_artifact_factory(self, tmpdir):
117 def factory(user_id, content):
118 store_path = self.app._pyramid_settings[config_keys.store_path]
119 store = utils.get_file_storage({config_keys.store_path: store_path})
120 fid = 'example.txt'
121
122 filesystem_file = os.path.join(str(tmpdir), fid)
123 with open(filesystem_file, 'wt') as f:
124 f.write(content)
125
126 with open(filesystem_file, 'rb') as f:
127 store_uid, metadata = store.save_file(f, fid, extra_metadata={'filename': fid})
128
129 entry = FileStore.create(
130 file_uid=store_uid, filename=metadata["filename"],
131 file_hash=metadata["sha256"], file_size=metadata["size"],
132 file_display_name='file_display_name',
133 file_description='repo artifact `{}`'.format(metadata["filename"]),
134 check_acl=True, user_id=user_id,
135 )
136 Session().add(entry)
137 Session().commit()
138 return entry
139 return factory
140
141 def test_download_file_non_scoped(self, user_util, create_artifact_factory):
148 def test_download_file_non_scoped(self, user_util, create_artifact_factory):
142 user = self.log_user()
149 user = self.log_user()
143 user_id = user['user_id']
150 user_id = user['user_id']
144 content = 'HELLO MY NAME IS ARTIFACT !'
151 content = 'HELLO MY NAME IS ARTIFACT !'
145
152
146 artifact = create_artifact_factory(user_id, content)
153 artifact = create_artifact_factory(user_id, content)
147 file_uid = artifact.file_uid
154 file_uid = artifact.file_uid
148 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
155 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
149 assert response.text == content
156 assert response.text == content
150
157
151 # log-in to new user and test download again
158 # log-in to new user and test download again
152 user = user_util.create_user(password='qweqwe')
159 user = user_util.create_user(password='qweqwe')
153 self.log_user(user.username, 'qweqwe')
160 self.log_user(user.username, 'qweqwe')
154 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
161 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
155 assert response.text == content
162 assert response.text == content
156
163
157 def test_download_file_scoped_to_repo(self, user_util, create_artifact_factory):
164 def test_download_file_scoped_to_repo(self, user_util, create_artifact_factory):
158 user = self.log_user()
165 user = self.log_user()
159 user_id = user['user_id']
166 user_id = user['user_id']
160 content = 'HELLO MY NAME IS ARTIFACT !'
167 content = 'HELLO MY NAME IS ARTIFACT !'
161
168
162 artifact = create_artifact_factory(user_id, content)
169 artifact = create_artifact_factory(user_id, content)
163 # bind to repo
170 # bind to repo
164 repo = user_util.create_repo()
171 repo = user_util.create_repo()
165 repo_id = repo.repo_id
172 repo_id = repo.repo_id
166 artifact.scope_repo_id = repo_id
173 artifact.scope_repo_id = repo_id
167 Session().add(artifact)
174 Session().add(artifact)
168 Session().commit()
175 Session().commit()
169
176
170 file_uid = artifact.file_uid
177 file_uid = artifact.file_uid
171 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
178 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
172 assert response.text == content
179 assert response.text == content
173
180
174 # log-in to new user and test download again
181 # log-in to new user and test download again
175 user = user_util.create_user(password='qweqwe')
182 user = user_util.create_user(password='qweqwe')
176 self.log_user(user.username, 'qweqwe')
183 self.log_user(user.username, 'qweqwe')
177 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
184 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
178 assert response.text == content
185 assert response.text == content
179
186
180 # forbid user the rights to repo
187 # forbid user the rights to repo
181 repo = Repository.get(repo_id)
188 repo = Repository.get(repo_id)
182 user_util.grant_user_permission_to_repo(repo, user, 'repository.none')
189 user_util.grant_user_permission_to_repo(repo, user, 'repository.none')
183 self.app.get(route_path('download_file', fid=file_uid), status=404)
190 self.app.get(route_path('download_file', fid=file_uid), status=404)
184
191
185 def test_download_file_scoped_to_user(self, user_util, create_artifact_factory):
192 def test_download_file_scoped_to_user(self, user_util, create_artifact_factory):
186 user = self.log_user()
193 user = self.log_user()
187 user_id = user['user_id']
194 user_id = user['user_id']
188 content = 'HELLO MY NAME IS ARTIFACT !'
195 content = 'HELLO MY NAME IS ARTIFACT !'
189
196
190 artifact = create_artifact_factory(user_id, content)
197 artifact = create_artifact_factory(user_id, content)
191 # bind to user
198 # bind to user
192 user = user_util.create_user(password='qweqwe')
199 user = user_util.create_user(password='qweqwe')
193
200
194 artifact.scope_user_id = user.user_id
201 artifact.scope_user_id = user.user_id
195 Session().add(artifact)
202 Session().add(artifact)
196 Session().commit()
203 Session().commit()
197
204
198 # artifact creator doesn't have access since it's bind to another user
205 # artifact creator doesn't have access since it's bind to another user
199 file_uid = artifact.file_uid
206 file_uid = artifact.file_uid
200 self.app.get(route_path('download_file', fid=file_uid), status=404)
207 self.app.get(route_path('download_file', fid=file_uid), status=404)
201
208
202 # log-in to new user and test download again, should be ok since we're bind to this artifact
209 # log-in to new user and test download again, should be ok since we're bind to this artifact
203 self.log_user(user.username, 'qweqwe')
210 self.log_user(user.username, 'qweqwe')
204 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
211 response = self.app.get(route_path('download_file', fid=file_uid), status=200)
205 assert response.text == content
212 assert response.text == content
206
213
207 def test_download_file_scoped_to_repo_with_bad_token(self, user_util, create_artifact_factory):
214 def test_download_file_scoped_to_repo_with_bad_token(self, user_util, create_artifact_factory):
208 user_id = User.get_first_super_admin().user_id
215 user_id = User.get_first_super_admin().user_id
209 content = 'HELLO MY NAME IS ARTIFACT !'
216 content = 'HELLO MY NAME IS ARTIFACT !'
210
217
211 artifact = create_artifact_factory(user_id, content)
218 artifact = create_artifact_factory(user_id, content)
212 # bind to repo
219 # bind to repo
213 repo = user_util.create_repo()
220 repo = user_util.create_repo()
214 repo_id = repo.repo_id
221 repo_id = repo.repo_id
215 artifact.scope_repo_id = repo_id
222 artifact.scope_repo_id = repo_id
216 Session().add(artifact)
223 Session().add(artifact)
217 Session().commit()
224 Session().commit()
218
225
219 file_uid = artifact.file_uid
226 file_uid = artifact.file_uid
220 self.app.get(route_path('download_file_by_token',
227 self.app.get(route_path('download_file_by_token',
221 _auth_token='bogus', fid=file_uid), status=302)
228 _auth_token='bogus', fid=file_uid), status=302)
222
229
223 def test_download_file_scoped_to_repo_with_token(self, user_util, create_artifact_factory):
230 def test_download_file_scoped_to_repo_with_token(self, user_util, create_artifact_factory):
224 user = User.get_first_super_admin()
231 user = User.get_first_super_admin()
225 AuthTokenModel().create(user, 'test artifact token',
232 AuthTokenModel().create(user, 'test artifact token',
226 role=AuthTokenModel.cls.ROLE_ARTIFACT_DOWNLOAD)
233 role=AuthTokenModel.cls.ROLE_ARTIFACT_DOWNLOAD)
227
234
228 user = User.get_first_super_admin()
235 user = User.get_first_super_admin()
229 artifact_token = user.artifact_token
236 artifact_token = user.artifact_token
230
237
231 user_id = User.get_first_super_admin().user_id
238 user_id = User.get_first_super_admin().user_id
232 content = 'HELLO MY NAME IS ARTIFACT !'
239 content = 'HELLO MY NAME IS ARTIFACT !'
233
240
234 artifact = create_artifact_factory(user_id, content)
241 artifact = create_artifact_factory(user_id, content)
235 # bind to repo
242 # bind to repo
236 repo = user_util.create_repo()
243 repo = user_util.create_repo()
237 repo_id = repo.repo_id
244 repo_id = repo.repo_id
238 artifact.scope_repo_id = repo_id
245 artifact.scope_repo_id = repo_id
239 Session().add(artifact)
246 Session().add(artifact)
240 Session().commit()
247 Session().commit()
241
248
242 file_uid = artifact.file_uid
249 file_uid = artifact.file_uid
243 response = self.app.get(
250 response = self.app.get(
244 route_path('download_file_by_token',
251 route_path('download_file_by_token',
245 _auth_token=artifact_token, fid=file_uid), status=200)
252 _auth_token=artifact_token, fid=file_uid), status=200)
246 assert response.text == content
253 assert response.text == content
@@ -1,55 +1,145 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import io
19 import io
20 import uuid
20 import uuid
21 import pathlib
21 import pathlib
22 import s3fs
23
24 from rhodecode.lib.hash_utils import sha256_safe
25 from rhodecode.apps.file_store import config_keys
26
27
28 file_store_meta = None
29
30
31 def get_filestore_config(config) -> dict:
32
33 final_config = {}
34
35 for k, v in config.items():
36 if k.startswith('file_store'):
37 final_config[k] = v
38
39 return final_config
22
40
23
41
24 def get_file_storage(settings):
42 def get_filestore_backend(config, always_init=False):
25 from rhodecode.apps.file_store.backends.local_store import LocalFileStorage
43 """
26 from rhodecode.apps.file_store import config_keys
44
27 store_path = settings.get(config_keys.store_path)
45 usage::
28 return LocalFileStorage(base_path=store_path)
46 from rhodecode.apps.file_store import get_filestore_backend
47 f_store = get_filestore_backend(config=CONFIG)
48
49 :param config:
50 :param always_init:
51 :return:
52 """
53
54 global file_store_meta
55 if file_store_meta is not None and not always_init:
56 return file_store_meta
57
58 config = get_filestore_config(config)
59 backend = config[config_keys.backend_type]
60
61 match backend:
62 case config_keys.backend_legacy_filesystem:
63 # Legacy backward compatible storage
64 from rhodecode.apps.file_store.backends.filesystem_legacy import LegacyFileSystemBackend
65 d_cache = LegacyFileSystemBackend(
66 settings=config
67 )
68 case config_keys.backend_filesystem:
69 from rhodecode.apps.file_store.backends.filesystem import FileSystemBackend
70 d_cache = FileSystemBackend(
71 settings=config
72 )
73 case config_keys.backend_objectstore:
74 from rhodecode.apps.file_store.backends.objectstore import ObjectStoreBackend
75 d_cache = ObjectStoreBackend(
76 settings=config
77 )
78 case _:
79 raise ValueError(
80 f'file_store.backend.type only supports "{config_keys.backend_types}" got {backend}'
81 )
82
83 cache_meta = d_cache
84 return cache_meta
29
85
30
86
31 def splitext(filename):
87 def splitext(filename):
32 ext = ''.join(pathlib.Path(filename).suffixes)
88 final_ext = []
89 for suffix in pathlib.Path(filename).suffixes:
90 if not suffix.isascii():
91 continue
92
93 suffix = " ".join(suffix.split()).replace(" ", "")
94 final_ext.append(suffix)
95 ext = ''.join(final_ext)
33 return filename, ext
96 return filename, ext
34
97
35
98
36 def uid_filename(filename, randomized=True):
99 def get_uid_filename(filename, randomized=True):
37 """
100 """
38 Generates a randomized or stable (uuid) filename,
101 Generates a randomized or stable (uuid) filename,
39 preserving the original extension.
102 preserving the original extension.
40
103
41 :param filename: the original filename
104 :param filename: the original filename
42 :param randomized: define if filename should be stable (sha1 based) or randomized
105 :param randomized: define if filename should be stable (sha1 based) or randomized
43 """
106 """
44
107
45 _, ext = splitext(filename)
108 _, ext = splitext(filename)
46 if randomized:
109 if randomized:
47 uid = uuid.uuid4()
110 uid = uuid.uuid4()
48 else:
111 else:
49 hash_key = '{}.{}'.format(filename, 'store')
112 store_suffix = "store"
113 hash_key = f'{filename}.{store_suffix}'
50 uid = uuid.uuid5(uuid.NAMESPACE_URL, hash_key)
114 uid = uuid.uuid5(uuid.NAMESPACE_URL, hash_key)
51 return str(uid) + ext.lower()
115 return str(uid) + ext.lower()
52
116
53
117
54 def bytes_to_file_obj(bytes_data):
118 def bytes_to_file_obj(bytes_data):
55 return io.StringIO(bytes_data)
119 return io.BytesIO(bytes_data)
120
121
122 class ShardFileReader:
123
124 def __init__(self, file_like_reader):
125 self._file_like_reader = file_like_reader
126
127 def __getattr__(self, item):
128 if isinstance(self._file_like_reader, s3fs.core.S3File):
129 match item:
130 case 'name':
131 # S3 FileWrapper doesn't support name attribute, and we use it
132 return self._file_like_reader.full_name
133 case _:
134 return getattr(self._file_like_reader, item)
135 else:
136 return getattr(self._file_like_reader, item)
137
138
139 def archive_iterator(_reader, block_size: int = 4096 * 512):
140 # 4096 * 64 = 64KB
141 while 1:
142 data = _reader.read(block_size)
143 if not data:
144 break
145 yield data
@@ -1,200 +1,197 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18 import logging
18 import logging
19
19
20
20 from pyramid.response import Response
21 from pyramid.response import FileResponse
22 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
21 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
23
22
24 from rhodecode.apps._base import BaseAppView
23 from rhodecode.apps._base import BaseAppView
25 from rhodecode.apps.file_store import utils
24 from rhodecode.apps.file_store import utils as store_utils
26 from rhodecode.apps.file_store.exceptions import (
25 from rhodecode.apps.file_store.exceptions import (
27 FileNotAllowedException, FileOverSizeException)
26 FileNotAllowedException, FileOverSizeException)
28
27
29 from rhodecode.lib import helpers as h
28 from rhodecode.lib import helpers as h
30 from rhodecode.lib import audit_logger
29 from rhodecode.lib import audit_logger
31 from rhodecode.lib.auth import (
30 from rhodecode.lib.auth import (
32 CSRFRequired, NotAnonymous, HasRepoPermissionAny, HasRepoGroupPermissionAny,
31 CSRFRequired, NotAnonymous, HasRepoPermissionAny, HasRepoGroupPermissionAny,
33 LoginRequired)
32 LoginRequired)
33 from rhodecode.lib.str_utils import header_safe_str
34 from rhodecode.lib.vcs.conf.mtypes import get_mimetypes_db
34 from rhodecode.lib.vcs.conf.mtypes import get_mimetypes_db
35 from rhodecode.model.db import Session, FileStore, UserApiKeys
35 from rhodecode.model.db import Session, FileStore, UserApiKeys
36
36
37 log = logging.getLogger(__name__)
37 log = logging.getLogger(__name__)
38
38
39
39
40 class FileStoreView(BaseAppView):
40 class FileStoreView(BaseAppView):
41 upload_key = 'store_file'
41 upload_key = 'store_file'
42
42
43 def load_default_context(self):
43 def load_default_context(self):
44 c = self._get_local_tmpl_context()
44 c = self._get_local_tmpl_context()
45 self.storage = utils.get_file_storage(self.request.registry.settings)
45 self.f_store = store_utils.get_filestore_backend(self.request.registry.settings)
46 return c
46 return c
47
47
48 def _guess_type(self, file_name):
48 def _guess_type(self, file_name):
49 """
49 """
50 Our own type guesser for mimetypes using the rich DB
50 Our own type guesser for mimetypes using the rich DB
51 """
51 """
52 if not hasattr(self, 'db'):
52 if not hasattr(self, 'db'):
53 self.db = get_mimetypes_db()
53 self.db = get_mimetypes_db()
54 _content_type, _encoding = self.db.guess_type(file_name, strict=False)
54 _content_type, _encoding = self.db.guess_type(file_name, strict=False)
55 return _content_type, _encoding
55 return _content_type, _encoding
56
56
57 def _serve_file(self, file_uid):
57 def _serve_file(self, file_uid):
58 if not self.storage.exists(file_uid):
58 if not self.f_store.filename_exists(file_uid):
59 store_path = self.storage.store_path(file_uid)
59 store_path = self.f_store.store_path(file_uid)
60 log.debug('File with FID:%s not found in the store under `%s`',
60 log.warning('File with FID:%s not found in the store under `%s`',
61 file_uid, store_path)
61 file_uid, store_path)
62 raise HTTPNotFound()
62 raise HTTPNotFound()
63
63
64 db_obj = FileStore.get_by_store_uid(file_uid, safe=True)
64 db_obj = FileStore.get_by_store_uid(file_uid, safe=True)
65 if not db_obj:
65 if not db_obj:
66 raise HTTPNotFound()
66 raise HTTPNotFound()
67
67
68 # private upload for user
68 # private upload for user
69 if db_obj.check_acl and db_obj.scope_user_id:
69 if db_obj.check_acl and db_obj.scope_user_id:
70 log.debug('Artifact: checking scope access for bound artifact user: `%s`',
70 log.debug('Artifact: checking scope access for bound artifact user: `%s`',
71 db_obj.scope_user_id)
71 db_obj.scope_user_id)
72 user = db_obj.user
72 user = db_obj.user
73 if self._rhodecode_db_user.user_id != user.user_id:
73 if self._rhodecode_db_user.user_id != user.user_id:
74 log.warning('Access to file store object forbidden')
74 log.warning('Access to file store object forbidden')
75 raise HTTPNotFound()
75 raise HTTPNotFound()
76
76
77 # scoped to repository permissions
77 # scoped to repository permissions
78 if db_obj.check_acl and db_obj.scope_repo_id:
78 if db_obj.check_acl and db_obj.scope_repo_id:
79 log.debug('Artifact: checking scope access for bound artifact repo: `%s`',
79 log.debug('Artifact: checking scope access for bound artifact repo: `%s`',
80 db_obj.scope_repo_id)
80 db_obj.scope_repo_id)
81 repo = db_obj.repo
81 repo = db_obj.repo
82 perm_set = ['repository.read', 'repository.write', 'repository.admin']
82 perm_set = ['repository.read', 'repository.write', 'repository.admin']
83 has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'FileStore check')
83 has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'FileStore check')
84 if not has_perm:
84 if not has_perm:
85 log.warning('Access to file store object `%s` forbidden', file_uid)
85 log.warning('Access to file store object `%s` forbidden', file_uid)
86 raise HTTPNotFound()
86 raise HTTPNotFound()
87
87
88 # scoped to repository group permissions
88 # scoped to repository group permissions
89 if db_obj.check_acl and db_obj.scope_repo_group_id:
89 if db_obj.check_acl and db_obj.scope_repo_group_id:
90 log.debug('Artifact: checking scope access for bound artifact repo group: `%s`',
90 log.debug('Artifact: checking scope access for bound artifact repo group: `%s`',
91 db_obj.scope_repo_group_id)
91 db_obj.scope_repo_group_id)
92 repo_group = db_obj.repo_group
92 repo_group = db_obj.repo_group
93 perm_set = ['group.read', 'group.write', 'group.admin']
93 perm_set = ['group.read', 'group.write', 'group.admin']
94 has_perm = HasRepoGroupPermissionAny(*perm_set)(repo_group.group_name, 'FileStore check')
94 has_perm = HasRepoGroupPermissionAny(*perm_set)(repo_group.group_name, 'FileStore check')
95 if not has_perm:
95 if not has_perm:
96 log.warning('Access to file store object `%s` forbidden', file_uid)
96 log.warning('Access to file store object `%s` forbidden', file_uid)
97 raise HTTPNotFound()
97 raise HTTPNotFound()
98
98
99 FileStore.bump_access_counter(file_uid)
99 FileStore.bump_access_counter(file_uid)
100
100
101 file_path = self.storage.store_path(file_uid)
101 file_name = db_obj.file_display_name
102 content_type = 'application/octet-stream'
102 content_type = 'application/octet-stream'
103 content_encoding = None
104
103
105 _content_type, _encoding = self._guess_type(file_path)
104 _content_type, _encoding = self._guess_type(file_name)
106 if _content_type:
105 if _content_type:
107 content_type = _content_type
106 content_type = _content_type
108
107
109 # For file store we don't submit any session data, this logic tells the
108 # For file store we don't submit any session data, this logic tells the
110 # Session lib to skip it
109 # Session lib to skip it
111 setattr(self.request, '_file_response', True)
110 setattr(self.request, '_file_response', True)
112 response = FileResponse(
111 reader, _meta = self.f_store.fetch(file_uid)
113 file_path, request=self.request,
114 content_type=content_type, content_encoding=content_encoding)
115
112
116 file_name = db_obj.file_display_name
113 response = Response(app_iter=store_utils.archive_iterator(reader))
117
114
118 response.headers["Content-Disposition"] = (
115 response.content_type = str(content_type)
119 f'attachment; filename="{str(file_name)}"'
116 response.content_disposition = f'attachment; filename="{header_safe_str(file_name)}"'
120 )
117
121 response.headers["X-RC-Artifact-Id"] = str(db_obj.file_store_id)
118 response.headers["X-RC-Artifact-Id"] = str(db_obj.file_store_id)
122 response.headers["X-RC-Artifact-Desc"] = str(db_obj.file_description)
119 response.headers["X-RC-Artifact-Desc"] = header_safe_str(db_obj.file_description)
123 response.headers["X-RC-Artifact-Sha256"] = str(db_obj.file_hash)
120 response.headers["X-RC-Artifact-Sha256"] = str(db_obj.file_hash)
124 return response
121 return response
125
122
126 @LoginRequired()
123 @LoginRequired()
127 @NotAnonymous()
124 @NotAnonymous()
128 @CSRFRequired()
125 @CSRFRequired()
129 def upload_file(self):
126 def upload_file(self):
130 self.load_default_context()
127 self.load_default_context()
131 file_obj = self.request.POST.get(self.upload_key)
128 file_obj = self.request.POST.get(self.upload_key)
132
129
133 if file_obj is None:
130 if file_obj is None:
134 return {'store_fid': None,
131 return {'store_fid': None,
135 'access_path': None,
132 'access_path': None,
136 'error': f'{self.upload_key} data field is missing'}
133 'error': f'{self.upload_key} data field is missing'}
137
134
138 if not hasattr(file_obj, 'filename'):
135 if not hasattr(file_obj, 'filename'):
139 return {'store_fid': None,
136 return {'store_fid': None,
140 'access_path': None,
137 'access_path': None,
141 'error': 'filename cannot be read from the data field'}
138 'error': 'filename cannot be read from the data field'}
142
139
143 filename = file_obj.filename
140 filename = file_obj.filename
144
141
145 metadata = {
142 metadata = {
146 'user_uploaded': {'username': self._rhodecode_user.username,
143 'user_uploaded': {'username': self._rhodecode_user.username,
147 'user_id': self._rhodecode_user.user_id,
144 'user_id': self._rhodecode_user.user_id,
148 'ip': self._rhodecode_user.ip_addr}}
145 'ip': self._rhodecode_user.ip_addr}}
149 try:
146 try:
150 store_uid, metadata = self.storage.save_file(
147 store_uid, metadata = self.f_store.store(
151 file_obj.file, filename, extra_metadata=metadata)
148 filename, file_obj.file, extra_metadata=metadata)
152 except FileNotAllowedException:
149 except FileNotAllowedException:
153 return {'store_fid': None,
150 return {'store_fid': None,
154 'access_path': None,
151 'access_path': None,
155 'error': f'File {filename} is not allowed.'}
152 'error': f'File {filename} is not allowed.'}
156
153
157 except FileOverSizeException:
154 except FileOverSizeException:
158 return {'store_fid': None,
155 return {'store_fid': None,
159 'access_path': None,
156 'access_path': None,
160 'error': f'File {filename} is exceeding allowed limit.'}
157 'error': f'File {filename} is exceeding allowed limit.'}
161
158
162 try:
159 try:
163 entry = FileStore.create(
160 entry = FileStore.create(
164 file_uid=store_uid, filename=metadata["filename"],
161 file_uid=store_uid, filename=metadata["filename"],
165 file_hash=metadata["sha256"], file_size=metadata["size"],
162 file_hash=metadata["sha256"], file_size=metadata["size"],
166 file_description='upload attachment',
163 file_description='upload attachment',
167 check_acl=False, user_id=self._rhodecode_user.user_id
164 check_acl=False, user_id=self._rhodecode_user.user_id
168 )
165 )
169 Session().add(entry)
166 Session().add(entry)
170 Session().commit()
167 Session().commit()
171 log.debug('Stored upload in DB as %s', entry)
168 log.debug('Stored upload in DB as %s', entry)
172 except Exception:
169 except Exception:
173 log.exception('Failed to store file %s', filename)
170 log.exception('Failed to store file %s', filename)
174 return {'store_fid': None,
171 return {'store_fid': None,
175 'access_path': None,
172 'access_path': None,
176 'error': f'File {filename} failed to store in DB.'}
173 'error': f'File {filename} failed to store in DB.'}
177
174
178 return {'store_fid': store_uid,
175 return {'store_fid': store_uid,
179 'access_path': h.route_path('download_file', fid=store_uid)}
176 'access_path': h.route_path('download_file', fid=store_uid)}
180
177
181 # ACL is checked by scopes, if no scope the file is accessible to all
178 # ACL is checked by scopes, if no scope the file is accessible to all
182 def download_file(self):
179 def download_file(self):
183 self.load_default_context()
180 self.load_default_context()
184 file_uid = self.request.matchdict['fid']
181 file_uid = self.request.matchdict['fid']
185 log.debug('Requesting FID:%s from store %s', file_uid, self.storage)
182 log.debug('Requesting FID:%s from store %s', file_uid, self.f_store)
186 return self._serve_file(file_uid)
183 return self._serve_file(file_uid)
187
184
188 # in addition to @LoginRequired ACL is checked by scopes
185 # in addition to @LoginRequired ACL is checked by scopes
189 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_ARTIFACT_DOWNLOAD])
186 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_ARTIFACT_DOWNLOAD])
190 @NotAnonymous()
187 @NotAnonymous()
191 def download_file_by_token(self):
188 def download_file_by_token(self):
192 """
189 """
193 Special view that allows to access the download file by special URL that
190 Special view that allows to access the download file by special URL that
194 is stored inside the URL.
191 is stored inside the URL.
195
192
196 http://example.com/_file_store/token-download/TOKEN/FILE_UID
193 http://example.com/_file_store/token-download/TOKEN/FILE_UID
197 """
194 """
198 self.load_default_context()
195 self.load_default_context()
199 file_uid = self.request.matchdict['fid']
196 file_uid = self.request.matchdict['fid']
200 return self._serve_file(file_uid)
197 return self._serve_file(file_uid)
@@ -1,830 +1,830 b''
1 # Copyright (C) 2010-2023 RhodeCode GmbH
1 # Copyright (C) 2010-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import logging
19 import logging
20 import collections
20 import collections
21
21
22 from pyramid.httpexceptions import (
22 from pyramid.httpexceptions import (
23 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
23 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
24 from pyramid.renderers import render
24 from pyramid.renderers import render
25 from pyramid.response import Response
25 from pyramid.response import Response
26
26
27 from rhodecode.apps._base import RepoAppView
27 from rhodecode.apps._base import RepoAppView
28 from rhodecode.apps.file_store import utils as store_utils
28 from rhodecode.apps.file_store import utils as store_utils
29 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
29 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
30
30
31 from rhodecode.lib import diffs, codeblocks, channelstream
31 from rhodecode.lib import diffs, codeblocks, channelstream
32 from rhodecode.lib.auth import (
32 from rhodecode.lib.auth import (
33 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
33 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
34 from rhodecode.lib import ext_json
34 from rhodecode.lib import ext_json
35 from collections import OrderedDict
35 from collections import OrderedDict
36 from rhodecode.lib.diffs import (
36 from rhodecode.lib.diffs import (
37 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
37 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
38 get_diff_whitespace_flag)
38 get_diff_whitespace_flag)
39 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
39 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
40 import rhodecode.lib.helpers as h
40 import rhodecode.lib.helpers as h
41 from rhodecode.lib.utils2 import str2bool, StrictAttributeDict, safe_str
41 from rhodecode.lib.utils2 import str2bool, StrictAttributeDict, safe_str
42 from rhodecode.lib.vcs.backends.base import EmptyCommit
42 from rhodecode.lib.vcs.backends.base import EmptyCommit
43 from rhodecode.lib.vcs.exceptions import (
43 from rhodecode.lib.vcs.exceptions import (
44 RepositoryError, CommitDoesNotExistError)
44 RepositoryError, CommitDoesNotExistError)
45 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
45 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
46 ChangesetCommentHistory
46 ChangesetCommentHistory
47 from rhodecode.model.changeset_status import ChangesetStatusModel
47 from rhodecode.model.changeset_status import ChangesetStatusModel
48 from rhodecode.model.comment import CommentsModel
48 from rhodecode.model.comment import CommentsModel
49 from rhodecode.model.meta import Session
49 from rhodecode.model.meta import Session
50 from rhodecode.model.settings import VcsSettingsModel
50 from rhodecode.model.settings import VcsSettingsModel
51
51
52 log = logging.getLogger(__name__)
52 log = logging.getLogger(__name__)
53
53
54
54
55 def _update_with_GET(params, request):
55 def _update_with_GET(params, request):
56 for k in ['diff1', 'diff2', 'diff']:
56 for k in ['diff1', 'diff2', 'diff']:
57 params[k] += request.GET.getall(k)
57 params[k] += request.GET.getall(k)
58
58
59
59
60 class RepoCommitsView(RepoAppView):
60 class RepoCommitsView(RepoAppView):
61 def load_default_context(self):
61 def load_default_context(self):
62 c = self._get_local_tmpl_context(include_app_defaults=True)
62 c = self._get_local_tmpl_context(include_app_defaults=True)
63 c.rhodecode_repo = self.rhodecode_vcs_repo
63 c.rhodecode_repo = self.rhodecode_vcs_repo
64
64
65 return c
65 return c
66
66
67 def _is_diff_cache_enabled(self, target_repo):
67 def _is_diff_cache_enabled(self, target_repo):
68 caching_enabled = self._get_general_setting(
68 caching_enabled = self._get_general_setting(
69 target_repo, 'rhodecode_diff_cache')
69 target_repo, 'rhodecode_diff_cache')
70 log.debug('Diff caching enabled: %s', caching_enabled)
70 log.debug('Diff caching enabled: %s', caching_enabled)
71 return caching_enabled
71 return caching_enabled
72
72
73 def _commit(self, commit_id_range, method):
73 def _commit(self, commit_id_range, method):
74 _ = self.request.translate
74 _ = self.request.translate
75 c = self.load_default_context()
75 c = self.load_default_context()
76 c.fulldiff = self.request.GET.get('fulldiff')
76 c.fulldiff = self.request.GET.get('fulldiff')
77 redirect_to_combined = str2bool(self.request.GET.get('redirect_combined'))
77 redirect_to_combined = str2bool(self.request.GET.get('redirect_combined'))
78
78
79 # fetch global flags of ignore ws or context lines
79 # fetch global flags of ignore ws or context lines
80 diff_context = get_diff_context(self.request)
80 diff_context = get_diff_context(self.request)
81 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
81 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
82
82
83 # diff_limit will cut off the whole diff if the limit is applied
83 # diff_limit will cut off the whole diff if the limit is applied
84 # otherwise it will just hide the big files from the front-end
84 # otherwise it will just hide the big files from the front-end
85 diff_limit = c.visual.cut_off_limit_diff
85 diff_limit = c.visual.cut_off_limit_diff
86 file_limit = c.visual.cut_off_limit_file
86 file_limit = c.visual.cut_off_limit_file
87
87
88 # get ranges of commit ids if preset
88 # get ranges of commit ids if preset
89 commit_range = commit_id_range.split('...')[:2]
89 commit_range = commit_id_range.split('...')[:2]
90
90
91 try:
91 try:
92 pre_load = ['affected_files', 'author', 'branch', 'date',
92 pre_load = ['affected_files', 'author', 'branch', 'date',
93 'message', 'parents']
93 'message', 'parents']
94 if self.rhodecode_vcs_repo.alias == 'hg':
94 if self.rhodecode_vcs_repo.alias == 'hg':
95 pre_load += ['hidden', 'obsolete', 'phase']
95 pre_load += ['hidden', 'obsolete', 'phase']
96
96
97 if len(commit_range) == 2:
97 if len(commit_range) == 2:
98 commits = self.rhodecode_vcs_repo.get_commits(
98 commits = self.rhodecode_vcs_repo.get_commits(
99 start_id=commit_range[0], end_id=commit_range[1],
99 start_id=commit_range[0], end_id=commit_range[1],
100 pre_load=pre_load, translate_tags=False)
100 pre_load=pre_load, translate_tags=False)
101 commits = list(commits)
101 commits = list(commits)
102 else:
102 else:
103 commits = [self.rhodecode_vcs_repo.get_commit(
103 commits = [self.rhodecode_vcs_repo.get_commit(
104 commit_id=commit_id_range, pre_load=pre_load)]
104 commit_id=commit_id_range, pre_load=pre_load)]
105
105
106 c.commit_ranges = commits
106 c.commit_ranges = commits
107 if not c.commit_ranges:
107 if not c.commit_ranges:
108 raise RepositoryError('The commit range returned an empty result')
108 raise RepositoryError('The commit range returned an empty result')
109 except CommitDoesNotExistError as e:
109 except CommitDoesNotExistError as e:
110 msg = _('No such commit exists. Org exception: `{}`').format(safe_str(e))
110 msg = _('No such commit exists. Org exception: `{}`').format(safe_str(e))
111 h.flash(msg, category='error')
111 h.flash(msg, category='error')
112 raise HTTPNotFound()
112 raise HTTPNotFound()
113 except Exception:
113 except Exception:
114 log.exception("General failure")
114 log.exception("General failure")
115 raise HTTPNotFound()
115 raise HTTPNotFound()
116 single_commit = len(c.commit_ranges) == 1
116 single_commit = len(c.commit_ranges) == 1
117
117
118 if redirect_to_combined and not single_commit:
118 if redirect_to_combined and not single_commit:
119 source_ref = getattr(c.commit_ranges[0].parents[0]
119 source_ref = getattr(c.commit_ranges[0].parents[0]
120 if c.commit_ranges[0].parents else h.EmptyCommit(), 'raw_id')
120 if c.commit_ranges[0].parents else h.EmptyCommit(), 'raw_id')
121 target_ref = c.commit_ranges[-1].raw_id
121 target_ref = c.commit_ranges[-1].raw_id
122 next_url = h.route_path(
122 next_url = h.route_path(
123 'repo_compare',
123 'repo_compare',
124 repo_name=c.repo_name,
124 repo_name=c.repo_name,
125 source_ref_type='rev',
125 source_ref_type='rev',
126 source_ref=source_ref,
126 source_ref=source_ref,
127 target_ref_type='rev',
127 target_ref_type='rev',
128 target_ref=target_ref)
128 target_ref=target_ref)
129 raise HTTPFound(next_url)
129 raise HTTPFound(next_url)
130
130
131 c.changes = OrderedDict()
131 c.changes = OrderedDict()
132 c.lines_added = 0
132 c.lines_added = 0
133 c.lines_deleted = 0
133 c.lines_deleted = 0
134
134
135 # auto collapse if we have more than limit
135 # auto collapse if we have more than limit
136 collapse_limit = diffs.DiffProcessor._collapse_commits_over
136 collapse_limit = diffs.DiffProcessor._collapse_commits_over
137 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
137 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
138
138
139 c.commit_statuses = ChangesetStatus.STATUSES
139 c.commit_statuses = ChangesetStatus.STATUSES
140 c.inline_comments = []
140 c.inline_comments = []
141 c.files = []
141 c.files = []
142
142
143 c.comments = []
143 c.comments = []
144 c.unresolved_comments = []
144 c.unresolved_comments = []
145 c.resolved_comments = []
145 c.resolved_comments = []
146
146
147 # Single commit
147 # Single commit
148 if single_commit:
148 if single_commit:
149 commit = c.commit_ranges[0]
149 commit = c.commit_ranges[0]
150 c.comments = CommentsModel().get_comments(
150 c.comments = CommentsModel().get_comments(
151 self.db_repo.repo_id,
151 self.db_repo.repo_id,
152 revision=commit.raw_id)
152 revision=commit.raw_id)
153
153
154 # comments from PR
154 # comments from PR
155 statuses = ChangesetStatusModel().get_statuses(
155 statuses = ChangesetStatusModel().get_statuses(
156 self.db_repo.repo_id, commit.raw_id,
156 self.db_repo.repo_id, commit.raw_id,
157 with_revisions=True)
157 with_revisions=True)
158
158
159 prs = set()
159 prs = set()
160 reviewers = list()
160 reviewers = list()
161 reviewers_duplicates = set() # to not have duplicates from multiple votes
161 reviewers_duplicates = set() # to not have duplicates from multiple votes
162 for c_status in statuses:
162 for c_status in statuses:
163
163
164 # extract associated pull-requests from votes
164 # extract associated pull-requests from votes
165 if c_status.pull_request:
165 if c_status.pull_request:
166 prs.add(c_status.pull_request)
166 prs.add(c_status.pull_request)
167
167
168 # extract reviewers
168 # extract reviewers
169 _user_id = c_status.author.user_id
169 _user_id = c_status.author.user_id
170 if _user_id not in reviewers_duplicates:
170 if _user_id not in reviewers_duplicates:
171 reviewers.append(
171 reviewers.append(
172 StrictAttributeDict({
172 StrictAttributeDict({
173 'user': c_status.author,
173 'user': c_status.author,
174
174
175 # fake attributed for commit, page that we don't have
175 # fake attributed for commit, page that we don't have
176 # but we share the display with PR page
176 # but we share the display with PR page
177 'mandatory': False,
177 'mandatory': False,
178 'reasons': [],
178 'reasons': [],
179 'rule_user_group_data': lambda: None
179 'rule_user_group_data': lambda: None
180 })
180 })
181 )
181 )
182 reviewers_duplicates.add(_user_id)
182 reviewers_duplicates.add(_user_id)
183
183
184 c.reviewers_count = len(reviewers)
184 c.reviewers_count = len(reviewers)
185 c.observers_count = 0
185 c.observers_count = 0
186
186
187 # from associated statuses, check the pull requests, and
187 # from associated statuses, check the pull requests, and
188 # show comments from them
188 # show comments from them
189 for pr in prs:
189 for pr in prs:
190 c.comments.extend(pr.comments)
190 c.comments.extend(pr.comments)
191
191
192 c.unresolved_comments = CommentsModel()\
192 c.unresolved_comments = CommentsModel()\
193 .get_commit_unresolved_todos(commit.raw_id)
193 .get_commit_unresolved_todos(commit.raw_id)
194 c.resolved_comments = CommentsModel()\
194 c.resolved_comments = CommentsModel()\
195 .get_commit_resolved_todos(commit.raw_id)
195 .get_commit_resolved_todos(commit.raw_id)
196
196
197 c.inline_comments_flat = CommentsModel()\
197 c.inline_comments_flat = CommentsModel()\
198 .get_commit_inline_comments(commit.raw_id)
198 .get_commit_inline_comments(commit.raw_id)
199
199
200 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
200 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
201 statuses, reviewers)
201 statuses, reviewers)
202
202
203 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
203 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
204
204
205 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
205 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
206
206
207 for review_obj, member, reasons, mandatory, status in review_statuses:
207 for review_obj, member, reasons, mandatory, status in review_statuses:
208 member_reviewer = h.reviewer_as_json(
208 member_reviewer = h.reviewer_as_json(
209 member, reasons=reasons, mandatory=mandatory, role=None,
209 member, reasons=reasons, mandatory=mandatory, role=None,
210 user_group=None
210 user_group=None
211 )
211 )
212
212
213 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
213 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
214 member_reviewer['review_status'] = current_review_status
214 member_reviewer['review_status'] = current_review_status
215 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
215 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
216 member_reviewer['allowed_to_update'] = False
216 member_reviewer['allowed_to_update'] = False
217 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
217 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
218
218
219 c.commit_set_reviewers_data_json = ext_json.str_json(c.commit_set_reviewers_data_json)
219 c.commit_set_reviewers_data_json = ext_json.str_json(c.commit_set_reviewers_data_json)
220
220
221 # NOTE(marcink): this uses the same voting logic as in pull-requests
221 # NOTE(marcink): this uses the same voting logic as in pull-requests
222 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
222 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
223 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
223 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
224
224
225 diff = None
225 diff = None
226 # Iterate over ranges (default commit view is always one commit)
226 # Iterate over ranges (default commit view is always one commit)
227 for commit in c.commit_ranges:
227 for commit in c.commit_ranges:
228 c.changes[commit.raw_id] = []
228 c.changes[commit.raw_id] = []
229
229
230 commit2 = commit
230 commit2 = commit
231 commit1 = commit.first_parent
231 commit1 = commit.first_parent
232
232
233 if method == 'show':
233 if method == 'show':
234 inline_comments = CommentsModel().get_inline_comments(
234 inline_comments = CommentsModel().get_inline_comments(
235 self.db_repo.repo_id, revision=commit.raw_id)
235 self.db_repo.repo_id, revision=commit.raw_id)
236 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
236 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
237 inline_comments))
237 inline_comments))
238 c.inline_comments = inline_comments
238 c.inline_comments = inline_comments
239
239
240 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
240 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
241 self.db_repo)
241 self.db_repo)
242 cache_file_path = diff_cache_exist(
242 cache_file_path = diff_cache_exist(
243 cache_path, 'diff', commit.raw_id,
243 cache_path, 'diff', commit.raw_id,
244 hide_whitespace_changes, diff_context, c.fulldiff)
244 hide_whitespace_changes, diff_context, c.fulldiff)
245
245
246 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
246 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
247 force_recache = str2bool(self.request.GET.get('force_recache'))
247 force_recache = str2bool(self.request.GET.get('force_recache'))
248
248
249 cached_diff = None
249 cached_diff = None
250 if caching_enabled:
250 if caching_enabled:
251 cached_diff = load_cached_diff(cache_file_path)
251 cached_diff = load_cached_diff(cache_file_path)
252
252
253 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
253 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
254 if not force_recache and has_proper_diff_cache:
254 if not force_recache and has_proper_diff_cache:
255 diffset = cached_diff['diff']
255 diffset = cached_diff['diff']
256 else:
256 else:
257 vcs_diff = self.rhodecode_vcs_repo.get_diff(
257 vcs_diff = self.rhodecode_vcs_repo.get_diff(
258 commit1, commit2,
258 commit1, commit2,
259 ignore_whitespace=hide_whitespace_changes,
259 ignore_whitespace=hide_whitespace_changes,
260 context=diff_context)
260 context=diff_context)
261
261
262 diff_processor = diffs.DiffProcessor(vcs_diff, diff_format='newdiff',
262 diff_processor = diffs.DiffProcessor(vcs_diff, diff_format='newdiff',
263 diff_limit=diff_limit,
263 diff_limit=diff_limit,
264 file_limit=file_limit,
264 file_limit=file_limit,
265 show_full_diff=c.fulldiff)
265 show_full_diff=c.fulldiff)
266
266
267 _parsed = diff_processor.prepare()
267 _parsed = diff_processor.prepare()
268
268
269 diffset = codeblocks.DiffSet(
269 diffset = codeblocks.DiffSet(
270 repo_name=self.db_repo_name,
270 repo_name=self.db_repo_name,
271 source_node_getter=codeblocks.diffset_node_getter(commit1),
271 source_node_getter=codeblocks.diffset_node_getter(commit1),
272 target_node_getter=codeblocks.diffset_node_getter(commit2))
272 target_node_getter=codeblocks.diffset_node_getter(commit2))
273
273
274 diffset = self.path_filter.render_patchset_filtered(
274 diffset = self.path_filter.render_patchset_filtered(
275 diffset, _parsed, commit1.raw_id, commit2.raw_id)
275 diffset, _parsed, commit1.raw_id, commit2.raw_id)
276
276
277 # save cached diff
277 # save cached diff
278 if caching_enabled:
278 if caching_enabled:
279 cache_diff(cache_file_path, diffset, None)
279 cache_diff(cache_file_path, diffset, None)
280
280
281 c.limited_diff = diffset.limited_diff
281 c.limited_diff = diffset.limited_diff
282 c.changes[commit.raw_id] = diffset
282 c.changes[commit.raw_id] = diffset
283 else:
283 else:
284 # TODO(marcink): no cache usage here...
284 # TODO(marcink): no cache usage here...
285 _diff = self.rhodecode_vcs_repo.get_diff(
285 _diff = self.rhodecode_vcs_repo.get_diff(
286 commit1, commit2,
286 commit1, commit2,
287 ignore_whitespace=hide_whitespace_changes, context=diff_context)
287 ignore_whitespace=hide_whitespace_changes, context=diff_context)
288 diff_processor = diffs.DiffProcessor(_diff, diff_format='newdiff',
288 diff_processor = diffs.DiffProcessor(_diff, diff_format='newdiff',
289 diff_limit=diff_limit,
289 diff_limit=diff_limit,
290 file_limit=file_limit, show_full_diff=c.fulldiff)
290 file_limit=file_limit, show_full_diff=c.fulldiff)
291 # downloads/raw we only need RAW diff nothing else
291 # downloads/raw we only need RAW diff nothing else
292 diff = self.path_filter.get_raw_patch(diff_processor)
292 diff = self.path_filter.get_raw_patch(diff_processor)
293 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
293 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
294
294
295 # sort comments by how they were generated
295 # sort comments by how they were generated
296 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
296 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
297 c.at_version_num = None
297 c.at_version_num = None
298
298
299 if len(c.commit_ranges) == 1:
299 if len(c.commit_ranges) == 1:
300 c.commit = c.commit_ranges[0]
300 c.commit = c.commit_ranges[0]
301 c.parent_tmpl = ''.join(
301 c.parent_tmpl = ''.join(
302 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
302 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
303
303
304 if method == 'download':
304 if method == 'download':
305 response = Response(diff)
305 response = Response(diff)
306 response.content_type = 'text/plain'
306 response.content_type = 'text/plain'
307 response.content_disposition = (
307 response.content_disposition = (
308 'attachment; filename=%s.diff' % commit_id_range[:12])
308 'attachment; filename=%s.diff' % commit_id_range[:12])
309 return response
309 return response
310 elif method == 'patch':
310 elif method == 'patch':
311
311
312 c.diff = safe_str(diff)
312 c.diff = safe_str(diff)
313 patch = render(
313 patch = render(
314 'rhodecode:templates/changeset/patch_changeset.mako',
314 'rhodecode:templates/changeset/patch_changeset.mako',
315 self._get_template_context(c), self.request)
315 self._get_template_context(c), self.request)
316 response = Response(patch)
316 response = Response(patch)
317 response.content_type = 'text/plain'
317 response.content_type = 'text/plain'
318 return response
318 return response
319 elif method == 'raw':
319 elif method == 'raw':
320 response = Response(diff)
320 response = Response(diff)
321 response.content_type = 'text/plain'
321 response.content_type = 'text/plain'
322 return response
322 return response
323 elif method == 'show':
323 elif method == 'show':
324 if len(c.commit_ranges) == 1:
324 if len(c.commit_ranges) == 1:
325 html = render(
325 html = render(
326 'rhodecode:templates/changeset/changeset.mako',
326 'rhodecode:templates/changeset/changeset.mako',
327 self._get_template_context(c), self.request)
327 self._get_template_context(c), self.request)
328 return Response(html)
328 return Response(html)
329 else:
329 else:
330 c.ancestor = None
330 c.ancestor = None
331 c.target_repo = self.db_repo
331 c.target_repo = self.db_repo
332 html = render(
332 html = render(
333 'rhodecode:templates/changeset/changeset_range.mako',
333 'rhodecode:templates/changeset/changeset_range.mako',
334 self._get_template_context(c), self.request)
334 self._get_template_context(c), self.request)
335 return Response(html)
335 return Response(html)
336
336
337 raise HTTPBadRequest()
337 raise HTTPBadRequest()
338
338
339 @LoginRequired()
339 @LoginRequired()
340 @HasRepoPermissionAnyDecorator(
340 @HasRepoPermissionAnyDecorator(
341 'repository.read', 'repository.write', 'repository.admin')
341 'repository.read', 'repository.write', 'repository.admin')
342 def repo_commit_show(self):
342 def repo_commit_show(self):
343 commit_id = self.request.matchdict['commit_id']
343 commit_id = self.request.matchdict['commit_id']
344 return self._commit(commit_id, method='show')
344 return self._commit(commit_id, method='show')
345
345
346 @LoginRequired()
346 @LoginRequired()
347 @HasRepoPermissionAnyDecorator(
347 @HasRepoPermissionAnyDecorator(
348 'repository.read', 'repository.write', 'repository.admin')
348 'repository.read', 'repository.write', 'repository.admin')
349 def repo_commit_raw(self):
349 def repo_commit_raw(self):
350 commit_id = self.request.matchdict['commit_id']
350 commit_id = self.request.matchdict['commit_id']
351 return self._commit(commit_id, method='raw')
351 return self._commit(commit_id, method='raw')
352
352
353 @LoginRequired()
353 @LoginRequired()
354 @HasRepoPermissionAnyDecorator(
354 @HasRepoPermissionAnyDecorator(
355 'repository.read', 'repository.write', 'repository.admin')
355 'repository.read', 'repository.write', 'repository.admin')
356 def repo_commit_patch(self):
356 def repo_commit_patch(self):
357 commit_id = self.request.matchdict['commit_id']
357 commit_id = self.request.matchdict['commit_id']
358 return self._commit(commit_id, method='patch')
358 return self._commit(commit_id, method='patch')
359
359
360 @LoginRequired()
360 @LoginRequired()
361 @HasRepoPermissionAnyDecorator(
361 @HasRepoPermissionAnyDecorator(
362 'repository.read', 'repository.write', 'repository.admin')
362 'repository.read', 'repository.write', 'repository.admin')
363 def repo_commit_download(self):
363 def repo_commit_download(self):
364 commit_id = self.request.matchdict['commit_id']
364 commit_id = self.request.matchdict['commit_id']
365 return self._commit(commit_id, method='download')
365 return self._commit(commit_id, method='download')
366
366
367 def _commit_comments_create(self, commit_id, comments):
367 def _commit_comments_create(self, commit_id, comments):
368 _ = self.request.translate
368 _ = self.request.translate
369 data = {}
369 data = {}
370 if not comments:
370 if not comments:
371 return
371 return
372
372
373 commit = self.db_repo.get_commit(commit_id)
373 commit = self.db_repo.get_commit(commit_id)
374
374
375 all_drafts = len([x for x in comments if str2bool(x['is_draft'])]) == len(comments)
375 all_drafts = len([x for x in comments if str2bool(x['is_draft'])]) == len(comments)
376 for entry in comments:
376 for entry in comments:
377 c = self.load_default_context()
377 c = self.load_default_context()
378 comment_type = entry['comment_type']
378 comment_type = entry['comment_type']
379 text = entry['text']
379 text = entry['text']
380 status = entry['status']
380 status = entry['status']
381 is_draft = str2bool(entry['is_draft'])
381 is_draft = str2bool(entry['is_draft'])
382 resolves_comment_id = entry['resolves_comment_id']
382 resolves_comment_id = entry['resolves_comment_id']
383 f_path = entry['f_path']
383 f_path = entry['f_path']
384 line_no = entry['line']
384 line_no = entry['line']
385 target_elem_id = f'file-{h.safeid(h.safe_str(f_path))}'
385 target_elem_id = f'file-{h.safeid(h.safe_str(f_path))}'
386
386
387 if status:
387 if status:
388 text = text or (_('Status change %(transition_icon)s %(status)s')
388 text = text or (_('Status change %(transition_icon)s %(status)s')
389 % {'transition_icon': '>',
389 % {'transition_icon': '>',
390 'status': ChangesetStatus.get_status_lbl(status)})
390 'status': ChangesetStatus.get_status_lbl(status)})
391
391
392 comment = CommentsModel().create(
392 comment = CommentsModel().create(
393 text=text,
393 text=text,
394 repo=self.db_repo.repo_id,
394 repo=self.db_repo.repo_id,
395 user=self._rhodecode_db_user.user_id,
395 user=self._rhodecode_db_user.user_id,
396 commit_id=commit_id,
396 commit_id=commit_id,
397 f_path=f_path,
397 f_path=f_path,
398 line_no=line_no,
398 line_no=line_no,
399 status_change=(ChangesetStatus.get_status_lbl(status)
399 status_change=(ChangesetStatus.get_status_lbl(status)
400 if status else None),
400 if status else None),
401 status_change_type=status,
401 status_change_type=status,
402 comment_type=comment_type,
402 comment_type=comment_type,
403 is_draft=is_draft,
403 is_draft=is_draft,
404 resolves_comment_id=resolves_comment_id,
404 resolves_comment_id=resolves_comment_id,
405 auth_user=self._rhodecode_user,
405 auth_user=self._rhodecode_user,
406 send_email=not is_draft, # skip notification for draft comments
406 send_email=not is_draft, # skip notification for draft comments
407 )
407 )
408 is_inline = comment.is_inline
408 is_inline = comment.is_inline
409
409
410 # get status if set !
410 # get status if set !
411 if status:
411 if status:
412 # `dont_allow_on_closed_pull_request = True` means
412 # `dont_allow_on_closed_pull_request = True` means
413 # if latest status was from pull request and it's closed
413 # if latest status was from pull request and it's closed
414 # disallow changing status !
414 # disallow changing status !
415
415
416 try:
416 try:
417 ChangesetStatusModel().set_status(
417 ChangesetStatusModel().set_status(
418 self.db_repo.repo_id,
418 self.db_repo.repo_id,
419 status,
419 status,
420 self._rhodecode_db_user.user_id,
420 self._rhodecode_db_user.user_id,
421 comment,
421 comment,
422 revision=commit_id,
422 revision=commit_id,
423 dont_allow_on_closed_pull_request=True
423 dont_allow_on_closed_pull_request=True
424 )
424 )
425 except StatusChangeOnClosedPullRequestError:
425 except StatusChangeOnClosedPullRequestError:
426 msg = _('Changing the status of a commit associated with '
426 msg = _('Changing the status of a commit associated with '
427 'a closed pull request is not allowed')
427 'a closed pull request is not allowed')
428 log.exception(msg)
428 log.exception(msg)
429 h.flash(msg, category='warning')
429 h.flash(msg, category='warning')
430 raise HTTPFound(h.route_path(
430 raise HTTPFound(h.route_path(
431 'repo_commit', repo_name=self.db_repo_name,
431 'repo_commit', repo_name=self.db_repo_name,
432 commit_id=commit_id))
432 commit_id=commit_id))
433
433
434 Session().flush()
434 Session().flush()
435 # this is somehow required to get access to some relationship
435 # this is somehow required to get access to some relationship
436 # loaded on comment
436 # loaded on comment
437 Session().refresh(comment)
437 Session().refresh(comment)
438
438
439 # skip notifications for drafts
439 # skip notifications for drafts
440 if not is_draft:
440 if not is_draft:
441 CommentsModel().trigger_commit_comment_hook(
441 CommentsModel().trigger_commit_comment_hook(
442 self.db_repo, self._rhodecode_user, 'create',
442 self.db_repo, self._rhodecode_user, 'create',
443 data={'comment': comment, 'commit': commit})
443 data={'comment': comment, 'commit': commit})
444
444
445 comment_id = comment.comment_id
445 comment_id = comment.comment_id
446 data[comment_id] = {
446 data[comment_id] = {
447 'target_id': target_elem_id
447 'target_id': target_elem_id
448 }
448 }
449 Session().flush()
449 Session().flush()
450
450
451 c.co = comment
451 c.co = comment
452 c.at_version_num = 0
452 c.at_version_num = 0
453 c.is_new = True
453 c.is_new = True
454 rendered_comment = render(
454 rendered_comment = render(
455 'rhodecode:templates/changeset/changeset_comment_block.mako',
455 'rhodecode:templates/changeset/changeset_comment_block.mako',
456 self._get_template_context(c), self.request)
456 self._get_template_context(c), self.request)
457
457
458 data[comment_id].update(comment.get_dict())
458 data[comment_id].update(comment.get_dict())
459 data[comment_id].update({'rendered_text': rendered_comment})
459 data[comment_id].update({'rendered_text': rendered_comment})
460
460
461 # finalize, commit and redirect
461 # finalize, commit and redirect
462 Session().commit()
462 Session().commit()
463
463
464 # skip channelstream for draft comments
464 # skip channelstream for draft comments
465 if not all_drafts:
465 if not all_drafts:
466 comment_broadcast_channel = channelstream.comment_channel(
466 comment_broadcast_channel = channelstream.comment_channel(
467 self.db_repo_name, commit_obj=commit)
467 self.db_repo_name, commit_obj=commit)
468
468
469 comment_data = data
469 comment_data = data
470 posted_comment_type = 'inline' if is_inline else 'general'
470 posted_comment_type = 'inline' if is_inline else 'general'
471 if len(data) == 1:
471 if len(data) == 1:
472 msg = _('posted {} new {} comment').format(len(data), posted_comment_type)
472 msg = _('posted {} new {} comment').format(len(data), posted_comment_type)
473 else:
473 else:
474 msg = _('posted {} new {} comments').format(len(data), posted_comment_type)
474 msg = _('posted {} new {} comments').format(len(data), posted_comment_type)
475
475
476 channelstream.comment_channelstream_push(
476 channelstream.comment_channelstream_push(
477 self.request, comment_broadcast_channel, self._rhodecode_user, msg,
477 self.request, comment_broadcast_channel, self._rhodecode_user, msg,
478 comment_data=comment_data)
478 comment_data=comment_data)
479
479
480 return data
480 return data
481
481
482 @LoginRequired()
482 @LoginRequired()
483 @NotAnonymous()
483 @NotAnonymous()
484 @HasRepoPermissionAnyDecorator(
484 @HasRepoPermissionAnyDecorator(
485 'repository.read', 'repository.write', 'repository.admin')
485 'repository.read', 'repository.write', 'repository.admin')
486 @CSRFRequired()
486 @CSRFRequired()
487 def repo_commit_comment_create(self):
487 def repo_commit_comment_create(self):
488 _ = self.request.translate
488 _ = self.request.translate
489 commit_id = self.request.matchdict['commit_id']
489 commit_id = self.request.matchdict['commit_id']
490
490
491 multi_commit_ids = []
491 multi_commit_ids = []
492 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
492 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
493 if _commit_id not in ['', None, EmptyCommit.raw_id]:
493 if _commit_id not in ['', None, EmptyCommit.raw_id]:
494 if _commit_id not in multi_commit_ids:
494 if _commit_id not in multi_commit_ids:
495 multi_commit_ids.append(_commit_id)
495 multi_commit_ids.append(_commit_id)
496
496
497 commit_ids = multi_commit_ids or [commit_id]
497 commit_ids = multi_commit_ids or [commit_id]
498
498
499 data = []
499 data = []
500 # Multiple comments for each passed commit id
500 # Multiple comments for each passed commit id
501 for current_id in filter(None, commit_ids):
501 for current_id in filter(None, commit_ids):
502 comment_data = {
502 comment_data = {
503 'comment_type': self.request.POST.get('comment_type'),
503 'comment_type': self.request.POST.get('comment_type'),
504 'text': self.request.POST.get('text'),
504 'text': self.request.POST.get('text'),
505 'status': self.request.POST.get('changeset_status', None),
505 'status': self.request.POST.get('changeset_status', None),
506 'is_draft': self.request.POST.get('draft'),
506 'is_draft': self.request.POST.get('draft'),
507 'resolves_comment_id': self.request.POST.get('resolves_comment_id', None),
507 'resolves_comment_id': self.request.POST.get('resolves_comment_id', None),
508 'close_pull_request': self.request.POST.get('close_pull_request'),
508 'close_pull_request': self.request.POST.get('close_pull_request'),
509 'f_path': self.request.POST.get('f_path'),
509 'f_path': self.request.POST.get('f_path'),
510 'line': self.request.POST.get('line'),
510 'line': self.request.POST.get('line'),
511 }
511 }
512 comment = self._commit_comments_create(commit_id=current_id, comments=[comment_data])
512 comment = self._commit_comments_create(commit_id=current_id, comments=[comment_data])
513 data.append(comment)
513 data.append(comment)
514
514
515 return data if len(data) > 1 else data[0]
515 return data if len(data) > 1 else data[0]
516
516
517 @LoginRequired()
517 @LoginRequired()
518 @NotAnonymous()
518 @NotAnonymous()
519 @HasRepoPermissionAnyDecorator(
519 @HasRepoPermissionAnyDecorator(
520 'repository.read', 'repository.write', 'repository.admin')
520 'repository.read', 'repository.write', 'repository.admin')
521 @CSRFRequired()
521 @CSRFRequired()
522 def repo_commit_comment_preview(self):
522 def repo_commit_comment_preview(self):
523 # Technically a CSRF token is not needed as no state changes with this
523 # Technically a CSRF token is not needed as no state changes with this
524 # call. However, as this is a POST is better to have it, so automated
524 # call. However, as this is a POST is better to have it, so automated
525 # tools don't flag it as potential CSRF.
525 # tools don't flag it as potential CSRF.
526 # Post is required because the payload could be bigger than the maximum
526 # Post is required because the payload could be bigger than the maximum
527 # allowed by GET.
527 # allowed by GET.
528
528
529 text = self.request.POST.get('text')
529 text = self.request.POST.get('text')
530 renderer = self.request.POST.get('renderer') or 'rst'
530 renderer = self.request.POST.get('renderer') or 'rst'
531 if text:
531 if text:
532 return h.render(text, renderer=renderer, mentions=True,
532 return h.render(text, renderer=renderer, mentions=True,
533 repo_name=self.db_repo_name)
533 repo_name=self.db_repo_name)
534 return ''
534 return ''
535
535
536 @LoginRequired()
536 @LoginRequired()
537 @HasRepoPermissionAnyDecorator(
537 @HasRepoPermissionAnyDecorator(
538 'repository.read', 'repository.write', 'repository.admin')
538 'repository.read', 'repository.write', 'repository.admin')
539 @CSRFRequired()
539 @CSRFRequired()
540 def repo_commit_comment_history_view(self):
540 def repo_commit_comment_history_view(self):
541 c = self.load_default_context()
541 c = self.load_default_context()
542 comment_id = self.request.matchdict['comment_id']
542 comment_id = self.request.matchdict['comment_id']
543 comment_history_id = self.request.matchdict['comment_history_id']
543 comment_history_id = self.request.matchdict['comment_history_id']
544
544
545 comment = ChangesetComment.get_or_404(comment_id)
545 comment = ChangesetComment.get_or_404(comment_id)
546 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
546 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
547 if comment.draft and not comment_owner:
547 if comment.draft and not comment_owner:
548 # if we see draft comments history, we only allow this for owner
548 # if we see draft comments history, we only allow this for owner
549 raise HTTPNotFound()
549 raise HTTPNotFound()
550
550
551 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
551 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
552 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
552 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
553
553
554 if is_repo_comment:
554 if is_repo_comment:
555 c.comment_history = comment_history
555 c.comment_history = comment_history
556
556
557 rendered_comment = render(
557 rendered_comment = render(
558 'rhodecode:templates/changeset/comment_history.mako',
558 'rhodecode:templates/changeset/comment_history.mako',
559 self._get_template_context(c), self.request)
559 self._get_template_context(c), self.request)
560 return rendered_comment
560 return rendered_comment
561 else:
561 else:
562 log.warning('No permissions for user %s to show comment_history_id: %s',
562 log.warning('No permissions for user %s to show comment_history_id: %s',
563 self._rhodecode_db_user, comment_history_id)
563 self._rhodecode_db_user, comment_history_id)
564 raise HTTPNotFound()
564 raise HTTPNotFound()
565
565
566 @LoginRequired()
566 @LoginRequired()
567 @NotAnonymous()
567 @NotAnonymous()
568 @HasRepoPermissionAnyDecorator(
568 @HasRepoPermissionAnyDecorator(
569 'repository.read', 'repository.write', 'repository.admin')
569 'repository.read', 'repository.write', 'repository.admin')
570 @CSRFRequired()
570 @CSRFRequired()
571 def repo_commit_comment_attachment_upload(self):
571 def repo_commit_comment_attachment_upload(self):
572 c = self.load_default_context()
572 c = self.load_default_context()
573 upload_key = 'attachment'
573 upload_key = 'attachment'
574
574
575 file_obj = self.request.POST.get(upload_key)
575 file_obj = self.request.POST.get(upload_key)
576
576
577 if file_obj is None:
577 if file_obj is None:
578 self.request.response.status = 400
578 self.request.response.status = 400
579 return {'store_fid': None,
579 return {'store_fid': None,
580 'access_path': None,
580 'access_path': None,
581 'error': f'{upload_key} data field is missing'}
581 'error': f'{upload_key} data field is missing'}
582
582
583 if not hasattr(file_obj, 'filename'):
583 if not hasattr(file_obj, 'filename'):
584 self.request.response.status = 400
584 self.request.response.status = 400
585 return {'store_fid': None,
585 return {'store_fid': None,
586 'access_path': None,
586 'access_path': None,
587 'error': 'filename cannot be read from the data field'}
587 'error': 'filename cannot be read from the data field'}
588
588
589 filename = file_obj.filename
589 filename = file_obj.filename
590 file_display_name = filename
590 file_display_name = filename
591
591
592 metadata = {
592 metadata = {
593 'user_uploaded': {'username': self._rhodecode_user.username,
593 'user_uploaded': {'username': self._rhodecode_user.username,
594 'user_id': self._rhodecode_user.user_id,
594 'user_id': self._rhodecode_user.user_id,
595 'ip': self._rhodecode_user.ip_addr}}
595 'ip': self._rhodecode_user.ip_addr}}
596
596
597 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
597 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
598 allowed_extensions = [
598 allowed_extensions = [
599 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
599 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
600 '.pptx', '.txt', '.xlsx', '.zip']
600 '.pptx', '.txt', '.xlsx', '.zip']
601 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
601 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
602
602
603 try:
603 try:
604 storage = store_utils.get_file_storage(self.request.registry.settings)
604 f_store = store_utils.get_filestore_backend(self.request.registry.settings)
605 store_uid, metadata = storage.save_file(
605 store_uid, metadata = f_store.store(
606 file_obj.file, filename, extra_metadata=metadata,
606 filename, file_obj.file, metadata=metadata,
607 extensions=allowed_extensions, max_filesize=max_file_size)
607 extensions=allowed_extensions, max_filesize=max_file_size)
608 except FileNotAllowedException:
608 except FileNotAllowedException:
609 self.request.response.status = 400
609 self.request.response.status = 400
610 permitted_extensions = ', '.join(allowed_extensions)
610 permitted_extensions = ', '.join(allowed_extensions)
611 error_msg = 'File `{}` is not allowed. ' \
611 error_msg = f'File `{filename}` is not allowed. ' \
612 'Only following extensions are permitted: {}'.format(
612 f'Only following extensions are permitted: {permitted_extensions}'
613 filename, permitted_extensions)
613
614 return {'store_fid': None,
614 return {'store_fid': None,
615 'access_path': None,
615 'access_path': None,
616 'error': error_msg}
616 'error': error_msg}
617 except FileOverSizeException:
617 except FileOverSizeException:
618 self.request.response.status = 400
618 self.request.response.status = 400
619 limit_mb = h.format_byte_size_binary(max_file_size)
619 limit_mb = h.format_byte_size_binary(max_file_size)
620 error_msg = f'File {filename} is exceeding allowed limit of {limit_mb}.'
620 return {'store_fid': None,
621 return {'store_fid': None,
621 'access_path': None,
622 'access_path': None,
622 'error': 'File {} is exceeding allowed limit of {}.'.format(
623 'error': error_msg}
623 filename, limit_mb)}
624
624
625 try:
625 try:
626 entry = FileStore.create(
626 entry = FileStore.create(
627 file_uid=store_uid, filename=metadata["filename"],
627 file_uid=store_uid, filename=metadata["filename"],
628 file_hash=metadata["sha256"], file_size=metadata["size"],
628 file_hash=metadata["sha256"], file_size=metadata["size"],
629 file_display_name=file_display_name,
629 file_display_name=file_display_name,
630 file_description=f'comment attachment `{safe_str(filename)}`',
630 file_description=f'comment attachment `{safe_str(filename)}`',
631 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
631 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
632 scope_repo_id=self.db_repo.repo_id
632 scope_repo_id=self.db_repo.repo_id
633 )
633 )
634 Session().add(entry)
634 Session().add(entry)
635 Session().commit()
635 Session().commit()
636 log.debug('Stored upload in DB as %s', entry)
636 log.debug('Stored upload in DB as %s', entry)
637 except Exception:
637 except Exception:
638 log.exception('Failed to store file %s', filename)
638 log.exception('Failed to store file %s', filename)
639 self.request.response.status = 400
639 self.request.response.status = 400
640 return {'store_fid': None,
640 return {'store_fid': None,
641 'access_path': None,
641 'access_path': None,
642 'error': f'File {filename} failed to store in DB.'}
642 'error': f'File {filename} failed to store in DB.'}
643
643
644 Session().commit()
644 Session().commit()
645
645
646 data = {
646 data = {
647 'store_fid': store_uid,
647 'store_fid': store_uid,
648 'access_path': h.route_path(
648 'access_path': h.route_path(
649 'download_file', fid=store_uid),
649 'download_file', fid=store_uid),
650 'fqn_access_path': h.route_url(
650 'fqn_access_path': h.route_url(
651 'download_file', fid=store_uid),
651 'download_file', fid=store_uid),
652 # for EE those are replaced by FQN links on repo-only like
652 # for EE those are replaced by FQN links on repo-only like
653 'repo_access_path': h.route_url(
653 'repo_access_path': h.route_url(
654 'download_file', fid=store_uid),
654 'download_file', fid=store_uid),
655 'repo_fqn_access_path': h.route_url(
655 'repo_fqn_access_path': h.route_url(
656 'download_file', fid=store_uid),
656 'download_file', fid=store_uid),
657 }
657 }
658 # this data is a part of CE/EE additional code
658 # this data is a part of CE/EE additional code
659 if c.rhodecode_edition_id == 'EE':
659 if c.rhodecode_edition_id == 'EE':
660 data.update({
660 data.update({
661 'repo_access_path': h.route_path(
661 'repo_access_path': h.route_path(
662 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
662 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
663 'repo_fqn_access_path': h.route_url(
663 'repo_fqn_access_path': h.route_url(
664 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
664 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
665 })
665 })
666
666
667 return data
667 return data
668
668
669 @LoginRequired()
669 @LoginRequired()
670 @NotAnonymous()
670 @NotAnonymous()
671 @HasRepoPermissionAnyDecorator(
671 @HasRepoPermissionAnyDecorator(
672 'repository.read', 'repository.write', 'repository.admin')
672 'repository.read', 'repository.write', 'repository.admin')
673 @CSRFRequired()
673 @CSRFRequired()
674 def repo_commit_comment_delete(self):
674 def repo_commit_comment_delete(self):
675 commit_id = self.request.matchdict['commit_id']
675 commit_id = self.request.matchdict['commit_id']
676 comment_id = self.request.matchdict['comment_id']
676 comment_id = self.request.matchdict['comment_id']
677
677
678 comment = ChangesetComment.get_or_404(comment_id)
678 comment = ChangesetComment.get_or_404(comment_id)
679 if not comment:
679 if not comment:
680 log.debug('Comment with id:%s not found, skipping', comment_id)
680 log.debug('Comment with id:%s not found, skipping', comment_id)
681 # comment already deleted in another call probably
681 # comment already deleted in another call probably
682 return True
682 return True
683
683
684 if comment.immutable:
684 if comment.immutable:
685 # don't allow deleting comments that are immutable
685 # don't allow deleting comments that are immutable
686 raise HTTPForbidden()
686 raise HTTPForbidden()
687
687
688 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
688 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
689 super_admin = h.HasPermissionAny('hg.admin')()
689 super_admin = h.HasPermissionAny('hg.admin')()
690 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
690 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
691 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
691 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
692 comment_repo_admin = is_repo_admin and is_repo_comment
692 comment_repo_admin = is_repo_admin and is_repo_comment
693
693
694 if comment.draft and not comment_owner:
694 if comment.draft and not comment_owner:
695 # We never allow to delete draft comments for other than owners
695 # We never allow to delete draft comments for other than owners
696 raise HTTPNotFound()
696 raise HTTPNotFound()
697
697
698 if super_admin or comment_owner or comment_repo_admin:
698 if super_admin or comment_owner or comment_repo_admin:
699 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
699 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
700 Session().commit()
700 Session().commit()
701 return True
701 return True
702 else:
702 else:
703 log.warning('No permissions for user %s to delete comment_id: %s',
703 log.warning('No permissions for user %s to delete comment_id: %s',
704 self._rhodecode_db_user, comment_id)
704 self._rhodecode_db_user, comment_id)
705 raise HTTPNotFound()
705 raise HTTPNotFound()
706
706
707 @LoginRequired()
707 @LoginRequired()
708 @NotAnonymous()
708 @NotAnonymous()
709 @HasRepoPermissionAnyDecorator(
709 @HasRepoPermissionAnyDecorator(
710 'repository.read', 'repository.write', 'repository.admin')
710 'repository.read', 'repository.write', 'repository.admin')
711 @CSRFRequired()
711 @CSRFRequired()
712 def repo_commit_comment_edit(self):
712 def repo_commit_comment_edit(self):
713 self.load_default_context()
713 self.load_default_context()
714
714
715 commit_id = self.request.matchdict['commit_id']
715 commit_id = self.request.matchdict['commit_id']
716 comment_id = self.request.matchdict['comment_id']
716 comment_id = self.request.matchdict['comment_id']
717 comment = ChangesetComment.get_or_404(comment_id)
717 comment = ChangesetComment.get_or_404(comment_id)
718
718
719 if comment.immutable:
719 if comment.immutable:
720 # don't allow deleting comments that are immutable
720 # don't allow deleting comments that are immutable
721 raise HTTPForbidden()
721 raise HTTPForbidden()
722
722
723 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
723 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
724 super_admin = h.HasPermissionAny('hg.admin')()
724 super_admin = h.HasPermissionAny('hg.admin')()
725 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
725 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
726 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
726 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
727 comment_repo_admin = is_repo_admin and is_repo_comment
727 comment_repo_admin = is_repo_admin and is_repo_comment
728
728
729 if super_admin or comment_owner or comment_repo_admin:
729 if super_admin or comment_owner or comment_repo_admin:
730 text = self.request.POST.get('text')
730 text = self.request.POST.get('text')
731 version = self.request.POST.get('version')
731 version = self.request.POST.get('version')
732 if text == comment.text:
732 if text == comment.text:
733 log.warning(
733 log.warning(
734 'Comment(repo): '
734 'Comment(repo): '
735 'Trying to create new version '
735 'Trying to create new version '
736 'with the same comment body {}'.format(
736 'with the same comment body {}'.format(
737 comment_id,
737 comment_id,
738 )
738 )
739 )
739 )
740 raise HTTPNotFound()
740 raise HTTPNotFound()
741
741
742 if version.isdigit():
742 if version.isdigit():
743 version = int(version)
743 version = int(version)
744 else:
744 else:
745 log.warning(
745 log.warning(
746 'Comment(repo): Wrong version type {} {} '
746 'Comment(repo): Wrong version type {} {} '
747 'for comment {}'.format(
747 'for comment {}'.format(
748 version,
748 version,
749 type(version),
749 type(version),
750 comment_id,
750 comment_id,
751 )
751 )
752 )
752 )
753 raise HTTPNotFound()
753 raise HTTPNotFound()
754
754
755 try:
755 try:
756 comment_history = CommentsModel().edit(
756 comment_history = CommentsModel().edit(
757 comment_id=comment_id,
757 comment_id=comment_id,
758 text=text,
758 text=text,
759 auth_user=self._rhodecode_user,
759 auth_user=self._rhodecode_user,
760 version=version,
760 version=version,
761 )
761 )
762 except CommentVersionMismatch:
762 except CommentVersionMismatch:
763 raise HTTPConflict()
763 raise HTTPConflict()
764
764
765 if not comment_history:
765 if not comment_history:
766 raise HTTPNotFound()
766 raise HTTPNotFound()
767
767
768 if not comment.draft:
768 if not comment.draft:
769 commit = self.db_repo.get_commit(commit_id)
769 commit = self.db_repo.get_commit(commit_id)
770 CommentsModel().trigger_commit_comment_hook(
770 CommentsModel().trigger_commit_comment_hook(
771 self.db_repo, self._rhodecode_user, 'edit',
771 self.db_repo, self._rhodecode_user, 'edit',
772 data={'comment': comment, 'commit': commit})
772 data={'comment': comment, 'commit': commit})
773
773
774 Session().commit()
774 Session().commit()
775 return {
775 return {
776 'comment_history_id': comment_history.comment_history_id,
776 'comment_history_id': comment_history.comment_history_id,
777 'comment_id': comment.comment_id,
777 'comment_id': comment.comment_id,
778 'comment_version': comment_history.version,
778 'comment_version': comment_history.version,
779 'comment_author_username': comment_history.author.username,
779 'comment_author_username': comment_history.author.username,
780 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16, request=self.request),
780 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16, request=self.request),
781 'comment_created_on': h.age_component(comment_history.created_on,
781 'comment_created_on': h.age_component(comment_history.created_on,
782 time_is_local=True),
782 time_is_local=True),
783 }
783 }
784 else:
784 else:
785 log.warning('No permissions for user %s to edit comment_id: %s',
785 log.warning('No permissions for user %s to edit comment_id: %s',
786 self._rhodecode_db_user, comment_id)
786 self._rhodecode_db_user, comment_id)
787 raise HTTPNotFound()
787 raise HTTPNotFound()
788
788
789 @LoginRequired()
789 @LoginRequired()
790 @HasRepoPermissionAnyDecorator(
790 @HasRepoPermissionAnyDecorator(
791 'repository.read', 'repository.write', 'repository.admin')
791 'repository.read', 'repository.write', 'repository.admin')
792 def repo_commit_data(self):
792 def repo_commit_data(self):
793 commit_id = self.request.matchdict['commit_id']
793 commit_id = self.request.matchdict['commit_id']
794 self.load_default_context()
794 self.load_default_context()
795
795
796 try:
796 try:
797 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
797 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
798 except CommitDoesNotExistError as e:
798 except CommitDoesNotExistError as e:
799 return EmptyCommit(message=str(e))
799 return EmptyCommit(message=str(e))
800
800
801 @LoginRequired()
801 @LoginRequired()
802 @HasRepoPermissionAnyDecorator(
802 @HasRepoPermissionAnyDecorator(
803 'repository.read', 'repository.write', 'repository.admin')
803 'repository.read', 'repository.write', 'repository.admin')
804 def repo_commit_children(self):
804 def repo_commit_children(self):
805 commit_id = self.request.matchdict['commit_id']
805 commit_id = self.request.matchdict['commit_id']
806 self.load_default_context()
806 self.load_default_context()
807
807
808 try:
808 try:
809 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
809 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
810 children = commit.children
810 children = commit.children
811 except CommitDoesNotExistError:
811 except CommitDoesNotExistError:
812 children = []
812 children = []
813
813
814 result = {"results": children}
814 result = {"results": children}
815 return result
815 return result
816
816
817 @LoginRequired()
817 @LoginRequired()
818 @HasRepoPermissionAnyDecorator(
818 @HasRepoPermissionAnyDecorator(
819 'repository.read', 'repository.write', 'repository.admin')
819 'repository.read', 'repository.write', 'repository.admin')
820 def repo_commit_parents(self):
820 def repo_commit_parents(self):
821 commit_id = self.request.matchdict['commit_id']
821 commit_id = self.request.matchdict['commit_id']
822 self.load_default_context()
822 self.load_default_context()
823
823
824 try:
824 try:
825 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
825 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
826 parents = commit.parents
826 parents = commit.parents
827 except CommitDoesNotExistError:
827 except CommitDoesNotExistError:
828 parents = []
828 parents = []
829 result = {"results": parents}
829 result = {"results": parents}
830 return result
830 return result
@@ -1,1716 +1,1716 b''
1 # Copyright (C) 2011-2023 RhodeCode GmbH
1 # Copyright (C) 2011-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import itertools
19 import itertools
20 import logging
20 import logging
21 import os
21 import os
22 import collections
22 import collections
23 import urllib.request
23 import urllib.request
24 import urllib.parse
24 import urllib.parse
25 import urllib.error
25 import urllib.error
26 import pathlib
26 import pathlib
27 import time
27 import time
28 import random
28 import random
29
29
30 from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound
30 from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound
31
31
32 from pyramid.renderers import render
32 from pyramid.renderers import render
33 from pyramid.response import Response
33 from pyramid.response import Response
34
34
35 import rhodecode
35 import rhodecode
36 from rhodecode.apps._base import RepoAppView
36 from rhodecode.apps._base import RepoAppView
37
37
38
38
39 from rhodecode.lib import diffs, helpers as h, rc_cache
39 from rhodecode.lib import diffs, helpers as h, rc_cache
40 from rhodecode.lib import audit_logger
40 from rhodecode.lib import audit_logger
41 from rhodecode.lib.hash_utils import sha1_safe
41 from rhodecode.lib.hash_utils import sha1_safe
42 from rhodecode.lib.archive_cache import (
42 from rhodecode.lib.archive_cache import (
43 get_archival_cache_store, get_archival_config, ArchiveCacheGenerationLock, archive_iterator)
43 get_archival_cache_store, get_archival_config, ArchiveCacheGenerationLock, archive_iterator)
44 from rhodecode.lib.str_utils import safe_bytes, convert_special_chars
44 from rhodecode.lib.str_utils import safe_bytes, convert_special_chars
45 from rhodecode.lib.view_utils import parse_path_ref
45 from rhodecode.lib.view_utils import parse_path_ref
46 from rhodecode.lib.exceptions import NonRelativePathError
46 from rhodecode.lib.exceptions import NonRelativePathError
47 from rhodecode.lib.codeblocks import (
47 from rhodecode.lib.codeblocks import (
48 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
48 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
49 from rhodecode.lib.utils2 import convert_line_endings, detect_mode
49 from rhodecode.lib.utils2 import convert_line_endings, detect_mode
50 from rhodecode.lib.type_utils import str2bool
50 from rhodecode.lib.type_utils import str2bool
51 from rhodecode.lib.str_utils import safe_str, safe_int
51 from rhodecode.lib.str_utils import safe_str, safe_int, header_safe_str
52 from rhodecode.lib.auth import (
52 from rhodecode.lib.auth import (
53 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
53 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
54 from rhodecode.lib.vcs import path as vcspath
54 from rhodecode.lib.vcs import path as vcspath
55 from rhodecode.lib.vcs.backends.base import EmptyCommit
55 from rhodecode.lib.vcs.backends.base import EmptyCommit
56 from rhodecode.lib.vcs.conf import settings
56 from rhodecode.lib.vcs.conf import settings
57 from rhodecode.lib.vcs.nodes import FileNode
57 from rhodecode.lib.vcs.nodes import FileNode
58 from rhodecode.lib.vcs.exceptions import (
58 from rhodecode.lib.vcs.exceptions import (
59 RepositoryError, CommitDoesNotExistError, EmptyRepositoryError,
59 RepositoryError, CommitDoesNotExistError, EmptyRepositoryError,
60 ImproperArchiveTypeError, VCSError, NodeAlreadyExistsError,
60 ImproperArchiveTypeError, VCSError, NodeAlreadyExistsError,
61 NodeDoesNotExistError, CommitError, NodeError)
61 NodeDoesNotExistError, CommitError, NodeError)
62
62
63 from rhodecode.model.scm import ScmModel
63 from rhodecode.model.scm import ScmModel
64 from rhodecode.model.db import Repository
64 from rhodecode.model.db import Repository
65
65
66 log = logging.getLogger(__name__)
66 log = logging.getLogger(__name__)
67
67
68
68
69 def get_archive_name(db_repo_id, db_repo_name, commit_sha, ext, subrepos=False, path_sha='', with_hash=True):
69 def get_archive_name(db_repo_id, db_repo_name, commit_sha, ext, subrepos=False, path_sha='', with_hash=True):
70 # original backward compat name of archive
70 # original backward compat name of archive
71 clean_name = safe_str(convert_special_chars(db_repo_name).replace('/', '_'))
71 clean_name = safe_str(convert_special_chars(db_repo_name).replace('/', '_'))
72
72
73 # e.g vcsserver-id-abcd-sub-1-abcfdef-archive-all.zip
73 # e.g vcsserver-id-abcd-sub-1-abcfdef-archive-all.zip
74 # vcsserver-id-abcd-sub-0-abcfdef-COMMIT_SHA-PATH_SHA.zip
74 # vcsserver-id-abcd-sub-0-abcfdef-COMMIT_SHA-PATH_SHA.zip
75 id_sha = sha1_safe(str(db_repo_id))[:4]
75 id_sha = sha1_safe(str(db_repo_id))[:4]
76 sub_repo = 'sub-1' if subrepos else 'sub-0'
76 sub_repo = 'sub-1' if subrepos else 'sub-0'
77 commit = commit_sha if with_hash else 'archive'
77 commit = commit_sha if with_hash else 'archive'
78 path_marker = (path_sha if with_hash else '') or 'all'
78 path_marker = (path_sha if with_hash else '') or 'all'
79 archive_name = f'{clean_name}-id-{id_sha}-{sub_repo}-{commit}-{path_marker}{ext}'
79 archive_name = f'{clean_name}-id-{id_sha}-{sub_repo}-{commit}-{path_marker}{ext}'
80
80
81 return archive_name
81 return archive_name
82
82
83
83
84 def get_path_sha(at_path):
84 def get_path_sha(at_path):
85 return safe_str(sha1_safe(at_path)[:8])
85 return safe_str(sha1_safe(at_path)[:8])
86
86
87
87
88 def _get_archive_spec(fname):
88 def _get_archive_spec(fname):
89 log.debug('Detecting archive spec for: `%s`', fname)
89 log.debug('Detecting archive spec for: `%s`', fname)
90
90
91 fileformat = None
91 fileformat = None
92 ext = None
92 ext = None
93 content_type = None
93 content_type = None
94 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
94 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
95
95
96 if fname.endswith(extension):
96 if fname.endswith(extension):
97 fileformat = a_type
97 fileformat = a_type
98 log.debug('archive is of type: %s', fileformat)
98 log.debug('archive is of type: %s', fileformat)
99 ext = extension
99 ext = extension
100 break
100 break
101
101
102 if not fileformat:
102 if not fileformat:
103 raise ValueError()
103 raise ValueError()
104
104
105 # left over part of whole fname is the commit
105 # left over part of whole fname is the commit
106 commit_id = fname[:-len(ext)]
106 commit_id = fname[:-len(ext)]
107
107
108 return commit_id, ext, fileformat, content_type
108 return commit_id, ext, fileformat, content_type
109
109
110
110
111 class RepoFilesView(RepoAppView):
111 class RepoFilesView(RepoAppView):
112
112
113 @staticmethod
113 @staticmethod
114 def adjust_file_path_for_svn(f_path, repo):
114 def adjust_file_path_for_svn(f_path, repo):
115 """
115 """
116 Computes the relative path of `f_path`.
116 Computes the relative path of `f_path`.
117
117
118 This is mainly based on prefix matching of the recognized tags and
118 This is mainly based on prefix matching of the recognized tags and
119 branches in the underlying repository.
119 branches in the underlying repository.
120 """
120 """
121 tags_and_branches = itertools.chain(
121 tags_and_branches = itertools.chain(
122 repo.branches.keys(),
122 repo.branches.keys(),
123 repo.tags.keys())
123 repo.tags.keys())
124 tags_and_branches = sorted(tags_and_branches, key=len, reverse=True)
124 tags_and_branches = sorted(tags_and_branches, key=len, reverse=True)
125
125
126 for name in tags_and_branches:
126 for name in tags_and_branches:
127 if f_path.startswith(f'{name}/'):
127 if f_path.startswith(f'{name}/'):
128 f_path = vcspath.relpath(f_path, name)
128 f_path = vcspath.relpath(f_path, name)
129 break
129 break
130 return f_path
130 return f_path
131
131
132 def load_default_context(self):
132 def load_default_context(self):
133 c = self._get_local_tmpl_context(include_app_defaults=True)
133 c = self._get_local_tmpl_context(include_app_defaults=True)
134 c.rhodecode_repo = self.rhodecode_vcs_repo
134 c.rhodecode_repo = self.rhodecode_vcs_repo
135 c.enable_downloads = self.db_repo.enable_downloads
135 c.enable_downloads = self.db_repo.enable_downloads
136 return c
136 return c
137
137
138 def _ensure_not_locked(self, commit_id='tip'):
138 def _ensure_not_locked(self, commit_id='tip'):
139 _ = self.request.translate
139 _ = self.request.translate
140
140
141 repo = self.db_repo
141 repo = self.db_repo
142 if repo.enable_locking and repo.locked[0]:
142 if repo.enable_locking and repo.locked[0]:
143 h.flash(_('This repository has been locked by %s on %s')
143 h.flash(_('This repository has been locked by %s on %s')
144 % (h.person_by_id(repo.locked[0]),
144 % (h.person_by_id(repo.locked[0]),
145 h.format_date(h.time_to_datetime(repo.locked[1]))),
145 h.format_date(h.time_to_datetime(repo.locked[1]))),
146 'warning')
146 'warning')
147 files_url = h.route_path(
147 files_url = h.route_path(
148 'repo_files:default_path',
148 'repo_files:default_path',
149 repo_name=self.db_repo_name, commit_id=commit_id)
149 repo_name=self.db_repo_name, commit_id=commit_id)
150 raise HTTPFound(files_url)
150 raise HTTPFound(files_url)
151
151
152 def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False):
152 def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False):
153 _ = self.request.translate
153 _ = self.request.translate
154
154
155 if not is_head:
155 if not is_head:
156 message = _('Cannot modify file. '
156 message = _('Cannot modify file. '
157 'Given commit `{}` is not head of a branch.').format(commit_id)
157 'Given commit `{}` is not head of a branch.').format(commit_id)
158 h.flash(message, category='warning')
158 h.flash(message, category='warning')
159
159
160 if json_mode:
160 if json_mode:
161 return message
161 return message
162
162
163 files_url = h.route_path(
163 files_url = h.route_path(
164 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id,
164 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id,
165 f_path=f_path)
165 f_path=f_path)
166 raise HTTPFound(files_url)
166 raise HTTPFound(files_url)
167
167
168 def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False):
168 def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False):
169 _ = self.request.translate
169 _ = self.request.translate
170
170
171 rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission(
171 rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission(
172 self.db_repo_name, branch_name)
172 self.db_repo_name, branch_name)
173 if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']:
173 if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']:
174 message = _('Branch `{}` changes forbidden by rule {}.').format(
174 message = _('Branch `{}` changes forbidden by rule {}.').format(
175 h.escape(branch_name), h.escape(rule))
175 h.escape(branch_name), h.escape(rule))
176 h.flash(message, 'warning')
176 h.flash(message, 'warning')
177
177
178 if json_mode:
178 if json_mode:
179 return message
179 return message
180
180
181 files_url = h.route_path(
181 files_url = h.route_path(
182 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id)
182 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id)
183
183
184 raise HTTPFound(files_url)
184 raise HTTPFound(files_url)
185
185
186 def _get_commit_and_path(self):
186 def _get_commit_and_path(self):
187 default_commit_id = self.db_repo.landing_ref_name
187 default_commit_id = self.db_repo.landing_ref_name
188 default_f_path = '/'
188 default_f_path = '/'
189
189
190 commit_id = self.request.matchdict.get(
190 commit_id = self.request.matchdict.get(
191 'commit_id', default_commit_id)
191 'commit_id', default_commit_id)
192 f_path = self._get_f_path(self.request.matchdict, default_f_path)
192 f_path = self._get_f_path(self.request.matchdict, default_f_path)
193 return commit_id, f_path
193 return commit_id, f_path
194
194
195 def _get_default_encoding(self, c):
195 def _get_default_encoding(self, c):
196 enc_list = getattr(c, 'default_encodings', [])
196 enc_list = getattr(c, 'default_encodings', [])
197 return enc_list[0] if enc_list else 'UTF-8'
197 return enc_list[0] if enc_list else 'UTF-8'
198
198
199 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
199 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
200 """
200 """
201 This is a safe way to get commit. If an error occurs it redirects to
201 This is a safe way to get commit. If an error occurs it redirects to
202 tip with proper message
202 tip with proper message
203
203
204 :param commit_id: id of commit to fetch
204 :param commit_id: id of commit to fetch
205 :param redirect_after: toggle redirection
205 :param redirect_after: toggle redirection
206 """
206 """
207 _ = self.request.translate
207 _ = self.request.translate
208
208
209 try:
209 try:
210 return self.rhodecode_vcs_repo.get_commit(commit_id)
210 return self.rhodecode_vcs_repo.get_commit(commit_id)
211 except EmptyRepositoryError:
211 except EmptyRepositoryError:
212 if not redirect_after:
212 if not redirect_after:
213 return None
213 return None
214
214
215 add_new = upload_new = ""
215 add_new = upload_new = ""
216 if h.HasRepoPermissionAny(
216 if h.HasRepoPermissionAny(
217 'repository.write', 'repository.admin')(self.db_repo_name):
217 'repository.write', 'repository.admin')(self.db_repo_name):
218 _url = h.route_path(
218 _url = h.route_path(
219 'repo_files_add_file',
219 'repo_files_add_file',
220 repo_name=self.db_repo_name, commit_id=0, f_path='')
220 repo_name=self.db_repo_name, commit_id=0, f_path='')
221 add_new = h.link_to(
221 add_new = h.link_to(
222 _('add a new file'), _url, class_="alert-link")
222 _('add a new file'), _url, class_="alert-link")
223
223
224 _url_upld = h.route_path(
224 _url_upld = h.route_path(
225 'repo_files_upload_file',
225 'repo_files_upload_file',
226 repo_name=self.db_repo_name, commit_id=0, f_path='')
226 repo_name=self.db_repo_name, commit_id=0, f_path='')
227 upload_new = h.link_to(
227 upload_new = h.link_to(
228 _('upload a new file'), _url_upld, class_="alert-link")
228 _('upload a new file'), _url_upld, class_="alert-link")
229
229
230 h.flash(h.literal(
230 h.flash(h.literal(
231 _('There are no files yet. Click here to %s or %s.') % (add_new, upload_new)), category='warning')
231 _('There are no files yet. Click here to %s or %s.') % (add_new, upload_new)), category='warning')
232 raise HTTPFound(
232 raise HTTPFound(
233 h.route_path('repo_summary', repo_name=self.db_repo_name))
233 h.route_path('repo_summary', repo_name=self.db_repo_name))
234
234
235 except (CommitDoesNotExistError, LookupError) as e:
235 except (CommitDoesNotExistError, LookupError) as e:
236 msg = _('No such commit exists for this repository. Commit: {}').format(commit_id)
236 msg = _('No such commit exists for this repository. Commit: {}').format(commit_id)
237 h.flash(msg, category='error')
237 h.flash(msg, category='error')
238 raise HTTPNotFound()
238 raise HTTPNotFound()
239 except RepositoryError as e:
239 except RepositoryError as e:
240 h.flash(h.escape(safe_str(e)), category='error')
240 h.flash(h.escape(safe_str(e)), category='error')
241 raise HTTPNotFound()
241 raise HTTPNotFound()
242
242
243 def _get_filenode_or_redirect(self, commit_obj, path, pre_load=None):
243 def _get_filenode_or_redirect(self, commit_obj, path, pre_load=None):
244 """
244 """
245 Returns file_node, if error occurs or given path is directory,
245 Returns file_node, if error occurs or given path is directory,
246 it'll redirect to top level path
246 it'll redirect to top level path
247 """
247 """
248 _ = self.request.translate
248 _ = self.request.translate
249
249
250 try:
250 try:
251 file_node = commit_obj.get_node(path, pre_load=pre_load)
251 file_node = commit_obj.get_node(path, pre_load=pre_load)
252 if file_node.is_dir():
252 if file_node.is_dir():
253 raise RepositoryError('The given path is a directory')
253 raise RepositoryError('The given path is a directory')
254 except CommitDoesNotExistError:
254 except CommitDoesNotExistError:
255 log.exception('No such commit exists for this repository')
255 log.exception('No such commit exists for this repository')
256 h.flash(_('No such commit exists for this repository'), category='error')
256 h.flash(_('No such commit exists for this repository'), category='error')
257 raise HTTPNotFound()
257 raise HTTPNotFound()
258 except RepositoryError as e:
258 except RepositoryError as e:
259 log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e)
259 log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e)
260 h.flash(h.escape(safe_str(e)), category='error')
260 h.flash(h.escape(safe_str(e)), category='error')
261 raise HTTPNotFound()
261 raise HTTPNotFound()
262
262
263 return file_node
263 return file_node
264
264
265 def _is_valid_head(self, commit_id, repo, landing_ref):
265 def _is_valid_head(self, commit_id, repo, landing_ref):
266 branch_name = sha_commit_id = ''
266 branch_name = sha_commit_id = ''
267 is_head = False
267 is_head = False
268 log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo)
268 log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo)
269
269
270 for _branch_name, branch_commit_id in repo.branches.items():
270 for _branch_name, branch_commit_id in repo.branches.items():
271 # simple case we pass in branch name, it's a HEAD
271 # simple case we pass in branch name, it's a HEAD
272 if commit_id == _branch_name:
272 if commit_id == _branch_name:
273 is_head = True
273 is_head = True
274 branch_name = _branch_name
274 branch_name = _branch_name
275 sha_commit_id = branch_commit_id
275 sha_commit_id = branch_commit_id
276 break
276 break
277 # case when we pass in full sha commit_id, which is a head
277 # case when we pass in full sha commit_id, which is a head
278 elif commit_id == branch_commit_id:
278 elif commit_id == branch_commit_id:
279 is_head = True
279 is_head = True
280 branch_name = _branch_name
280 branch_name = _branch_name
281 sha_commit_id = branch_commit_id
281 sha_commit_id = branch_commit_id
282 break
282 break
283
283
284 if h.is_svn(repo) and not repo.is_empty():
284 if h.is_svn(repo) and not repo.is_empty():
285 # Note: Subversion only has one head.
285 # Note: Subversion only has one head.
286 if commit_id == repo.get_commit(commit_idx=-1).raw_id:
286 if commit_id == repo.get_commit(commit_idx=-1).raw_id:
287 is_head = True
287 is_head = True
288 return branch_name, sha_commit_id, is_head
288 return branch_name, sha_commit_id, is_head
289
289
290 # checked branches, means we only need to try to get the branch/commit_sha
290 # checked branches, means we only need to try to get the branch/commit_sha
291 if repo.is_empty():
291 if repo.is_empty():
292 is_head = True
292 is_head = True
293 branch_name = landing_ref
293 branch_name = landing_ref
294 sha_commit_id = EmptyCommit().raw_id
294 sha_commit_id = EmptyCommit().raw_id
295 else:
295 else:
296 commit = repo.get_commit(commit_id=commit_id)
296 commit = repo.get_commit(commit_id=commit_id)
297 if commit:
297 if commit:
298 branch_name = commit.branch
298 branch_name = commit.branch
299 sha_commit_id = commit.raw_id
299 sha_commit_id = commit.raw_id
300
300
301 return branch_name, sha_commit_id, is_head
301 return branch_name, sha_commit_id, is_head
302
302
303 def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False, at_rev=None):
303 def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False, at_rev=None):
304
304
305 repo_id = self.db_repo.repo_id
305 repo_id = self.db_repo.repo_id
306 force_recache = self.get_recache_flag()
306 force_recache = self.get_recache_flag()
307
307
308 cache_seconds = safe_int(
308 cache_seconds = safe_int(
309 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
309 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
310 cache_on = not force_recache and cache_seconds > 0
310 cache_on = not force_recache and cache_seconds > 0
311 log.debug(
311 log.debug(
312 'Computing FILE TREE for repo_id %s commit_id `%s` and path `%s`'
312 'Computing FILE TREE for repo_id %s commit_id `%s` and path `%s`'
313 'with caching: %s[TTL: %ss]' % (
313 'with caching: %s[TTL: %ss]' % (
314 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
314 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
315
315
316 cache_namespace_uid = f'repo.{rc_cache.FILE_TREE_CACHE_VER}.{repo_id}'
316 cache_namespace_uid = f'repo.{rc_cache.FILE_TREE_CACHE_VER}.{repo_id}'
317 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
317 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
318
318
319 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
319 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
320 def compute_file_tree(_name_hash, _repo_id, _commit_id, _f_path, _full_load, _at_rev):
320 def compute_file_tree(_name_hash, _repo_id, _commit_id, _f_path, _full_load, _at_rev):
321 log.debug('Generating cached file tree at for repo_id: %s, %s, %s',
321 log.debug('Generating cached file tree at for repo_id: %s, %s, %s',
322 _repo_id, _commit_id, _f_path)
322 _repo_id, _commit_id, _f_path)
323
323
324 c.full_load = _full_load
324 c.full_load = _full_load
325 return render(
325 return render(
326 'rhodecode:templates/files/files_browser_tree.mako',
326 'rhodecode:templates/files/files_browser_tree.mako',
327 self._get_template_context(c), self.request, _at_rev)
327 self._get_template_context(c), self.request, _at_rev)
328
328
329 return compute_file_tree(
329 return compute_file_tree(
330 self.db_repo.repo_name_hash, self.db_repo.repo_id, commit_id, f_path, full_load, at_rev)
330 self.db_repo.repo_name_hash, self.db_repo.repo_id, commit_id, f_path, full_load, at_rev)
331
331
332 def create_pure_path(self, *parts):
332 def create_pure_path(self, *parts):
333 # Split paths and sanitize them, removing any ../ etc
333 # Split paths and sanitize them, removing any ../ etc
334 sanitized_path = [
334 sanitized_path = [
335 x for x in pathlib.PurePath(*parts).parts
335 x for x in pathlib.PurePath(*parts).parts
336 if x not in ['.', '..']]
336 if x not in ['.', '..']]
337
337
338 pure_path = pathlib.PurePath(*sanitized_path)
338 pure_path = pathlib.PurePath(*sanitized_path)
339 return pure_path
339 return pure_path
340
340
341 def _is_lf_enabled(self, target_repo):
341 def _is_lf_enabled(self, target_repo):
342 lf_enabled = False
342 lf_enabled = False
343
343
344 lf_key_for_vcs_map = {
344 lf_key_for_vcs_map = {
345 'hg': 'extensions_largefiles',
345 'hg': 'extensions_largefiles',
346 'git': 'vcs_git_lfs_enabled'
346 'git': 'vcs_git_lfs_enabled'
347 }
347 }
348
348
349 lf_key_for_vcs = lf_key_for_vcs_map.get(target_repo.repo_type)
349 lf_key_for_vcs = lf_key_for_vcs_map.get(target_repo.repo_type)
350
350
351 if lf_key_for_vcs:
351 if lf_key_for_vcs:
352 lf_enabled = self._get_repo_setting(target_repo, lf_key_for_vcs)
352 lf_enabled = self._get_repo_setting(target_repo, lf_key_for_vcs)
353
353
354 return lf_enabled
354 return lf_enabled
355
355
356 @LoginRequired()
356 @LoginRequired()
357 @HasRepoPermissionAnyDecorator(
357 @HasRepoPermissionAnyDecorator(
358 'repository.read', 'repository.write', 'repository.admin')
358 'repository.read', 'repository.write', 'repository.admin')
359 def repo_archivefile(self):
359 def repo_archivefile(self):
360 # archive cache config
360 # archive cache config
361 from rhodecode import CONFIG
361 from rhodecode import CONFIG
362 _ = self.request.translate
362 _ = self.request.translate
363 self.load_default_context()
363 self.load_default_context()
364 default_at_path = '/'
364 default_at_path = '/'
365 fname = self.request.matchdict['fname']
365 fname = self.request.matchdict['fname']
366 subrepos = self.request.GET.get('subrepos') == 'true'
366 subrepos = self.request.GET.get('subrepos') == 'true'
367 with_hash = str2bool(self.request.GET.get('with_hash', '1'))
367 with_hash = str2bool(self.request.GET.get('with_hash', '1'))
368 at_path = self.request.GET.get('at_path') or default_at_path
368 at_path = self.request.GET.get('at_path') or default_at_path
369
369
370 if not self.db_repo.enable_downloads:
370 if not self.db_repo.enable_downloads:
371 return Response(_('Downloads disabled'))
371 return Response(_('Downloads disabled'))
372
372
373 try:
373 try:
374 commit_id, ext, fileformat, content_type = \
374 commit_id, ext, fileformat, content_type = \
375 _get_archive_spec(fname)
375 _get_archive_spec(fname)
376 except ValueError:
376 except ValueError:
377 return Response(_('Unknown archive type for: `{}`').format(
377 return Response(_('Unknown archive type for: `{}`').format(
378 h.escape(fname)))
378 h.escape(fname)))
379
379
380 try:
380 try:
381 commit = self.rhodecode_vcs_repo.get_commit(commit_id)
381 commit = self.rhodecode_vcs_repo.get_commit(commit_id)
382 except CommitDoesNotExistError:
382 except CommitDoesNotExistError:
383 return Response(_('Unknown commit_id {}').format(
383 return Response(_('Unknown commit_id {}').format(
384 h.escape(commit_id)))
384 h.escape(commit_id)))
385 except EmptyRepositoryError:
385 except EmptyRepositoryError:
386 return Response(_('Empty repository'))
386 return Response(_('Empty repository'))
387
387
388 # we used a ref, or a shorter version, lets redirect client ot use explicit hash
388 # we used a ref, or a shorter version, lets redirect client ot use explicit hash
389 if commit_id != commit.raw_id:
389 if commit_id != commit.raw_id:
390 fname=f'{commit.raw_id}{ext}'
390 fname=f'{commit.raw_id}{ext}'
391 raise HTTPFound(self.request.current_route_path(fname=fname))
391 raise HTTPFound(self.request.current_route_path(fname=fname))
392
392
393 try:
393 try:
394 at_path = commit.get_node(at_path).path or default_at_path
394 at_path = commit.get_node(at_path).path or default_at_path
395 except Exception:
395 except Exception:
396 return Response(_('No node at path {} for this repository').format(h.escape(at_path)))
396 return Response(_('No node at path {} for this repository').format(h.escape(at_path)))
397
397
398 path_sha = get_path_sha(at_path)
398 path_sha = get_path_sha(at_path)
399
399
400 # used for cache etc, consistent unique archive name
400 # used for cache etc, consistent unique archive name
401 archive_name_key = get_archive_name(
401 archive_name_key = get_archive_name(
402 self.db_repo.repo_id, self.db_repo_name, commit_sha=commit.short_id, ext=ext, subrepos=subrepos,
402 self.db_repo.repo_id, self.db_repo_name, commit_sha=commit.short_id, ext=ext, subrepos=subrepos,
403 path_sha=path_sha, with_hash=True)
403 path_sha=path_sha, with_hash=True)
404
404
405 if not with_hash:
405 if not with_hash:
406 path_sha = ''
406 path_sha = ''
407
407
408 # what end client gets served
408 # what end client gets served
409 response_archive_name = get_archive_name(
409 response_archive_name = get_archive_name(
410 self.db_repo.repo_id, self.db_repo_name, commit_sha=commit.short_id, ext=ext, subrepos=subrepos,
410 self.db_repo.repo_id, self.db_repo_name, commit_sha=commit.short_id, ext=ext, subrepos=subrepos,
411 path_sha=path_sha, with_hash=with_hash)
411 path_sha=path_sha, with_hash=with_hash)
412
412
413 # remove extension from our archive directory name
413 # remove extension from our archive directory name
414 archive_dir_name = response_archive_name[:-len(ext)]
414 archive_dir_name = response_archive_name[:-len(ext)]
415
415
416 archive_cache_disable = self.request.GET.get('no_cache')
416 archive_cache_disable = self.request.GET.get('no_cache')
417
417
418 d_cache = get_archival_cache_store(config=CONFIG)
418 d_cache = get_archival_cache_store(config=CONFIG)
419
419
420 # NOTE: we get the config to pass to a call to lazy-init the SAME type of cache on vcsserver
420 # NOTE: we get the config to pass to a call to lazy-init the SAME type of cache on vcsserver
421 d_cache_conf = get_archival_config(config=CONFIG)
421 d_cache_conf = get_archival_config(config=CONFIG)
422
422
423 # This is also a cache key, and lock key
423 # This is also a cache key, and lock key
424 reentrant_lock_key = archive_name_key + '.lock'
424 reentrant_lock_key = archive_name_key + '.lock'
425
425
426 use_cached_archive = False
426 use_cached_archive = False
427 if not archive_cache_disable and archive_name_key in d_cache:
427 if not archive_cache_disable and archive_name_key in d_cache:
428 reader, metadata = d_cache.fetch(archive_name_key)
428 reader, metadata = d_cache.fetch(archive_name_key)
429
429
430 use_cached_archive = True
430 use_cached_archive = True
431 log.debug('Found cached archive as key=%s tag=%s, serving archive from cache reader=%s',
431 log.debug('Found cached archive as key=%s tag=%s, serving archive from cache reader=%s',
432 archive_name_key, metadata, reader.name)
432 archive_name_key, metadata, reader.name)
433 else:
433 else:
434 reader = None
434 reader = None
435 log.debug('Archive with key=%s is not yet cached, creating one now...', archive_name_key)
435 log.debug('Archive with key=%s is not yet cached, creating one now...', archive_name_key)
436
436
437 if not reader:
437 if not reader:
438 # generate new archive, as previous was not found in the cache
438 # generate new archive, as previous was not found in the cache
439 try:
439 try:
440 with d_cache.get_lock(reentrant_lock_key):
440 with d_cache.get_lock(reentrant_lock_key):
441 try:
441 try:
442 commit.archive_repo(archive_name_key, archive_dir_name=archive_dir_name,
442 commit.archive_repo(archive_name_key, archive_dir_name=archive_dir_name,
443 kind=fileformat, subrepos=subrepos,
443 kind=fileformat, subrepos=subrepos,
444 archive_at_path=at_path, cache_config=d_cache_conf)
444 archive_at_path=at_path, cache_config=d_cache_conf)
445 except ImproperArchiveTypeError:
445 except ImproperArchiveTypeError:
446 return _('Unknown archive type')
446 return _('Unknown archive type')
447
447
448 except ArchiveCacheGenerationLock:
448 except ArchiveCacheGenerationLock:
449 retry_after = round(random.uniform(0.3, 3.0), 1)
449 retry_after = round(random.uniform(0.3, 3.0), 1)
450 time.sleep(retry_after)
450 time.sleep(retry_after)
451
451
452 location = self.request.url
452 location = self.request.url
453 response = Response(
453 response = Response(
454 f"archive {archive_name_key} generation in progress, Retry-After={retry_after}, Location={location}"
454 f"archive {archive_name_key} generation in progress, Retry-After={retry_after}, Location={location}"
455 )
455 )
456 response.headers["Retry-After"] = str(retry_after)
456 response.headers["Retry-After"] = str(retry_after)
457 response.status_code = 307 # temporary redirect
457 response.status_code = 307 # temporary redirect
458
458
459 response.location = location
459 response.location = location
460 return response
460 return response
461
461
462 reader, metadata = d_cache.fetch(archive_name_key, retry=True, retry_attempts=30)
462 reader, metadata = d_cache.fetch(archive_name_key, retry=True, retry_attempts=30)
463
463
464 response = Response(app_iter=archive_iterator(reader))
464 response = Response(app_iter=archive_iterator(reader))
465 response.content_disposition = f'attachment; filename={response_archive_name}'
465 response.content_disposition = f'attachment; filename={response_archive_name}'
466 response.content_type = str(content_type)
466 response.content_type = str(content_type)
467
467
468 try:
468 try:
469 return response
469 return response
470 finally:
470 finally:
471 # store download action
471 # store download action
472 audit_logger.store_web(
472 audit_logger.store_web(
473 'repo.archive.download', action_data={
473 'repo.archive.download', action_data={
474 'user_agent': self.request.user_agent,
474 'user_agent': self.request.user_agent,
475 'archive_name': archive_name_key,
475 'archive_name': archive_name_key,
476 'archive_spec': fname,
476 'archive_spec': fname,
477 'archive_cached': use_cached_archive},
477 'archive_cached': use_cached_archive},
478 user=self._rhodecode_user,
478 user=self._rhodecode_user,
479 repo=self.db_repo,
479 repo=self.db_repo,
480 commit=True
480 commit=True
481 )
481 )
482
482
483 def _get_file_node(self, commit_id, f_path):
483 def _get_file_node(self, commit_id, f_path):
484 if commit_id not in ['', None, 'None', '0' * 12, '0' * 40]:
484 if commit_id not in ['', None, 'None', '0' * 12, '0' * 40]:
485 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
485 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
486 try:
486 try:
487 node = commit.get_node(f_path)
487 node = commit.get_node(f_path)
488 if node.is_dir():
488 if node.is_dir():
489 raise NodeError(f'{node} path is a {type(node)} not a file')
489 raise NodeError(f'{node} path is a {type(node)} not a file')
490 except NodeDoesNotExistError:
490 except NodeDoesNotExistError:
491 commit = EmptyCommit(
491 commit = EmptyCommit(
492 commit_id=commit_id,
492 commit_id=commit_id,
493 idx=commit.idx,
493 idx=commit.idx,
494 repo=commit.repository,
494 repo=commit.repository,
495 alias=commit.repository.alias,
495 alias=commit.repository.alias,
496 message=commit.message,
496 message=commit.message,
497 author=commit.author,
497 author=commit.author,
498 date=commit.date)
498 date=commit.date)
499 node = FileNode(safe_bytes(f_path), b'', commit=commit)
499 node = FileNode(safe_bytes(f_path), b'', commit=commit)
500 else:
500 else:
501 commit = EmptyCommit(
501 commit = EmptyCommit(
502 repo=self.rhodecode_vcs_repo,
502 repo=self.rhodecode_vcs_repo,
503 alias=self.rhodecode_vcs_repo.alias)
503 alias=self.rhodecode_vcs_repo.alias)
504 node = FileNode(safe_bytes(f_path), b'', commit=commit)
504 node = FileNode(safe_bytes(f_path), b'', commit=commit)
505 return node
505 return node
506
506
507 @LoginRequired()
507 @LoginRequired()
508 @HasRepoPermissionAnyDecorator(
508 @HasRepoPermissionAnyDecorator(
509 'repository.read', 'repository.write', 'repository.admin')
509 'repository.read', 'repository.write', 'repository.admin')
510 def repo_files_diff(self):
510 def repo_files_diff(self):
511 c = self.load_default_context()
511 c = self.load_default_context()
512 f_path = self._get_f_path(self.request.matchdict)
512 f_path = self._get_f_path(self.request.matchdict)
513 diff1 = self.request.GET.get('diff1', '')
513 diff1 = self.request.GET.get('diff1', '')
514 diff2 = self.request.GET.get('diff2', '')
514 diff2 = self.request.GET.get('diff2', '')
515
515
516 path1, diff1 = parse_path_ref(diff1, default_path=f_path)
516 path1, diff1 = parse_path_ref(diff1, default_path=f_path)
517
517
518 ignore_whitespace = str2bool(self.request.GET.get('ignorews'))
518 ignore_whitespace = str2bool(self.request.GET.get('ignorews'))
519 line_context = self.request.GET.get('context', 3)
519 line_context = self.request.GET.get('context', 3)
520
520
521 if not any((diff1, diff2)):
521 if not any((diff1, diff2)):
522 h.flash(
522 h.flash(
523 'Need query parameter "diff1" or "diff2" to generate a diff.',
523 'Need query parameter "diff1" or "diff2" to generate a diff.',
524 category='error')
524 category='error')
525 raise HTTPBadRequest()
525 raise HTTPBadRequest()
526
526
527 c.action = self.request.GET.get('diff')
527 c.action = self.request.GET.get('diff')
528 if c.action not in ['download', 'raw']:
528 if c.action not in ['download', 'raw']:
529 compare_url = h.route_path(
529 compare_url = h.route_path(
530 'repo_compare',
530 'repo_compare',
531 repo_name=self.db_repo_name,
531 repo_name=self.db_repo_name,
532 source_ref_type='rev',
532 source_ref_type='rev',
533 source_ref=diff1,
533 source_ref=diff1,
534 target_repo=self.db_repo_name,
534 target_repo=self.db_repo_name,
535 target_ref_type='rev',
535 target_ref_type='rev',
536 target_ref=diff2,
536 target_ref=diff2,
537 _query=dict(f_path=f_path))
537 _query=dict(f_path=f_path))
538 # redirect to new view if we render diff
538 # redirect to new view if we render diff
539 raise HTTPFound(compare_url)
539 raise HTTPFound(compare_url)
540
540
541 try:
541 try:
542 node1 = self._get_file_node(diff1, path1)
542 node1 = self._get_file_node(diff1, path1)
543 node2 = self._get_file_node(diff2, f_path)
543 node2 = self._get_file_node(diff2, f_path)
544 except (RepositoryError, NodeError):
544 except (RepositoryError, NodeError):
545 log.exception("Exception while trying to get node from repository")
545 log.exception("Exception while trying to get node from repository")
546 raise HTTPFound(
546 raise HTTPFound(
547 h.route_path('repo_files', repo_name=self.db_repo_name,
547 h.route_path('repo_files', repo_name=self.db_repo_name,
548 commit_id='tip', f_path=f_path))
548 commit_id='tip', f_path=f_path))
549
549
550 if all(isinstance(node.commit, EmptyCommit)
550 if all(isinstance(node.commit, EmptyCommit)
551 for node in (node1, node2)):
551 for node in (node1, node2)):
552 raise HTTPNotFound()
552 raise HTTPNotFound()
553
553
554 c.commit_1 = node1.commit
554 c.commit_1 = node1.commit
555 c.commit_2 = node2.commit
555 c.commit_2 = node2.commit
556
556
557 if c.action == 'download':
557 if c.action == 'download':
558 _diff = diffs.get_gitdiff(node1, node2,
558 _diff = diffs.get_gitdiff(node1, node2,
559 ignore_whitespace=ignore_whitespace,
559 ignore_whitespace=ignore_whitespace,
560 context=line_context)
560 context=line_context)
561 # NOTE: this was using diff_format='gitdiff'
561 # NOTE: this was using diff_format='gitdiff'
562 diff = diffs.DiffProcessor(_diff, diff_format='newdiff')
562 diff = diffs.DiffProcessor(_diff, diff_format='newdiff')
563
563
564 response = Response(self.path_filter.get_raw_patch(diff))
564 response = Response(self.path_filter.get_raw_patch(diff))
565 response.content_type = 'text/plain'
565 response.content_type = 'text/plain'
566 response.content_disposition = (
566 response.content_disposition = (
567 f'attachment; filename={f_path}_{diff1}_vs_{diff2}.diff'
567 f'attachment; filename={f_path}_{diff1}_vs_{diff2}.diff'
568 )
568 )
569 charset = self._get_default_encoding(c)
569 charset = self._get_default_encoding(c)
570 if charset:
570 if charset:
571 response.charset = charset
571 response.charset = charset
572 return response
572 return response
573
573
574 elif c.action == 'raw':
574 elif c.action == 'raw':
575 _diff = diffs.get_gitdiff(node1, node2,
575 _diff = diffs.get_gitdiff(node1, node2,
576 ignore_whitespace=ignore_whitespace,
576 ignore_whitespace=ignore_whitespace,
577 context=line_context)
577 context=line_context)
578 # NOTE: this was using diff_format='gitdiff'
578 # NOTE: this was using diff_format='gitdiff'
579 diff = diffs.DiffProcessor(_diff, diff_format='newdiff')
579 diff = diffs.DiffProcessor(_diff, diff_format='newdiff')
580
580
581 response = Response(self.path_filter.get_raw_patch(diff))
581 response = Response(self.path_filter.get_raw_patch(diff))
582 response.content_type = 'text/plain'
582 response.content_type = 'text/plain'
583 charset = self._get_default_encoding(c)
583 charset = self._get_default_encoding(c)
584 if charset:
584 if charset:
585 response.charset = charset
585 response.charset = charset
586 return response
586 return response
587
587
588 # in case we ever end up here
588 # in case we ever end up here
589 raise HTTPNotFound()
589 raise HTTPNotFound()
590
590
591 @LoginRequired()
591 @LoginRequired()
592 @HasRepoPermissionAnyDecorator(
592 @HasRepoPermissionAnyDecorator(
593 'repository.read', 'repository.write', 'repository.admin')
593 'repository.read', 'repository.write', 'repository.admin')
594 def repo_files_diff_2way_redirect(self):
594 def repo_files_diff_2way_redirect(self):
595 """
595 """
596 Kept only to make OLD links work
596 Kept only to make OLD links work
597 """
597 """
598 f_path = self._get_f_path_unchecked(self.request.matchdict)
598 f_path = self._get_f_path_unchecked(self.request.matchdict)
599 diff1 = self.request.GET.get('diff1', '')
599 diff1 = self.request.GET.get('diff1', '')
600 diff2 = self.request.GET.get('diff2', '')
600 diff2 = self.request.GET.get('diff2', '')
601
601
602 if not any((diff1, diff2)):
602 if not any((diff1, diff2)):
603 h.flash(
603 h.flash(
604 'Need query parameter "diff1" or "diff2" to generate a diff.',
604 'Need query parameter "diff1" or "diff2" to generate a diff.',
605 category='error')
605 category='error')
606 raise HTTPBadRequest()
606 raise HTTPBadRequest()
607
607
608 compare_url = h.route_path(
608 compare_url = h.route_path(
609 'repo_compare',
609 'repo_compare',
610 repo_name=self.db_repo_name,
610 repo_name=self.db_repo_name,
611 source_ref_type='rev',
611 source_ref_type='rev',
612 source_ref=diff1,
612 source_ref=diff1,
613 target_ref_type='rev',
613 target_ref_type='rev',
614 target_ref=diff2,
614 target_ref=diff2,
615 _query=dict(f_path=f_path, diffmode='sideside',
615 _query=dict(f_path=f_path, diffmode='sideside',
616 target_repo=self.db_repo_name,))
616 target_repo=self.db_repo_name,))
617 raise HTTPFound(compare_url)
617 raise HTTPFound(compare_url)
618
618
619 @LoginRequired()
619 @LoginRequired()
620 def repo_files_default_commit_redirect(self):
620 def repo_files_default_commit_redirect(self):
621 """
621 """
622 Special page that redirects to the landing page of files based on the default
622 Special page that redirects to the landing page of files based on the default
623 commit for repository
623 commit for repository
624 """
624 """
625 c = self.load_default_context()
625 c = self.load_default_context()
626 ref_name = c.rhodecode_db_repo.landing_ref_name
626 ref_name = c.rhodecode_db_repo.landing_ref_name
627 landing_url = h.repo_files_by_ref_url(
627 landing_url = h.repo_files_by_ref_url(
628 c.rhodecode_db_repo.repo_name,
628 c.rhodecode_db_repo.repo_name,
629 c.rhodecode_db_repo.repo_type,
629 c.rhodecode_db_repo.repo_type,
630 f_path='',
630 f_path='',
631 ref_name=ref_name,
631 ref_name=ref_name,
632 commit_id='tip',
632 commit_id='tip',
633 query=dict(at=ref_name)
633 query=dict(at=ref_name)
634 )
634 )
635
635
636 raise HTTPFound(landing_url)
636 raise HTTPFound(landing_url)
637
637
638 @LoginRequired()
638 @LoginRequired()
639 @HasRepoPermissionAnyDecorator(
639 @HasRepoPermissionAnyDecorator(
640 'repository.read', 'repository.write', 'repository.admin')
640 'repository.read', 'repository.write', 'repository.admin')
641 def repo_files(self):
641 def repo_files(self):
642 c = self.load_default_context()
642 c = self.load_default_context()
643
643
644 view_name = getattr(self.request.matched_route, 'name', None)
644 view_name = getattr(self.request.matched_route, 'name', None)
645
645
646 c.annotate = view_name == 'repo_files:annotated'
646 c.annotate = view_name == 'repo_files:annotated'
647 # default is false, but .rst/.md files later are auto rendered, we can
647 # default is false, but .rst/.md files later are auto rendered, we can
648 # overwrite auto rendering by setting this GET flag
648 # overwrite auto rendering by setting this GET flag
649 c.renderer = view_name == 'repo_files:rendered' or not self.request.GET.get('no-render', False)
649 c.renderer = view_name == 'repo_files:rendered' or not self.request.GET.get('no-render', False)
650
650
651 commit_id, f_path = self._get_commit_and_path()
651 commit_id, f_path = self._get_commit_and_path()
652
652
653 c.commit = self._get_commit_or_redirect(commit_id)
653 c.commit = self._get_commit_or_redirect(commit_id)
654 c.branch = self.request.GET.get('branch', None)
654 c.branch = self.request.GET.get('branch', None)
655 c.f_path = f_path
655 c.f_path = f_path
656 at_rev = self.request.GET.get('at')
656 at_rev = self.request.GET.get('at')
657
657
658 # files or dirs
658 # files or dirs
659 try:
659 try:
660 c.file = c.commit.get_node(f_path, pre_load=['is_binary', 'size', 'data'])
660 c.file = c.commit.get_node(f_path, pre_load=['is_binary', 'size', 'data'])
661
661
662 c.file_author = True
662 c.file_author = True
663 c.file_tree = ''
663 c.file_tree = ''
664
664
665 # prev link
665 # prev link
666 try:
666 try:
667 prev_commit = c.commit.prev(c.branch)
667 prev_commit = c.commit.prev(c.branch)
668 c.prev_commit = prev_commit
668 c.prev_commit = prev_commit
669 c.url_prev = h.route_path(
669 c.url_prev = h.route_path(
670 'repo_files', repo_name=self.db_repo_name,
670 'repo_files', repo_name=self.db_repo_name,
671 commit_id=prev_commit.raw_id, f_path=f_path)
671 commit_id=prev_commit.raw_id, f_path=f_path)
672 if c.branch:
672 if c.branch:
673 c.url_prev += '?branch=%s' % c.branch
673 c.url_prev += '?branch=%s' % c.branch
674 except (CommitDoesNotExistError, VCSError):
674 except (CommitDoesNotExistError, VCSError):
675 c.url_prev = '#'
675 c.url_prev = '#'
676 c.prev_commit = EmptyCommit()
676 c.prev_commit = EmptyCommit()
677
677
678 # next link
678 # next link
679 try:
679 try:
680 next_commit = c.commit.next(c.branch)
680 next_commit = c.commit.next(c.branch)
681 c.next_commit = next_commit
681 c.next_commit = next_commit
682 c.url_next = h.route_path(
682 c.url_next = h.route_path(
683 'repo_files', repo_name=self.db_repo_name,
683 'repo_files', repo_name=self.db_repo_name,
684 commit_id=next_commit.raw_id, f_path=f_path)
684 commit_id=next_commit.raw_id, f_path=f_path)
685 if c.branch:
685 if c.branch:
686 c.url_next += '?branch=%s' % c.branch
686 c.url_next += '?branch=%s' % c.branch
687 except (CommitDoesNotExistError, VCSError):
687 except (CommitDoesNotExistError, VCSError):
688 c.url_next = '#'
688 c.url_next = '#'
689 c.next_commit = EmptyCommit()
689 c.next_commit = EmptyCommit()
690
690
691 # load file content
691 # load file content
692 if c.file.is_file():
692 if c.file.is_file():
693
693
694 c.lf_node = {}
694 c.lf_node = {}
695
695
696 has_lf_enabled = self._is_lf_enabled(self.db_repo)
696 has_lf_enabled = self._is_lf_enabled(self.db_repo)
697 if has_lf_enabled:
697 if has_lf_enabled:
698 c.lf_node = c.file.get_largefile_node()
698 c.lf_node = c.file.get_largefile_node()
699
699
700 c.file_source_page = 'true'
700 c.file_source_page = 'true'
701 c.file_last_commit = c.file.last_commit
701 c.file_last_commit = c.file.last_commit
702
702
703 c.file_size_too_big = c.file.size > c.visual.cut_off_limit_file
703 c.file_size_too_big = c.file.size > c.visual.cut_off_limit_file
704
704
705 if not (c.file_size_too_big or c.file.is_binary):
705 if not (c.file_size_too_big or c.file.is_binary):
706 if c.annotate: # annotation has precedence over renderer
706 if c.annotate: # annotation has precedence over renderer
707 c.annotated_lines = filenode_as_annotated_lines_tokens(
707 c.annotated_lines = filenode_as_annotated_lines_tokens(
708 c.file
708 c.file
709 )
709 )
710 else:
710 else:
711 c.renderer = (
711 c.renderer = (
712 c.renderer and h.renderer_from_filename(c.file.path)
712 c.renderer and h.renderer_from_filename(c.file.path)
713 )
713 )
714 if not c.renderer:
714 if not c.renderer:
715 c.lines = filenode_as_lines_tokens(c.file)
715 c.lines = filenode_as_lines_tokens(c.file)
716
716
717 _branch_name, _sha_commit_id, is_head = \
717 _branch_name, _sha_commit_id, is_head = \
718 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
718 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
719 landing_ref=self.db_repo.landing_ref_name)
719 landing_ref=self.db_repo.landing_ref_name)
720 c.on_branch_head = is_head
720 c.on_branch_head = is_head
721
721
722 branch = c.commit.branch if (
722 branch = c.commit.branch if (
723 c.commit.branch and '/' not in c.commit.branch) else None
723 c.commit.branch and '/' not in c.commit.branch) else None
724 c.branch_or_raw_id = branch or c.commit.raw_id
724 c.branch_or_raw_id = branch or c.commit.raw_id
725 c.branch_name = c.commit.branch or h.short_id(c.commit.raw_id)
725 c.branch_name = c.commit.branch or h.short_id(c.commit.raw_id)
726
726
727 author = c.file_last_commit.author
727 author = c.file_last_commit.author
728 c.authors = [[
728 c.authors = [[
729 h.email(author),
729 h.email(author),
730 h.person(author, 'username_or_name_or_email'),
730 h.person(author, 'username_or_name_or_email'),
731 1
731 1
732 ]]
732 ]]
733
733
734 else: # load tree content at path
734 else: # load tree content at path
735 c.file_source_page = 'false'
735 c.file_source_page = 'false'
736 c.authors = []
736 c.authors = []
737 # this loads a simple tree without metadata to speed things up
737 # this loads a simple tree without metadata to speed things up
738 # later via ajax we call repo_nodetree_full and fetch whole
738 # later via ajax we call repo_nodetree_full and fetch whole
739 c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path, at_rev=at_rev)
739 c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path, at_rev=at_rev)
740
740
741 c.readme_data, c.readme_file = \
741 c.readme_data, c.readme_file = \
742 self._get_readme_data(self.db_repo, c.visual.default_renderer,
742 self._get_readme_data(self.db_repo, c.visual.default_renderer,
743 c.commit.raw_id, f_path)
743 c.commit.raw_id, f_path)
744
744
745 except RepositoryError as e:
745 except RepositoryError as e:
746 h.flash(h.escape(safe_str(e)), category='error')
746 h.flash(h.escape(safe_str(e)), category='error')
747 raise HTTPNotFound()
747 raise HTTPNotFound()
748
748
749 if self.request.environ.get('HTTP_X_PJAX'):
749 if self.request.environ.get('HTTP_X_PJAX'):
750 html = render('rhodecode:templates/files/files_pjax.mako',
750 html = render('rhodecode:templates/files/files_pjax.mako',
751 self._get_template_context(c), self.request)
751 self._get_template_context(c), self.request)
752 else:
752 else:
753 html = render('rhodecode:templates/files/files.mako',
753 html = render('rhodecode:templates/files/files.mako',
754 self._get_template_context(c), self.request)
754 self._get_template_context(c), self.request)
755 return Response(html)
755 return Response(html)
756
756
757 @HasRepoPermissionAnyDecorator(
757 @HasRepoPermissionAnyDecorator(
758 'repository.read', 'repository.write', 'repository.admin')
758 'repository.read', 'repository.write', 'repository.admin')
759 def repo_files_annotated_previous(self):
759 def repo_files_annotated_previous(self):
760 self.load_default_context()
760 self.load_default_context()
761
761
762 commit_id, f_path = self._get_commit_and_path()
762 commit_id, f_path = self._get_commit_and_path()
763 commit = self._get_commit_or_redirect(commit_id)
763 commit = self._get_commit_or_redirect(commit_id)
764 prev_commit_id = commit.raw_id
764 prev_commit_id = commit.raw_id
765 line_anchor = self.request.GET.get('line_anchor')
765 line_anchor = self.request.GET.get('line_anchor')
766 is_file = False
766 is_file = False
767 try:
767 try:
768 _file = commit.get_node(f_path)
768 _file = commit.get_node(f_path)
769 is_file = _file.is_file()
769 is_file = _file.is_file()
770 except (NodeDoesNotExistError, CommitDoesNotExistError, VCSError):
770 except (NodeDoesNotExistError, CommitDoesNotExistError, VCSError):
771 pass
771 pass
772
772
773 if is_file:
773 if is_file:
774 history = commit.get_path_history(f_path)
774 history = commit.get_path_history(f_path)
775 prev_commit_id = history[1].raw_id \
775 prev_commit_id = history[1].raw_id \
776 if len(history) > 1 else prev_commit_id
776 if len(history) > 1 else prev_commit_id
777 prev_url = h.route_path(
777 prev_url = h.route_path(
778 'repo_files:annotated', repo_name=self.db_repo_name,
778 'repo_files:annotated', repo_name=self.db_repo_name,
779 commit_id=prev_commit_id, f_path=f_path,
779 commit_id=prev_commit_id, f_path=f_path,
780 _anchor=f'L{line_anchor}')
780 _anchor=f'L{line_anchor}')
781
781
782 raise HTTPFound(prev_url)
782 raise HTTPFound(prev_url)
783
783
784 @LoginRequired()
784 @LoginRequired()
785 @HasRepoPermissionAnyDecorator(
785 @HasRepoPermissionAnyDecorator(
786 'repository.read', 'repository.write', 'repository.admin')
786 'repository.read', 'repository.write', 'repository.admin')
787 def repo_nodetree_full(self):
787 def repo_nodetree_full(self):
788 """
788 """
789 Returns rendered html of file tree that contains commit date,
789 Returns rendered html of file tree that contains commit date,
790 author, commit_id for the specified combination of
790 author, commit_id for the specified combination of
791 repo, commit_id and file path
791 repo, commit_id and file path
792 """
792 """
793 c = self.load_default_context()
793 c = self.load_default_context()
794
794
795 commit_id, f_path = self._get_commit_and_path()
795 commit_id, f_path = self._get_commit_and_path()
796 commit = self._get_commit_or_redirect(commit_id)
796 commit = self._get_commit_or_redirect(commit_id)
797 try:
797 try:
798 dir_node = commit.get_node(f_path)
798 dir_node = commit.get_node(f_path)
799 except RepositoryError as e:
799 except RepositoryError as e:
800 return Response(f'error: {h.escape(safe_str(e))}')
800 return Response(f'error: {h.escape(safe_str(e))}')
801
801
802 if dir_node.is_file():
802 if dir_node.is_file():
803 return Response('')
803 return Response('')
804
804
805 c.file = dir_node
805 c.file = dir_node
806 c.commit = commit
806 c.commit = commit
807 at_rev = self.request.GET.get('at')
807 at_rev = self.request.GET.get('at')
808
808
809 html = self._get_tree_at_commit(
809 html = self._get_tree_at_commit(
810 c, commit.raw_id, dir_node.path, full_load=True, at_rev=at_rev)
810 c, commit.raw_id, dir_node.path, full_load=True, at_rev=at_rev)
811
811
812 return Response(html)
812 return Response(html)
813
813
814 def _get_attachement_headers(self, f_path):
814 def _get_attachement_headers(self, f_path):
815 f_name = safe_str(f_path.split(Repository.NAME_SEP)[-1])
815 f_name = safe_str(f_path.split(Repository.NAME_SEP)[-1])
816 safe_path = f_name.replace('"', '\\"')
816 safe_path = f_name.replace('"', '\\"')
817 encoded_path = urllib.parse.quote(f_name)
817 encoded_path = urllib.parse.quote(f_name)
818
818
819 headers = "attachment; " \
819 headers = "attachment; " \
820 "filename=\"{}\"; " \
820 "filename=\"{}\"; " \
821 "filename*=UTF-8\'\'{}".format(safe_path, encoded_path)
821 "filename*=UTF-8\'\'{}".format(safe_path, encoded_path)
822
822
823 return safe_bytes(headers).decode('latin-1', errors='replace')
823 return header_safe_str(headers)
824
824
825 @LoginRequired()
825 @LoginRequired()
826 @HasRepoPermissionAnyDecorator(
826 @HasRepoPermissionAnyDecorator(
827 'repository.read', 'repository.write', 'repository.admin')
827 'repository.read', 'repository.write', 'repository.admin')
828 def repo_file_raw(self):
828 def repo_file_raw(self):
829 """
829 """
830 Action for show as raw, some mimetypes are "rendered",
830 Action for show as raw, some mimetypes are "rendered",
831 those include images, icons.
831 those include images, icons.
832 """
832 """
833 c = self.load_default_context()
833 c = self.load_default_context()
834
834
835 commit_id, f_path = self._get_commit_and_path()
835 commit_id, f_path = self._get_commit_and_path()
836 commit = self._get_commit_or_redirect(commit_id)
836 commit = self._get_commit_or_redirect(commit_id)
837 file_node = self._get_filenode_or_redirect(commit, f_path)
837 file_node = self._get_filenode_or_redirect(commit, f_path)
838
838
839 raw_mimetype_mapping = {
839 raw_mimetype_mapping = {
840 # map original mimetype to a mimetype used for "show as raw"
840 # map original mimetype to a mimetype used for "show as raw"
841 # you can also provide a content-disposition to override the
841 # you can also provide a content-disposition to override the
842 # default "attachment" disposition.
842 # default "attachment" disposition.
843 # orig_type: (new_type, new_dispo)
843 # orig_type: (new_type, new_dispo)
844
844
845 # show images inline:
845 # show images inline:
846 # Do not re-add SVG: it is unsafe and permits XSS attacks. One can
846 # Do not re-add SVG: it is unsafe and permits XSS attacks. One can
847 # for example render an SVG with javascript inside or even render
847 # for example render an SVG with javascript inside or even render
848 # HTML.
848 # HTML.
849 'image/x-icon': ('image/x-icon', 'inline'),
849 'image/x-icon': ('image/x-icon', 'inline'),
850 'image/png': ('image/png', 'inline'),
850 'image/png': ('image/png', 'inline'),
851 'image/gif': ('image/gif', 'inline'),
851 'image/gif': ('image/gif', 'inline'),
852 'image/jpeg': ('image/jpeg', 'inline'),
852 'image/jpeg': ('image/jpeg', 'inline'),
853 'application/pdf': ('application/pdf', 'inline'),
853 'application/pdf': ('application/pdf', 'inline'),
854 }
854 }
855
855
856 mimetype = file_node.mimetype
856 mimetype = file_node.mimetype
857 try:
857 try:
858 mimetype, disposition = raw_mimetype_mapping[mimetype]
858 mimetype, disposition = raw_mimetype_mapping[mimetype]
859 except KeyError:
859 except KeyError:
860 # we don't know anything special about this, handle it safely
860 # we don't know anything special about this, handle it safely
861 if file_node.is_binary:
861 if file_node.is_binary:
862 # do same as download raw for binary files
862 # do same as download raw for binary files
863 mimetype, disposition = 'application/octet-stream', 'attachment'
863 mimetype, disposition = 'application/octet-stream', 'attachment'
864 else:
864 else:
865 # do not just use the original mimetype, but force text/plain,
865 # do not just use the original mimetype, but force text/plain,
866 # otherwise it would serve text/html and that might be unsafe.
866 # otherwise it would serve text/html and that might be unsafe.
867 # Note: underlying vcs library fakes text/plain mimetype if the
867 # Note: underlying vcs library fakes text/plain mimetype if the
868 # mimetype can not be determined and it thinks it is not
868 # mimetype can not be determined and it thinks it is not
869 # binary.This might lead to erroneous text display in some
869 # binary.This might lead to erroneous text display in some
870 # cases, but helps in other cases, like with text files
870 # cases, but helps in other cases, like with text files
871 # without extension.
871 # without extension.
872 mimetype, disposition = 'text/plain', 'inline'
872 mimetype, disposition = 'text/plain', 'inline'
873
873
874 if disposition == 'attachment':
874 if disposition == 'attachment':
875 disposition = self._get_attachement_headers(f_path)
875 disposition = self._get_attachement_headers(f_path)
876
876
877 stream_content = file_node.stream_bytes()
877 stream_content = file_node.stream_bytes()
878
878
879 response = Response(app_iter=stream_content)
879 response = Response(app_iter=stream_content)
880 response.content_disposition = disposition
880 response.content_disposition = disposition
881 response.content_type = mimetype
881 response.content_type = mimetype
882
882
883 charset = self._get_default_encoding(c)
883 charset = self._get_default_encoding(c)
884 if charset:
884 if charset:
885 response.charset = charset
885 response.charset = charset
886
886
887 return response
887 return response
888
888
889 @LoginRequired()
889 @LoginRequired()
890 @HasRepoPermissionAnyDecorator(
890 @HasRepoPermissionAnyDecorator(
891 'repository.read', 'repository.write', 'repository.admin')
891 'repository.read', 'repository.write', 'repository.admin')
892 def repo_file_download(self):
892 def repo_file_download(self):
893 c = self.load_default_context()
893 c = self.load_default_context()
894
894
895 commit_id, f_path = self._get_commit_and_path()
895 commit_id, f_path = self._get_commit_and_path()
896 commit = self._get_commit_or_redirect(commit_id)
896 commit = self._get_commit_or_redirect(commit_id)
897 file_node = self._get_filenode_or_redirect(commit, f_path)
897 file_node = self._get_filenode_or_redirect(commit, f_path)
898
898
899 if self.request.GET.get('lf'):
899 if self.request.GET.get('lf'):
900 # only if lf get flag is passed, we download this file
900 # only if lf get flag is passed, we download this file
901 # as LFS/Largefile
901 # as LFS/Largefile
902 lf_node = file_node.get_largefile_node()
902 lf_node = file_node.get_largefile_node()
903 if lf_node:
903 if lf_node:
904 # overwrite our pointer with the REAL large-file
904 # overwrite our pointer with the REAL large-file
905 file_node = lf_node
905 file_node = lf_node
906
906
907 disposition = self._get_attachement_headers(f_path)
907 disposition = self._get_attachement_headers(f_path)
908
908
909 stream_content = file_node.stream_bytes()
909 stream_content = file_node.stream_bytes()
910
910
911 response = Response(app_iter=stream_content)
911 response = Response(app_iter=stream_content)
912 response.content_disposition = disposition
912 response.content_disposition = disposition
913 response.content_type = file_node.mimetype
913 response.content_type = file_node.mimetype
914
914
915 charset = self._get_default_encoding(c)
915 charset = self._get_default_encoding(c)
916 if charset:
916 if charset:
917 response.charset = charset
917 response.charset = charset
918
918
919 return response
919 return response
920
920
921 def _get_nodelist_at_commit(self, repo_name, repo_id, commit_id, f_path):
921 def _get_nodelist_at_commit(self, repo_name, repo_id, commit_id, f_path):
922
922
923 cache_seconds = safe_int(
923 cache_seconds = safe_int(
924 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
924 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
925 cache_on = cache_seconds > 0
925 cache_on = cache_seconds > 0
926 log.debug(
926 log.debug(
927 'Computing FILE SEARCH for repo_id %s commit_id `%s` and path `%s`'
927 'Computing FILE SEARCH for repo_id %s commit_id `%s` and path `%s`'
928 'with caching: %s[TTL: %ss]' % (
928 'with caching: %s[TTL: %ss]' % (
929 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
929 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
930
930
931 cache_namespace_uid = f'repo.{repo_id}'
931 cache_namespace_uid = f'repo.{repo_id}'
932 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
932 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
933
933
934 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
934 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
935 def compute_file_search(_name_hash, _repo_id, _commit_id, _f_path):
935 def compute_file_search(_name_hash, _repo_id, _commit_id, _f_path):
936 log.debug('Generating cached nodelist for repo_id:%s, %s, %s',
936 log.debug('Generating cached nodelist for repo_id:%s, %s, %s',
937 _repo_id, commit_id, f_path)
937 _repo_id, commit_id, f_path)
938 try:
938 try:
939 _d, _f = ScmModel().get_quick_filter_nodes(repo_name, _commit_id, _f_path)
939 _d, _f = ScmModel().get_quick_filter_nodes(repo_name, _commit_id, _f_path)
940 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
940 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
941 log.exception(safe_str(e))
941 log.exception(safe_str(e))
942 h.flash(h.escape(safe_str(e)), category='error')
942 h.flash(h.escape(safe_str(e)), category='error')
943 raise HTTPFound(h.route_path(
943 raise HTTPFound(h.route_path(
944 'repo_files', repo_name=self.db_repo_name,
944 'repo_files', repo_name=self.db_repo_name,
945 commit_id='tip', f_path='/'))
945 commit_id='tip', f_path='/'))
946
946
947 return _d + _f
947 return _d + _f
948
948
949 result = compute_file_search(self.db_repo.repo_name_hash, self.db_repo.repo_id,
949 result = compute_file_search(self.db_repo.repo_name_hash, self.db_repo.repo_id,
950 commit_id, f_path)
950 commit_id, f_path)
951 return filter(lambda n: self.path_filter.path_access_allowed(n['name']), result)
951 return filter(lambda n: self.path_filter.path_access_allowed(n['name']), result)
952
952
953 @LoginRequired()
953 @LoginRequired()
954 @HasRepoPermissionAnyDecorator(
954 @HasRepoPermissionAnyDecorator(
955 'repository.read', 'repository.write', 'repository.admin')
955 'repository.read', 'repository.write', 'repository.admin')
956 def repo_nodelist(self):
956 def repo_nodelist(self):
957 self.load_default_context()
957 self.load_default_context()
958
958
959 commit_id, f_path = self._get_commit_and_path()
959 commit_id, f_path = self._get_commit_and_path()
960 commit = self._get_commit_or_redirect(commit_id)
960 commit = self._get_commit_or_redirect(commit_id)
961
961
962 metadata = self._get_nodelist_at_commit(
962 metadata = self._get_nodelist_at_commit(
963 self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path)
963 self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path)
964 return {'nodes': [x for x in metadata]}
964 return {'nodes': [x for x in metadata]}
965
965
966 def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type):
966 def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type):
967 items = []
967 items = []
968 for name, commit_id in branches_or_tags.items():
968 for name, commit_id in branches_or_tags.items():
969 sym_ref = symbolic_reference(commit_id, name, f_path, ref_type)
969 sym_ref = symbolic_reference(commit_id, name, f_path, ref_type)
970 items.append((sym_ref, name, ref_type))
970 items.append((sym_ref, name, ref_type))
971 return items
971 return items
972
972
973 def _symbolic_reference(self, commit_id, name, f_path, ref_type):
973 def _symbolic_reference(self, commit_id, name, f_path, ref_type):
974 return commit_id
974 return commit_id
975
975
976 def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type):
976 def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type):
977 return commit_id
977 return commit_id
978
978
979 # NOTE(dan): old code we used in "diff" mode compare
979 # NOTE(dan): old code we used in "diff" mode compare
980 new_f_path = vcspath.join(name, f_path)
980 new_f_path = vcspath.join(name, f_path)
981 return f'{new_f_path}@{commit_id}'
981 return f'{new_f_path}@{commit_id}'
982
982
983 def _get_node_history(self, commit_obj, f_path, commits=None):
983 def _get_node_history(self, commit_obj, f_path, commits=None):
984 """
984 """
985 get commit history for given node
985 get commit history for given node
986
986
987 :param commit_obj: commit to calculate history
987 :param commit_obj: commit to calculate history
988 :param f_path: path for node to calculate history for
988 :param f_path: path for node to calculate history for
989 :param commits: if passed don't calculate history and take
989 :param commits: if passed don't calculate history and take
990 commits defined in this list
990 commits defined in this list
991 """
991 """
992 _ = self.request.translate
992 _ = self.request.translate
993
993
994 # calculate history based on tip
994 # calculate history based on tip
995 tip = self.rhodecode_vcs_repo.get_commit()
995 tip = self.rhodecode_vcs_repo.get_commit()
996 if commits is None:
996 if commits is None:
997 pre_load = ["author", "branch"]
997 pre_load = ["author", "branch"]
998 try:
998 try:
999 commits = tip.get_path_history(f_path, pre_load=pre_load)
999 commits = tip.get_path_history(f_path, pre_load=pre_load)
1000 except (NodeDoesNotExistError, CommitError):
1000 except (NodeDoesNotExistError, CommitError):
1001 # this node is not present at tip!
1001 # this node is not present at tip!
1002 commits = commit_obj.get_path_history(f_path, pre_load=pre_load)
1002 commits = commit_obj.get_path_history(f_path, pre_load=pre_load)
1003
1003
1004 history = []
1004 history = []
1005 commits_group = ([], _("Changesets"))
1005 commits_group = ([], _("Changesets"))
1006 for commit in commits:
1006 for commit in commits:
1007 branch = ' (%s)' % commit.branch if commit.branch else ''
1007 branch = ' (%s)' % commit.branch if commit.branch else ''
1008 n_desc = f'r{commit.idx}:{commit.short_id}{branch}'
1008 n_desc = f'r{commit.idx}:{commit.short_id}{branch}'
1009 commits_group[0].append((commit.raw_id, n_desc, 'sha'))
1009 commits_group[0].append((commit.raw_id, n_desc, 'sha'))
1010 history.append(commits_group)
1010 history.append(commits_group)
1011
1011
1012 symbolic_reference = self._symbolic_reference
1012 symbolic_reference = self._symbolic_reference
1013
1013
1014 if self.rhodecode_vcs_repo.alias == 'svn':
1014 if self.rhodecode_vcs_repo.alias == 'svn':
1015 adjusted_f_path = RepoFilesView.adjust_file_path_for_svn(
1015 adjusted_f_path = RepoFilesView.adjust_file_path_for_svn(
1016 f_path, self.rhodecode_vcs_repo)
1016 f_path, self.rhodecode_vcs_repo)
1017 if adjusted_f_path != f_path:
1017 if adjusted_f_path != f_path:
1018 log.debug(
1018 log.debug(
1019 'Recognized svn tag or branch in file "%s", using svn '
1019 'Recognized svn tag or branch in file "%s", using svn '
1020 'specific symbolic references', f_path)
1020 'specific symbolic references', f_path)
1021 f_path = adjusted_f_path
1021 f_path = adjusted_f_path
1022 symbolic_reference = self._symbolic_reference_svn
1022 symbolic_reference = self._symbolic_reference_svn
1023
1023
1024 branches = self._create_references(
1024 branches = self._create_references(
1025 self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch')
1025 self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch')
1026 branches_group = (branches, _("Branches"))
1026 branches_group = (branches, _("Branches"))
1027
1027
1028 tags = self._create_references(
1028 tags = self._create_references(
1029 self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag')
1029 self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag')
1030 tags_group = (tags, _("Tags"))
1030 tags_group = (tags, _("Tags"))
1031
1031
1032 history.append(branches_group)
1032 history.append(branches_group)
1033 history.append(tags_group)
1033 history.append(tags_group)
1034
1034
1035 return history, commits
1035 return history, commits
1036
1036
1037 @LoginRequired()
1037 @LoginRequired()
1038 @HasRepoPermissionAnyDecorator(
1038 @HasRepoPermissionAnyDecorator(
1039 'repository.read', 'repository.write', 'repository.admin')
1039 'repository.read', 'repository.write', 'repository.admin')
1040 def repo_file_history(self):
1040 def repo_file_history(self):
1041 self.load_default_context()
1041 self.load_default_context()
1042
1042
1043 commit_id, f_path = self._get_commit_and_path()
1043 commit_id, f_path = self._get_commit_and_path()
1044 commit = self._get_commit_or_redirect(commit_id)
1044 commit = self._get_commit_or_redirect(commit_id)
1045 file_node = self._get_filenode_or_redirect(commit, f_path)
1045 file_node = self._get_filenode_or_redirect(commit, f_path)
1046
1046
1047 if file_node.is_file():
1047 if file_node.is_file():
1048 file_history, _hist = self._get_node_history(commit, f_path)
1048 file_history, _hist = self._get_node_history(commit, f_path)
1049
1049
1050 res = []
1050 res = []
1051 for section_items, section in file_history:
1051 for section_items, section in file_history:
1052 items = []
1052 items = []
1053 for obj_id, obj_text, obj_type in section_items:
1053 for obj_id, obj_text, obj_type in section_items:
1054 at_rev = ''
1054 at_rev = ''
1055 if obj_type in ['branch', 'bookmark', 'tag']:
1055 if obj_type in ['branch', 'bookmark', 'tag']:
1056 at_rev = obj_text
1056 at_rev = obj_text
1057 entry = {
1057 entry = {
1058 'id': obj_id,
1058 'id': obj_id,
1059 'text': obj_text,
1059 'text': obj_text,
1060 'type': obj_type,
1060 'type': obj_type,
1061 'at_rev': at_rev
1061 'at_rev': at_rev
1062 }
1062 }
1063
1063
1064 items.append(entry)
1064 items.append(entry)
1065
1065
1066 res.append({
1066 res.append({
1067 'text': section,
1067 'text': section,
1068 'children': items
1068 'children': items
1069 })
1069 })
1070
1070
1071 data = {
1071 data = {
1072 'more': False,
1072 'more': False,
1073 'results': res
1073 'results': res
1074 }
1074 }
1075 return data
1075 return data
1076
1076
1077 log.warning('Cannot fetch history for directory')
1077 log.warning('Cannot fetch history for directory')
1078 raise HTTPBadRequest()
1078 raise HTTPBadRequest()
1079
1079
1080 @LoginRequired()
1080 @LoginRequired()
1081 @HasRepoPermissionAnyDecorator(
1081 @HasRepoPermissionAnyDecorator(
1082 'repository.read', 'repository.write', 'repository.admin')
1082 'repository.read', 'repository.write', 'repository.admin')
1083 def repo_file_authors(self):
1083 def repo_file_authors(self):
1084 c = self.load_default_context()
1084 c = self.load_default_context()
1085
1085
1086 commit_id, f_path = self._get_commit_and_path()
1086 commit_id, f_path = self._get_commit_and_path()
1087 commit = self._get_commit_or_redirect(commit_id)
1087 commit = self._get_commit_or_redirect(commit_id)
1088 file_node = self._get_filenode_or_redirect(commit, f_path)
1088 file_node = self._get_filenode_or_redirect(commit, f_path)
1089
1089
1090 if not file_node.is_file():
1090 if not file_node.is_file():
1091 raise HTTPBadRequest()
1091 raise HTTPBadRequest()
1092
1092
1093 c.file_last_commit = file_node.last_commit
1093 c.file_last_commit = file_node.last_commit
1094 if self.request.GET.get('annotate') == '1':
1094 if self.request.GET.get('annotate') == '1':
1095 # use _hist from annotation if annotation mode is on
1095 # use _hist from annotation if annotation mode is on
1096 commit_ids = {x[1] for x in file_node.annotate}
1096 commit_ids = {x[1] for x in file_node.annotate}
1097 _hist = (
1097 _hist = (
1098 self.rhodecode_vcs_repo.get_commit(commit_id)
1098 self.rhodecode_vcs_repo.get_commit(commit_id)
1099 for commit_id in commit_ids)
1099 for commit_id in commit_ids)
1100 else:
1100 else:
1101 _f_history, _hist = self._get_node_history(commit, f_path)
1101 _f_history, _hist = self._get_node_history(commit, f_path)
1102 c.file_author = False
1102 c.file_author = False
1103
1103
1104 unique = collections.OrderedDict()
1104 unique = collections.OrderedDict()
1105 for commit in _hist:
1105 for commit in _hist:
1106 author = commit.author
1106 author = commit.author
1107 if author not in unique:
1107 if author not in unique:
1108 unique[commit.author] = [
1108 unique[commit.author] = [
1109 h.email(author),
1109 h.email(author),
1110 h.person(author, 'username_or_name_or_email'),
1110 h.person(author, 'username_or_name_or_email'),
1111 1 # counter
1111 1 # counter
1112 ]
1112 ]
1113
1113
1114 else:
1114 else:
1115 # increase counter
1115 # increase counter
1116 unique[commit.author][2] += 1
1116 unique[commit.author][2] += 1
1117
1117
1118 c.authors = [val for val in unique.values()]
1118 c.authors = [val for val in unique.values()]
1119
1119
1120 return self._get_template_context(c)
1120 return self._get_template_context(c)
1121
1121
1122 @LoginRequired()
1122 @LoginRequired()
1123 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1123 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1124 def repo_files_check_head(self):
1124 def repo_files_check_head(self):
1125 self.load_default_context()
1125 self.load_default_context()
1126
1126
1127 commit_id, f_path = self._get_commit_and_path()
1127 commit_id, f_path = self._get_commit_and_path()
1128 _branch_name, _sha_commit_id, is_head = \
1128 _branch_name, _sha_commit_id, is_head = \
1129 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1129 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1130 landing_ref=self.db_repo.landing_ref_name)
1130 landing_ref=self.db_repo.landing_ref_name)
1131
1131
1132 new_path = self.request.POST.get('path')
1132 new_path = self.request.POST.get('path')
1133 operation = self.request.POST.get('operation')
1133 operation = self.request.POST.get('operation')
1134 path_exist = ''
1134 path_exist = ''
1135
1135
1136 if new_path and operation in ['create', 'upload']:
1136 if new_path and operation in ['create', 'upload']:
1137 new_f_path = os.path.join(f_path.lstrip('/'), new_path)
1137 new_f_path = os.path.join(f_path.lstrip('/'), new_path)
1138 try:
1138 try:
1139 commit_obj = self.rhodecode_vcs_repo.get_commit(commit_id)
1139 commit_obj = self.rhodecode_vcs_repo.get_commit(commit_id)
1140 # NOTE(dan): construct whole path without leading /
1140 # NOTE(dan): construct whole path without leading /
1141 file_node = commit_obj.get_node(new_f_path)
1141 file_node = commit_obj.get_node(new_f_path)
1142 if file_node is not None:
1142 if file_node is not None:
1143 path_exist = new_f_path
1143 path_exist = new_f_path
1144 except EmptyRepositoryError:
1144 except EmptyRepositoryError:
1145 pass
1145 pass
1146 except Exception:
1146 except Exception:
1147 pass
1147 pass
1148
1148
1149 return {
1149 return {
1150 'branch': _branch_name,
1150 'branch': _branch_name,
1151 'sha': _sha_commit_id,
1151 'sha': _sha_commit_id,
1152 'is_head': is_head,
1152 'is_head': is_head,
1153 'path_exists': path_exist
1153 'path_exists': path_exist
1154 }
1154 }
1155
1155
1156 @LoginRequired()
1156 @LoginRequired()
1157 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1157 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1158 def repo_files_remove_file(self):
1158 def repo_files_remove_file(self):
1159 _ = self.request.translate
1159 _ = self.request.translate
1160 c = self.load_default_context()
1160 c = self.load_default_context()
1161 commit_id, f_path = self._get_commit_and_path()
1161 commit_id, f_path = self._get_commit_and_path()
1162
1162
1163 self._ensure_not_locked()
1163 self._ensure_not_locked()
1164 _branch_name, _sha_commit_id, is_head = \
1164 _branch_name, _sha_commit_id, is_head = \
1165 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1165 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1166 landing_ref=self.db_repo.landing_ref_name)
1166 landing_ref=self.db_repo.landing_ref_name)
1167
1167
1168 self.forbid_non_head(is_head, f_path)
1168 self.forbid_non_head(is_head, f_path)
1169 self.check_branch_permission(_branch_name)
1169 self.check_branch_permission(_branch_name)
1170
1170
1171 c.commit = self._get_commit_or_redirect(commit_id)
1171 c.commit = self._get_commit_or_redirect(commit_id)
1172 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1172 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1173
1173
1174 c.default_message = _(
1174 c.default_message = _(
1175 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1175 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1176 c.f_path = f_path
1176 c.f_path = f_path
1177
1177
1178 return self._get_template_context(c)
1178 return self._get_template_context(c)
1179
1179
1180 @LoginRequired()
1180 @LoginRequired()
1181 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1181 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1182 @CSRFRequired()
1182 @CSRFRequired()
1183 def repo_files_delete_file(self):
1183 def repo_files_delete_file(self):
1184 _ = self.request.translate
1184 _ = self.request.translate
1185
1185
1186 c = self.load_default_context()
1186 c = self.load_default_context()
1187 commit_id, f_path = self._get_commit_and_path()
1187 commit_id, f_path = self._get_commit_and_path()
1188
1188
1189 self._ensure_not_locked()
1189 self._ensure_not_locked()
1190 _branch_name, _sha_commit_id, is_head = \
1190 _branch_name, _sha_commit_id, is_head = \
1191 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1191 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1192 landing_ref=self.db_repo.landing_ref_name)
1192 landing_ref=self.db_repo.landing_ref_name)
1193
1193
1194 self.forbid_non_head(is_head, f_path)
1194 self.forbid_non_head(is_head, f_path)
1195 self.check_branch_permission(_branch_name)
1195 self.check_branch_permission(_branch_name)
1196
1196
1197 c.commit = self._get_commit_or_redirect(commit_id)
1197 c.commit = self._get_commit_or_redirect(commit_id)
1198 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1198 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1199
1199
1200 c.default_message = _(
1200 c.default_message = _(
1201 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1201 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1202 c.f_path = f_path
1202 c.f_path = f_path
1203 node_path = f_path
1203 node_path = f_path
1204 author = self._rhodecode_db_user.full_contact
1204 author = self._rhodecode_db_user.full_contact
1205 message = self.request.POST.get('message') or c.default_message
1205 message = self.request.POST.get('message') or c.default_message
1206 try:
1206 try:
1207 nodes = {
1207 nodes = {
1208 safe_bytes(node_path): {
1208 safe_bytes(node_path): {
1209 'content': b''
1209 'content': b''
1210 }
1210 }
1211 }
1211 }
1212 ScmModel().delete_nodes(
1212 ScmModel().delete_nodes(
1213 user=self._rhodecode_db_user.user_id, repo=self.db_repo,
1213 user=self._rhodecode_db_user.user_id, repo=self.db_repo,
1214 message=message,
1214 message=message,
1215 nodes=nodes,
1215 nodes=nodes,
1216 parent_commit=c.commit,
1216 parent_commit=c.commit,
1217 author=author,
1217 author=author,
1218 )
1218 )
1219
1219
1220 h.flash(
1220 h.flash(
1221 _('Successfully deleted file `{}`').format(
1221 _('Successfully deleted file `{}`').format(
1222 h.escape(f_path)), category='success')
1222 h.escape(f_path)), category='success')
1223 except Exception:
1223 except Exception:
1224 log.exception('Error during commit operation')
1224 log.exception('Error during commit operation')
1225 h.flash(_('Error occurred during commit'), category='error')
1225 h.flash(_('Error occurred during commit'), category='error')
1226 raise HTTPFound(
1226 raise HTTPFound(
1227 h.route_path('repo_commit', repo_name=self.db_repo_name,
1227 h.route_path('repo_commit', repo_name=self.db_repo_name,
1228 commit_id='tip'))
1228 commit_id='tip'))
1229
1229
1230 @LoginRequired()
1230 @LoginRequired()
1231 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1231 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1232 def repo_files_edit_file(self):
1232 def repo_files_edit_file(self):
1233 _ = self.request.translate
1233 _ = self.request.translate
1234 c = self.load_default_context()
1234 c = self.load_default_context()
1235 commit_id, f_path = self._get_commit_and_path()
1235 commit_id, f_path = self._get_commit_and_path()
1236
1236
1237 self._ensure_not_locked()
1237 self._ensure_not_locked()
1238 _branch_name, _sha_commit_id, is_head = \
1238 _branch_name, _sha_commit_id, is_head = \
1239 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1239 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1240 landing_ref=self.db_repo.landing_ref_name)
1240 landing_ref=self.db_repo.landing_ref_name)
1241
1241
1242 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1242 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1243 self.check_branch_permission(_branch_name, commit_id=commit_id)
1243 self.check_branch_permission(_branch_name, commit_id=commit_id)
1244
1244
1245 c.commit = self._get_commit_or_redirect(commit_id)
1245 c.commit = self._get_commit_or_redirect(commit_id)
1246 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1246 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1247
1247
1248 if c.file.is_binary:
1248 if c.file.is_binary:
1249 files_url = h.route_path(
1249 files_url = h.route_path(
1250 'repo_files',
1250 'repo_files',
1251 repo_name=self.db_repo_name,
1251 repo_name=self.db_repo_name,
1252 commit_id=c.commit.raw_id, f_path=f_path)
1252 commit_id=c.commit.raw_id, f_path=f_path)
1253 raise HTTPFound(files_url)
1253 raise HTTPFound(files_url)
1254
1254
1255 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1255 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1256 c.f_path = f_path
1256 c.f_path = f_path
1257
1257
1258 return self._get_template_context(c)
1258 return self._get_template_context(c)
1259
1259
1260 @LoginRequired()
1260 @LoginRequired()
1261 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1261 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1262 @CSRFRequired()
1262 @CSRFRequired()
1263 def repo_files_update_file(self):
1263 def repo_files_update_file(self):
1264 _ = self.request.translate
1264 _ = self.request.translate
1265 c = self.load_default_context()
1265 c = self.load_default_context()
1266 commit_id, f_path = self._get_commit_and_path()
1266 commit_id, f_path = self._get_commit_and_path()
1267
1267
1268 self._ensure_not_locked()
1268 self._ensure_not_locked()
1269
1269
1270 c.commit = self._get_commit_or_redirect(commit_id)
1270 c.commit = self._get_commit_or_redirect(commit_id)
1271 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1271 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1272
1272
1273 if c.file.is_binary:
1273 if c.file.is_binary:
1274 raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name,
1274 raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name,
1275 commit_id=c.commit.raw_id, f_path=f_path))
1275 commit_id=c.commit.raw_id, f_path=f_path))
1276
1276
1277 _branch_name, _sha_commit_id, is_head = \
1277 _branch_name, _sha_commit_id, is_head = \
1278 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1278 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1279 landing_ref=self.db_repo.landing_ref_name)
1279 landing_ref=self.db_repo.landing_ref_name)
1280
1280
1281 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1281 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1282 self.check_branch_permission(_branch_name, commit_id=commit_id)
1282 self.check_branch_permission(_branch_name, commit_id=commit_id)
1283
1283
1284 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1284 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1285 c.f_path = f_path
1285 c.f_path = f_path
1286
1286
1287 old_content = c.file.str_content
1287 old_content = c.file.str_content
1288 sl = old_content.splitlines(1)
1288 sl = old_content.splitlines(1)
1289 first_line = sl[0] if sl else ''
1289 first_line = sl[0] if sl else ''
1290
1290
1291 r_post = self.request.POST
1291 r_post = self.request.POST
1292 # line endings: 0 - Unix, 1 - Mac, 2 - DOS
1292 # line endings: 0 - Unix, 1 - Mac, 2 - DOS
1293 line_ending_mode = detect_mode(first_line, 0)
1293 line_ending_mode = detect_mode(first_line, 0)
1294 content = convert_line_endings(r_post.get('content', ''), line_ending_mode)
1294 content = convert_line_endings(r_post.get('content', ''), line_ending_mode)
1295
1295
1296 message = r_post.get('message') or c.default_message
1296 message = r_post.get('message') or c.default_message
1297
1297
1298 org_node_path = c.file.str_path
1298 org_node_path = c.file.str_path
1299 filename = r_post['filename']
1299 filename = r_post['filename']
1300
1300
1301 root_path = c.file.dir_path
1301 root_path = c.file.dir_path
1302 pure_path = self.create_pure_path(root_path, filename)
1302 pure_path = self.create_pure_path(root_path, filename)
1303 node_path = pure_path.as_posix()
1303 node_path = pure_path.as_posix()
1304
1304
1305 default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name,
1305 default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name,
1306 commit_id=commit_id)
1306 commit_id=commit_id)
1307 if content == old_content and node_path == org_node_path:
1307 if content == old_content and node_path == org_node_path:
1308 h.flash(_('No changes detected on {}').format(h.escape(org_node_path)),
1308 h.flash(_('No changes detected on {}').format(h.escape(org_node_path)),
1309 category='warning')
1309 category='warning')
1310 raise HTTPFound(default_redirect_url)
1310 raise HTTPFound(default_redirect_url)
1311
1311
1312 try:
1312 try:
1313 mapping = {
1313 mapping = {
1314 c.file.bytes_path: {
1314 c.file.bytes_path: {
1315 'org_filename': org_node_path,
1315 'org_filename': org_node_path,
1316 'filename': safe_bytes(node_path),
1316 'filename': safe_bytes(node_path),
1317 'content': safe_bytes(content),
1317 'content': safe_bytes(content),
1318 'lexer': '',
1318 'lexer': '',
1319 'op': 'mod',
1319 'op': 'mod',
1320 'mode': c.file.mode
1320 'mode': c.file.mode
1321 }
1321 }
1322 }
1322 }
1323
1323
1324 commit = ScmModel().update_nodes(
1324 commit = ScmModel().update_nodes(
1325 user=self._rhodecode_db_user.user_id,
1325 user=self._rhodecode_db_user.user_id,
1326 repo=self.db_repo,
1326 repo=self.db_repo,
1327 message=message,
1327 message=message,
1328 nodes=mapping,
1328 nodes=mapping,
1329 parent_commit=c.commit,
1329 parent_commit=c.commit,
1330 )
1330 )
1331
1331
1332 h.flash(_('Successfully committed changes to file `{}`').format(
1332 h.flash(_('Successfully committed changes to file `{}`').format(
1333 h.escape(f_path)), category='success')
1333 h.escape(f_path)), category='success')
1334 default_redirect_url = h.route_path(
1334 default_redirect_url = h.route_path(
1335 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1335 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1336
1336
1337 except Exception:
1337 except Exception:
1338 log.exception('Error occurred during commit')
1338 log.exception('Error occurred during commit')
1339 h.flash(_('Error occurred during commit'), category='error')
1339 h.flash(_('Error occurred during commit'), category='error')
1340
1340
1341 raise HTTPFound(default_redirect_url)
1341 raise HTTPFound(default_redirect_url)
1342
1342
1343 @LoginRequired()
1343 @LoginRequired()
1344 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1344 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1345 def repo_files_add_file(self):
1345 def repo_files_add_file(self):
1346 _ = self.request.translate
1346 _ = self.request.translate
1347 c = self.load_default_context()
1347 c = self.load_default_context()
1348 commit_id, f_path = self._get_commit_and_path()
1348 commit_id, f_path = self._get_commit_and_path()
1349
1349
1350 self._ensure_not_locked()
1350 self._ensure_not_locked()
1351
1351
1352 # Check if we need to use this page to upload binary
1352 # Check if we need to use this page to upload binary
1353 upload_binary = str2bool(self.request.params.get('upload_binary', False))
1353 upload_binary = str2bool(self.request.params.get('upload_binary', False))
1354
1354
1355 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1355 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1356 if c.commit is None:
1356 if c.commit is None:
1357 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1357 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1358
1358
1359 if self.rhodecode_vcs_repo.is_empty():
1359 if self.rhodecode_vcs_repo.is_empty():
1360 # for empty repository we cannot check for current branch, we rely on
1360 # for empty repository we cannot check for current branch, we rely on
1361 # c.commit.branch instead
1361 # c.commit.branch instead
1362 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1362 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1363 else:
1363 else:
1364 _branch_name, _sha_commit_id, is_head = \
1364 _branch_name, _sha_commit_id, is_head = \
1365 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1365 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1366 landing_ref=self.db_repo.landing_ref_name)
1366 landing_ref=self.db_repo.landing_ref_name)
1367
1367
1368 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1368 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1369 self.check_branch_permission(_branch_name, commit_id=commit_id)
1369 self.check_branch_permission(_branch_name, commit_id=commit_id)
1370
1370
1371 c.default_message = (_('Added file via RhodeCode Enterprise')) \
1371 c.default_message = (_('Added file via RhodeCode Enterprise')) \
1372 if not upload_binary else (_('Edited file {} via RhodeCode Enterprise').format(f_path))
1372 if not upload_binary else (_('Edited file {} via RhodeCode Enterprise').format(f_path))
1373 c.f_path = f_path.lstrip('/') # ensure not relative path
1373 c.f_path = f_path.lstrip('/') # ensure not relative path
1374 c.replace_binary = upload_binary
1374 c.replace_binary = upload_binary
1375
1375
1376 return self._get_template_context(c)
1376 return self._get_template_context(c)
1377
1377
1378 @LoginRequired()
1378 @LoginRequired()
1379 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1379 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1380 @CSRFRequired()
1380 @CSRFRequired()
1381 def repo_files_create_file(self):
1381 def repo_files_create_file(self):
1382 _ = self.request.translate
1382 _ = self.request.translate
1383 c = self.load_default_context()
1383 c = self.load_default_context()
1384 commit_id, f_path = self._get_commit_and_path()
1384 commit_id, f_path = self._get_commit_and_path()
1385
1385
1386 self._ensure_not_locked()
1386 self._ensure_not_locked()
1387
1387
1388 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1388 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1389 if c.commit is None:
1389 if c.commit is None:
1390 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1390 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1391
1391
1392 # calculate redirect URL
1392 # calculate redirect URL
1393 if self.rhodecode_vcs_repo.is_empty():
1393 if self.rhodecode_vcs_repo.is_empty():
1394 default_redirect_url = h.route_path(
1394 default_redirect_url = h.route_path(
1395 'repo_summary', repo_name=self.db_repo_name)
1395 'repo_summary', repo_name=self.db_repo_name)
1396 else:
1396 else:
1397 default_redirect_url = h.route_path(
1397 default_redirect_url = h.route_path(
1398 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1398 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1399
1399
1400 if self.rhodecode_vcs_repo.is_empty():
1400 if self.rhodecode_vcs_repo.is_empty():
1401 # for empty repository we cannot check for current branch, we rely on
1401 # for empty repository we cannot check for current branch, we rely on
1402 # c.commit.branch instead
1402 # c.commit.branch instead
1403 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1403 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1404 else:
1404 else:
1405 _branch_name, _sha_commit_id, is_head = \
1405 _branch_name, _sha_commit_id, is_head = \
1406 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1406 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1407 landing_ref=self.db_repo.landing_ref_name)
1407 landing_ref=self.db_repo.landing_ref_name)
1408
1408
1409 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1409 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1410 self.check_branch_permission(_branch_name, commit_id=commit_id)
1410 self.check_branch_permission(_branch_name, commit_id=commit_id)
1411
1411
1412 c.default_message = (_('Added file via RhodeCode Enterprise'))
1412 c.default_message = (_('Added file via RhodeCode Enterprise'))
1413 c.f_path = f_path
1413 c.f_path = f_path
1414
1414
1415 r_post = self.request.POST
1415 r_post = self.request.POST
1416 message = r_post.get('message') or c.default_message
1416 message = r_post.get('message') or c.default_message
1417 filename = r_post.get('filename')
1417 filename = r_post.get('filename')
1418 unix_mode = 0
1418 unix_mode = 0
1419
1419
1420 if not filename:
1420 if not filename:
1421 # If there's no commit, redirect to repo summary
1421 # If there's no commit, redirect to repo summary
1422 if type(c.commit) is EmptyCommit:
1422 if type(c.commit) is EmptyCommit:
1423 redirect_url = h.route_path(
1423 redirect_url = h.route_path(
1424 'repo_summary', repo_name=self.db_repo_name)
1424 'repo_summary', repo_name=self.db_repo_name)
1425 else:
1425 else:
1426 redirect_url = default_redirect_url
1426 redirect_url = default_redirect_url
1427 h.flash(_('No filename specified'), category='warning')
1427 h.flash(_('No filename specified'), category='warning')
1428 raise HTTPFound(redirect_url)
1428 raise HTTPFound(redirect_url)
1429
1429
1430 root_path = f_path
1430 root_path = f_path
1431 pure_path = self.create_pure_path(root_path, filename)
1431 pure_path = self.create_pure_path(root_path, filename)
1432 node_path = pure_path.as_posix().lstrip('/')
1432 node_path = pure_path.as_posix().lstrip('/')
1433
1433
1434 author = self._rhodecode_db_user.full_contact
1434 author = self._rhodecode_db_user.full_contact
1435 content = convert_line_endings(r_post.get('content', ''), unix_mode)
1435 content = convert_line_endings(r_post.get('content', ''), unix_mode)
1436 nodes = {
1436 nodes = {
1437 safe_bytes(node_path): {
1437 safe_bytes(node_path): {
1438 'content': safe_bytes(content)
1438 'content': safe_bytes(content)
1439 }
1439 }
1440 }
1440 }
1441
1441
1442 try:
1442 try:
1443
1443
1444 commit = ScmModel().create_nodes(
1444 commit = ScmModel().create_nodes(
1445 user=self._rhodecode_db_user.user_id,
1445 user=self._rhodecode_db_user.user_id,
1446 repo=self.db_repo,
1446 repo=self.db_repo,
1447 message=message,
1447 message=message,
1448 nodes=nodes,
1448 nodes=nodes,
1449 parent_commit=c.commit,
1449 parent_commit=c.commit,
1450 author=author,
1450 author=author,
1451 )
1451 )
1452
1452
1453 h.flash(_('Successfully committed new file `{}`').format(
1453 h.flash(_('Successfully committed new file `{}`').format(
1454 h.escape(node_path)), category='success')
1454 h.escape(node_path)), category='success')
1455
1455
1456 default_redirect_url = h.route_path(
1456 default_redirect_url = h.route_path(
1457 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1457 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1458
1458
1459 except NonRelativePathError:
1459 except NonRelativePathError:
1460 log.exception('Non Relative path found')
1460 log.exception('Non Relative path found')
1461 h.flash(_('The location specified must be a relative path and must not '
1461 h.flash(_('The location specified must be a relative path and must not '
1462 'contain .. in the path'), category='warning')
1462 'contain .. in the path'), category='warning')
1463 raise HTTPFound(default_redirect_url)
1463 raise HTTPFound(default_redirect_url)
1464 except (NodeError, NodeAlreadyExistsError) as e:
1464 except (NodeError, NodeAlreadyExistsError) as e:
1465 h.flash(h.escape(safe_str(e)), category='error')
1465 h.flash(h.escape(safe_str(e)), category='error')
1466 except Exception:
1466 except Exception:
1467 log.exception('Error occurred during commit')
1467 log.exception('Error occurred during commit')
1468 h.flash(_('Error occurred during commit'), category='error')
1468 h.flash(_('Error occurred during commit'), category='error')
1469
1469
1470 raise HTTPFound(default_redirect_url)
1470 raise HTTPFound(default_redirect_url)
1471
1471
1472 @LoginRequired()
1472 @LoginRequired()
1473 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1473 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1474 @CSRFRequired()
1474 @CSRFRequired()
1475 def repo_files_upload_file(self):
1475 def repo_files_upload_file(self):
1476 _ = self.request.translate
1476 _ = self.request.translate
1477 c = self.load_default_context()
1477 c = self.load_default_context()
1478 commit_id, f_path = self._get_commit_and_path()
1478 commit_id, f_path = self._get_commit_and_path()
1479
1479
1480 self._ensure_not_locked()
1480 self._ensure_not_locked()
1481
1481
1482 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1482 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1483 if c.commit is None:
1483 if c.commit is None:
1484 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1484 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1485
1485
1486 # calculate redirect URL
1486 # calculate redirect URL
1487 if self.rhodecode_vcs_repo.is_empty():
1487 if self.rhodecode_vcs_repo.is_empty():
1488 default_redirect_url = h.route_path(
1488 default_redirect_url = h.route_path(
1489 'repo_summary', repo_name=self.db_repo_name)
1489 'repo_summary', repo_name=self.db_repo_name)
1490 else:
1490 else:
1491 default_redirect_url = h.route_path(
1491 default_redirect_url = h.route_path(
1492 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1492 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1493
1493
1494 if self.rhodecode_vcs_repo.is_empty():
1494 if self.rhodecode_vcs_repo.is_empty():
1495 # for empty repository we cannot check for current branch, we rely on
1495 # for empty repository we cannot check for current branch, we rely on
1496 # c.commit.branch instead
1496 # c.commit.branch instead
1497 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1497 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1498 else:
1498 else:
1499 _branch_name, _sha_commit_id, is_head = \
1499 _branch_name, _sha_commit_id, is_head = \
1500 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1500 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1501 landing_ref=self.db_repo.landing_ref_name)
1501 landing_ref=self.db_repo.landing_ref_name)
1502
1502
1503 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1503 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1504 if error:
1504 if error:
1505 return {
1505 return {
1506 'error': error,
1506 'error': error,
1507 'redirect_url': default_redirect_url
1507 'redirect_url': default_redirect_url
1508 }
1508 }
1509 error = self.check_branch_permission(_branch_name, json_mode=True)
1509 error = self.check_branch_permission(_branch_name, json_mode=True)
1510 if error:
1510 if error:
1511 return {
1511 return {
1512 'error': error,
1512 'error': error,
1513 'redirect_url': default_redirect_url
1513 'redirect_url': default_redirect_url
1514 }
1514 }
1515
1515
1516 c.default_message = (_('Added file via RhodeCode Enterprise'))
1516 c.default_message = (_('Added file via RhodeCode Enterprise'))
1517 c.f_path = f_path
1517 c.f_path = f_path
1518
1518
1519 r_post = self.request.POST
1519 r_post = self.request.POST
1520
1520
1521 message = c.default_message
1521 message = c.default_message
1522 user_message = r_post.getall('message')
1522 user_message = r_post.getall('message')
1523 if isinstance(user_message, list) and user_message:
1523 if isinstance(user_message, list) and user_message:
1524 # we take the first from duplicated results if it's not empty
1524 # we take the first from duplicated results if it's not empty
1525 message = user_message[0] if user_message[0] else message
1525 message = user_message[0] if user_message[0] else message
1526
1526
1527 nodes = {}
1527 nodes = {}
1528
1528
1529 for file_obj in r_post.getall('files_upload') or []:
1529 for file_obj in r_post.getall('files_upload') or []:
1530 content = file_obj.file
1530 content = file_obj.file
1531 filename = file_obj.filename
1531 filename = file_obj.filename
1532
1532
1533 root_path = f_path
1533 root_path = f_path
1534 pure_path = self.create_pure_path(root_path, filename)
1534 pure_path = self.create_pure_path(root_path, filename)
1535 node_path = pure_path.as_posix().lstrip('/')
1535 node_path = pure_path.as_posix().lstrip('/')
1536
1536
1537 nodes[safe_bytes(node_path)] = {
1537 nodes[safe_bytes(node_path)] = {
1538 'content': content
1538 'content': content
1539 }
1539 }
1540
1540
1541 if not nodes:
1541 if not nodes:
1542 error = 'missing files'
1542 error = 'missing files'
1543 return {
1543 return {
1544 'error': error,
1544 'error': error,
1545 'redirect_url': default_redirect_url
1545 'redirect_url': default_redirect_url
1546 }
1546 }
1547
1547
1548 author = self._rhodecode_db_user.full_contact
1548 author = self._rhodecode_db_user.full_contact
1549
1549
1550 try:
1550 try:
1551 commit = ScmModel().create_nodes(
1551 commit = ScmModel().create_nodes(
1552 user=self._rhodecode_db_user.user_id,
1552 user=self._rhodecode_db_user.user_id,
1553 repo=self.db_repo,
1553 repo=self.db_repo,
1554 message=message,
1554 message=message,
1555 nodes=nodes,
1555 nodes=nodes,
1556 parent_commit=c.commit,
1556 parent_commit=c.commit,
1557 author=author,
1557 author=author,
1558 )
1558 )
1559 if len(nodes) == 1:
1559 if len(nodes) == 1:
1560 flash_message = _('Successfully committed {} new files').format(len(nodes))
1560 flash_message = _('Successfully committed {} new files').format(len(nodes))
1561 else:
1561 else:
1562 flash_message = _('Successfully committed 1 new file')
1562 flash_message = _('Successfully committed 1 new file')
1563
1563
1564 h.flash(flash_message, category='success')
1564 h.flash(flash_message, category='success')
1565
1565
1566 default_redirect_url = h.route_path(
1566 default_redirect_url = h.route_path(
1567 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1567 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1568
1568
1569 except NonRelativePathError:
1569 except NonRelativePathError:
1570 log.exception('Non Relative path found')
1570 log.exception('Non Relative path found')
1571 error = _('The location specified must be a relative path and must not '
1571 error = _('The location specified must be a relative path and must not '
1572 'contain .. in the path')
1572 'contain .. in the path')
1573 h.flash(error, category='warning')
1573 h.flash(error, category='warning')
1574
1574
1575 return {
1575 return {
1576 'error': error,
1576 'error': error,
1577 'redirect_url': default_redirect_url
1577 'redirect_url': default_redirect_url
1578 }
1578 }
1579 except (NodeError, NodeAlreadyExistsError) as e:
1579 except (NodeError, NodeAlreadyExistsError) as e:
1580 error = h.escape(e)
1580 error = h.escape(e)
1581 h.flash(error, category='error')
1581 h.flash(error, category='error')
1582
1582
1583 return {
1583 return {
1584 'error': error,
1584 'error': error,
1585 'redirect_url': default_redirect_url
1585 'redirect_url': default_redirect_url
1586 }
1586 }
1587 except Exception:
1587 except Exception:
1588 log.exception('Error occurred during commit')
1588 log.exception('Error occurred during commit')
1589 error = _('Error occurred during commit')
1589 error = _('Error occurred during commit')
1590 h.flash(error, category='error')
1590 h.flash(error, category='error')
1591 return {
1591 return {
1592 'error': error,
1592 'error': error,
1593 'redirect_url': default_redirect_url
1593 'redirect_url': default_redirect_url
1594 }
1594 }
1595
1595
1596 return {
1596 return {
1597 'error': None,
1597 'error': None,
1598 'redirect_url': default_redirect_url
1598 'redirect_url': default_redirect_url
1599 }
1599 }
1600
1600
1601 @LoginRequired()
1601 @LoginRequired()
1602 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1602 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1603 @CSRFRequired()
1603 @CSRFRequired()
1604 def repo_files_replace_file(self):
1604 def repo_files_replace_file(self):
1605 _ = self.request.translate
1605 _ = self.request.translate
1606 c = self.load_default_context()
1606 c = self.load_default_context()
1607 commit_id, f_path = self._get_commit_and_path()
1607 commit_id, f_path = self._get_commit_and_path()
1608
1608
1609 self._ensure_not_locked()
1609 self._ensure_not_locked()
1610
1610
1611 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1611 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1612 if c.commit is None:
1612 if c.commit is None:
1613 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1613 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1614
1614
1615 if self.rhodecode_vcs_repo.is_empty():
1615 if self.rhodecode_vcs_repo.is_empty():
1616 default_redirect_url = h.route_path(
1616 default_redirect_url = h.route_path(
1617 'repo_summary', repo_name=self.db_repo_name)
1617 'repo_summary', repo_name=self.db_repo_name)
1618 else:
1618 else:
1619 default_redirect_url = h.route_path(
1619 default_redirect_url = h.route_path(
1620 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1620 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1621
1621
1622 if self.rhodecode_vcs_repo.is_empty():
1622 if self.rhodecode_vcs_repo.is_empty():
1623 # for empty repository we cannot check for current branch, we rely on
1623 # for empty repository we cannot check for current branch, we rely on
1624 # c.commit.branch instead
1624 # c.commit.branch instead
1625 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1625 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1626 else:
1626 else:
1627 _branch_name, _sha_commit_id, is_head = \
1627 _branch_name, _sha_commit_id, is_head = \
1628 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1628 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1629 landing_ref=self.db_repo.landing_ref_name)
1629 landing_ref=self.db_repo.landing_ref_name)
1630
1630
1631 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1631 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1632 if error:
1632 if error:
1633 return {
1633 return {
1634 'error': error,
1634 'error': error,
1635 'redirect_url': default_redirect_url
1635 'redirect_url': default_redirect_url
1636 }
1636 }
1637 error = self.check_branch_permission(_branch_name, json_mode=True)
1637 error = self.check_branch_permission(_branch_name, json_mode=True)
1638 if error:
1638 if error:
1639 return {
1639 return {
1640 'error': error,
1640 'error': error,
1641 'redirect_url': default_redirect_url
1641 'redirect_url': default_redirect_url
1642 }
1642 }
1643
1643
1644 c.default_message = (_('Edited file {} via RhodeCode Enterprise').format(f_path))
1644 c.default_message = (_('Edited file {} via RhodeCode Enterprise').format(f_path))
1645 c.f_path = f_path
1645 c.f_path = f_path
1646
1646
1647 r_post = self.request.POST
1647 r_post = self.request.POST
1648
1648
1649 message = c.default_message
1649 message = c.default_message
1650 user_message = r_post.getall('message')
1650 user_message = r_post.getall('message')
1651 if isinstance(user_message, list) and user_message:
1651 if isinstance(user_message, list) and user_message:
1652 # we take the first from duplicated results if it's not empty
1652 # we take the first from duplicated results if it's not empty
1653 message = user_message[0] if user_message[0] else message
1653 message = user_message[0] if user_message[0] else message
1654
1654
1655 data_for_replacement = r_post.getall('files_upload') or []
1655 data_for_replacement = r_post.getall('files_upload') or []
1656 if (objects_count := len(data_for_replacement)) > 1:
1656 if (objects_count := len(data_for_replacement)) > 1:
1657 return {
1657 return {
1658 'error': 'too many files for replacement',
1658 'error': 'too many files for replacement',
1659 'redirect_url': default_redirect_url
1659 'redirect_url': default_redirect_url
1660 }
1660 }
1661 elif not objects_count:
1661 elif not objects_count:
1662 return {
1662 return {
1663 'error': 'missing files',
1663 'error': 'missing files',
1664 'redirect_url': default_redirect_url
1664 'redirect_url': default_redirect_url
1665 }
1665 }
1666
1666
1667 content = data_for_replacement[0].file
1667 content = data_for_replacement[0].file
1668 retrieved_filename = data_for_replacement[0].filename
1668 retrieved_filename = data_for_replacement[0].filename
1669
1669
1670 if retrieved_filename.split('.')[-1] != f_path.split('.')[-1]:
1670 if retrieved_filename.split('.')[-1] != f_path.split('.')[-1]:
1671 return {
1671 return {
1672 'error': 'file extension of uploaded file doesn\'t match an original file\'s extension',
1672 'error': 'file extension of uploaded file doesn\'t match an original file\'s extension',
1673 'redirect_url': default_redirect_url
1673 'redirect_url': default_redirect_url
1674 }
1674 }
1675
1675
1676 author = self._rhodecode_db_user.full_contact
1676 author = self._rhodecode_db_user.full_contact
1677
1677
1678 try:
1678 try:
1679 commit = ScmModel().update_binary_node(
1679 commit = ScmModel().update_binary_node(
1680 user=self._rhodecode_db_user.user_id,
1680 user=self._rhodecode_db_user.user_id,
1681 repo=self.db_repo,
1681 repo=self.db_repo,
1682 message=message,
1682 message=message,
1683 node={
1683 node={
1684 'content': content,
1684 'content': content,
1685 'file_path': f_path.encode(),
1685 'file_path': f_path.encode(),
1686 },
1686 },
1687 parent_commit=c.commit,
1687 parent_commit=c.commit,
1688 author=author,
1688 author=author,
1689 )
1689 )
1690
1690
1691 h.flash(_('Successfully committed 1 new file'), category='success')
1691 h.flash(_('Successfully committed 1 new file'), category='success')
1692
1692
1693 default_redirect_url = h.route_path(
1693 default_redirect_url = h.route_path(
1694 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1694 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1695
1695
1696 except (NodeError, NodeAlreadyExistsError) as e:
1696 except (NodeError, NodeAlreadyExistsError) as e:
1697 error = h.escape(e)
1697 error = h.escape(e)
1698 h.flash(error, category='error')
1698 h.flash(error, category='error')
1699
1699
1700 return {
1700 return {
1701 'error': error,
1701 'error': error,
1702 'redirect_url': default_redirect_url
1702 'redirect_url': default_redirect_url
1703 }
1703 }
1704 except Exception:
1704 except Exception:
1705 log.exception('Error occurred during commit')
1705 log.exception('Error occurred during commit')
1706 error = _('Error occurred during commit')
1706 error = _('Error occurred during commit')
1707 h.flash(error, category='error')
1707 h.flash(error, category='error')
1708 return {
1708 return {
1709 'error': error,
1709 'error': error,
1710 'redirect_url': default_redirect_url
1710 'redirect_url': default_redirect_url
1711 }
1711 }
1712
1712
1713 return {
1713 return {
1714 'error': None,
1714 'error': None,
1715 'redirect_url': default_redirect_url
1715 'redirect_url': default_redirect_url
1716 }
1716 }
@@ -1,302 +1,309 b''
1 # Copyright (C) 2011-2023 RhodeCode GmbH
1 # Copyright (C) 2011-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import logging
19 import logging
20
20
21
21
22 from pyramid.httpexceptions import HTTPFound
22 from pyramid.httpexceptions import HTTPFound
23 from packaging.version import Version
23 from packaging.version import Version
24
24
25 from rhodecode import events
25 from rhodecode import events
26 from rhodecode.apps._base import RepoAppView
26 from rhodecode.apps._base import RepoAppView
27 from rhodecode.lib import helpers as h
27 from rhodecode.lib import helpers as h
28 from rhodecode.lib import audit_logger
28 from rhodecode.lib import audit_logger
29 from rhodecode.lib.auth import (
29 from rhodecode.lib.auth import (
30 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired,
30 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired,
31 HasRepoPermissionAny)
31 HasRepoPermissionAny)
32 from rhodecode.lib.exceptions import AttachedForksError, AttachedPullRequestsError
32 from rhodecode.lib.exceptions import AttachedForksError, AttachedPullRequestsError, AttachedArtifactsError
33 from rhodecode.lib.utils2 import safe_int
33 from rhodecode.lib.utils2 import safe_int
34 from rhodecode.lib.vcs import RepositoryError
34 from rhodecode.lib.vcs import RepositoryError
35 from rhodecode.model.db import Session, UserFollowing, User, Repository
35 from rhodecode.model.db import Session, UserFollowing, User, Repository
36 from rhodecode.model.permission import PermissionModel
36 from rhodecode.model.permission import PermissionModel
37 from rhodecode.model.repo import RepoModel
37 from rhodecode.model.repo import RepoModel
38 from rhodecode.model.scm import ScmModel
38 from rhodecode.model.scm import ScmModel
39
39
40 log = logging.getLogger(__name__)
40 log = logging.getLogger(__name__)
41
41
42
42
43 class RepoSettingsAdvancedView(RepoAppView):
43 class RepoSettingsAdvancedView(RepoAppView):
44
44
45 def load_default_context(self):
45 def load_default_context(self):
46 c = self._get_local_tmpl_context()
46 c = self._get_local_tmpl_context()
47 return c
47 return c
48
48
49 def _get_users_with_permissions(self):
49 def _get_users_with_permissions(self):
50 user_permissions = {}
50 user_permissions = {}
51 for perm in self.db_repo.permissions():
51 for perm in self.db_repo.permissions():
52 user_permissions[perm.user_id] = perm
52 user_permissions[perm.user_id] = perm
53
53
54 return user_permissions
54 return user_permissions
55
55
56 @LoginRequired()
56 @LoginRequired()
57 @HasRepoPermissionAnyDecorator('repository.admin')
57 @HasRepoPermissionAnyDecorator('repository.admin')
58 def edit_advanced(self):
58 def edit_advanced(self):
59 _ = self.request.translate
59 _ = self.request.translate
60 c = self.load_default_context()
60 c = self.load_default_context()
61 c.active = 'advanced'
61 c.active = 'advanced'
62
62
63 c.default_user_id = User.get_default_user_id()
63 c.default_user_id = User.get_default_user_id()
64 c.in_public_journal = UserFollowing.query() \
64 c.in_public_journal = UserFollowing.query() \
65 .filter(UserFollowing.user_id == c.default_user_id) \
65 .filter(UserFollowing.user_id == c.default_user_id) \
66 .filter(UserFollowing.follows_repository == self.db_repo).scalar()
66 .filter(UserFollowing.follows_repository == self.db_repo).scalar()
67
67
68 c.ver_info_dict = self.rhodecode_vcs_repo.get_hooks_info()
68 c.ver_info_dict = self.rhodecode_vcs_repo.get_hooks_info()
69 c.hooks_outdated = False
69 c.hooks_outdated = False
70
70
71 try:
71 try:
72 if Version(c.ver_info_dict['pre_version']) < Version(c.rhodecode_version):
72 if Version(c.ver_info_dict['pre_version']) < Version(c.rhodecode_version):
73 c.hooks_outdated = True
73 c.hooks_outdated = True
74 except Exception:
74 except Exception:
75 pass
75 pass
76
76
77 # update commit cache if GET flag is present
77 # update commit cache if GET flag is present
78 if self.request.GET.get('update_commit_cache'):
78 if self.request.GET.get('update_commit_cache'):
79 self.db_repo.update_commit_cache()
79 self.db_repo.update_commit_cache()
80 h.flash(_('updated commit cache'), category='success')
80 h.flash(_('updated commit cache'), category='success')
81
81
82 return self._get_template_context(c)
82 return self._get_template_context(c)
83
83
84 @LoginRequired()
84 @LoginRequired()
85 @HasRepoPermissionAnyDecorator('repository.admin')
85 @HasRepoPermissionAnyDecorator('repository.admin')
86 @CSRFRequired()
86 @CSRFRequired()
87 def edit_advanced_archive(self):
87 def edit_advanced_archive(self):
88 """
88 """
89 Archives the repository. It will become read-only, and not visible in search
89 Archives the repository. It will become read-only, and not visible in search
90 or other queries. But still visible for super-admins.
90 or other queries. But still visible for super-admins.
91 """
91 """
92
92
93 _ = self.request.translate
93 _ = self.request.translate
94
94
95 try:
95 try:
96 old_data = self.db_repo.get_api_data()
96 old_data = self.db_repo.get_api_data()
97 RepoModel().archive(self.db_repo)
97 RepoModel().archive(self.db_repo)
98
98
99 repo = audit_logger.RepoWrap(repo_id=None, repo_name=self.db_repo.repo_name)
99 repo = audit_logger.RepoWrap(repo_id=None, repo_name=self.db_repo.repo_name)
100 audit_logger.store_web(
100 audit_logger.store_web(
101 'repo.archive', action_data={'old_data': old_data},
101 'repo.archive', action_data={'old_data': old_data},
102 user=self._rhodecode_user, repo=repo)
102 user=self._rhodecode_user, repo=repo)
103
103
104 ScmModel().mark_for_invalidation(self.db_repo_name, delete=True)
104 ScmModel().mark_for_invalidation(self.db_repo_name, delete=True)
105 h.flash(
105 h.flash(
106 _('Archived repository `%s`') % self.db_repo_name,
106 _('Archived repository `%s`') % self.db_repo_name,
107 category='success')
107 category='success')
108 Session().commit()
108 Session().commit()
109 except Exception:
109 except Exception:
110 log.exception("Exception during archiving of repository")
110 log.exception("Exception during archiving of repository")
111 h.flash(_('An error occurred during archiving of `%s`')
111 h.flash(_('An error occurred during archiving of `%s`')
112 % self.db_repo_name, category='error')
112 % self.db_repo_name, category='error')
113 # redirect to advanced for more deletion options
113 # redirect to advanced for more deletion options
114 raise HTTPFound(
114 raise HTTPFound(
115 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name,
115 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name,
116 _anchor='advanced-archive'))
116 _anchor='advanced-archive'))
117
117
118 # flush permissions for all users defined in permissions
118 # flush permissions for all users defined in permissions
119 affected_user_ids = self._get_users_with_permissions().keys()
119 affected_user_ids = self._get_users_with_permissions().keys()
120 PermissionModel().trigger_permission_flush(affected_user_ids)
120 PermissionModel().trigger_permission_flush(affected_user_ids)
121
121
122 raise HTTPFound(h.route_path('home'))
122 raise HTTPFound(h.route_path('home'))
123
123
124 @LoginRequired()
124 @LoginRequired()
125 @HasRepoPermissionAnyDecorator('repository.admin')
125 @HasRepoPermissionAnyDecorator('repository.admin')
126 @CSRFRequired()
126 @CSRFRequired()
127 def edit_advanced_delete(self):
127 def edit_advanced_delete(self):
128 """
128 """
129 Deletes the repository, or shows warnings if deletion is not possible
129 Deletes the repository, or shows warnings if deletion is not possible
130 because of attached forks or other errors.
130 because of attached forks or other errors.
131 """
131 """
132 _ = self.request.translate
132 _ = self.request.translate
133 handle_forks = self.request.POST.get('forks', None)
133 handle_forks = self.request.POST.get('forks', None)
134 if handle_forks == 'detach_forks':
134 if handle_forks == 'detach_forks':
135 handle_forks = 'detach'
135 handle_forks = 'detach'
136 elif handle_forks == 'delete_forks':
136 elif handle_forks == 'delete_forks':
137 handle_forks = 'delete'
137 handle_forks = 'delete'
138
138
139 repo_advanced_url = h.route_path(
140 'edit_repo_advanced', repo_name=self.db_repo_name,
141 _anchor='advanced-delete')
139 try:
142 try:
140 old_data = self.db_repo.get_api_data()
143 old_data = self.db_repo.get_api_data()
141 RepoModel().delete(self.db_repo, forks=handle_forks)
144 RepoModel().delete(self.db_repo, forks=handle_forks)
142
145
143 _forks = self.db_repo.forks.count()
146 _forks = self.db_repo.forks.count()
144 if _forks and handle_forks:
147 if _forks and handle_forks:
145 if handle_forks == 'detach_forks':
148 if handle_forks == 'detach_forks':
146 h.flash(_('Detached %s forks') % _forks, category='success')
149 h.flash(_('Detached %s forks') % _forks, category='success')
147 elif handle_forks == 'delete_forks':
150 elif handle_forks == 'delete_forks':
148 h.flash(_('Deleted %s forks') % _forks, category='success')
151 h.flash(_('Deleted %s forks') % _forks, category='success')
149
152
150 repo = audit_logger.RepoWrap(repo_id=None, repo_name=self.db_repo.repo_name)
153 repo = audit_logger.RepoWrap(repo_id=None, repo_name=self.db_repo.repo_name)
151 audit_logger.store_web(
154 audit_logger.store_web(
152 'repo.delete', action_data={'old_data': old_data},
155 'repo.delete', action_data={'old_data': old_data},
153 user=self._rhodecode_user, repo=repo)
156 user=self._rhodecode_user, repo=repo)
154
157
155 ScmModel().mark_for_invalidation(self.db_repo_name, delete=True)
158 ScmModel().mark_for_invalidation(self.db_repo_name, delete=True)
156 h.flash(
159 h.flash(
157 _('Deleted repository `%s`') % self.db_repo_name,
160 _('Deleted repository `%s`') % self.db_repo_name,
158 category='success')
161 category='success')
159 Session().commit()
162 Session().commit()
160 except AttachedForksError:
163 except AttachedForksError:
161 repo_advanced_url = h.route_path(
162 'edit_repo_advanced', repo_name=self.db_repo_name,
163 _anchor='advanced-delete')
164 delete_anchor = h.link_to(_('detach or delete'), repo_advanced_url)
164 delete_anchor = h.link_to(_('detach or delete'), repo_advanced_url)
165 h.flash(_('Cannot delete `{repo}` it still contains attached forks. '
165 h.flash(_('Cannot delete `{repo}` it still contains attached forks. '
166 'Try using {delete_or_detach} option.')
166 'Try using {delete_or_detach} option.')
167 .format(repo=self.db_repo_name, delete_or_detach=delete_anchor),
167 .format(repo=self.db_repo_name, delete_or_detach=delete_anchor),
168 category='warning')
168 category='warning')
169
169
170 # redirect to advanced for forks handle action ?
170 # redirect to advanced for forks handle action ?
171 raise HTTPFound(repo_advanced_url)
171 raise HTTPFound(repo_advanced_url)
172
172
173 except AttachedPullRequestsError:
173 except AttachedPullRequestsError:
174 repo_advanced_url = h.route_path(
175 'edit_repo_advanced', repo_name=self.db_repo_name,
176 _anchor='advanced-delete')
177 attached_prs = len(self.db_repo.pull_requests_source +
174 attached_prs = len(self.db_repo.pull_requests_source +
178 self.db_repo.pull_requests_target)
175 self.db_repo.pull_requests_target)
179 h.flash(
176 h.flash(
180 _('Cannot delete `{repo}` it still contains {num} attached pull requests. '
177 _('Cannot delete `{repo}` it still contains {num} attached pull requests. '
181 'Consider archiving the repository instead.').format(
178 'Consider archiving the repository instead.').format(
182 repo=self.db_repo_name, num=attached_prs), category='warning')
179 repo=self.db_repo_name, num=attached_prs), category='warning')
183
180
184 # redirect to advanced for forks handle action ?
181 # redirect to advanced for forks handle action ?
185 raise HTTPFound(repo_advanced_url)
182 raise HTTPFound(repo_advanced_url)
186
183
184 except AttachedArtifactsError:
185
186 attached_artifacts = len(self.db_repo.artifacts)
187 h.flash(
188 _('Cannot delete `{repo}` it still contains {num} attached artifacts. '
189 'Consider archiving the repository instead.').format(
190 repo=self.db_repo_name, num=attached_artifacts), category='warning')
191
192 # redirect to advanced for forks handle action ?
193 raise HTTPFound(repo_advanced_url)
187 except Exception:
194 except Exception:
188 log.exception("Exception during deletion of repository")
195 log.exception("Exception during deletion of repository")
189 h.flash(_('An error occurred during deletion of `%s`')
196 h.flash(_('An error occurred during deletion of `%s`')
190 % self.db_repo_name, category='error')
197 % self.db_repo_name, category='error')
191 # redirect to advanced for more deletion options
198 # redirect to advanced for more deletion options
192 raise HTTPFound(
199 raise HTTPFound(
193 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name,
200 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name,
194 _anchor='advanced-delete'))
201 _anchor='advanced-delete'))
195
202
196 raise HTTPFound(h.route_path('home'))
203 raise HTTPFound(h.route_path('home'))
197
204
198 @LoginRequired()
205 @LoginRequired()
199 @HasRepoPermissionAnyDecorator('repository.admin')
206 @HasRepoPermissionAnyDecorator('repository.admin')
200 @CSRFRequired()
207 @CSRFRequired()
201 def edit_advanced_journal(self):
208 def edit_advanced_journal(self):
202 """
209 """
203 Set's this repository to be visible in public journal,
210 Set's this repository to be visible in public journal,
204 in other words making default user to follow this repo
211 in other words making default user to follow this repo
205 """
212 """
206 _ = self.request.translate
213 _ = self.request.translate
207
214
208 try:
215 try:
209 user_id = User.get_default_user_id()
216 user_id = User.get_default_user_id()
210 ScmModel().toggle_following_repo(self.db_repo.repo_id, user_id)
217 ScmModel().toggle_following_repo(self.db_repo.repo_id, user_id)
211 h.flash(_('Updated repository visibility in public journal'),
218 h.flash(_('Updated repository visibility in public journal'),
212 category='success')
219 category='success')
213 Session().commit()
220 Session().commit()
214 except Exception:
221 except Exception:
215 h.flash(_('An error occurred during setting this '
222 h.flash(_('An error occurred during setting this '
216 'repository in public journal'),
223 'repository in public journal'),
217 category='error')
224 category='error')
218
225
219 raise HTTPFound(
226 raise HTTPFound(
220 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
227 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
221
228
222 @LoginRequired()
229 @LoginRequired()
223 @HasRepoPermissionAnyDecorator('repository.admin')
230 @HasRepoPermissionAnyDecorator('repository.admin')
224 @CSRFRequired()
231 @CSRFRequired()
225 def edit_advanced_fork(self):
232 def edit_advanced_fork(self):
226 """
233 """
227 Mark given repository as a fork of another
234 Mark given repository as a fork of another
228 """
235 """
229 _ = self.request.translate
236 _ = self.request.translate
230
237
231 new_fork_id = safe_int(self.request.POST.get('id_fork_of'))
238 new_fork_id = safe_int(self.request.POST.get('id_fork_of'))
232
239
233 # valid repo, re-check permissions
240 # valid repo, re-check permissions
234 if new_fork_id:
241 if new_fork_id:
235 repo = Repository.get(new_fork_id)
242 repo = Repository.get(new_fork_id)
236 # ensure we have at least read access to the repo we mark
243 # ensure we have at least read access to the repo we mark
237 perm_check = HasRepoPermissionAny(
244 perm_check = HasRepoPermissionAny(
238 'repository.read', 'repository.write', 'repository.admin')
245 'repository.read', 'repository.write', 'repository.admin')
239
246
240 if repo and perm_check(repo_name=repo.repo_name):
247 if repo and perm_check(repo_name=repo.repo_name):
241 new_fork_id = repo.repo_id
248 new_fork_id = repo.repo_id
242 else:
249 else:
243 new_fork_id = None
250 new_fork_id = None
244
251
245 try:
252 try:
246 repo = ScmModel().mark_as_fork(
253 repo = ScmModel().mark_as_fork(
247 self.db_repo_name, new_fork_id, self._rhodecode_user.user_id)
254 self.db_repo_name, new_fork_id, self._rhodecode_user.user_id)
248 fork = repo.fork.repo_name if repo.fork else _('Nothing')
255 fork = repo.fork.repo_name if repo.fork else _('Nothing')
249 Session().commit()
256 Session().commit()
250 h.flash(
257 h.flash(
251 _('Marked repo %s as fork of %s') % (self.db_repo_name, fork),
258 _('Marked repo %s as fork of %s') % (self.db_repo_name, fork),
252 category='success')
259 category='success')
253 except RepositoryError as e:
260 except RepositoryError as e:
254 log.exception("Repository Error occurred")
261 log.exception("Repository Error occurred")
255 h.flash(str(e), category='error')
262 h.flash(str(e), category='error')
256 except Exception:
263 except Exception:
257 log.exception("Exception while editing fork")
264 log.exception("Exception while editing fork")
258 h.flash(_('An error occurred during this operation'),
265 h.flash(_('An error occurred during this operation'),
259 category='error')
266 category='error')
260
267
261 raise HTTPFound(
268 raise HTTPFound(
262 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
269 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
263
270
264 @LoginRequired()
271 @LoginRequired()
265 @HasRepoPermissionAnyDecorator('repository.admin')
272 @HasRepoPermissionAnyDecorator('repository.admin')
266 @CSRFRequired()
273 @CSRFRequired()
267 def edit_advanced_toggle_locking(self):
274 def edit_advanced_toggle_locking(self):
268 """
275 """
269 Toggle locking of repository
276 Toggle locking of repository
270 """
277 """
271 _ = self.request.translate
278 _ = self.request.translate
272 set_lock = self.request.POST.get('set_lock')
279 set_lock = self.request.POST.get('set_lock')
273 set_unlock = self.request.POST.get('set_unlock')
280 set_unlock = self.request.POST.get('set_unlock')
274
281
275 try:
282 try:
276 if set_lock:
283 if set_lock:
277 Repository.lock(self.db_repo, self._rhodecode_user.user_id,
284 Repository.lock(self.db_repo, self._rhodecode_user.user_id,
278 lock_reason=Repository.LOCK_WEB)
285 lock_reason=Repository.LOCK_WEB)
279 h.flash(_('Locked repository'), category='success')
286 h.flash(_('Locked repository'), category='success')
280 elif set_unlock:
287 elif set_unlock:
281 Repository.unlock(self.db_repo)
288 Repository.unlock(self.db_repo)
282 h.flash(_('Unlocked repository'), category='success')
289 h.flash(_('Unlocked repository'), category='success')
283 except Exception as e:
290 except Exception as e:
284 log.exception("Exception during unlocking")
291 log.exception("Exception during unlocking")
285 h.flash(_('An error occurred during unlocking'), category='error')
292 h.flash(_('An error occurred during unlocking'), category='error')
286
293
287 raise HTTPFound(
294 raise HTTPFound(
288 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
295 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
289
296
290 @LoginRequired()
297 @LoginRequired()
291 @HasRepoPermissionAnyDecorator('repository.admin')
298 @HasRepoPermissionAnyDecorator('repository.admin')
292 def edit_advanced_install_hooks(self):
299 def edit_advanced_install_hooks(self):
293 """
300 """
294 Install Hooks for repository
301 Install Hooks for repository
295 """
302 """
296 _ = self.request.translate
303 _ = self.request.translate
297 self.load_default_context()
304 self.load_default_context()
298 self.rhodecode_vcs_repo.install_hooks(force=True)
305 self.rhodecode_vcs_repo.install_hooks(force=True)
299 h.flash(_('installed updated hooks into this repository'),
306 h.flash(_('installed updated hooks into this repository'),
300 category='success')
307 category='success')
301 raise HTTPFound(
308 raise HTTPFound(
302 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
309 h.route_path('edit_repo_advanced', repo_name=self.db_repo_name))
@@ -1,60 +1,60 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19 import logging
19 import logging
20
20
21 from . import config_keys
21 from . import config_keys
22
22
23 from rhodecode.config.settings_maker import SettingsMaker
23 from rhodecode.config.settings_maker import SettingsMaker
24
24
25 log = logging.getLogger(__name__)
25 log = logging.getLogger(__name__)
26
26
27
27
28 def _sanitize_settings_and_apply_defaults(settings):
28 def _sanitize_settings_and_apply_defaults(settings):
29 """
29 """
30 Set defaults, convert to python types and validate settings.
30 Set defaults, convert to python types and validate settings.
31 """
31 """
32 settings_maker = SettingsMaker(settings)
32 settings_maker = SettingsMaker(settings)
33
33
34 settings_maker.make_setting(config_keys.generate_authorized_keyfile, False, parser='bool')
34 settings_maker.make_setting(config_keys.generate_authorized_keyfile, False, parser='bool')
35 settings_maker.make_setting(config_keys.wrapper_allow_shell, False, parser='bool')
35 settings_maker.make_setting(config_keys.wrapper_allow_shell, False, parser='bool')
36 settings_maker.make_setting(config_keys.enable_debug_logging, False, parser='bool')
36 settings_maker.make_setting(config_keys.enable_debug_logging, False, parser='bool')
37 settings_maker.make_setting(config_keys.ssh_key_generator_enabled, True, parser='bool')
37 settings_maker.make_setting(config_keys.ssh_key_generator_enabled, True, parser='bool')
38
38
39 settings_maker.make_setting(config_keys.authorized_keys_file_path, '~/.ssh/authorized_keys_rhodecode')
39 settings_maker.make_setting(config_keys.authorized_keys_file_path, '~/.ssh/authorized_keys_rhodecode')
40 settings_maker.make_setting(config_keys.wrapper_cmd, '')
40 settings_maker.make_setting(config_keys.wrapper_cmd, '/usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2')
41 settings_maker.make_setting(config_keys.authorized_keys_line_ssh_opts, '')
41 settings_maker.make_setting(config_keys.authorized_keys_line_ssh_opts, '')
42
42
43 settings_maker.make_setting(config_keys.ssh_hg_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/hg')
43 settings_maker.make_setting(config_keys.ssh_hg_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/hg')
44 settings_maker.make_setting(config_keys.ssh_git_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/git')
44 settings_maker.make_setting(config_keys.ssh_git_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/git')
45 settings_maker.make_setting(config_keys.ssh_svn_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/svnserve')
45 settings_maker.make_setting(config_keys.ssh_svn_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/svnserve')
46
46
47 settings_maker.env_expand()
47 settings_maker.env_expand()
48
48
49
49
50 def includeme(config):
50 def includeme(config):
51 settings = config.registry.settings
51 settings = config.registry.settings
52 _sanitize_settings_and_apply_defaults(settings)
52 _sanitize_settings_and_apply_defaults(settings)
53
53
54 # if we have enable generation of file, subscribe to event
54 # if we have enable generation of file, subscribe to event
55 if settings[config_keys.generate_authorized_keyfile]:
55 if settings[config_keys.generate_authorized_keyfile]:
56 # lazy import here for faster code reading... via sshwrapper-v2 mode
56 # lazy import here for faster code reading... via sshwrapper-v2 mode
57 from .subscribers import generate_ssh_authorized_keys_file_subscriber
57 from .subscribers import generate_ssh_authorized_keys_file_subscriber
58 from .events import SshKeyFileChangeEvent
58 from .events import SshKeyFileChangeEvent
59 config.add_subscriber(
59 config.add_subscriber(
60 generate_ssh_authorized_keys_file_subscriber, SshKeyFileChangeEvent)
60 generate_ssh_authorized_keys_file_subscriber, SshKeyFileChangeEvent)
@@ -1,32 +1,32 b''
1 # Copyright (C) 2016-2023 RhodeCode GmbH
1 # Copyright (C) 2016-2023 RhodeCode GmbH
2 #
2 #
3 # This program is free software: you can redistribute it and/or modify
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
5 # (only), as published by the Free Software Foundation.
6 #
6 #
7 # This program is distributed in the hope that it will be useful,
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
10 # GNU General Public License for more details.
11 #
11 #
12 # You should have received a copy of the GNU Affero General Public License
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
14 #
15 # This program is dual-licensed. If you wish to learn more about the
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
18
19
19
20 # Definition of setting keys used to configure this module. Defined here to
20 # Definition of setting keys used to configure this module. Defined here to
21 # avoid repetition of keys throughout the module.
21 # avoid repetition of keys throughout the module.
22 generate_authorized_keyfile = 'ssh.generate_authorized_keyfile'
22 generate_authorized_keyfile = 'ssh.generate_authorized_keyfile'
23 authorized_keys_file_path = 'ssh.authorized_keys_file_path'
23 authorized_keys_file_path = 'ssh.authorized_keys_file_path'
24 authorized_keys_line_ssh_opts = 'ssh.authorized_keys_ssh_opts'
24 authorized_keys_line_ssh_opts = 'ssh.authorized_keys_ssh_opts'
25 ssh_key_generator_enabled = 'ssh.enable_ui_key_generator'
25 ssh_key_generator_enabled = 'ssh.enable_ui_key_generator'
26 wrapper_cmd = 'ssh.wrapper_cmd'
26 wrapper_cmd = 'ssh.wrapper_cmd.v2'
27 wrapper_allow_shell = 'ssh.wrapper_cmd_allow_shell'
27 wrapper_allow_shell = 'ssh.wrapper_cmd_allow_shell'
28 enable_debug_logging = 'ssh.enable_debug_logging'
28 enable_debug_logging = 'ssh.enable_debug_logging'
29
29
30 ssh_hg_bin = 'ssh.executable.hg'
30 ssh_hg_bin = 'ssh.executable.hg'
31 ssh_git_bin = 'ssh.executable.git'
31 ssh_git_bin = 'ssh.executable.git'
32 ssh_svn_bin = 'ssh.executable.svn'
32 ssh_svn_bin = 'ssh.executable.svn'
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: file was removed
NO CONTENT: file was removed
1 NO CONTENT: file was removed
NO CONTENT: file was removed
1 NO CONTENT: file was removed
NO CONTENT: file was removed
General Comments 0
You need to be logged in to leave comments. Login now