Show More
@@ -0,0 +1,43 b'' | |||
|
1 | |RCE| 5.0.1 |RNS| | |
|
2 | ----------------- | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2024-05-20 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | ||
|
14 | ||
|
15 | General | |
|
16 | ^^^^^^^ | |
|
17 | ||
|
18 | ||
|
19 | ||
|
20 | Security | |
|
21 | ^^^^^^^^ | |
|
22 | ||
|
23 | ||
|
24 | ||
|
25 | Performance | |
|
26 | ^^^^^^^^^^^ | |
|
27 | ||
|
28 | ||
|
29 | ||
|
30 | ||
|
31 | Fixes | |
|
32 | ^^^^^ | |
|
33 | ||
|
34 | - Fixed Celery serialization issues | |
|
35 | - Fixed Celery startup problems signaling | |
|
36 | - Fixed SVN hooks binary dir paths which in certain scenarios resulted in empty values forbidding hooks to execute | |
|
37 | - Fixed annotation bug for files without new lines or mixed newlines | |
|
38 | ||
|
39 | ||
|
40 | Upgrade notes | |
|
41 | ^^^^^^^^^^^^^ | |
|
42 | ||
|
43 | - RhodeCode 5.0.1 is unscheduled bugfix release to address some of the issues found during 4.X -> 5.X migration |
@@ -0,0 +1,39 b'' | |||
|
1 | |RCE| 5.0.2 |RNS| | |
|
2 | ----------------- | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2024-05-29 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | ||
|
14 | ||
|
15 | General | |
|
16 | ^^^^^^^ | |
|
17 | ||
|
18 | ||
|
19 | ||
|
20 | Security | |
|
21 | ^^^^^^^^ | |
|
22 | ||
|
23 | ||
|
24 | ||
|
25 | Performance | |
|
26 | ^^^^^^^^^^^ | |
|
27 | ||
|
28 | ||
|
29 | ||
|
30 | ||
|
31 | Fixes | |
|
32 | ^^^^^ | |
|
33 | ||
|
34 | - Fixed problems with saving branch permissions | |
|
35 | ||
|
36 | Upgrade notes | |
|
37 | ^^^^^^^^^^^^^ | |
|
38 | ||
|
39 | - RhodeCode 5.0.2 is unscheduled bugfix release to address some of the issues found during 4.X -> 5.X migration |
@@ -0,0 +1,39 b'' | |||
|
1 | |RCE| 5.0.3 |RNS| | |
|
2 | ----------------- | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2024-06-17 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | ||
|
14 | ||
|
15 | General | |
|
16 | ^^^^^^^ | |
|
17 | ||
|
18 | ||
|
19 | ||
|
20 | Security | |
|
21 | ^^^^^^^^ | |
|
22 | ||
|
23 | ||
|
24 | ||
|
25 | Performance | |
|
26 | ^^^^^^^^^^^ | |
|
27 | ||
|
28 | ||
|
29 | ||
|
30 | ||
|
31 | Fixes | |
|
32 | ^^^^^ | |
|
33 | ||
|
34 | - Fixed problems nested ldap groups | |
|
35 | ||
|
36 | Upgrade notes | |
|
37 | ^^^^^^^^^^^^^ | |
|
38 | ||
|
39 | - RhodeCode 5.0.3 is unscheduled bugfix release to address some of the issues found during 4.X -> 5.X migration |
@@ -0,0 +1,59 b'' | |||
|
1 | |RCE| 5.1.0 |RNS| | |
|
2 | ----------------- | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2024-07-18 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | - We've introduced 2FA for users. Now alongside the external auth 2fa support RhodeCode allows to enable 2FA for users | |
|
14 | 2FA options will be available for each user individually, or enforced via authentication plugins like ldap, or internal. | |
|
15 | - Email based log-in. RhodeCode now allows to log-in using email as well as username for main authentication type. | |
|
16 | - Ability to replace a file using web UI. Now one can replace an existing file from the web-ui. | |
|
17 | - GIT LFS Sync automation. Remote push/pull commands now can also sync GIT LFS objects. | |
|
18 | - Added ability to remove or close branches from the web ui | |
|
19 | - Added ability to delete a branch automatically after merging PR for git repositories | |
|
20 | - Added support for S3 based archive_cache based that allows storing cached archives in S3 compatible object store. | |
|
21 | ||
|
22 | ||
|
23 | General | |
|
24 | ^^^^^^^ | |
|
25 | ||
|
26 | - Upgraded all dependency libraries to their latest available versions | |
|
27 | - Repository storage is no longer controlled via DB settings, but .ini file. This allows easier automated deployments. | |
|
28 | - Bumped mercurial to 6.7.4 | |
|
29 | - Mercurial: enable httppostarguments for better support of large repositories with lots of heads. | |
|
30 | - Added explicit db-migrate step to update hooks for 5.X release. | |
|
31 | ||
|
32 | ||
|
33 | Security | |
|
34 | ^^^^^^^^ | |
|
35 | ||
|
36 | ||
|
37 | ||
|
38 | Performance | |
|
39 | ^^^^^^^^^^^ | |
|
40 | ||
|
41 | - Introduced a full rewrite of ssh backend for performance. The result is 2-5x speed improvement for operation with ssh. | |
|
42 | enable new ssh wrapper by setting: `ssh.wrapper_cmd = /home/rhodecode/venv/bin/rc-ssh-wrapper-v2` | |
|
43 | - Introduced a new hooks subsystem that is more scalable and faster, enable it by settings: `vcs.hooks.protocol = celery` | |
|
44 | ||
|
45 | ||
|
46 | Fixes | |
|
47 | ^^^^^ | |
|
48 | ||
|
49 | - Archives: Zip archive download breaks when a gitmodules file is present | |
|
50 | - Branch permissions: fixed bug preventing to specify own rules from 4.X install | |
|
51 | - SVN: refactored svn events, thus fixing support for it in dockerized env | |
|
52 | - Fixed empty server url in PR link after push from cli | |
|
53 | ||
|
54 | ||
|
55 | Upgrade notes | |
|
56 | ^^^^^^^^^^^^^ | |
|
57 | ||
|
58 | - RhodeCode 5.1.0 is a mayor feature release after big 5.0.0 python3 migration. Happy to ship a first time feature | |
|
59 | rich release |
@@ -0,0 +1,55 b'' | |||
|
1 | ||
|
2 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
3 | # | |
|
4 | # This program is free software: you can redistribute it and/or modify | |
|
5 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
6 | # (only), as published by the Free Software Foundation. | |
|
7 | # | |
|
8 | # This program is distributed in the hope that it will be useful, | |
|
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
11 | # GNU General Public License for more details. | |
|
12 | # | |
|
13 | # You should have received a copy of the GNU Affero General Public License | |
|
14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
15 | # | |
|
16 | # This program is dual-licensed. If you wish to learn more about the | |
|
17 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
18 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
19 | ||
|
20 | import pytest | |
|
21 | ||
|
22 | from rhodecode.api.tests.utils import ( | |
|
23 | build_data, api_call) | |
|
24 | ||
|
25 | ||
|
26 | @pytest.mark.usefixtures("app") | |
|
27 | class TestServiceApi: | |
|
28 | ||
|
29 | def test_service_api_with_wrong_secret(self): | |
|
30 | id, payload = build_data("wrong_api_key", 'service_get_repo_name_by_id') | |
|
31 | response = api_call(self.app, payload) | |
|
32 | ||
|
33 | assert 'Invalid API KEY' == response.json['error'] | |
|
34 | ||
|
35 | def test_service_api_with_legit_secret(self): | |
|
36 | id, payload = build_data(self.app.app.config.get_settings()['app.service_api.token'], | |
|
37 | 'service_get_repo_name_by_id', repo_id='1') | |
|
38 | response = api_call(self.app, payload) | |
|
39 | assert not response.json['error'] | |
|
40 | ||
|
41 | def test_service_api_not_a_part_of_public_api_suggestions(self): | |
|
42 | id, payload = build_data("secret", 'some_random_guess_method') | |
|
43 | response = api_call(self.app, payload) | |
|
44 | assert 'service_' not in response.json['error'] | |
|
45 | ||
|
46 | def test_service_get_data_for_ssh_wrapper_output(self): | |
|
47 | id, payload = build_data( | |
|
48 | self.app.app.config.get_settings()['app.service_api.token'], | |
|
49 | 'service_get_data_for_ssh_wrapper', | |
|
50 | user_id=1, | |
|
51 | repo_name='vcs_test_git') | |
|
52 | response = api_call(self.app, payload) | |
|
53 | ||
|
54 | assert ['branch_permissions', 'repo_permissions', 'repos_path', 'user_id', 'username']\ | |
|
55 | == list(response.json['result'].keys()) |
@@ -0,0 +1,125 b'' | |||
|
1 | # Copyright (C) 2011-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import logging | |
|
20 | import datetime | |
|
21 | from collections import defaultdict | |
|
22 | ||
|
23 | from sqlalchemy import Table | |
|
24 | from rhodecode.api import jsonrpc_method, SERVICE_API_IDENTIFIER | |
|
25 | ||
|
26 | ||
|
27 | log = logging.getLogger(__name__) | |
|
28 | ||
|
29 | ||
|
30 | @jsonrpc_method() | |
|
31 | def service_get_data_for_ssh_wrapper(request, apiuser, user_id, repo_name, key_id=None): | |
|
32 | from rhodecode.model.db import User | |
|
33 | from rhodecode.model.scm import ScmModel | |
|
34 | from rhodecode.model.meta import raw_query_executor, Base | |
|
35 | ||
|
36 | if key_id: | |
|
37 | table = Table('user_ssh_keys', Base.metadata, autoload=False) | |
|
38 | atime = datetime.datetime.utcnow() | |
|
39 | stmt = ( | |
|
40 | table.update() | |
|
41 | .where(table.c.ssh_key_id == key_id) | |
|
42 | .values(accessed_on=atime) | |
|
43 | ) | |
|
44 | ||
|
45 | res_count = None | |
|
46 | with raw_query_executor() as session: | |
|
47 | result = session.execute(stmt) | |
|
48 | if result.rowcount: | |
|
49 | res_count = result.rowcount | |
|
50 | ||
|
51 | if res_count: | |
|
52 | log.debug(f'Update key id:{key_id} access time') | |
|
53 | db_user = User.get(user_id) | |
|
54 | if not db_user: | |
|
55 | return None | |
|
56 | auth_user = db_user.AuthUser() | |
|
57 | ||
|
58 | return { | |
|
59 | 'user_id': db_user.user_id, | |
|
60 | 'username': db_user.username, | |
|
61 | 'repo_permissions': auth_user.permissions['repositories'], | |
|
62 | "branch_permissions": auth_user.get_branch_permissions(repo_name), | |
|
63 | "repos_path": ScmModel().repos_path | |
|
64 | } | |
|
65 | ||
|
66 | ||
|
67 | @jsonrpc_method() | |
|
68 | def service_get_repo_name_by_id(request, apiuser, repo_id): | |
|
69 | from rhodecode.model.repo import RepoModel | |
|
70 | by_id_match = RepoModel().get_repo_by_id(repo_id) | |
|
71 | if by_id_match: | |
|
72 | repo_name = by_id_match.repo_name | |
|
73 | return { | |
|
74 | 'repo_name': repo_name | |
|
75 | } | |
|
76 | return None | |
|
77 | ||
|
78 | ||
|
79 | @jsonrpc_method() | |
|
80 | def service_mark_for_invalidation(request, apiuser, repo_name): | |
|
81 | from rhodecode.model.scm import ScmModel | |
|
82 | ScmModel().mark_for_invalidation(repo_name) | |
|
83 | return {'msg': "Applied"} | |
|
84 | ||
|
85 | ||
|
86 | @jsonrpc_method() | |
|
87 | def service_config_to_hgrc(request, apiuser, cli_flags, repo_name): | |
|
88 | from rhodecode.model.db import RhodeCodeUi | |
|
89 | from rhodecode.model.settings import VcsSettingsModel | |
|
90 | ||
|
91 | ui_sections = defaultdict(list) | |
|
92 | ui = VcsSettingsModel(repo=repo_name).get_ui_settings(section=None, key=None) | |
|
93 | ||
|
94 | default_hooks = [ | |
|
95 | ('pretxnchangegroup.ssh_auth', 'python:vcsserver.hooks.pre_push_ssh_auth'), | |
|
96 | ('pretxnchangegroup.ssh', 'python:vcsserver.hooks.pre_push_ssh'), | |
|
97 | ('changegroup.ssh', 'python:vcsserver.hooks.post_push_ssh'), | |
|
98 | ||
|
99 | ('preoutgoing.ssh', 'python:vcsserver.hooks.pre_pull_ssh'), | |
|
100 | ('outgoing.ssh', 'python:vcsserver.hooks.post_pull_ssh'), | |
|
101 | ] | |
|
102 | ||
|
103 | for k, v in default_hooks: | |
|
104 | ui_sections['hooks'].append((k, v)) | |
|
105 | ||
|
106 | for entry in ui: | |
|
107 | if not entry.active: | |
|
108 | continue | |
|
109 | sec = entry.section | |
|
110 | key = entry.key | |
|
111 | ||
|
112 | if sec in cli_flags: | |
|
113 | # we want only custom hooks, so we skip builtins | |
|
114 | if sec == 'hooks' and key in RhodeCodeUi.HOOKS_BUILTIN: | |
|
115 | continue | |
|
116 | ||
|
117 | ui_sections[sec].append([key, entry.value]) | |
|
118 | ||
|
119 | flags = [] | |
|
120 | for _sec, key_val in ui_sections.items(): | |
|
121 | flags.append(' ') | |
|
122 | flags.append(f'[{_sec}]') | |
|
123 | for key, val in key_val: | |
|
124 | flags.append(f'{key}= {val}') | |
|
125 | return {'flags': flags} |
@@ -0,0 +1,67 b'' | |||
|
1 | import pytest | |
|
2 | import mock | |
|
3 | ||
|
4 | from rhodecode.lib.type_utils import AttributeDict | |
|
5 | from rhodecode.model.meta import Session | |
|
6 | from rhodecode.tests.fixture import Fixture | |
|
7 | from rhodecode.tests.routes import route_path | |
|
8 | from rhodecode.model.settings import SettingsModel | |
|
9 | ||
|
10 | fixture = Fixture() | |
|
11 | ||
|
12 | ||
|
13 | @pytest.mark.usefixtures('app') | |
|
14 | class Test2FA(object): | |
|
15 | @classmethod | |
|
16 | def setup_class(cls): | |
|
17 | cls.password = 'valid-one' | |
|
18 | ||
|
19 | def test_redirect_to_2fa_setup_if_enabled_for_user(self, user_util): | |
|
20 | user = user_util.create_user(password=self.password) | |
|
21 | user.has_enabled_2fa = True | |
|
22 | self.app.post( | |
|
23 | route_path('login'), | |
|
24 | {'username': user.username, | |
|
25 | 'password': self.password}) | |
|
26 | ||
|
27 | response = self.app.get('/') | |
|
28 | assert response.status_code == 302 | |
|
29 | assert response.location.endswith(route_path('setup_2fa')) | |
|
30 | ||
|
31 | def test_redirect_to_2fa_check_if_2fa_configured(self, user_util): | |
|
32 | user = user_util.create_user(password=self.password) | |
|
33 | user.has_enabled_2fa = True | |
|
34 | user.init_secret_2fa() | |
|
35 | Session().add(user) | |
|
36 | Session().commit() | |
|
37 | self.app.post( | |
|
38 | route_path('login'), | |
|
39 | {'username': user.username, | |
|
40 | 'password': self.password}) | |
|
41 | response = self.app.get('/') | |
|
42 | assert response.status_code == 302 | |
|
43 | assert response.location.endswith(route_path('check_2fa')) | |
|
44 | ||
|
45 | def test_2fa_recovery_codes_works_only_once(self, user_util): | |
|
46 | user = user_util.create_user(password=self.password) | |
|
47 | user.has_enabled_2fa = True | |
|
48 | user.init_secret_2fa() | |
|
49 | recovery_code_to_check = user.init_2fa_recovery_codes()[0] | |
|
50 | Session().add(user) | |
|
51 | Session().commit() | |
|
52 | self.app.post( | |
|
53 | route_path('login'), | |
|
54 | {'username': user.username, | |
|
55 | 'password': self.password}) | |
|
56 | response = self.app.post(route_path('check_2fa'), {'totp': recovery_code_to_check}) | |
|
57 | assert response.status_code == 302 | |
|
58 | response = self.app.post(route_path('check_2fa'), {'totp': recovery_code_to_check}) | |
|
59 | response.mustcontain('Code is invalid. Try again!') | |
|
60 | ||
|
61 | def test_2fa_state_when_forced_by_admin(self, user_util): | |
|
62 | user = user_util.create_user(password=self.password) | |
|
63 | user.has_enabled_2fa = False | |
|
64 | with mock.patch.object( | |
|
65 | SettingsModel, 'get_setting_by_name', lambda *a, **kw: AttributeDict(app_settings_value=True)): | |
|
66 | ||
|
67 | assert user.has_enabled_2fa |
@@ -0,0 +1,98 b'' | |||
|
1 | # Copyright (C) 2016-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | """ | |
|
20 | WARNING: be really carefully with changing ANY imports in this file | |
|
21 | # This script is to mean as really fast executable, doing some imports here that would yield an import chain change | |
|
22 | # can affect execution times... | |
|
23 | # This can be easily debugged using such command:: | |
|
24 | # time PYTHONPROFILEIMPORTTIME=1 rc-ssh-wrapper-v2 --debug --mode=test .dev/dev.ini | |
|
25 | """ | |
|
26 | ||
|
27 | import os | |
|
28 | import sys | |
|
29 | import time | |
|
30 | import logging | |
|
31 | ||
|
32 | import click | |
|
33 | ||
|
34 | from rhodecode.config.config_maker import sanitize_settings_and_apply_defaults | |
|
35 | from rhodecode.lib.request import Request | |
|
36 | from rhodecode.lib.utils2 import AttributeDict | |
|
37 | from rhodecode.lib.statsd_client import StatsdClient | |
|
38 | from rhodecode.lib.config_utils import get_app_config_lightweight | |
|
39 | ||
|
40 | from .utils import setup_custom_logging | |
|
41 | from .backends import SshWrapperStandalone | |
|
42 | ||
|
43 | log = logging.getLogger(__name__) | |
|
44 | ||
|
45 | ||
|
46 | @click.command() | |
|
47 | @click.argument('ini_path', type=click.Path(exists=True)) | |
|
48 | @click.option( | |
|
49 | '--mode', '-m', required=False, default='auto', | |
|
50 | type=click.Choice(['auto', 'vcs', 'git', 'hg', 'svn', 'test']), | |
|
51 | help='mode of operation') | |
|
52 | @click.option('--user', help='Username for which the command will be executed') | |
|
53 | @click.option('--user-id', help='User ID for which the command will be executed') | |
|
54 | @click.option('--key-id', help='ID of the key from the database') | |
|
55 | @click.option('--shell', '-s', is_flag=True, help='Allow Shell') | |
|
56 | @click.option('--debug', is_flag=True, help='Enabled detailed output logging') | |
|
57 | def main(ini_path, mode, user, user_id, key_id, shell, debug): | |
|
58 | ||
|
59 | time_start = time.time() | |
|
60 | setup_custom_logging(ini_path, debug) | |
|
61 | ||
|
62 | command = os.environ.get('SSH_ORIGINAL_COMMAND', '') | |
|
63 | if not command and mode not in ['test']: | |
|
64 | raise ValueError( | |
|
65 | 'Unable to fetch SSH_ORIGINAL_COMMAND from environment.' | |
|
66 | 'Please make sure this is set and available during execution ' | |
|
67 | 'of this script.') | |
|
68 | ||
|
69 | # initialize settings and get defaults | |
|
70 | settings = get_app_config_lightweight(ini_path) | |
|
71 | settings = sanitize_settings_and_apply_defaults({'__file__': ini_path}, settings) | |
|
72 | ||
|
73 | # init and bootstrap StatsdClient | |
|
74 | StatsdClient.setup(settings) | |
|
75 | statsd = StatsdClient.statsd | |
|
76 | ||
|
77 | try: | |
|
78 | connection_info = os.environ.get('SSH_CONNECTION', '') | |
|
79 | request = Request.blank('/', base_url=settings['app.base_url']) | |
|
80 | request.user = AttributeDict({'username': user, | |
|
81 | 'user_id': user_id, | |
|
82 | 'ip_addr': connection_info.split(' ')[0] if connection_info else None}) | |
|
83 | env = {'RC_CMD_SSH_WRAPPER': '1', 'request': request} | |
|
84 | ssh_wrapper = SshWrapperStandalone( | |
|
85 | command, connection_info, mode, | |
|
86 | user, user_id, key_id, shell, ini_path, settings, env) | |
|
87 | except Exception: | |
|
88 | log.exception('Failed to execute SshWrapper') | |
|
89 | sys.exit(-5) | |
|
90 | ||
|
91 | return_code = ssh_wrapper.wrap() | |
|
92 | operation_took = time.time() - time_start | |
|
93 | if statsd: | |
|
94 | operation_took_ms = round(1000.0 * operation_took) | |
|
95 | statsd.timing("rhodecode_ssh_wrapper_timing.histogram", operation_took_ms, | |
|
96 | use_decimals=False) | |
|
97 | ||
|
98 | sys.exit(return_code) |
@@ -0,0 +1,34 b'' | |||
|
1 | # Copyright (C) 2016-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import logging | |
|
20 | ||
|
21 | ||
|
22 | def setup_custom_logging(ini_path, debug): | |
|
23 | if debug: | |
|
24 | from pyramid.paster import setup_logging # Lazy import | |
|
25 | # enabled rhodecode.ini controlled logging setup | |
|
26 | setup_logging(ini_path) | |
|
27 | else: | |
|
28 | # configure logging in a mode that doesn't print anything. | |
|
29 | # in case of regularly configured logging it gets printed out back | |
|
30 | # to the client doing an SSH command. | |
|
31 | logger = logging.getLogger('') | |
|
32 | null = logging.NullHandler() | |
|
33 | # add the handler to the root logger | |
|
34 | logger.handlers = [null] |
@@ -0,0 +1,224 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import os | |
|
20 | import tempfile | |
|
21 | import logging | |
|
22 | ||
|
23 | from pyramid.settings import asbool | |
|
24 | ||
|
25 | from rhodecode.config.settings_maker import SettingsMaker | |
|
26 | from rhodecode.config import utils as config_utils | |
|
27 | ||
|
28 | log = logging.getLogger(__name__) | |
|
29 | ||
|
30 | ||
|
31 | def sanitize_settings_and_apply_defaults(global_config, settings): | |
|
32 | """ | |
|
33 | Applies settings defaults and does all type conversion. | |
|
34 | ||
|
35 | We would move all settings parsing and preparation into this place, so that | |
|
36 | we have only one place left which deals with this part. The remaining parts | |
|
37 | of the application would start to rely fully on well-prepared settings. | |
|
38 | ||
|
39 | This piece would later be split up per topic to avoid a big fat monster | |
|
40 | function. | |
|
41 | """ | |
|
42 | jn = os.path.join | |
|
43 | ||
|
44 | global_settings_maker = SettingsMaker(global_config) | |
|
45 | global_settings_maker.make_setting('debug', default=False, parser='bool') | |
|
46 | debug_enabled = asbool(global_config.get('debug')) | |
|
47 | ||
|
48 | settings_maker = SettingsMaker(settings) | |
|
49 | ||
|
50 | settings_maker.make_setting( | |
|
51 | 'logging.autoconfigure', | |
|
52 | default=False, | |
|
53 | parser='bool') | |
|
54 | ||
|
55 | logging_conf = jn(os.path.dirname(global_config.get('__file__')), 'logging.ini') | |
|
56 | settings_maker.enable_logging(logging_conf, level='INFO' if debug_enabled else 'DEBUG') | |
|
57 | ||
|
58 | # Default includes, possible to change as a user | |
|
59 | pyramid_includes = settings_maker.make_setting('pyramid.includes', [], parser='list:newline') | |
|
60 | log.debug( | |
|
61 | "Using the following pyramid.includes: %s", | |
|
62 | pyramid_includes) | |
|
63 | ||
|
64 | settings_maker.make_setting('rhodecode.edition', 'Community Edition') | |
|
65 | settings_maker.make_setting('rhodecode.edition_id', 'CE') | |
|
66 | ||
|
67 | if 'mako.default_filters' not in settings: | |
|
68 | # set custom default filters if we don't have it defined | |
|
69 | settings['mako.imports'] = 'from rhodecode.lib.base import h_filter' | |
|
70 | settings['mako.default_filters'] = 'h_filter' | |
|
71 | ||
|
72 | if 'mako.directories' not in settings: | |
|
73 | mako_directories = settings.setdefault('mako.directories', [ | |
|
74 | # Base templates of the original application | |
|
75 | 'rhodecode:templates', | |
|
76 | ]) | |
|
77 | log.debug( | |
|
78 | "Using the following Mako template directories: %s", | |
|
79 | mako_directories) | |
|
80 | ||
|
81 | # NOTE(marcink): fix redis requirement for schema of connection since 3.X | |
|
82 | if 'beaker.session.type' in settings and settings['beaker.session.type'] == 'ext:redis': | |
|
83 | raw_url = settings['beaker.session.url'] | |
|
84 | if not raw_url.startswith(('redis://', 'rediss://', 'unix://')): | |
|
85 | settings['beaker.session.url'] = 'redis://' + raw_url | |
|
86 | ||
|
87 | settings_maker.make_setting('__file__', global_config.get('__file__')) | |
|
88 | ||
|
89 | # TODO: johbo: Re-think this, usually the call to config.include | |
|
90 | # should allow to pass in a prefix. | |
|
91 | settings_maker.make_setting('rhodecode.api.url', '/_admin/api') | |
|
92 | ||
|
93 | # Sanitize generic settings. | |
|
94 | settings_maker.make_setting('default_encoding', 'UTF-8', parser='list') | |
|
95 | settings_maker.make_setting('gzip_responses', False, parser='bool') | |
|
96 | settings_maker.make_setting('startup.import_repos', 'false', parser='bool') | |
|
97 | ||
|
98 | # statsd | |
|
99 | settings_maker.make_setting('statsd.enabled', False, parser='bool') | |
|
100 | settings_maker.make_setting('statsd.statsd_host', 'statsd-exporter', parser='string') | |
|
101 | settings_maker.make_setting('statsd.statsd_port', 9125, parser='int') | |
|
102 | settings_maker.make_setting('statsd.statsd_prefix', '') | |
|
103 | settings_maker.make_setting('statsd.statsd_ipv6', False, parser='bool') | |
|
104 | ||
|
105 | settings_maker.make_setting('vcs.svn.compatible_version', '') | |
|
106 | settings_maker.make_setting('vcs.svn.redis_conn', 'redis://redis:6379/0') | |
|
107 | settings_maker.make_setting('vcs.svn.proxy.enabled', True, parser='bool') | |
|
108 | settings_maker.make_setting('vcs.svn.proxy.host', 'http://svn:8090', parser='string') | |
|
109 | settings_maker.make_setting('vcs.hooks.protocol', 'http') | |
|
110 | settings_maker.make_setting('vcs.hooks.host', '*') | |
|
111 | settings_maker.make_setting('vcs.scm_app_implementation', 'http') | |
|
112 | settings_maker.make_setting('vcs.server', '') | |
|
113 | settings_maker.make_setting('vcs.server.protocol', 'http') | |
|
114 | settings_maker.make_setting('vcs.server.enable', 'true', parser='bool') | |
|
115 | settings_maker.make_setting('vcs.hooks.direct_calls', 'false', parser='bool') | |
|
116 | settings_maker.make_setting('vcs.start_server', 'false', parser='bool') | |
|
117 | settings_maker.make_setting('vcs.backends', 'hg, git, svn', parser='list') | |
|
118 | settings_maker.make_setting('vcs.connection_timeout', 3600, parser='int') | |
|
119 | ||
|
120 | settings_maker.make_setting('vcs.methods.cache', True, parser='bool') | |
|
121 | ||
|
122 | # repo_store path | |
|
123 | settings_maker.make_setting('repo_store.path', '/var/opt/rhodecode_repo_store') | |
|
124 | # Support legacy values of vcs.scm_app_implementation. Legacy | |
|
125 | # configurations may use 'rhodecode.lib.middleware.utils.scm_app_http', or | |
|
126 | # disabled since 4.13 'vcsserver.scm_app' which is now mapped to 'http'. | |
|
127 | scm_app_impl = settings['vcs.scm_app_implementation'] | |
|
128 | if scm_app_impl in ['rhodecode.lib.middleware.utils.scm_app_http', 'vcsserver.scm_app']: | |
|
129 | settings['vcs.scm_app_implementation'] = 'http' | |
|
130 | ||
|
131 | settings_maker.make_setting('appenlight', False, parser='bool') | |
|
132 | ||
|
133 | temp_store = tempfile.gettempdir() | |
|
134 | tmp_cache_dir = jn(temp_store, 'rc_cache') | |
|
135 | ||
|
136 | # save default, cache dir, and use it for all backends later. | |
|
137 | default_cache_dir = settings_maker.make_setting( | |
|
138 | 'cache_dir', | |
|
139 | default=tmp_cache_dir, default_when_empty=True, | |
|
140 | parser='dir:ensured') | |
|
141 | ||
|
142 | # exception store cache | |
|
143 | settings_maker.make_setting( | |
|
144 | 'exception_tracker.store_path', | |
|
145 | default=jn(default_cache_dir, 'exc_store'), default_when_empty=True, | |
|
146 | parser='dir:ensured' | |
|
147 | ) | |
|
148 | ||
|
149 | settings_maker.make_setting( | |
|
150 | 'celerybeat-schedule.path', | |
|
151 | default=jn(default_cache_dir, 'celerybeat_schedule', 'celerybeat-schedule.db'), default_when_empty=True, | |
|
152 | parser='file:ensured' | |
|
153 | ) | |
|
154 | ||
|
155 | settings_maker.make_setting('exception_tracker.send_email', False, parser='bool') | |
|
156 | settings_maker.make_setting('exception_tracker.email_prefix', '[RHODECODE ERROR]', default_when_empty=True) | |
|
157 | ||
|
158 | # sessions, ensure file since no-value is memory | |
|
159 | settings_maker.make_setting('beaker.session.type', 'file') | |
|
160 | settings_maker.make_setting('beaker.session.data_dir', jn(default_cache_dir, 'session_data')) | |
|
161 | ||
|
162 | # cache_general | |
|
163 | settings_maker.make_setting('rc_cache.cache_general.backend', 'dogpile.cache.rc.file_namespace') | |
|
164 | settings_maker.make_setting('rc_cache.cache_general.expiration_time', 60 * 60 * 12, parser='int') | |
|
165 | settings_maker.make_setting('rc_cache.cache_general.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_general.db')) | |
|
166 | ||
|
167 | # cache_perms | |
|
168 | settings_maker.make_setting('rc_cache.cache_perms.backend', 'dogpile.cache.rc.file_namespace') | |
|
169 | settings_maker.make_setting('rc_cache.cache_perms.expiration_time', 60 * 60, parser='int') | |
|
170 | settings_maker.make_setting('rc_cache.cache_perms.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_perms_db')) | |
|
171 | ||
|
172 | # cache_repo | |
|
173 | settings_maker.make_setting('rc_cache.cache_repo.backend', 'dogpile.cache.rc.file_namespace') | |
|
174 | settings_maker.make_setting('rc_cache.cache_repo.expiration_time', 60 * 60 * 24 * 30, parser='int') | |
|
175 | settings_maker.make_setting('rc_cache.cache_repo.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_repo_db')) | |
|
176 | ||
|
177 | # cache_license | |
|
178 | settings_maker.make_setting('rc_cache.cache_license.backend', 'dogpile.cache.rc.file_namespace') | |
|
179 | settings_maker.make_setting('rc_cache.cache_license.expiration_time', 60 * 5, parser='int') | |
|
180 | settings_maker.make_setting('rc_cache.cache_license.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_license_db')) | |
|
181 | ||
|
182 | # cache_repo_longterm memory, 96H | |
|
183 | settings_maker.make_setting('rc_cache.cache_repo_longterm.backend', 'dogpile.cache.rc.memory_lru') | |
|
184 | settings_maker.make_setting('rc_cache.cache_repo_longterm.expiration_time', 345600, parser='int') | |
|
185 | settings_maker.make_setting('rc_cache.cache_repo_longterm.max_size', 10000, parser='int') | |
|
186 | ||
|
187 | # sql_cache_short | |
|
188 | settings_maker.make_setting('rc_cache.sql_cache_short.backend', 'dogpile.cache.rc.memory_lru') | |
|
189 | settings_maker.make_setting('rc_cache.sql_cache_short.expiration_time', 30, parser='int') | |
|
190 | settings_maker.make_setting('rc_cache.sql_cache_short.max_size', 10000, parser='int') | |
|
191 | ||
|
192 | # archive_cache | |
|
193 | settings_maker.make_setting('archive_cache.locking.url', 'redis://redis:6379/1') | |
|
194 | settings_maker.make_setting('archive_cache.backend.type', 'filesystem') | |
|
195 | ||
|
196 | settings_maker.make_setting('archive_cache.filesystem.store_dir', jn(default_cache_dir, 'archive_cache'), default_when_empty=True,) | |
|
197 | settings_maker.make_setting('archive_cache.filesystem.cache_shards', 8, parser='int') | |
|
198 | settings_maker.make_setting('archive_cache.filesystem.cache_size_gb', 10, parser='float') | |
|
199 | settings_maker.make_setting('archive_cache.filesystem.eviction_policy', 'least-recently-stored') | |
|
200 | ||
|
201 | settings_maker.make_setting('archive_cache.filesystem.retry', False, parser='bool') | |
|
202 | settings_maker.make_setting('archive_cache.filesystem.retry_backoff', 1, parser='int') | |
|
203 | settings_maker.make_setting('archive_cache.filesystem.retry_attempts', 10, parser='int') | |
|
204 | ||
|
205 | settings_maker.make_setting('archive_cache.objectstore.url', jn(default_cache_dir, 'archive_cache'), default_when_empty=True,) | |
|
206 | settings_maker.make_setting('archive_cache.objectstore.key', '') | |
|
207 | settings_maker.make_setting('archive_cache.objectstore.secret', '') | |
|
208 | settings_maker.make_setting('archive_cache.objectstore.region', 'eu-central-1') | |
|
209 | settings_maker.make_setting('archive_cache.objectstore.bucket', 'rhodecode-archive-cache', default_when_empty=True,) | |
|
210 | settings_maker.make_setting('archive_cache.objectstore.bucket_shards', 8, parser='int') | |
|
211 | ||
|
212 | settings_maker.make_setting('archive_cache.objectstore.cache_size_gb', 10, parser='float') | |
|
213 | settings_maker.make_setting('archive_cache.objectstore.eviction_policy', 'least-recently-stored') | |
|
214 | ||
|
215 | settings_maker.make_setting('archive_cache.objectstore.retry', False, parser='bool') | |
|
216 | settings_maker.make_setting('archive_cache.objectstore.retry_backoff', 1, parser='int') | |
|
217 | settings_maker.make_setting('archive_cache.objectstore.retry_attempts', 10, parser='int') | |
|
218 | ||
|
219 | settings_maker.env_expand() | |
|
220 | ||
|
221 | # configure instance id | |
|
222 | config_utils.set_instance_id(settings) | |
|
223 | ||
|
224 | return settings |
@@ -0,0 +1,47 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import urllib.parse | |
|
20 | ||
|
21 | from rhodecode.lib.vcs import CurlSession | |
|
22 | from rhodecode.lib.ext_json import json | |
|
23 | from rhodecode.lib.vcs.exceptions import ImproperlyConfiguredError | |
|
24 | ||
|
25 | ||
|
26 | def call_service_api(settings, payload): | |
|
27 | try: | |
|
28 | api_host = settings['app.service_api.host'] | |
|
29 | api_token = settings['app.service_api.token'] | |
|
30 | api_url = settings['rhodecode.api.url'] | |
|
31 | except KeyError as exc: | |
|
32 | raise ImproperlyConfiguredError( | |
|
33 | f"{str(exc)} is missing. " | |
|
34 | "Please ensure that app.service_api.host, app.service_api.token and rhodecode.api.url are " | |
|
35 | "defined inside of .ini configuration file." | |
|
36 | ) | |
|
37 | payload.update({ | |
|
38 | 'id': 'service', | |
|
39 | 'auth_token': api_token | |
|
40 | }) | |
|
41 | service_api_url = urllib.parse.urljoin(api_host, api_url) | |
|
42 | response = CurlSession().post(service_api_url, json.dumps(payload)) | |
|
43 | ||
|
44 | if response.status_code != 200: | |
|
45 | raise Exception(f"Service API at {service_api_url} responded with error: {response.status_code}") | |
|
46 | ||
|
47 | return json.loads(response.content)['result'] |
@@ -0,0 +1,78 b'' | |||
|
1 | # Copyright (C) 2015-2024 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import logging | |
|
20 | ||
|
21 | from .backends.fanout_cache import FileSystemFanoutCache | |
|
22 | from .backends.objectstore_cache import ObjectStoreCache | |
|
23 | ||
|
24 | from .utils import archive_iterator # noqa | |
|
25 | from .lock import ArchiveCacheGenerationLock # noqa | |
|
26 | ||
|
27 | log = logging.getLogger(__name__) | |
|
28 | ||
|
29 | ||
|
30 | cache_meta = None | |
|
31 | ||
|
32 | ||
|
33 | def includeme(config): | |
|
34 | # init our cache at start | |
|
35 | settings = config.get_settings() | |
|
36 | get_archival_cache_store(settings) | |
|
37 | ||
|
38 | ||
|
39 | def get_archival_config(config): | |
|
40 | ||
|
41 | final_config = { | |
|
42 | ||
|
43 | } | |
|
44 | ||
|
45 | for k, v in config.items(): | |
|
46 | if k.startswith('archive_cache'): | |
|
47 | final_config[k] = v | |
|
48 | ||
|
49 | return final_config | |
|
50 | ||
|
51 | ||
|
52 | def get_archival_cache_store(config, always_init=False): | |
|
53 | ||
|
54 | global cache_meta | |
|
55 | if cache_meta is not None and not always_init: | |
|
56 | return cache_meta | |
|
57 | ||
|
58 | config = get_archival_config(config) | |
|
59 | backend = config['archive_cache.backend.type'] | |
|
60 | ||
|
61 | archive_cache_locking_url = config['archive_cache.locking.url'] | |
|
62 | ||
|
63 | match backend: | |
|
64 | case 'filesystem': | |
|
65 | d_cache = FileSystemFanoutCache( | |
|
66 | locking_url=archive_cache_locking_url, | |
|
67 | **config | |
|
68 | ) | |
|
69 | case 'objectstore': | |
|
70 | d_cache = ObjectStoreCache( | |
|
71 | locking_url=archive_cache_locking_url, | |
|
72 | **config | |
|
73 | ) | |
|
74 | case _: | |
|
75 | raise ValueError(f'archive_cache.backend.type only supports "filesystem" or "objectstore" got {backend} ') | |
|
76 | ||
|
77 | cache_meta = d_cache | |
|
78 | return cache_meta |
@@ -0,0 +1,17 b'' | |||
|
1 | # Copyright (C) 2015-2024 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
@@ -0,0 +1,372 b'' | |||
|
1 | # Copyright (C) 2015-2024 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import os | |
|
20 | import functools | |
|
21 | import logging | |
|
22 | import typing | |
|
23 | import time | |
|
24 | import zlib | |
|
25 | ||
|
26 | from ...ext_json import json | |
|
27 | from ..utils import StatsDB, NOT_GIVEN, ShardFileReader, EVICTION_POLICY, format_size | |
|
28 | from ..lock import GenerationLock | |
|
29 | ||
|
30 | log = logging.getLogger(__name__) | |
|
31 | ||
|
32 | ||
|
33 | class BaseShard: | |
|
34 | storage_type: str = '' | |
|
35 | fs = None | |
|
36 | ||
|
37 | @classmethod | |
|
38 | def hash(cls, key): | |
|
39 | """Compute portable hash for `key`. | |
|
40 | ||
|
41 | :param key: key to hash | |
|
42 | :return: hash value | |
|
43 | ||
|
44 | """ | |
|
45 | mask = 0xFFFFFFFF | |
|
46 | return zlib.adler32(key.encode('utf-8')) & mask # noqa | |
|
47 | ||
|
48 | def _write_file(self, full_path, read_iterator, mode): | |
|
49 | raise NotImplementedError | |
|
50 | ||
|
51 | def _get_keyfile(self, key): | |
|
52 | raise NotImplementedError | |
|
53 | ||
|
54 | def random_filename(self): | |
|
55 | raise NotImplementedError | |
|
56 | ||
|
57 | def store(self, *args, **kwargs): | |
|
58 | raise NotImplementedError | |
|
59 | ||
|
60 | def _store(self, key, value_reader, metadata, mode): | |
|
61 | (filename, # hash-name | |
|
62 | full_path # full-path/hash-name | |
|
63 | ) = self.random_filename() | |
|
64 | ||
|
65 | key_file, key_file_path = self._get_keyfile(key) | |
|
66 | ||
|
67 | # STORE METADATA | |
|
68 | _metadata = { | |
|
69 | "version": "v1", | |
|
70 | ||
|
71 | "key_file": key_file, # this is the .key.json file storing meta | |
|
72 | "key_file_path": key_file_path, # full path to key_file | |
|
73 | "archive_key": key, # original name we stored archive under, e.g my-archive.zip | |
|
74 | "archive_filename": filename, # the actual filename we stored that file under | |
|
75 | "archive_full_path": full_path, | |
|
76 | ||
|
77 | "store_time": time.time(), | |
|
78 | "access_count": 0, | |
|
79 | "access_time": 0, | |
|
80 | ||
|
81 | "size": 0 | |
|
82 | } | |
|
83 | if metadata: | |
|
84 | _metadata.update(metadata) | |
|
85 | ||
|
86 | read_iterator = iter(functools.partial(value_reader.read, 2**22), b'') | |
|
87 | size, sha256 = self._write_file(full_path, read_iterator, mode) | |
|
88 | _metadata['size'] = size | |
|
89 | _metadata['sha256'] = sha256 | |
|
90 | ||
|
91 | # after archive is finished, we create a key to save the presence of the binary file | |
|
92 | with self.fs.open(key_file_path, 'wb') as f: | |
|
93 | f.write(json.dumps(_metadata)) | |
|
94 | ||
|
95 | return key, filename, size, _metadata | |
|
96 | ||
|
97 | def fetch(self, *args, **kwargs): | |
|
98 | raise NotImplementedError | |
|
99 | ||
|
100 | def _fetch(self, key, retry, retry_attempts, retry_backoff, | |
|
101 | presigned_url_expires: int = 0) -> tuple[ShardFileReader, dict]: | |
|
102 | if retry is NOT_GIVEN: | |
|
103 | retry = False | |
|
104 | if retry_attempts is NOT_GIVEN: | |
|
105 | retry_attempts = 0 | |
|
106 | ||
|
107 | if retry and retry_attempts > 0: | |
|
108 | for attempt in range(1, retry_attempts + 1): | |
|
109 | if key in self: | |
|
110 | break | |
|
111 | # we didn't find the key, wait retry_backoff N seconds, and re-check | |
|
112 | time.sleep(retry_backoff) | |
|
113 | ||
|
114 | if key not in self: | |
|
115 | log.exception(f'requested key={key} not found in {self} retry={retry}, attempts={retry_attempts}') | |
|
116 | raise KeyError(key) | |
|
117 | ||
|
118 | key_file, key_file_path = self._get_keyfile(key) | |
|
119 | with self.fs.open(key_file_path, 'rb') as f: | |
|
120 | metadata = json.loads(f.read()) | |
|
121 | ||
|
122 | archive_path = metadata['archive_full_path'] | |
|
123 | if presigned_url_expires and presigned_url_expires > 0: | |
|
124 | metadata['url'] = self.fs.url(archive_path, expires=presigned_url_expires) | |
|
125 | ||
|
126 | try: | |
|
127 | return ShardFileReader(self.fs.open(archive_path, 'rb')), metadata | |
|
128 | finally: | |
|
129 | # update usage stats, count and accessed | |
|
130 | metadata["access_count"] = metadata.get("access_count", 0) + 1 | |
|
131 | metadata["access_time"] = time.time() | |
|
132 | log.debug('Updated %s with access snapshot, access_count=%s access_time=%s', | |
|
133 | key_file, metadata['access_count'], metadata['access_time']) | |
|
134 | with self.fs.open(key_file_path, 'wb') as f: | |
|
135 | f.write(json.dumps(metadata)) | |
|
136 | ||
|
137 | def remove(self, *args, **kwargs): | |
|
138 | raise NotImplementedError | |
|
139 | ||
|
140 | def _remove(self, key): | |
|
141 | if key not in self: | |
|
142 | log.exception(f'requested key={key} not found in {self}') | |
|
143 | raise KeyError(key) | |
|
144 | ||
|
145 | key_file, key_file_path = self._get_keyfile(key) | |
|
146 | with self.fs.open(key_file_path, 'rb') as f: | |
|
147 | metadata = json.loads(f.read()) | |
|
148 | ||
|
149 | archive_path = metadata['archive_full_path'] | |
|
150 | self.fs.rm(archive_path) | |
|
151 | self.fs.rm(key_file_path) | |
|
152 | return 1 | |
|
153 | ||
|
154 | @property | |
|
155 | def storage_medium(self): | |
|
156 | return getattr(self, self.storage_type) | |
|
157 | ||
|
158 | @property | |
|
159 | def key_suffix(self): | |
|
160 | return 'key.json' | |
|
161 | ||
|
162 | def __contains__(self, key): | |
|
163 | """Return `True` if `key` matching item is found in cache. | |
|
164 | ||
|
165 | :param key: key matching item | |
|
166 | :return: True if key matching item | |
|
167 | ||
|
168 | """ | |
|
169 | key_file, key_file_path = self._get_keyfile(key) | |
|
170 | return self.fs.exists(key_file_path) | |
|
171 | ||
|
172 | ||
|
173 | class BaseCache: | |
|
174 | _locking_url: str = '' | |
|
175 | _storage_path: str = '' | |
|
176 | _config: dict = {} | |
|
177 | retry = False | |
|
178 | retry_attempts: int = 0 | |
|
179 | retry_backoff: int | float = 1 | |
|
180 | _shards = tuple() | |
|
181 | shard_cls = BaseShard | |
|
182 | # define the presigned url expiration, 0 == disabled | |
|
183 | presigned_url_expires: int = 0 | |
|
184 | ||
|
185 | def __contains__(self, key): | |
|
186 | """Return `True` if `key` matching item is found in cache. | |
|
187 | ||
|
188 | :param key: key matching item | |
|
189 | :return: True if key matching item | |
|
190 | ||
|
191 | """ | |
|
192 | return self.has_key(key) | |
|
193 | ||
|
194 | def __repr__(self): | |
|
195 | return f'<{self.__class__.__name__}(storage={self._storage_path})>' | |
|
196 | ||
|
197 | @classmethod | |
|
198 | def gb_to_bytes(cls, gb): | |
|
199 | return gb * (1024 ** 3) | |
|
200 | ||
|
201 | @property | |
|
202 | def storage_path(self): | |
|
203 | return self._storage_path | |
|
204 | ||
|
205 | @classmethod | |
|
206 | def get_stats_db(cls): | |
|
207 | return StatsDB() | |
|
208 | ||
|
209 | def get_conf(self, key, pop=False): | |
|
210 | if key not in self._config: | |
|
211 | raise ValueError(f"No configuration key '{key}', please make sure it exists in archive_cache config") | |
|
212 | val = self._config[key] | |
|
213 | if pop: | |
|
214 | del self._config[key] | |
|
215 | return val | |
|
216 | ||
|
217 | def _get_shard(self, key) -> shard_cls: | |
|
218 | index = self._hash(key) % self._shard_count | |
|
219 | shard = self._shards[index] | |
|
220 | return shard | |
|
221 | ||
|
222 | def _get_size(self, shard, archive_path): | |
|
223 | raise NotImplementedError | |
|
224 | ||
|
225 | def store(self, key, value_reader, metadata=None): | |
|
226 | shard = self._get_shard(key) | |
|
227 | return shard.store(key, value_reader, metadata) | |
|
228 | ||
|
229 | def fetch(self, key, retry=NOT_GIVEN, retry_attempts=NOT_GIVEN) -> tuple[typing.BinaryIO, dict]: | |
|
230 | """ | |
|
231 | Return file handle corresponding to `key` from specific shard cache. | |
|
232 | """ | |
|
233 | if retry is NOT_GIVEN: | |
|
234 | retry = self.retry | |
|
235 | if retry_attempts is NOT_GIVEN: | |
|
236 | retry_attempts = self.retry_attempts | |
|
237 | retry_backoff = self.retry_backoff | |
|
238 | presigned_url_expires = self.presigned_url_expires | |
|
239 | ||
|
240 | shard = self._get_shard(key) | |
|
241 | return shard.fetch(key, retry=retry, | |
|
242 | retry_attempts=retry_attempts, | |
|
243 | retry_backoff=retry_backoff, | |
|
244 | presigned_url_expires=presigned_url_expires) | |
|
245 | ||
|
246 | def remove(self, key): | |
|
247 | shard = self._get_shard(key) | |
|
248 | return shard.remove(key) | |
|
249 | ||
|
250 | def has_key(self, archive_key): | |
|
251 | """Return `True` if `key` matching item is found in cache. | |
|
252 | ||
|
253 | :param archive_key: key for item, this is a unique archive name we want to store data under. e.g my-archive-svn.zip | |
|
254 | :return: True if key is found | |
|
255 | ||
|
256 | """ | |
|
257 | shard = self._get_shard(archive_key) | |
|
258 | return archive_key in shard | |
|
259 | ||
|
260 | def iter_keys(self): | |
|
261 | for shard in self._shards: | |
|
262 | if shard.fs.exists(shard.storage_medium): | |
|
263 | for path, _dirs, _files in shard.fs.walk(shard.storage_medium): | |
|
264 | for key_file_path in _files: | |
|
265 | if key_file_path.endswith(shard.key_suffix): | |
|
266 | yield shard, key_file_path | |
|
267 | ||
|
268 | def get_lock(self, lock_key): | |
|
269 | return GenerationLock(lock_key, self._locking_url) | |
|
270 | ||
|
271 | def evict(self, policy=None, size_limit=None) -> dict: | |
|
272 | """ | |
|
273 | Remove old items based on the conditions | |
|
274 | ||
|
275 | ||
|
276 | explanation of this algo: | |
|
277 | iterate over each shard, then for each shard iterate over the .key files | |
|
278 | read the key files metadata stored. This gives us a full list of keys, cached_archived, their size and | |
|
279 | access data, time creation, and access counts. | |
|
280 | ||
|
281 | Store that into a memory DB in order we can run different sorting strategies easily. | |
|
282 | Summing the size is a sum sql query. | |
|
283 | ||
|
284 | Then we run a sorting strategy based on eviction policy. | |
|
285 | We iterate over sorted keys, and remove each checking if we hit the overall limit. | |
|
286 | """ | |
|
287 | removal_info = { | |
|
288 | "removed_items": 0, | |
|
289 | "removed_size": 0 | |
|
290 | } | |
|
291 | policy = policy or self._eviction_policy | |
|
292 | size_limit = size_limit or self._cache_size_limit | |
|
293 | ||
|
294 | select_policy = EVICTION_POLICY[policy]['evict'] | |
|
295 | ||
|
296 | log.debug('Running eviction policy \'%s\', and checking for size limit: %s', | |
|
297 | policy, format_size(size_limit)) | |
|
298 | ||
|
299 | if select_policy is None: | |
|
300 | return removal_info | |
|
301 | ||
|
302 | db = self.get_stats_db() | |
|
303 | ||
|
304 | data = [] | |
|
305 | cnt = 1 | |
|
306 | ||
|
307 | for shard, key_file in self.iter_keys(): | |
|
308 | with shard.fs.open(os.path.join(shard.storage_medium, key_file), 'rb') as f: | |
|
309 | metadata = json.loads(f.read()) | |
|
310 | ||
|
311 | key_file_path = os.path.join(shard.storage_medium, key_file) | |
|
312 | ||
|
313 | archive_key = metadata['archive_key'] | |
|
314 | archive_path = metadata['archive_full_path'] | |
|
315 | ||
|
316 | size = metadata.get('size') | |
|
317 | if not size: | |
|
318 | # in case we don't have size re-calc it... | |
|
319 | size = self._get_size(shard, archive_path) | |
|
320 | ||
|
321 | data.append([ | |
|
322 | cnt, | |
|
323 | key_file, | |
|
324 | key_file_path, | |
|
325 | archive_key, | |
|
326 | archive_path, | |
|
327 | metadata.get('store_time', 0), | |
|
328 | metadata.get('access_time', 0), | |
|
329 | metadata.get('access_count', 0), | |
|
330 | size, | |
|
331 | ]) | |
|
332 | cnt += 1 | |
|
333 | ||
|
334 | # Insert bulk data using executemany | |
|
335 | db.bulk_insert(data) | |
|
336 | ||
|
337 | total_size = db.get_total_size() | |
|
338 | log.debug('Analyzed %s keys, occupying: %s, running eviction to match %s', | |
|
339 | len(data), format_size(total_size), format_size(size_limit)) | |
|
340 | ||
|
341 | removed_items = 0 | |
|
342 | removed_size = 0 | |
|
343 | for key_file, archive_key, size in db.get_sorted_keys(select_policy): | |
|
344 | # simulate removal impact BEFORE removal | |
|
345 | total_size -= size | |
|
346 | ||
|
347 | if total_size <= size_limit: | |
|
348 | # we obtained what we wanted... | |
|
349 | break | |
|
350 | ||
|
351 | self.remove(archive_key) | |
|
352 | removed_items += 1 | |
|
353 | removed_size += size | |
|
354 | removal_info['removed_items'] = removed_items | |
|
355 | removal_info['removed_size'] = removed_size | |
|
356 | log.debug('Removed %s cache archives, and reduced size by: %s', | |
|
357 | removed_items, format_size(removed_size)) | |
|
358 | return removal_info | |
|
359 | ||
|
360 | def get_statistics(self): | |
|
361 | total_files = 0 | |
|
362 | total_size = 0 | |
|
363 | meta = {} | |
|
364 | ||
|
365 | for shard, key_file in self.iter_keys(): | |
|
366 | json_key = f"{shard.storage_medium}/{key_file}" | |
|
367 | with shard.fs.open(json_key, 'rb') as f: | |
|
368 | total_files += 1 | |
|
369 | metadata = json.loads(f.read()) | |
|
370 | total_size += metadata['size'] | |
|
371 | ||
|
372 | return total_files, total_size, meta |
@@ -0,0 +1,173 b'' | |||
|
1 | # Copyright (C) 2015-2024 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import codecs | |
|
20 | import hashlib | |
|
21 | import logging | |
|
22 | import os | |
|
23 | import typing | |
|
24 | ||
|
25 | import fsspec | |
|
26 | ||
|
27 | from .base import BaseCache, BaseShard | |
|
28 | from ..utils import ShardFileReader, NOT_GIVEN | |
|
29 | from ...type_utils import str2bool | |
|
30 | ||
|
31 | log = logging.getLogger(__name__) | |
|
32 | ||
|
33 | ||
|
34 | class S3Shard(BaseShard): | |
|
35 | ||
|
36 | def __init__(self, index, bucket, bucket_folder, fs, **settings): | |
|
37 | self._index: int = index | |
|
38 | self._bucket_folder: str = bucket_folder | |
|
39 | self.storage_type: str = 'bucket' | |
|
40 | self._bucket_main: str = bucket | |
|
41 | ||
|
42 | self.fs = fs | |
|
43 | ||
|
44 | @property | |
|
45 | def bucket(self) -> str: | |
|
46 | """Cache bucket final path.""" | |
|
47 | return os.path.join(self._bucket_main, self._bucket_folder) | |
|
48 | ||
|
49 | def _get_keyfile(self, archive_key) -> tuple[str, str]: | |
|
50 | key_file: str = f'{archive_key}-{self.key_suffix}' | |
|
51 | return key_file, os.path.join(self.bucket, key_file) | |
|
52 | ||
|
53 | def _get_writer(self, path, mode): | |
|
54 | return self.fs.open(path, 'wb') | |
|
55 | ||
|
56 | def _write_file(self, full_path, iterator, mode): | |
|
57 | ||
|
58 | # ensure folder in bucket exists | |
|
59 | destination = self.bucket | |
|
60 | if not self.fs.exists(destination): | |
|
61 | self.fs.mkdir(destination, s3_additional_kwargs={}) | |
|
62 | ||
|
63 | writer = self._get_writer(full_path, mode) | |
|
64 | ||
|
65 | digest = hashlib.sha256() | |
|
66 | with writer: | |
|
67 | size = 0 | |
|
68 | for chunk in iterator: | |
|
69 | size += len(chunk) | |
|
70 | digest.update(chunk) | |
|
71 | writer.write(chunk) | |
|
72 | ||
|
73 | sha256 = digest.hexdigest() | |
|
74 | log.debug('written new archive cache under %s, sha256: %s', full_path, sha256) | |
|
75 | return size, sha256 | |
|
76 | ||
|
77 | def store(self, key, value_reader, metadata: dict | None = None): | |
|
78 | return self._store(key, value_reader, metadata, mode='wb') | |
|
79 | ||
|
80 | def fetch(self, key, retry=NOT_GIVEN, | |
|
81 | retry_attempts=NOT_GIVEN, retry_backoff=1, | |
|
82 | presigned_url_expires: int = 0) -> tuple[ShardFileReader, dict]: | |
|
83 | return self._fetch(key, retry, retry_attempts, retry_backoff, presigned_url_expires=presigned_url_expires) | |
|
84 | ||
|
85 | def remove(self, key): | |
|
86 | return self._remove(key) | |
|
87 | ||
|
88 | def random_filename(self): | |
|
89 | """Return filename and full-path tuple for file storage. | |
|
90 | ||
|
91 | Filename will be a randomly generated 28 character hexadecimal string | |
|
92 | with ".archive_cache" suffixed. Two levels of sub-directories will be used to | |
|
93 | reduce the size of directories. On older filesystems, lookups in | |
|
94 | directories with many files may be slow. | |
|
95 | """ | |
|
96 | ||
|
97 | hex_name = codecs.encode(os.urandom(16), 'hex').decode('utf-8') | |
|
98 | ||
|
99 | archive_name = hex_name[4:] + '.archive_cache' | |
|
100 | filename = f"{hex_name[:2]}-{hex_name[2:4]}-{archive_name}" | |
|
101 | ||
|
102 | full_path = os.path.join(self.bucket, filename) | |
|
103 | return archive_name, full_path | |
|
104 | ||
|
105 | def __repr__(self): | |
|
106 | return f'{self.__class__.__name__}(index={self._index}, bucket={self.bucket})' | |
|
107 | ||
|
108 | ||
|
109 | class ObjectStoreCache(BaseCache): | |
|
110 | shard_name: str = 'shard-{:03d}' | |
|
111 | shard_cls = S3Shard | |
|
112 | ||
|
113 | def __init__(self, locking_url, **settings): | |
|
114 | """ | |
|
115 | Initialize objectstore cache instance. | |
|
116 | ||
|
117 | :param str locking_url: redis url for a lock | |
|
118 | :param settings: settings dict | |
|
119 | ||
|
120 | """ | |
|
121 | self._locking_url = locking_url | |
|
122 | self._config = settings | |
|
123 | ||
|
124 | objectstore_url = self.get_conf('archive_cache.objectstore.url') | |
|
125 | self._storage_path = objectstore_url # common path for all from BaseCache | |
|
126 | ||
|
127 | self._shard_count = int(self.get_conf('archive_cache.objectstore.bucket_shards', pop=True)) | |
|
128 | if self._shard_count < 1: | |
|
129 | raise ValueError('cache_shards must be 1 or more') | |
|
130 | ||
|
131 | self._bucket = settings.pop('archive_cache.objectstore.bucket') | |
|
132 | if not self._bucket: | |
|
133 | raise ValueError('archive_cache.objectstore.bucket needs to have a value') | |
|
134 | ||
|
135 | self._eviction_policy = self.get_conf('archive_cache.objectstore.eviction_policy', pop=True) | |
|
136 | self._cache_size_limit = self.gb_to_bytes(int(self.get_conf('archive_cache.objectstore.cache_size_gb'))) | |
|
137 | ||
|
138 | self.retry = str2bool(self.get_conf('archive_cache.objectstore.retry', pop=True)) | |
|
139 | self.retry_attempts = int(self.get_conf('archive_cache.objectstore.retry_attempts', pop=True)) | |
|
140 | self.retry_backoff = int(self.get_conf('archive_cache.objectstore.retry_backoff', pop=True)) | |
|
141 | ||
|
142 | endpoint_url = settings.pop('archive_cache.objectstore.url') | |
|
143 | key = settings.pop('archive_cache.objectstore.key') | |
|
144 | secret = settings.pop('archive_cache.objectstore.secret') | |
|
145 | region = settings.pop('archive_cache.objectstore.region') | |
|
146 | ||
|
147 | log.debug('Initializing %s archival cache instance', self) | |
|
148 | ||
|
149 | fs = fsspec.filesystem( | |
|
150 | 's3', anon=False, endpoint_url=endpoint_url, key=key, secret=secret, client_kwargs={'region_name': region} | |
|
151 | ) | |
|
152 | ||
|
153 | # init main bucket | |
|
154 | if not fs.exists(self._bucket): | |
|
155 | fs.mkdir(self._bucket) | |
|
156 | ||
|
157 | self._shards = tuple( | |
|
158 | self.shard_cls( | |
|
159 | index=num, | |
|
160 | bucket=self._bucket, | |
|
161 | bucket_folder=self.shard_name.format(num), | |
|
162 | fs=fs, | |
|
163 | **settings, | |
|
164 | ) | |
|
165 | for num in range(self._shard_count) | |
|
166 | ) | |
|
167 | self._hash = self._shards[0].hash | |
|
168 | ||
|
169 | def _get_size(self, shard, archive_path): | |
|
170 | return shard.fs.info(archive_path)['size'] | |
|
171 | ||
|
172 | def set_presigned_url_expiry(self, val: int) -> None: | |
|
173 | self.presigned_url_expires = val |
@@ -0,0 +1,62 b'' | |||
|
1 | # Copyright (C) 2015-2024 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import redis | |
|
20 | from .._vendor import redis_lock | |
|
21 | ||
|
22 | ||
|
23 | class ArchiveCacheGenerationLock(Exception): | |
|
24 | pass | |
|
25 | ||
|
26 | ||
|
27 | class GenerationLock: | |
|
28 | """ | |
|
29 | Locking mechanism that detects if a lock is acquired | |
|
30 | ||
|
31 | with GenerationLock(lock_key): | |
|
32 | compute_archive() | |
|
33 | """ | |
|
34 | lock_timeout = 7200 | |
|
35 | ||
|
36 | def __init__(self, lock_key, url): | |
|
37 | self.lock_key = lock_key | |
|
38 | self._create_client(url) | |
|
39 | self.lock = self.get_lock() | |
|
40 | ||
|
41 | def _create_client(self, url): | |
|
42 | connection_pool = redis.ConnectionPool.from_url(url) | |
|
43 | self.writer_client = redis.StrictRedis( | |
|
44 | connection_pool=connection_pool | |
|
45 | ) | |
|
46 | self.reader_client = self.writer_client | |
|
47 | ||
|
48 | def get_lock(self): | |
|
49 | return redis_lock.Lock( | |
|
50 | redis_client=self.writer_client, | |
|
51 | name=self.lock_key, | |
|
52 | expire=self.lock_timeout, | |
|
53 | strict=True | |
|
54 | ) | |
|
55 | ||
|
56 | def __enter__(self): | |
|
57 | acquired = self.lock.acquire(blocking=False) | |
|
58 | if not acquired: | |
|
59 | raise ArchiveCacheGenerationLock('Failed to create a lock') | |
|
60 | ||
|
61 | def __exit__(self, exc_type, exc_val, exc_tb): | |
|
62 | self.lock.release() |
@@ -0,0 +1,134 b'' | |||
|
1 | # Copyright (C) 2015-2024 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import sqlite3 | |
|
20 | import s3fs.core | |
|
21 | ||
|
22 | NOT_GIVEN = -917 | |
|
23 | ||
|
24 | ||
|
25 | EVICTION_POLICY = { | |
|
26 | 'none': { | |
|
27 | 'evict': None, | |
|
28 | }, | |
|
29 | 'least-recently-stored': { | |
|
30 | 'evict': 'SELECT {fields} FROM archive_cache ORDER BY store_time', | |
|
31 | }, | |
|
32 | 'least-recently-used': { | |
|
33 | 'evict': 'SELECT {fields} FROM archive_cache ORDER BY access_time', | |
|
34 | }, | |
|
35 | 'least-frequently-used': { | |
|
36 | 'evict': 'SELECT {fields} FROM archive_cache ORDER BY access_count', | |
|
37 | }, | |
|
38 | } | |
|
39 | ||
|
40 | ||
|
41 | def archive_iterator(_reader, block_size: int = 4096 * 512): | |
|
42 | # 4096 * 64 = 64KB | |
|
43 | while 1: | |
|
44 | data = _reader.read(block_size) | |
|
45 | if not data: | |
|
46 | break | |
|
47 | yield data | |
|
48 | ||
|
49 | ||
|
50 | def format_size(size): | |
|
51 | # Convert size in bytes to a human-readable format (e.g., KB, MB, GB) | |
|
52 | for unit in ['B', 'KB', 'MB', 'GB', 'TB']: | |
|
53 | if size < 1024: | |
|
54 | return f"{size:.2f} {unit}" | |
|
55 | size /= 1024 | |
|
56 | ||
|
57 | ||
|
58 | class StatsDB: | |
|
59 | ||
|
60 | def __init__(self): | |
|
61 | self.connection = sqlite3.connect(':memory:') | |
|
62 | self._init_db() | |
|
63 | ||
|
64 | def _init_db(self): | |
|
65 | qry = ''' | |
|
66 | CREATE TABLE IF NOT EXISTS archive_cache ( | |
|
67 | rowid INTEGER PRIMARY KEY, | |
|
68 | key_file TEXT, | |
|
69 | key_file_path TEXT, | |
|
70 | archive_key TEXT, | |
|
71 | archive_path TEXT, | |
|
72 | store_time REAL, | |
|
73 | access_time REAL, | |
|
74 | access_count INTEGER DEFAULT 0, | |
|
75 | size INTEGER DEFAULT 0 | |
|
76 | ) | |
|
77 | ''' | |
|
78 | ||
|
79 | self.sql(qry) | |
|
80 | self.connection.commit() | |
|
81 | ||
|
82 | @property | |
|
83 | def sql(self): | |
|
84 | return self.connection.execute | |
|
85 | ||
|
86 | def bulk_insert(self, rows): | |
|
87 | qry = ''' | |
|
88 | INSERT INTO archive_cache ( | |
|
89 | rowid, | |
|
90 | key_file, | |
|
91 | key_file_path, | |
|
92 | archive_key, | |
|
93 | archive_path, | |
|
94 | store_time, | |
|
95 | access_time, | |
|
96 | access_count, | |
|
97 | size | |
|
98 | ) | |
|
99 | VALUES ( | |
|
100 | ?, ?, ?, ?, ?, ?, ?, ?, ? | |
|
101 | ) | |
|
102 | ''' | |
|
103 | cursor = self.connection.cursor() | |
|
104 | cursor.executemany(qry, rows) | |
|
105 | self.connection.commit() | |
|
106 | ||
|
107 | def get_total_size(self): | |
|
108 | qry = 'SELECT COALESCE(SUM(size), 0) FROM archive_cache' | |
|
109 | ((total_size,),) = self.sql(qry).fetchall() | |
|
110 | return total_size | |
|
111 | ||
|
112 | def get_sorted_keys(self, select_policy): | |
|
113 | select_policy_qry = select_policy.format(fields='key_file, archive_key, size') | |
|
114 | return self.sql(select_policy_qry).fetchall() | |
|
115 | ||
|
116 | ||
|
117 | class ShardFileReader: | |
|
118 | ||
|
119 | def __init__(self, file_like_reader): | |
|
120 | self._file_like_reader = file_like_reader | |
|
121 | ||
|
122 | def __getattr__(self, item): | |
|
123 | if isinstance(self._file_like_reader, s3fs.core.S3File): | |
|
124 | match item: | |
|
125 | case 'name': | |
|
126 | # S3 FileWrapper doesn't support name attribute, and we use it | |
|
127 | return self._file_like_reader.full_name | |
|
128 | case _: | |
|
129 | return getattr(self._file_like_reader, item) | |
|
130 | else: | |
|
131 | return getattr(self._file_like_reader, item) | |
|
132 | ||
|
133 | def __repr__(self): | |
|
134 | return f'<{self.__class__.__name__}={self._file_like_reader}>' |
@@ -0,0 +1,40 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | import os | |
|
19 | ||
|
20 | ||
|
21 | def get_config(ini_path, **kwargs): | |
|
22 | import configparser | |
|
23 | parser = configparser.ConfigParser(**kwargs) | |
|
24 | parser.read(ini_path) | |
|
25 | return parser | |
|
26 | ||
|
27 | ||
|
28 | def get_app_config_lightweight(ini_path): | |
|
29 | parser = get_config(ini_path) | |
|
30 | parser.set('app:main', 'here', os.getcwd()) | |
|
31 | parser.set('app:main', '__file__', ini_path) | |
|
32 | return dict(parser.items('app:main')) | |
|
33 | ||
|
34 | ||
|
35 | def get_app_config(ini_path): | |
|
36 | """ | |
|
37 | This loads the app context and provides a heavy type iniliaziation of config | |
|
38 | """ | |
|
39 | from paste.deploy.loadwsgi import appconfig | |
|
40 | return appconfig(f'config:{ini_path}', relative_to=os.getcwd()) |
@@ -0,0 +1,50 b'' | |||
|
1 | ||
|
2 | ||
|
3 | import logging | |
|
4 | from sqlalchemy import * | |
|
5 | from sqlalchemy.engine import reflection | |
|
6 | ||
|
7 | from alembic.migration import MigrationContext | |
|
8 | from alembic.operations import Operations | |
|
9 | ||
|
10 | from rhodecode.lib.dbmigrate.versions import _reset_base | |
|
11 | from rhodecode.model import meta, init_model_encryption | |
|
12 | ||
|
13 | ||
|
14 | log = logging.getLogger(__name__) | |
|
15 | ||
|
16 | ||
|
17 | def _get_indexes_list(migrate_engine, table_name): | |
|
18 | inspector = reflection.Inspector.from_engine(migrate_engine) | |
|
19 | return inspector.get_indexes(table_name) | |
|
20 | ||
|
21 | ||
|
22 | def upgrade(migrate_engine): | |
|
23 | """ | |
|
24 | Upgrade operations go here. | |
|
25 | Don't create your own engine; bind migrate_engine to your metadata | |
|
26 | """ | |
|
27 | from rhodecode.model import db as db_5_1_0_0 | |
|
28 | ||
|
29 | # issue fixups | |
|
30 | fixups(db_5_1_0_0, meta.Session) | |
|
31 | ||
|
32 | ||
|
33 | def downgrade(migrate_engine): | |
|
34 | pass | |
|
35 | ||
|
36 | ||
|
37 | def fixups(models, _SESSION): | |
|
38 | for db_repo in _SESSION.query(models.Repository).all(): | |
|
39 | ||
|
40 | config = db_repo._config | |
|
41 | config.set('extensions', 'largefiles', '') | |
|
42 | ||
|
43 | try: | |
|
44 | scm = db_repo.scm_instance(cache=False, config=config) | |
|
45 | if scm: | |
|
46 | print(f'installing hook for repo: {db_repo}') | |
|
47 | scm.install_hooks(force=True) | |
|
48 | except Exception as e: | |
|
49 | print(e) | |
|
50 | print('continue...') |
@@ -0,0 +1,17 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
@@ -0,0 +1,89 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import os | |
|
20 | import time | |
|
21 | import logging | |
|
22 | ||
|
23 | from rhodecode.lib.config_utils import get_config | |
|
24 | ||
|
25 | from rhodecode.lib.svn_txn_utils import get_txn_id_from_store | |
|
26 | ||
|
27 | log = logging.getLogger(__name__) | |
|
28 | ||
|
29 | ||
|
30 | class BaseHooksCallbackDaemon: | |
|
31 | """ | |
|
32 | Basic context manager for actions that don't require some extra | |
|
33 | """ | |
|
34 | def __init__(self): | |
|
35 | pass | |
|
36 | ||
|
37 | def __enter__(self): | |
|
38 | log.debug('Running `%s` callback daemon', self.__class__.__name__) | |
|
39 | return self | |
|
40 | ||
|
41 | def __exit__(self, exc_type, exc_val, exc_tb): | |
|
42 | log.debug('Exiting `%s` callback daemon', self.__class__.__name__) | |
|
43 | ||
|
44 | ||
|
45 | class HooksModuleCallbackDaemon(BaseHooksCallbackDaemon): | |
|
46 | ||
|
47 | def __init__(self, module): | |
|
48 | super().__init__() | |
|
49 | self.hooks_module = module | |
|
50 | ||
|
51 | def __repr__(self): | |
|
52 | return f'HooksModuleCallbackDaemon(hooks_module={self.hooks_module})' | |
|
53 | ||
|
54 | ||
|
55 | def prepare_callback_daemon(extras, protocol, host, txn_id=None): | |
|
56 | ||
|
57 | match protocol: | |
|
58 | case 'http': | |
|
59 | from rhodecode.lib.hook_daemon.http_hooks_deamon import HttpHooksCallbackDaemon | |
|
60 | port = 0 | |
|
61 | if txn_id: | |
|
62 | # read txn-id to re-use the PORT for callback daemon | |
|
63 | repo_path = os.path.join(extras['repo_store'], extras['repository']) | |
|
64 | txn_details = get_txn_id_from_store(repo_path, txn_id) | |
|
65 | port = txn_details.get('port', 0) | |
|
66 | ||
|
67 | callback_daemon = HttpHooksCallbackDaemon( | |
|
68 | txn_id=txn_id, host=host, port=port) | |
|
69 | case 'celery': | |
|
70 | from rhodecode.lib.hook_daemon.celery_hooks_deamon import CeleryHooksCallbackDaemon | |
|
71 | callback_daemon = CeleryHooksCallbackDaemon(get_config(extras['config'])) | |
|
72 | case 'local': | |
|
73 | from rhodecode.lib.hook_daemon.hook_module import Hooks | |
|
74 | callback_daemon = HooksModuleCallbackDaemon(Hooks.__module__) | |
|
75 | case _: | |
|
76 | log.error('Unsupported callback daemon protocol "%s"', protocol) | |
|
77 | raise Exception('Unsupported callback daemon protocol.') | |
|
78 | ||
|
79 | extras['hooks_uri'] = getattr(callback_daemon, 'hooks_uri', '') | |
|
80 | extras['task_queue'] = getattr(callback_daemon, 'task_queue', '') | |
|
81 | extras['task_backend'] = getattr(callback_daemon, 'task_backend', '') | |
|
82 | extras['hooks_protocol'] = protocol | |
|
83 | extras['time'] = time.time() | |
|
84 | ||
|
85 | # register txn_id | |
|
86 | extras['txn_id'] = txn_id | |
|
87 | log.debug('Prepared a callback daemon: %s', | |
|
88 | callback_daemon.__class__.__name__) | |
|
89 | return callback_daemon, extras |
@@ -0,0 +1,33 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | from rhodecode.lib.hook_daemon.base import BaseHooksCallbackDaemon | |
|
20 | ||
|
21 | ||
|
22 | class CeleryHooksCallbackDaemon(BaseHooksCallbackDaemon): | |
|
23 | """ | |
|
24 | Context manger for achieving a compatibility with celery backend | |
|
25 | """ | |
|
26 | ||
|
27 | def __init__(self, config): | |
|
28 | # TODO: replace this with settings bootstrapped... | |
|
29 | self.task_queue = config.get('app:main', 'celery.broker_url') | |
|
30 | self.task_backend = config.get('app:main', 'celery.result_backend') | |
|
31 | ||
|
32 | def __repr__(self): | |
|
33 | return f'CeleryHooksCallbackDaemon(task_queue={self.task_queue}, task_backend={self.task_backend})' |
@@ -0,0 +1,104 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import logging | |
|
20 | import traceback | |
|
21 | ||
|
22 | from rhodecode.model import meta | |
|
23 | ||
|
24 | from rhodecode.lib import hooks_base | |
|
25 | from rhodecode.lib.exceptions import HTTPLockedRC, HTTPBranchProtected | |
|
26 | from rhodecode.lib.utils2 import AttributeDict | |
|
27 | ||
|
28 | log = logging.getLogger(__name__) | |
|
29 | ||
|
30 | ||
|
31 | class Hooks(object): | |
|
32 | """ | |
|
33 | Exposes the hooks for remote callbacks | |
|
34 | """ | |
|
35 | def __init__(self, request=None, log_prefix=''): | |
|
36 | self.log_prefix = log_prefix | |
|
37 | self.request = request | |
|
38 | ||
|
39 | def repo_size(self, extras): | |
|
40 | log.debug("%sCalled repo_size of %s object", self.log_prefix, self) | |
|
41 | return self._call_hook(hooks_base.repo_size, extras) | |
|
42 | ||
|
43 | def pre_pull(self, extras): | |
|
44 | log.debug("%sCalled pre_pull of %s object", self.log_prefix, self) | |
|
45 | return self._call_hook(hooks_base.pre_pull, extras) | |
|
46 | ||
|
47 | def post_pull(self, extras): | |
|
48 | log.debug("%sCalled post_pull of %s object", self.log_prefix, self) | |
|
49 | return self._call_hook(hooks_base.post_pull, extras) | |
|
50 | ||
|
51 | def pre_push(self, extras): | |
|
52 | log.debug("%sCalled pre_push of %s object", self.log_prefix, self) | |
|
53 | return self._call_hook(hooks_base.pre_push, extras) | |
|
54 | ||
|
55 | def post_push(self, extras): | |
|
56 | log.debug("%sCalled post_push of %s object", self.log_prefix, self) | |
|
57 | return self._call_hook(hooks_base.post_push, extras) | |
|
58 | ||
|
59 | def _call_hook(self, hook, extras): | |
|
60 | extras = AttributeDict(extras) | |
|
61 | _server_url = extras['server_url'] | |
|
62 | ||
|
63 | extras.request = self.request | |
|
64 | ||
|
65 | try: | |
|
66 | result = hook(extras) | |
|
67 | if result is None: | |
|
68 | raise Exception(f'Failed to obtain hook result from func: {hook}') | |
|
69 | except HTTPBranchProtected as handled_error: | |
|
70 | # Those special cases don't need error reporting. It's a case of | |
|
71 | # locked repo or protected branch | |
|
72 | result = AttributeDict({ | |
|
73 | 'status': handled_error.code, | |
|
74 | 'output': handled_error.explanation | |
|
75 | }) | |
|
76 | except (HTTPLockedRC, Exception) as error: | |
|
77 | # locked needs different handling since we need to also | |
|
78 | # handle PULL operations | |
|
79 | exc_tb = '' | |
|
80 | if not isinstance(error, HTTPLockedRC): | |
|
81 | exc_tb = traceback.format_exc() | |
|
82 | log.exception('%sException when handling hook %s', self.log_prefix, hook) | |
|
83 | error_args = error.args | |
|
84 | return { | |
|
85 | 'status': 128, | |
|
86 | 'output': '', | |
|
87 | 'exception': type(error).__name__, | |
|
88 | 'exception_traceback': exc_tb, | |
|
89 | 'exception_args': error_args, | |
|
90 | } | |
|
91 | finally: | |
|
92 | meta.Session.remove() | |
|
93 | ||
|
94 | log.debug('%sGot hook call response %s', self.log_prefix, result) | |
|
95 | return { | |
|
96 | 'status': result.status, | |
|
97 | 'output': result.output, | |
|
98 | } | |
|
99 | ||
|
100 | def __enter__(self): | |
|
101 | return self | |
|
102 | ||
|
103 | def __exit__(self, exc_type, exc_val, exc_tb): | |
|
104 | pass |
@@ -0,0 +1,287 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import os | |
|
20 | import logging | |
|
21 | import traceback | |
|
22 | import threading | |
|
23 | import socket | |
|
24 | import msgpack | |
|
25 | import gevent | |
|
26 | ||
|
27 | from http.server import BaseHTTPRequestHandler | |
|
28 | from socketserver import TCPServer | |
|
29 | ||
|
30 | from rhodecode.model import meta | |
|
31 | from rhodecode.lib.ext_json import json | |
|
32 | from rhodecode.lib import rc_cache | |
|
33 | from rhodecode.lib.svn_txn_utils import get_txn_id_data_key | |
|
34 | from rhodecode.lib.hook_daemon.hook_module import Hooks | |
|
35 | ||
|
36 | log = logging.getLogger(__name__) | |
|
37 | ||
|
38 | ||
|
39 | class HooksHttpHandler(BaseHTTPRequestHandler): | |
|
40 | ||
|
41 | JSON_HOOKS_PROTO = 'json.v1' | |
|
42 | MSGPACK_HOOKS_PROTO = 'msgpack.v1' | |
|
43 | # starting with RhodeCode 5.0.0 MsgPack is the default, prior it used json | |
|
44 | DEFAULT_HOOKS_PROTO = MSGPACK_HOOKS_PROTO | |
|
45 | ||
|
46 | @classmethod | |
|
47 | def serialize_data(cls, data, proto=DEFAULT_HOOKS_PROTO): | |
|
48 | if proto == cls.MSGPACK_HOOKS_PROTO: | |
|
49 | return msgpack.packb(data) | |
|
50 | return json.dumps(data) | |
|
51 | ||
|
52 | @classmethod | |
|
53 | def deserialize_data(cls, data, proto=DEFAULT_HOOKS_PROTO): | |
|
54 | if proto == cls.MSGPACK_HOOKS_PROTO: | |
|
55 | return msgpack.unpackb(data) | |
|
56 | return json.loads(data) | |
|
57 | ||
|
58 | def do_POST(self): | |
|
59 | hooks_proto, method, extras = self._read_request() | |
|
60 | log.debug('Handling HooksHttpHandler %s with %s proto', method, hooks_proto) | |
|
61 | ||
|
62 | txn_id = getattr(self.server, 'txn_id', None) | |
|
63 | if txn_id: | |
|
64 | log.debug('Computing TXN_ID based on `%s`:`%s`', | |
|
65 | extras['repository'], extras['txn_id']) | |
|
66 | computed_txn_id = rc_cache.utils.compute_key_from_params( | |
|
67 | extras['repository'], extras['txn_id']) | |
|
68 | if txn_id != computed_txn_id: | |
|
69 | raise Exception( | |
|
70 | 'TXN ID fail: expected {} got {} instead'.format( | |
|
71 | txn_id, computed_txn_id)) | |
|
72 | ||
|
73 | request = getattr(self.server, 'request', None) | |
|
74 | try: | |
|
75 | hooks = Hooks(request=request, log_prefix='HOOKS: {} '.format(self.server.server_address)) | |
|
76 | result = self._call_hook_method(hooks, method, extras) | |
|
77 | ||
|
78 | except Exception as e: | |
|
79 | exc_tb = traceback.format_exc() | |
|
80 | result = { | |
|
81 | 'exception': e.__class__.__name__, | |
|
82 | 'exception_traceback': exc_tb, | |
|
83 | 'exception_args': e.args | |
|
84 | } | |
|
85 | self._write_response(hooks_proto, result) | |
|
86 | ||
|
87 | def _read_request(self): | |
|
88 | length = int(self.headers['Content-Length']) | |
|
89 | # respect sent headers, fallback to OLD proto for compatability | |
|
90 | hooks_proto = self.headers.get('rc-hooks-protocol') or self.JSON_HOOKS_PROTO | |
|
91 | if hooks_proto == self.MSGPACK_HOOKS_PROTO: | |
|
92 | # support for new vcsserver msgpack based protocol hooks | |
|
93 | body = self.rfile.read(length) | |
|
94 | data = self.deserialize_data(body) | |
|
95 | else: | |
|
96 | body = self.rfile.read(length) | |
|
97 | data = self.deserialize_data(body) | |
|
98 | ||
|
99 | return hooks_proto, data['method'], data['extras'] | |
|
100 | ||
|
101 | def _write_response(self, hooks_proto, result): | |
|
102 | self.send_response(200) | |
|
103 | if hooks_proto == self.MSGPACK_HOOKS_PROTO: | |
|
104 | self.send_header("Content-type", "application/msgpack") | |
|
105 | self.end_headers() | |
|
106 | data = self.serialize_data(result) | |
|
107 | self.wfile.write(data) | |
|
108 | else: | |
|
109 | self.send_header("Content-type", "text/json") | |
|
110 | self.end_headers() | |
|
111 | data = self.serialize_data(result) | |
|
112 | self.wfile.write(data) | |
|
113 | ||
|
114 | def _call_hook_method(self, hooks, method, extras): | |
|
115 | try: | |
|
116 | result = getattr(hooks, method)(extras) | |
|
117 | finally: | |
|
118 | meta.Session.remove() | |
|
119 | return result | |
|
120 | ||
|
121 | def log_message(self, format, *args): | |
|
122 | """ | |
|
123 | This is an overridden method of BaseHTTPRequestHandler which logs using | |
|
124 | a logging library instead of writing directly to stderr. | |
|
125 | """ | |
|
126 | ||
|
127 | message = format % args | |
|
128 | ||
|
129 | log.debug( | |
|
130 | "HOOKS: client=%s - - [%s] %s", self.client_address, | |
|
131 | self.log_date_time_string(), message) | |
|
132 | ||
|
133 | ||
|
134 | class ThreadedHookCallbackDaemon(object): | |
|
135 | ||
|
136 | _callback_thread = None | |
|
137 | _daemon = None | |
|
138 | _done = False | |
|
139 | use_gevent = False | |
|
140 | ||
|
141 | def __init__(self, txn_id=None, host=None, port=None): | |
|
142 | self._prepare(txn_id=txn_id, host=host, port=port) | |
|
143 | if self.use_gevent: | |
|
144 | self._run_func = self._run_gevent | |
|
145 | self._stop_func = self._stop_gevent | |
|
146 | else: | |
|
147 | self._run_func = self._run | |
|
148 | self._stop_func = self._stop | |
|
149 | ||
|
150 | def __enter__(self): | |
|
151 | log.debug('Running `%s` callback daemon', self.__class__.__name__) | |
|
152 | self._run_func() | |
|
153 | return self | |
|
154 | ||
|
155 | def __exit__(self, exc_type, exc_val, exc_tb): | |
|
156 | log.debug('Exiting `%s` callback daemon', self.__class__.__name__) | |
|
157 | self._stop_func() | |
|
158 | ||
|
159 | def _prepare(self, txn_id=None, host=None, port=None): | |
|
160 | raise NotImplementedError() | |
|
161 | ||
|
162 | def _run(self): | |
|
163 | raise NotImplementedError() | |
|
164 | ||
|
165 | def _stop(self): | |
|
166 | raise NotImplementedError() | |
|
167 | ||
|
168 | def _run_gevent(self): | |
|
169 | raise NotImplementedError() | |
|
170 | ||
|
171 | def _stop_gevent(self): | |
|
172 | raise NotImplementedError() | |
|
173 | ||
|
174 | ||
|
175 | class HttpHooksCallbackDaemon(ThreadedHookCallbackDaemon): | |
|
176 | """ | |
|
177 | Context manager which will run a callback daemon in a background thread. | |
|
178 | """ | |
|
179 | ||
|
180 | hooks_uri = None | |
|
181 | ||
|
182 | # From Python docs: Polling reduces our responsiveness to a shutdown | |
|
183 | # request and wastes cpu at all other times. | |
|
184 | POLL_INTERVAL = 0.01 | |
|
185 | ||
|
186 | use_gevent = False | |
|
187 | ||
|
188 | def __repr__(self): | |
|
189 | return f'HttpHooksCallbackDaemon(hooks_uri={self.hooks_uri})' | |
|
190 | ||
|
191 | @property | |
|
192 | def _hook_prefix(self): | |
|
193 | return f'HOOKS: {self.hooks_uri} ' | |
|
194 | ||
|
195 | def get_hostname(self): | |
|
196 | return socket.gethostname() or '127.0.0.1' | |
|
197 | ||
|
198 | def get_available_port(self, min_port=20000, max_port=65535): | |
|
199 | from rhodecode.lib.utils2 import get_available_port as _get_port | |
|
200 | return _get_port(min_port, max_port) | |
|
201 | ||
|
202 | def _prepare(self, txn_id=None, host=None, port=None): | |
|
203 | from pyramid.threadlocal import get_current_request | |
|
204 | ||
|
205 | if not host or host == "*": | |
|
206 | host = self.get_hostname() | |
|
207 | if not port: | |
|
208 | port = self.get_available_port() | |
|
209 | ||
|
210 | server_address = (host, port) | |
|
211 | self.hooks_uri = f'{host}:{port}' | |
|
212 | self.txn_id = txn_id | |
|
213 | self._done = False | |
|
214 | ||
|
215 | log.debug( | |
|
216 | "%s Preparing HTTP callback daemon registering hook object: %s", | |
|
217 | self._hook_prefix, HooksHttpHandler) | |
|
218 | ||
|
219 | self._daemon = TCPServer(server_address, HooksHttpHandler) | |
|
220 | # inject transaction_id for later verification | |
|
221 | self._daemon.txn_id = self.txn_id | |
|
222 | ||
|
223 | # pass the WEB app request into daemon | |
|
224 | self._daemon.request = get_current_request() | |
|
225 | ||
|
226 | def _run(self): | |
|
227 | log.debug("Running thread-based loop of callback daemon in background") | |
|
228 | callback_thread = threading.Thread( | |
|
229 | target=self._daemon.serve_forever, | |
|
230 | kwargs={'poll_interval': self.POLL_INTERVAL}) | |
|
231 | callback_thread.daemon = True | |
|
232 | callback_thread.start() | |
|
233 | self._callback_thread = callback_thread | |
|
234 | ||
|
235 | def _run_gevent(self): | |
|
236 | log.debug("Running gevent-based loop of callback daemon in background") | |
|
237 | # create a new greenlet for the daemon's serve_forever method | |
|
238 | callback_greenlet = gevent.spawn( | |
|
239 | self._daemon.serve_forever, | |
|
240 | poll_interval=self.POLL_INTERVAL) | |
|
241 | ||
|
242 | # store reference to greenlet | |
|
243 | self._callback_greenlet = callback_greenlet | |
|
244 | ||
|
245 | # switch to this greenlet | |
|
246 | gevent.sleep(0.01) | |
|
247 | ||
|
248 | def _stop(self): | |
|
249 | log.debug("Waiting for background thread to finish.") | |
|
250 | self._daemon.shutdown() | |
|
251 | self._callback_thread.join() | |
|
252 | self._daemon = None | |
|
253 | self._callback_thread = None | |
|
254 | if self.txn_id: | |
|
255 | #TODO: figure out the repo_path... | |
|
256 | repo_path = '' | |
|
257 | txn_id_file = get_txn_id_data_key(repo_path, self.txn_id) | |
|
258 | log.debug('Cleaning up TXN ID %s', txn_id_file) | |
|
259 | if os.path.isfile(txn_id_file): | |
|
260 | os.remove(txn_id_file) | |
|
261 | ||
|
262 | log.debug("Background thread done.") | |
|
263 | ||
|
264 | def _stop_gevent(self): | |
|
265 | log.debug("Waiting for background greenlet to finish.") | |
|
266 | ||
|
267 | # if greenlet exists and is running | |
|
268 | if self._callback_greenlet and not self._callback_greenlet.dead: | |
|
269 | # shutdown daemon if it exists | |
|
270 | if self._daemon: | |
|
271 | self._daemon.shutdown() | |
|
272 | ||
|
273 | # kill the greenlet | |
|
274 | self._callback_greenlet.kill() | |
|
275 | ||
|
276 | self._daemon = None | |
|
277 | self._callback_greenlet = None | |
|
278 | ||
|
279 | if self.txn_id: | |
|
280 | #TODO: figure out the repo_path... | |
|
281 | repo_path = '' | |
|
282 | txn_id_file = get_txn_id_data_key(repo_path, self.txn_id) | |
|
283 | log.debug('Cleaning up TXN ID %s', txn_id_file) | |
|
284 | if os.path.isfile(txn_id_file): | |
|
285 | os.remove(txn_id_file) | |
|
286 | ||
|
287 | log.debug("Background greenlet done.") |
@@ -0,0 +1,132 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import logging | |
|
20 | import redis | |
|
21 | ||
|
22 | from ..lib import rc_cache | |
|
23 | from ..lib.ext_json import json | |
|
24 | ||
|
25 | ||
|
26 | log = logging.getLogger(__name__) | |
|
27 | ||
|
28 | redis_client = None | |
|
29 | ||
|
30 | ||
|
31 | class RedisTxnClient: | |
|
32 | ||
|
33 | def __init__(self, url): | |
|
34 | self.url = url | |
|
35 | self._create_client(url) | |
|
36 | ||
|
37 | def _create_client(self, url): | |
|
38 | connection_pool = redis.ConnectionPool.from_url(url) | |
|
39 | self.writer_client = redis.StrictRedis( | |
|
40 | connection_pool=connection_pool | |
|
41 | ) | |
|
42 | self.reader_client = self.writer_client | |
|
43 | ||
|
44 | def set(self, key, value, expire=24 * 60000): | |
|
45 | self.writer_client.set(key, value, ex=expire) | |
|
46 | ||
|
47 | def get(self, key): | |
|
48 | return self.reader_client.get(key) | |
|
49 | ||
|
50 | def delete(self, key): | |
|
51 | self.writer_client.delete(key) | |
|
52 | ||
|
53 | ||
|
54 | def get_redis_client(url=''): | |
|
55 | ||
|
56 | global redis_client | |
|
57 | if redis_client is not None: | |
|
58 | return redis_client | |
|
59 | if not url: | |
|
60 | from rhodecode import CONFIG | |
|
61 | url = CONFIG['vcs.svn.redis_conn'] | |
|
62 | redis_client = RedisTxnClient(url) | |
|
63 | return redis_client | |
|
64 | ||
|
65 | ||
|
66 | def extract_svn_txn_id(data: bytes): | |
|
67 | """ | |
|
68 | Helper method for extraction of svn txn_id from submitted XML data during | |
|
69 | POST operations | |
|
70 | """ | |
|
71 | import re | |
|
72 | from lxml import etree | |
|
73 | ||
|
74 | try: | |
|
75 | root = etree.fromstring(data) | |
|
76 | pat = re.compile(r'/txn/(?P<txn_id>.*)') | |
|
77 | for el in root: | |
|
78 | if el.tag == '{DAV:}source': | |
|
79 | for sub_el in el: | |
|
80 | if sub_el.tag == '{DAV:}href': | |
|
81 | match = pat.search(sub_el.text) | |
|
82 | if match: | |
|
83 | svn_tx_id = match.groupdict()['txn_id'] | |
|
84 | return svn_tx_id | |
|
85 | except Exception: | |
|
86 | log.exception('Failed to extract txn_id') | |
|
87 | ||
|
88 | ||
|
89 | def get_txn_id_data_key(repo_path, svn_txn_id): | |
|
90 | log.debug('svn-txn-id: %s, obtaining data path', svn_txn_id) | |
|
91 | repo_key = rc_cache.utils.compute_key_from_params(repo_path) | |
|
92 | final_key = f'{repo_key}.{svn_txn_id}.svn_txn_id' | |
|
93 | log.debug('computed final key: %s', final_key) | |
|
94 | ||
|
95 | return final_key | |
|
96 | ||
|
97 | ||
|
98 | def store_txn_id_data(repo_path, svn_txn_id, data_dict): | |
|
99 | log.debug('svn-txn-id: %s, storing data', svn_txn_id) | |
|
100 | ||
|
101 | if not svn_txn_id: | |
|
102 | log.warning('Cannot store txn_id because it is empty') | |
|
103 | return | |
|
104 | ||
|
105 | redis_conn = get_redis_client() | |
|
106 | ||
|
107 | store_key = get_txn_id_data_key(repo_path, svn_txn_id) | |
|
108 | store_data = json.dumps(data_dict) | |
|
109 | redis_conn.set(store_key, store_data) | |
|
110 | ||
|
111 | ||
|
112 | def get_txn_id_from_store(repo_path, svn_txn_id, rm_on_read=False): | |
|
113 | """ | |
|
114 | Reads txn_id from store and if present returns the data for callback manager | |
|
115 | """ | |
|
116 | log.debug('svn-txn-id: %s, retrieving data', svn_txn_id) | |
|
117 | redis_conn = get_redis_client() | |
|
118 | ||
|
119 | store_key = get_txn_id_data_key(repo_path, svn_txn_id) | |
|
120 | data = {} | |
|
121 | redis_conn.get(store_key) | |
|
122 | try: | |
|
123 | raw_data = redis_conn.get(store_key) | |
|
124 | data = json.loads(raw_data) | |
|
125 | except Exception: | |
|
126 | log.exception('Failed to get txn_id metadata') | |
|
127 | ||
|
128 | if rm_on_read: | |
|
129 | log.debug('Cleaning up txn_id at %s', store_key) | |
|
130 | redis_conn.delete(store_key) | |
|
131 | ||
|
132 | return data |
@@ -0,0 +1,134 b'' | |||
|
1 | <%namespace name="base" file="/base/base.mako"/> | |
|
2 | ||
|
3 | <div class="panel panel-default"> | |
|
4 | <div class="panel-heading"> | |
|
5 | <h3 class="panel-title">${_('Enable/Disable 2FA for your account')}</h3> | |
|
6 | </div> | |
|
7 | ${h.secure_form(h.route_path('my_account_configure_2fa_update'), request=request)} | |
|
8 | <div class="panel-body"> | |
|
9 | <div class="form"> | |
|
10 | <div class="fields"> | |
|
11 | <div class="field"> | |
|
12 | <div class="label"> | |
|
13 | <label>${_('2FA status')}:</label> | |
|
14 | </div> | |
|
15 | <div class="checkboxes"> | |
|
16 | % if c.locked_2fa: | |
|
17 | <span class="help-block">${_('2FA settings cannot be changed here, because 2FA was forced enabled by RhodeCode Administrator.')}</span> | |
|
18 | ||
|
19 | % else: | |
|
20 | <div class="form-check"> | |
|
21 | <input type="radio" id="2faEnabled" name="2fa_status" value="1" ${'checked=1' if c.state_of_2fa else ''}/> | |
|
22 | <label for="2faEnabled">${_('Enable 2FA')}</label> | |
|
23 | ||
|
24 | <input type="radio" id="2faDisabled" name="2fa_status" value="0" ${'checked=1' if not c.state_of_2fa else ''} /> | |
|
25 | <label for="2faDisabled">${_('Disable 2FA')}</label> | |
|
26 | </div> | |
|
27 | % endif | |
|
28 | ||
|
29 | </div> | |
|
30 | </div> | |
|
31 | </div> | |
|
32 | <button id="saveBtn" class="btn btn-primary" ${'disabled' if c.locked_2fa else ''}>${_('Save')}</button> | |
|
33 | </div> | |
|
34 | </div> | |
|
35 | ${h.end_form()} | |
|
36 | </div> | |
|
37 | ||
|
38 | % if c.state_of_2fa: | |
|
39 | ||
|
40 | ||
|
41 | % if not c.user_seen_2fa_recovery_codes: | |
|
42 | ||
|
43 | <div class="panel panel-warning"> | |
|
44 | <div class="panel-heading" id="advanced-archive"> | |
|
45 | <h3 class="panel-title">${_('2FA Recovery codes')} <a class="permalink" href="#advanced-archive"> ¶</a></h3> | |
|
46 | </div> | |
|
47 | <div class="panel-body"> | |
|
48 | <p> | |
|
49 | ${_('You have not seen your 2FA recovery codes yet.')} | |
|
50 | ${_('Please save them in a safe place, or you will lose access to your account in case of lost access to authenticator app.')} | |
|
51 | </p> | |
|
52 | <br/> | |
|
53 | <a href="${request.route_path('my_account_configure_2fa', _query={'show-recovery-codes': 1})}" class="btn btn-primary">${_('Show recovery codes')}</a> | |
|
54 | </div> | |
|
55 | </div> | |
|
56 | % endif | |
|
57 | ||
|
58 | ||
|
59 | ${h.secure_form(h.route_path('my_account_regenerate_2fa_recovery_codes'), request=request)} | |
|
60 | <div class="panel panel-default"> | |
|
61 | <div class="panel-heading"> | |
|
62 | <h3 class="panel-title">${_('Regenerate 2FA recovery codes for your account')}</h3> | |
|
63 | </div> | |
|
64 | <div class="panel-body"> | |
|
65 | <form id="2faForm"> | |
|
66 | <input type="text" name="totp" placeholder="${_('Verify the code from the app')}" pattern="\d{6}" style="width: 20%"> | |
|
67 | <button type="submit" class="btn btn-primary">${_('Verify and generate new codes')}</button> | |
|
68 | </form> | |
|
69 | </div> | |
|
70 | ||
|
71 | </div> | |
|
72 | ${h.end_form()} | |
|
73 | % endif | |
|
74 | ||
|
75 | ||
|
76 | <script> | |
|
77 | ||
|
78 | function showRecoveryCodesPopup() { | |
|
79 | ||
|
80 | SwalNoAnimation.fire({ | |
|
81 | title: _gettext('2FA recovery codes'), | |
|
82 | html: '<span>Should you ever lose your phone or access to your one time password secret, each of these recovery codes can be used one time each to regain access to your account. Please save them in a safe place, or you will lose access to your account.</span>', | |
|
83 | showCancelButton: false, | |
|
84 | showConfirmButton: true, | |
|
85 | showLoaderOnConfirm: true, | |
|
86 | confirmButtonText: _gettext('Show now'), | |
|
87 | allowOutsideClick: function () { | |
|
88 | !Swal.isLoading() | |
|
89 | }, | |
|
90 | ||
|
91 | preConfirm: function () { | |
|
92 | ||
|
93 | var postData = { | |
|
94 | 'csrf_token': CSRF_TOKEN | |
|
95 | }; | |
|
96 | return new Promise(function (resolve, reject) { | |
|
97 | $.ajax({ | |
|
98 | type: 'POST', | |
|
99 | data: postData, | |
|
100 | url: pyroutes.url('my_account_show_2fa_recovery_codes'), | |
|
101 | headers: {'X-PARTIAL-XHR': true} | |
|
102 | }) | |
|
103 | .done(function (data) { | |
|
104 | resolve(data); | |
|
105 | }) | |
|
106 | .fail(function (jqXHR, textStatus, errorThrown) { | |
|
107 | var message = formatErrorMessage(jqXHR, textStatus, errorThrown); | |
|
108 | ajaxErrorSwal(message); | |
|
109 | }); | |
|
110 | }) | |
|
111 | } | |
|
112 | ||
|
113 | }) | |
|
114 | .then(function (result) { | |
|
115 | if (result.value) { | |
|
116 | let funcData = {'recoveryCodes': result.value.recovery_codes} | |
|
117 | let recoveryCodesHtml = renderTemplate('recoveryCodes', funcData); | |
|
118 | SwalNoAnimation.fire({ | |
|
119 | allowOutsideClick: false, | |
|
120 | confirmButtonText: _gettext('I Copied the codes'), | |
|
121 | title: _gettext('2FA Recovery Codes'), | |
|
122 | html: recoveryCodesHtml | |
|
123 | }).then(function (result) { | |
|
124 | if (result.isConfirmed) { | |
|
125 | window.location.reload() | |
|
126 | } | |
|
127 | }) | |
|
128 | } | |
|
129 | }) | |
|
130 | } | |
|
131 | % if request.GET.get('show-recovery-codes') == '1' and not c.user_seen_2fa_recovery_codes: | |
|
132 | showRecoveryCodesPopup(); | |
|
133 | % endif | |
|
134 | </script> |
@@ -0,0 +1,90 b'' | |||
|
1 | <%inherit file="base/root.mako"/> | |
|
2 | ||
|
3 | <%def name="title()"> | |
|
4 | ${_('Setup 2FA')} | |
|
5 | %if c.rhodecode_name: | |
|
6 | · ${h.branding(c.rhodecode_name)} | |
|
7 | %endif | |
|
8 | </%def> | |
|
9 | ||
|
10 | <style>body{background-color:#eeeeee;}</style> | |
|
11 | ||
|
12 | <div class="loginbox" style="width: 600px"> | |
|
13 | ||
|
14 | <div class="header-account"> | |
|
15 | <div id="header-inner" class="title"> | |
|
16 | <div id="logo"> | |
|
17 | % if c.rhodecode_name: | |
|
18 | <div class="branding"> | |
|
19 | <a href="${h.route_path('home')}">${h.branding(c.rhodecode_name)}</a> | |
|
20 | </div> | |
|
21 | % endif | |
|
22 | </div> | |
|
23 | </div> | |
|
24 | </div> | |
|
25 | ||
|
26 | <div class="loginwrapper"> | |
|
27 | <rhodecode-toast id="notifications"></rhodecode-toast> | |
|
28 | ||
|
29 | <div class="sign-in-title"> | |
|
30 | <h1>${_('Set up the authenticator app')} - ${_('scan the QR code')}</h1> | |
|
31 | </div> | |
|
32 | <div class="inner form"> | |
|
33 | ${h.secure_form(h.route_path('setup_2fa'), request=request, id='totp_form')} | |
|
34 | <strong>${_('Use an authenticator app to scan.')}</strong><br/> | |
|
35 | ||
|
36 | ## QR CODE | |
|
37 | <code>${_('Account')}: ${totp_name}</code><br/> | |
|
38 | <div class="qr-code-container"> | |
|
39 | <img alt="qr-code" src="data:image/png;base64, ${qr}"/> | |
|
40 | </div> | |
|
41 | ||
|
42 | <div id="alternativeCode" style="margin: -10px 0 5px 0">${_('Unable to scan?')} <a id="toggleLink">${_('Click here')}</a></div> | |
|
43 | ||
|
44 | ## Secret alternative code | |
|
45 | <div id="secretDiv" style="display: none"> | |
|
46 | ||
|
47 | <div style="padding: 10px 0"> | |
|
48 | <strong style="padding: 4px 0">${_('Copy and use this code to manually set up an authenticator app')}</strong> | |
|
49 | <code>${key}</code><i class="tooltip icon-clipboard clipboard-action" data-clipboard-text="${key}" title="${_('Copy the secret key')}" ></i><br/> | |
|
50 | <code>${_('type')}: time-based</code> | |
|
51 | </div> | |
|
52 | ||
|
53 | </div> | |
|
54 | ||
|
55 | <label for="totp">${_('Verify the code from the app')}:</label> | |
|
56 | ${h.text('totp', class_='form-control', )} | |
|
57 | <div id="formErrors"> | |
|
58 | % if 'totp' in errors: | |
|
59 | <span class="error-message">${errors.get('totp')}</span> | |
|
60 | <br /> | |
|
61 | % endif | |
|
62 | % if 'secret_totp' in errors: | |
|
63 | <span class="error-message">SECRET:${errors.get('secret_totp')}</span> | |
|
64 | <br /> | |
|
65 | % endif | |
|
66 | </div> | |
|
67 | ${h.hidden('secret_totp', key)} | |
|
68 | ${h.submit('verify_2fa',_('Verify'), class_="btn sign-in")} | |
|
69 | ||
|
70 | ${h.end_form()} | |
|
71 | </div> | |
|
72 | ||
|
73 | </div> | |
|
74 | ||
|
75 | </div> | |
|
76 | ||
|
77 | <script type="text/javascript"> | |
|
78 | ||
|
79 | $(document).ready(function() { | |
|
80 | ||
|
81 | $( "#toggleLink" ).on("click", function() { | |
|
82 | $( "#secretDiv" ).toggle(); | |
|
83 | $( "#alternativeCode").hide(); | |
|
84 | $('#totp').focus(); | |
|
85 | }); | |
|
86 | ||
|
87 | $('#totp').focus(); | |
|
88 | }) | |
|
89 | ||
|
90 | </script> |
@@ -0,0 +1,54 b'' | |||
|
1 | <%inherit file="base/root.mako"/> | |
|
2 | ||
|
3 | <%def name="title()"> | |
|
4 | ${_('Verify 2FA')} | |
|
5 | %if c.rhodecode_name: | |
|
6 | · ${h.branding(c.rhodecode_name)} | |
|
7 | %endif | |
|
8 | </%def> | |
|
9 | <style>body{background-color:#eeeeee;}</style> | |
|
10 | ||
|
11 | <div class="loginbox" style="width: 600px"> | |
|
12 | <div class="header-account"> | |
|
13 | <div id="header-inner" class="title"> | |
|
14 | <div id="logo"> | |
|
15 | % if c.rhodecode_name: | |
|
16 | <div class="branding"> | |
|
17 | <a href="${h.route_path('home')}">${h.branding(c.rhodecode_name)}</a> | |
|
18 | </div> | |
|
19 | % endif | |
|
20 | </div> | |
|
21 | </div> | |
|
22 | </div> | |
|
23 | ||
|
24 | <div class="loginwrapper"> | |
|
25 | <rhodecode-toast id="notifications"></rhodecode-toast> | |
|
26 | ||
|
27 | <div id="register"> | |
|
28 | <div class="sign-in-title"> | |
|
29 | <h1>${_('Verify the code from the app')}</h1> | |
|
30 | </div> | |
|
31 | <div class="inner form"> | |
|
32 | ${h.secure_form(h.route_path('check_2fa'), request=request, id='totp_form')} | |
|
33 | <label for="totp">${_('Verification code')}:</label> | |
|
34 | ${h.text('totp', class_="form-control")} | |
|
35 | %if 'totp' in errors: | |
|
36 | <span class="error-message">${errors.get('totp')}</span> | |
|
37 | <br /> | |
|
38 | %endif | |
|
39 | <p class="help-block">${_('Enter the code from your two-factor authenticator app. If you\'ve lost your device, you can enter one of your recovery codes.')}</p> | |
|
40 | ||
|
41 | ${h.submit('send', _('Verify'), class_="btn sign-in")} | |
|
42 | <p class="help-block pull-right"> | |
|
43 | RhodeCode ${c.rhodecode_edition} | |
|
44 | </p> | |
|
45 | ${h.end_form()} | |
|
46 | </div> | |
|
47 | </div> | |
|
48 | ||
|
49 | </div> | |
|
50 | </div> | |
|
51 | ||
|
52 | ||
|
53 | ||
|
54 |
@@ -0,0 +1,105 b'' | |||
|
1 | # Copyright (C) 2016-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import time | |
|
20 | import pytest | |
|
21 | import rhodecode | |
|
22 | import os | |
|
23 | import shutil | |
|
24 | from tempfile import mkdtemp | |
|
25 | ||
|
26 | from rhodecode.lib import archive_cache | |
|
27 | ||
|
28 | ||
|
29 | def file_reader(temp_store): | |
|
30 | with open(temp_store, 'w') as f: | |
|
31 | for cnt in range(10000): | |
|
32 | f.write(str(cnt)) | |
|
33 | return open(temp_store, 'rb') | |
|
34 | ||
|
35 | ||
|
36 | @pytest.fixture() | |
|
37 | def d_cache_instance(ini_settings): | |
|
38 | config = ini_settings | |
|
39 | d_cache = archive_cache.get_archival_cache_store(config=config, always_init=True) | |
|
40 | return d_cache | |
|
41 | ||
|
42 | ||
|
43 | @pytest.mark.usefixtures('app') | |
|
44 | class TestArchiveCaches(object): | |
|
45 | ||
|
46 | def test_archivecache_empty_stats(self, d_cache_instance): | |
|
47 | d_cache = d_cache_instance | |
|
48 | shutil.rmtree(d_cache._directory) | |
|
49 | ||
|
50 | stats = d_cache.get_statistics() | |
|
51 | assert (0, 0, {}) == stats | |
|
52 | ||
|
53 | def test_archivecache_store_keys(self, d_cache_instance, tmp_path): | |
|
54 | d_cache = d_cache_instance | |
|
55 | shutil.rmtree(d_cache._directory) | |
|
56 | ||
|
57 | for n in range(100): | |
|
58 | ||
|
59 | archive_name = f'my-archive-abc-{n}.zip' | |
|
60 | temp_archive_path = os.path.join(tmp_path, archive_name) | |
|
61 | d_cache.store(archive_name, file_reader(temp_archive_path ), {'foo': 'bar'}) | |
|
62 | reader, meta = d_cache.fetch(archive_name) | |
|
63 | content = reader.read() | |
|
64 | assert content == open(temp_archive_path, 'rb').read() | |
|
65 | ||
|
66 | stats = d_cache.get_statistics() | |
|
67 | assert (100, 3889000, {}) == stats | |
|
68 | ||
|
69 | def test_archivecache_remove_keys(self, d_cache_instance, tmp_path): | |
|
70 | d_cache = d_cache_instance | |
|
71 | shutil.rmtree(d_cache._directory) | |
|
72 | ||
|
73 | n = 1 | |
|
74 | archive_name = f'my-archive-abc-{n}.zip' | |
|
75 | temp_archive_path = os.path.join(tmp_path, archive_name) | |
|
76 | ||
|
77 | d_cache.store(archive_name, file_reader(temp_archive_path ), {'foo': 'bar'}) | |
|
78 | stats = d_cache.get_statistics() | |
|
79 | assert (1, 38890, {}) == stats | |
|
80 | ||
|
81 | assert 1 == d_cache.remove(archive_name) | |
|
82 | ||
|
83 | stats = d_cache.get_statistics() | |
|
84 | assert (0, 0, {}) == stats | |
|
85 | ||
|
86 | def test_archivecache_evict_keys(self, d_cache_instance, tmp_path): | |
|
87 | d_cache = d_cache_instance | |
|
88 | shutil.rmtree(d_cache._directory) | |
|
89 | tries = 500 | |
|
90 | for n in range(tries): | |
|
91 | ||
|
92 | archive_name = f'my-archive-abc-{n}.zip' | |
|
93 | temp_archive_path = os.path.join(tmp_path, archive_name) | |
|
94 | d_cache.store(archive_name, file_reader(temp_archive_path ), {'foo': 'bar'}) | |
|
95 | ||
|
96 | stats = d_cache.get_statistics() | |
|
97 | assert (tries, 19445000, {}) == stats | |
|
98 | evict_to = 0.005 # around (5mb) | |
|
99 | evicted_items = d_cache.evict(size_limit=d_cache.gb_to_bytes(evict_to)) | |
|
100 | evicted = 361 | |
|
101 | assert {'removed_items': evicted, 'removed_size': 14039290} == evicted_items | |
|
102 | ||
|
103 | stats = d_cache.get_statistics() | |
|
104 | assert (tries - evicted, 5405710, {}) == stats | |
|
105 |
@@ -0,0 +1,226 b'' | |||
|
1 | ||
|
2 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
3 | # | |
|
4 | # This program is free software: you can redistribute it and/or modify | |
|
5 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
6 | # (only), as published by the Free Software Foundation. | |
|
7 | # | |
|
8 | # This program is distributed in the hope that it will be useful, | |
|
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
11 | # GNU General Public License for more details. | |
|
12 | # | |
|
13 | # You should have received a copy of the GNU Affero General Public License | |
|
14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
15 | # | |
|
16 | # This program is dual-licensed. If you wish to learn more about the | |
|
17 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
18 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
19 | ||
|
20 | """ | |
|
21 | Test suite for making push/pull operations, on specially modified INI files | |
|
22 | ||
|
23 | .. important:: | |
|
24 | ||
|
25 | You must have git >= 1.8.5 for tests to work fine. With 68b939b git started | |
|
26 | to redirect things to stderr instead of stdout. | |
|
27 | """ | |
|
28 | ||
|
29 | ||
|
30 | import time | |
|
31 | ||
|
32 | import pytest | |
|
33 | ||
|
34 | from rhodecode.lib import rc_cache | |
|
35 | from rhodecode.model.db import Repository, UserIpMap, CacheKey | |
|
36 | from rhodecode.model.meta import Session | |
|
37 | from rhodecode.model.repo import RepoModel | |
|
38 | from rhodecode.model.user import UserModel | |
|
39 | from rhodecode.tests import (GIT_REPO, HG_REPO, TEST_USER_ADMIN_LOGIN) | |
|
40 | ||
|
41 | from rhodecode.tests.vcs_operations import ( | |
|
42 | Command, _check_proper_clone, _add_files_and_push, HG_REPO_WITH_GROUP) | |
|
43 | ||
|
44 | ||
|
45 | @pytest.mark.usefixtures("disable_locking", "disable_anonymous_user") | |
|
46 | class TestVCSOperations(object): | |
|
47 | ||
|
48 | def test_clone_hg_repo_by_admin(self, rc_web_server, tmpdir): | |
|
49 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
50 | stdout, stderr = Command('/tmp').execute( | |
|
51 | 'hg clone', clone_url, tmpdir.strpath) | |
|
52 | _check_proper_clone(stdout, stderr, 'hg') | |
|
53 | ||
|
54 | def test_clone_hg_repo_by_admin_pull_protocol(self, rc_web_server, tmpdir): | |
|
55 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
56 | stdout, stderr = Command('/tmp').execute( | |
|
57 | 'hg clone --pull', clone_url, tmpdir.strpath) | |
|
58 | _check_proper_clone(stdout, stderr, 'hg') | |
|
59 | ||
|
60 | def test_clone_hg_repo_by_admin_pull_stream_protocol(self, rc_web_server, tmpdir): | |
|
61 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
62 | stdout, stderr = Command('/tmp').execute( | |
|
63 | 'hg clone --pull --stream', clone_url, tmpdir.strpath) | |
|
64 | assert 'files to transfer,' in stdout | |
|
65 | assert 'transferred 1.' in stdout | |
|
66 | assert '114 files updated,' in stdout | |
|
67 | ||
|
68 | def test_clone_hg_repo_by_id_by_admin(self, rc_web_server, tmpdir): | |
|
69 | repo_id = Repository.get_by_repo_name(HG_REPO).repo_id | |
|
70 | clone_url = rc_web_server.repo_clone_url('_%s' % repo_id) | |
|
71 | stdout, stderr = Command('/tmp').execute( | |
|
72 | 'hg clone', clone_url, tmpdir.strpath) | |
|
73 | _check_proper_clone(stdout, stderr, 'hg') | |
|
74 | ||
|
75 | def test_clone_hg_repo_with_group_by_admin(self, rc_web_server, tmpdir): | |
|
76 | clone_url = rc_web_server.repo_clone_url(HG_REPO_WITH_GROUP) | |
|
77 | stdout, stderr = Command('/tmp').execute( | |
|
78 | 'hg clone', clone_url, tmpdir.strpath) | |
|
79 | _check_proper_clone(stdout, stderr, 'hg') | |
|
80 | ||
|
81 | def test_clone_wrong_credentials_hg(self, rc_web_server, tmpdir): | |
|
82 | clone_url = rc_web_server.repo_clone_url(HG_REPO, passwd='bad!') | |
|
83 | stdout, stderr = Command('/tmp').execute( | |
|
84 | 'hg clone', clone_url, tmpdir.strpath) | |
|
85 | assert 'abort: authorization failed' in stderr | |
|
86 | ||
|
87 | def test_clone_git_dir_as_hg(self, rc_web_server, tmpdir): | |
|
88 | clone_url = rc_web_server.repo_clone_url(GIT_REPO) | |
|
89 | stdout, stderr = Command('/tmp').execute( | |
|
90 | 'hg clone', clone_url, tmpdir.strpath) | |
|
91 | assert 'HTTP Error 404: Not Found' in stderr | |
|
92 | ||
|
93 | def test_clone_non_existing_path_hg(self, rc_web_server, tmpdir): | |
|
94 | clone_url = rc_web_server.repo_clone_url('trololo') | |
|
95 | stdout, stderr = Command('/tmp').execute( | |
|
96 | 'hg clone', clone_url, tmpdir.strpath) | |
|
97 | assert 'HTTP Error 404: Not Found' in stderr | |
|
98 | ||
|
99 | def test_clone_hg_with_slashes(self, rc_web_server, tmpdir): | |
|
100 | clone_url = rc_web_server.repo_clone_url('//' + HG_REPO) | |
|
101 | stdout, stderr = Command('/tmp').execute('hg clone', clone_url, tmpdir.strpath) | |
|
102 | assert 'HTTP Error 404: Not Found' in stderr | |
|
103 | ||
|
104 | def test_clone_existing_path_hg_not_in_database( | |
|
105 | self, rc_web_server, tmpdir, fs_repo_only): | |
|
106 | ||
|
107 | db_name = fs_repo_only('not-in-db-hg', repo_type='hg') | |
|
108 | clone_url = rc_web_server.repo_clone_url(db_name) | |
|
109 | stdout, stderr = Command('/tmp').execute( | |
|
110 | 'hg clone', clone_url, tmpdir.strpath) | |
|
111 | assert 'HTTP Error 404: Not Found' in stderr | |
|
112 | ||
|
113 | def test_clone_existing_path_hg_not_in_database_different_scm( | |
|
114 | self, rc_web_server, tmpdir, fs_repo_only): | |
|
115 | db_name = fs_repo_only('not-in-db-git', repo_type='git') | |
|
116 | clone_url = rc_web_server.repo_clone_url(db_name) | |
|
117 | stdout, stderr = Command('/tmp').execute( | |
|
118 | 'hg clone', clone_url, tmpdir.strpath) | |
|
119 | assert 'HTTP Error 404: Not Found' in stderr | |
|
120 | ||
|
121 | def test_clone_non_existing_store_path_hg(self, rc_web_server, tmpdir, user_util): | |
|
122 | repo = user_util.create_repo() | |
|
123 | clone_url = rc_web_server.repo_clone_url(repo.repo_name) | |
|
124 | ||
|
125 | # Damage repo by removing it's folder | |
|
126 | RepoModel()._delete_filesystem_repo(repo) | |
|
127 | ||
|
128 | stdout, stderr = Command('/tmp').execute( | |
|
129 | 'hg clone', clone_url, tmpdir.strpath) | |
|
130 | assert 'HTTP Error 404: Not Found' in stderr | |
|
131 | ||
|
132 | def test_push_new_file_hg(self, rc_web_server, tmpdir): | |
|
133 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
134 | stdout, stderr = Command('/tmp').execute( | |
|
135 | 'hg clone', clone_url, tmpdir.strpath) | |
|
136 | ||
|
137 | stdout, stderr = _add_files_and_push( | |
|
138 | 'hg', tmpdir.strpath, clone_url=clone_url) | |
|
139 | ||
|
140 | assert 'pushing to' in stdout | |
|
141 | assert 'size summary' in stdout | |
|
142 | ||
|
143 | def test_push_invalidates_cache(self, rc_web_server, tmpdir): | |
|
144 | hg_repo = Repository.get_by_repo_name(HG_REPO) | |
|
145 | ||
|
146 | # init cache objects | |
|
147 | CacheKey.delete_all_cache() | |
|
148 | ||
|
149 | repo_namespace_key = CacheKey.REPO_INVALIDATION_NAMESPACE.format(repo_id=hg_repo.repo_id) | |
|
150 | ||
|
151 | inv_context_manager = rc_cache.InvalidationContext(key=repo_namespace_key) | |
|
152 | ||
|
153 | with inv_context_manager as invalidation_context: | |
|
154 | # __enter__ will create and register cache objects | |
|
155 | pass | |
|
156 | ||
|
157 | cache_keys = hg_repo.cache_keys | |
|
158 | assert cache_keys != [] | |
|
159 | old_ids = [x.cache_state_uid for x in cache_keys] | |
|
160 | ||
|
161 | # clone to init cache | |
|
162 | clone_url = rc_web_server.repo_clone_url(hg_repo.repo_name) | |
|
163 | stdout, stderr = Command('/tmp').execute( | |
|
164 | 'hg clone', clone_url, tmpdir.strpath) | |
|
165 | ||
|
166 | cache_keys = hg_repo.cache_keys | |
|
167 | assert cache_keys != [] | |
|
168 | for key in cache_keys: | |
|
169 | assert key.cache_active is True | |
|
170 | ||
|
171 | # PUSH that should trigger invalidation cache | |
|
172 | stdout, stderr = _add_files_and_push( | |
|
173 | 'hg', tmpdir.strpath, clone_url=clone_url, files_no=1) | |
|
174 | ||
|
175 | # flush... | |
|
176 | Session().commit() | |
|
177 | hg_repo = Repository.get_by_repo_name(HG_REPO) | |
|
178 | cache_keys = hg_repo.cache_keys | |
|
179 | assert cache_keys != [] | |
|
180 | new_ids = [x.cache_state_uid for x in cache_keys] | |
|
181 | assert new_ids != old_ids | |
|
182 | ||
|
183 | def test_push_wrong_credentials_hg(self, rc_web_server, tmpdir): | |
|
184 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
185 | stdout, stderr = Command('/tmp').execute( | |
|
186 | 'hg clone', clone_url, tmpdir.strpath) | |
|
187 | ||
|
188 | push_url = rc_web_server.repo_clone_url( | |
|
189 | HG_REPO, user='bad', passwd='name') | |
|
190 | stdout, stderr = _add_files_and_push( | |
|
191 | 'hg', tmpdir.strpath, clone_url=push_url) | |
|
192 | ||
|
193 | assert 'abort: authorization failed' in stderr | |
|
194 | ||
|
195 | def test_push_back_to_wrong_url_hg(self, rc_web_server, tmpdir): | |
|
196 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
197 | stdout, stderr = Command('/tmp').execute( | |
|
198 | 'hg clone', clone_url, tmpdir.strpath) | |
|
199 | ||
|
200 | stdout, stderr = _add_files_and_push( | |
|
201 | 'hg', tmpdir.strpath, | |
|
202 | clone_url=rc_web_server.repo_clone_url('not-existing')) | |
|
203 | ||
|
204 | assert 'HTTP Error 404: Not Found' in stderr | |
|
205 | ||
|
206 | def test_ip_restriction_hg(self, rc_web_server, tmpdir): | |
|
207 | user_model = UserModel() | |
|
208 | try: | |
|
209 | user_model.add_extra_ip(TEST_USER_ADMIN_LOGIN, '10.10.10.10/32') | |
|
210 | Session().commit() | |
|
211 | time.sleep(2) | |
|
212 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
213 | stdout, stderr = Command('/tmp').execute( | |
|
214 | 'hg clone', clone_url, tmpdir.strpath) | |
|
215 | assert 'abort: HTTP Error 403: Forbidden' in stderr | |
|
216 | finally: | |
|
217 | # release IP restrictions | |
|
218 | for ip in UserIpMap.getAll(): | |
|
219 | UserIpMap.delete(ip.ip_id) | |
|
220 | Session().commit() | |
|
221 | ||
|
222 | time.sleep(2) | |
|
223 | ||
|
224 | stdout, stderr = Command('/tmp').execute( | |
|
225 | 'hg clone', clone_url, tmpdir.strpath) | |
|
226 | _check_proper_clone(stdout, stderr, 'hg') |
@@ -0,0 +1,224 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | """ | |
|
20 | Test suite for making push/pull operations, on specially modified INI files | |
|
21 | ||
|
22 | .. important:: | |
|
23 | ||
|
24 | You must have git >= 1.8.5 for tests to work fine. With 68b939b git started | |
|
25 | to redirect things to stderr instead of stdout. | |
|
26 | """ | |
|
27 | ||
|
28 | ||
|
29 | import time | |
|
30 | import pytest | |
|
31 | ||
|
32 | from rhodecode.model.db import Repository, UserIpMap | |
|
33 | from rhodecode.model.meta import Session | |
|
34 | from rhodecode.model.repo import RepoModel | |
|
35 | from rhodecode.model.user import UserModel | |
|
36 | from rhodecode.tests import (SVN_REPO, TEST_USER_ADMIN_LOGIN) | |
|
37 | ||
|
38 | ||
|
39 | from rhodecode.tests.vcs_operations import ( | |
|
40 | Command, _check_proper_clone, _check_proper_svn_push, | |
|
41 | _add_files_and_push, SVN_REPO_WITH_GROUP) | |
|
42 | ||
|
43 | ||
|
44 | def get_cli_flags(username, password): | |
|
45 | flags = '--no-auth-cache --non-interactive' | |
|
46 | auth = '' | |
|
47 | if username and password: | |
|
48 | auth = f'--username {username} --password {password}' | |
|
49 | return flags, auth | |
|
50 | ||
|
51 | ||
|
52 | @pytest.mark.usefixtures("disable_locking", "disable_anonymous_user") | |
|
53 | class TestVCSOperations(object): | |
|
54 | ||
|
55 | def test_clone_svn_repo_by_admin(self, rc_web_server, tmpdir): | |
|
56 | clone_url = rc_web_server.repo_clone_url(SVN_REPO) | |
|
57 | username, password = rc_web_server.repo_clone_credentials() | |
|
58 | ||
|
59 | cmd = Command('/tmp') | |
|
60 | ||
|
61 | flags, auth = get_cli_flags(username, password) | |
|
62 | ||
|
63 | stdout, stderr = Command('/tmp').execute( | |
|
64 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
65 | ||
|
66 | _check_proper_clone(stdout, stderr, 'svn') | |
|
67 | cmd.assert_returncode_success() | |
|
68 | ||
|
69 | def test_clone_svn_repo_by_id_by_admin(self, rc_web_server, tmpdir): | |
|
70 | repo_id = Repository.get_by_repo_name(SVN_REPO).repo_id | |
|
71 | username, password = rc_web_server.repo_clone_credentials() | |
|
72 | ||
|
73 | clone_url = rc_web_server.repo_clone_url('_%s' % repo_id) | |
|
74 | cmd = Command('/tmp') | |
|
75 | ||
|
76 | flags, auth = get_cli_flags(username, password) | |
|
77 | ||
|
78 | stdout, stderr = Command('/tmp').execute( | |
|
79 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
80 | ||
|
81 | _check_proper_clone(stdout, stderr, 'svn') | |
|
82 | cmd.assert_returncode_success() | |
|
83 | ||
|
84 | def test_clone_svn_repo_with_group_by_admin(self, rc_web_server, tmpdir): | |
|
85 | clone_url = rc_web_server.repo_clone_url(SVN_REPO_WITH_GROUP) | |
|
86 | username, password = rc_web_server.repo_clone_credentials() | |
|
87 | ||
|
88 | flags, auth = get_cli_flags(username, password) | |
|
89 | ||
|
90 | stdout, stderr = Command('/tmp').execute( | |
|
91 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
92 | ||
|
93 | _check_proper_clone(stdout, stderr, 'svn') | |
|
94 | cmd.assert_returncode_success() | |
|
95 | ||
|
96 | def test_clone_wrong_credentials_svn(self, rc_web_server, tmpdir): | |
|
97 | clone_url = rc_web_server.repo_clone_url(SVN_REPO) | |
|
98 | username, password = rc_web_server.repo_clone_credentials() | |
|
99 | password = 'bad-password' | |
|
100 | ||
|
101 | flags, auth = get_cli_flags(username, password) | |
|
102 | ||
|
103 | stdout, stderr = Command('/tmp').execute( | |
|
104 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
105 | assert 'fatal: Authentication failed' in stderr | |
|
106 | ||
|
107 | def test_clone_svn_with_slashes(self, rc_web_server, tmpdir): | |
|
108 | clone_url = rc_web_server.repo_clone_url('//' + SVN_REPO) | |
|
109 | username, password = '', '' | |
|
110 | flags, auth = get_cli_flags(username, password) | |
|
111 | ||
|
112 | stdout, stderr = Command('/tmp').execute( | |
|
113 | f'svn checkout {flags} {auth}', clone_url) | |
|
114 | ||
|
115 | assert 'not found' in stderr | |
|
116 | ||
|
117 | def test_clone_existing_path_svn_not_in_database( | |
|
118 | self, rc_web_server, tmpdir, fs_repo_only): | |
|
119 | db_name = fs_repo_only('not-in-db-git', repo_type='git') | |
|
120 | clone_url = rc_web_server.repo_clone_url(db_name) | |
|
121 | username, password = '', '' | |
|
122 | flags, auth = get_cli_flags(username, password) | |
|
123 | ||
|
124 | stdout, stderr = Command('/tmp').execute( | |
|
125 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
126 | assert 'not found' in stderr | |
|
127 | ||
|
128 | def test_clone_existing_path_svn_not_in_database_different_scm( | |
|
129 | self, rc_web_server, tmpdir, fs_repo_only): | |
|
130 | db_name = fs_repo_only('not-in-db-hg', repo_type='hg') | |
|
131 | clone_url = rc_web_server.repo_clone_url(db_name) | |
|
132 | ||
|
133 | username, password = '', '' | |
|
134 | flags, auth = get_cli_flags(username, password) | |
|
135 | ||
|
136 | stdout, stderr = Command('/tmp').execute( | |
|
137 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
138 | assert 'not found' in stderr | |
|
139 | ||
|
140 | def test_clone_non_existing_store_path_svn(self, rc_web_server, tmpdir, user_util): | |
|
141 | repo = user_util.create_repo(repo_type='git') | |
|
142 | clone_url = rc_web_server.repo_clone_url(repo.repo_name) | |
|
143 | ||
|
144 | # Damage repo by removing it's folder | |
|
145 | RepoModel()._delete_filesystem_repo(repo) | |
|
146 | ||
|
147 | username, password = '', '' | |
|
148 | flags, auth = get_cli_flags(username, password) | |
|
149 | ||
|
150 | stdout, stderr = Command('/tmp').execute( | |
|
151 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
152 | assert 'not found' in stderr | |
|
153 | ||
|
154 | def test_push_new_file_svn(self, rc_web_server, tmpdir): | |
|
155 | clone_url = rc_web_server.repo_clone_url(SVN_REPO) | |
|
156 | username, password = '', '' | |
|
157 | flags, auth = get_cli_flags(username, password) | |
|
158 | ||
|
159 | stdout, stderr = Command('/tmp').execute( | |
|
160 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
161 | ||
|
162 | # commit some stuff into this repo | |
|
163 | stdout, stderr = _add_files_and_push( | |
|
164 | 'svn', tmpdir.strpath, clone_url=clone_url, username=username, password=password) | |
|
165 | ||
|
166 | _check_proper_svn_push(stdout, stderr) | |
|
167 | ||
|
168 | def test_push_wrong_credentials_svn(self, rc_web_server, tmpdir): | |
|
169 | clone_url = rc_web_server.repo_clone_url(SVN_REPO) | |
|
170 | ||
|
171 | username, password = rc_web_server.repo_clone_credentials() | |
|
172 | flags, auth = get_cli_flags(username, password) | |
|
173 | ||
|
174 | stdout, stderr = Command('/tmp').execute( | |
|
175 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
176 | ||
|
177 | push_url = rc_web_server.repo_clone_url( | |
|
178 | SVN_REPO, user='bad', passwd='name') | |
|
179 | stdout, stderr = _add_files_and_push( | |
|
180 | 'svn', tmpdir.strpath, clone_url=push_url, username=username, password=password) | |
|
181 | ||
|
182 | assert 'fatal: Authentication failed' in stderr | |
|
183 | ||
|
184 | def test_push_back_to_wrong_url_svn(self, rc_web_server, tmpdir): | |
|
185 | clone_url = rc_web_server.repo_clone_url(SVN_REPO) | |
|
186 | username, password = '', '' | |
|
187 | flags, auth = get_cli_flags(username, password) | |
|
188 | ||
|
189 | stdout, stderr = Command('/tmp').execute( | |
|
190 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
191 | ||
|
192 | stdout, stderr = _add_files_and_push( | |
|
193 | 'svn', tmpdir.strpath, | |
|
194 | clone_url=rc_web_server.repo_clone_url('not-existing'), username=username, password=password) | |
|
195 | ||
|
196 | assert 'not found' in stderr | |
|
197 | ||
|
198 | def test_ip_restriction_svn(self, rc_web_server, tmpdir): | |
|
199 | user_model = UserModel() | |
|
200 | username, password = '', '' | |
|
201 | flags, auth = get_cli_flags(username, password) | |
|
202 | ||
|
203 | try: | |
|
204 | user_model.add_extra_ip(TEST_USER_ADMIN_LOGIN, '10.10.10.10/32') | |
|
205 | Session().commit() | |
|
206 | time.sleep(2) | |
|
207 | clone_url = rc_web_server.repo_clone_url(SVN_REPO) | |
|
208 | ||
|
209 | stdout, stderr = Command('/tmp').execute( | |
|
210 | f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
211 | msg = "The requested URL returned error: 403" | |
|
212 | assert msg in stderr | |
|
213 | finally: | |
|
214 | # release IP restrictions | |
|
215 | for ip in UserIpMap.getAll(): | |
|
216 | UserIpMap.delete(ip.ip_id) | |
|
217 | Session().commit() | |
|
218 | ||
|
219 | time.sleep(2) | |
|
220 | ||
|
221 | cmd = Command('/tmp') | |
|
222 | stdout, stderr = cmd.execute(f'svn checkout {flags} {auth}', clone_url, tmpdir.strpath) | |
|
223 | cmd.assert_returncode_success() | |
|
224 | _check_proper_clone(stdout, stderr, 'svn') |
@@ -1,5 +1,5 b'' | |||
|
1 | 1 | [bumpversion] |
|
2 |
current_version = 5. |
|
|
2 | current_version = 5.1.0 | |
|
3 | 3 | message = release: Bump version {current_version} to {new_version} |
|
4 | 4 | |
|
5 | 5 | [bumpversion:file:rhodecode/VERSION] |
@@ -37,31 +37,6 b' test-only:' | |||
|
37 | 37 | --cov=rhodecode rhodecode |
|
38 | 38 | |
|
39 | 39 | |
|
40 | .PHONY: test-only-mysql | |
|
41 | ## run tests against mysql | |
|
42 | test-only-mysql: | |
|
43 | PYTHONHASHSEED=random \ | |
|
44 | py.test -x -vv -r xw -p no:sugar \ | |
|
45 | --cov-report=term-missing --cov-report=html \ | |
|
46 | --ini-config-override='{"app:main": {"sqlalchemy.db1.url": "mysql://root:qweqwe@localhost/rhodecode_test?charset=utf8"}}' \ | |
|
47 | --cov=rhodecode rhodecode | |
|
48 | ||
|
49 | ||
|
50 | .PHONY: test-only-postgres | |
|
51 | ## run tests against postgres | |
|
52 | test-only-postgres: | |
|
53 | PYTHONHASHSEED=random \ | |
|
54 | py.test -x -vv -r xw -p no:sugar \ | |
|
55 | --cov-report=term-missing --cov-report=html \ | |
|
56 | --ini-config-override='{"app:main": {"sqlalchemy.db1.url": "postgresql://postgres:qweqwe@localhost/rhodecode_test"}}' \ | |
|
57 | --cov=rhodecode rhodecode | |
|
58 | ||
|
59 | .PHONY: ruff-check | |
|
60 | ## run a ruff analysis | |
|
61 | ruff-check: | |
|
62 | ruff check --ignore F401 --ignore I001 --ignore E402 --ignore E501 --ignore F841 --exclude rhodecode/lib/dbmigrate --exclude .eggs --exclude .dev . | |
|
63 | ||
|
64 | ||
|
65 | 40 | .PHONY: docs |
|
66 | 41 | ## build docs |
|
67 | 42 | docs: |
@@ -88,6 +63,10 b' web-build:' | |||
|
88 | 63 | ./rhodecode/tests/scripts/static-file-check.sh rhodecode/public/ |
|
89 | 64 | rm -rf node_modules |
|
90 | 65 | |
|
66 | .PHONY: ruff-check | |
|
67 | ## run a ruff analysis | |
|
68 | ruff-check: | |
|
69 | ruff check --ignore F401 --ignore I001 --ignore E402 --ignore E501 --ignore F841 --exclude rhodecode/lib/dbmigrate --exclude .eggs --exclude .dev . | |
|
91 | 70 | |
|
92 | 71 | .PHONY: pip-packages |
|
93 | 72 | ## Show outdated packages |
@@ -109,8 +88,9 b' dev-sh:' | |||
|
109 | 88 | sudo apt-get install -y zsh carapace-bin |
|
110 | 89 | rm -rf /home/rhodecode/.oh-my-zsh |
|
111 | 90 | curl https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh | sh |
|
112 | echo "source <(carapace _carapace)" > /home/rhodecode/.zsrc | |
|
113 | PROMPT='%(?.%F{green}√.%F{red}?%?)%f %B%F{240}%1~%f%b %# ' zsh | |
|
91 | @echo "source <(carapace _carapace)" > /home/rhodecode/.zsrc | |
|
92 | @echo "${RC_DEV_CMD_HELP}" | |
|
93 | @PROMPT='%(?.%F{green}√.%F{red}?%?)%f %B%F{240}%1~%f%b %# ' zsh | |
|
114 | 94 | |
|
115 | 95 | |
|
116 | 96 | .PHONY: dev-cleanup |
@@ -122,7 +102,9 b' dev-cleanup:' | |||
|
122 | 102 | |
|
123 | 103 | .PHONY: dev-env |
|
124 | 104 | ## make dev-env based on the requirements files and install develop of packages |
|
105 | ## Cleanup: pip freeze | grep -v "^-e" | grep -v "@" | xargs pip uninstall -y | |
|
125 | 106 | dev-env: |
|
107 | sudo -u root chown rhodecode:rhodecode /home/rhodecode/.cache/pip/ | |
|
126 | 108 |
|
|
127 | 109 | pushd ../rhodecode-vcsserver/ && make dev-env && popd |
|
128 | 110 | pip wheel --wheel-dir=/home/rhodecode/.cache/pip/wheels -r requirements.txt -r requirements_rc_tools.txt -r requirements_test.txt -r requirements_debug.txt |
@@ -137,16 +119,13 b' sh:' | |||
|
137 | 119 | make dev-sh |
|
138 | 120 | |
|
139 | 121 | |
|
140 | .PHONY: dev-srv | |
|
141 | ## run develop server instance, docker exec -it $(docker ps -q --filter 'name=dev-enterprise-ce') /bin/bash | |
|
142 | dev-srv: | |
|
143 | pserve --reload .dev/dev.ini | |
|
122 | ## Allows changes of workers e.g make dev-srv-g workers=2 | |
|
123 | workers?=1 | |
|
144 | 124 | |
|
145 | ||
|
146 | .PHONY: dev-srv-g | |
|
147 | ## run gunicorn multi process workers | |
|
148 | dev-srv-g: | |
|
149 | gunicorn --paste .dev/dev.ini --bind=0.0.0.0:10020 --config=.dev/gunicorn_config.py --timeout=120 --reload | |
|
125 | .PHONY: dev-srv | |
|
126 | ## run gunicorn web server with reloader, use workers=N to set multiworker mode | |
|
127 | dev-srv: | |
|
128 | gunicorn --paste=.dev/dev.ini --bind=0.0.0.0:10020 --config=.dev/gunicorn_config.py --timeout=120 --reload --workers=$(workers) | |
|
150 | 129 | |
|
151 | 130 | |
|
152 | 131 | # Default command on calling make |
@@ -31,32 +31,15 b' debug = true' | |||
|
31 | 31 | host = 127.0.0.1 |
|
32 | 32 | port = 10020 |
|
33 | 33 | |
|
34 | ; ################################################## | |
|
35 | ; WAITRESS WSGI SERVER - Recommended for Development | |
|
36 | ; ################################################## | |
|
37 | ||
|
38 | ; use server type | |
|
39 | use = egg:waitress#main | |
|
40 | ||
|
41 | ; number of worker threads | |
|
42 | threads = 5 | |
|
43 | ||
|
44 | ; MAX BODY SIZE 100GB | |
|
45 | max_request_body_size = 107374182400 | |
|
46 | ||
|
47 | ; Use poll instead of select, fixes file descriptors limits problems. | |
|
48 | ; May not work on old windows systems. | |
|
49 | asyncore_use_poll = true | |
|
50 | ||
|
51 | 34 | |
|
52 | 35 | ; ########################### |
|
53 | 36 | ; GUNICORN APPLICATION SERVER |
|
54 | 37 | ; ########################### |
|
55 | 38 | |
|
56 |
; run with gunicorn |
|
|
39 | ; run with gunicorn --config gunicorn_conf.py --paste rhodecode.ini | |
|
57 | 40 | |
|
58 | 41 | ; Module to use, this setting shouldn't be changed |
|
59 |
|
|
|
42 | use = egg:gunicorn#main | |
|
60 | 43 | |
|
61 | 44 | ; Prefix middleware for RhodeCode. |
|
62 | 45 | ; recommended when using proxy setup. |
@@ -153,6 +136,12 b' startup.import_repos = false' | |||
|
153 | 136 | ; SSH calls. Set this for events to receive proper url for SSH calls. |
|
154 | 137 | app.base_url = http://rhodecode.local |
|
155 | 138 | |
|
139 | ; Host at which the Service API is running. | |
|
140 | app.service_api.host = http://rhodecode.local:10020 | |
|
141 | ||
|
142 | ; Secret for Service API authentication. | |
|
143 | app.service_api.token = | |
|
144 | ||
|
156 | 145 | ; Unique application ID. Should be a random unique string for security. |
|
157 | 146 | app_instance_uuid = rc-production |
|
158 | 147 | |
@@ -255,8 +244,8 b' auth_ret_code_detection = false' | |||
|
255 | 244 | ; codes don't break the transactions while 4XX codes do |
|
256 | 245 | lock_ret_code = 423 |
|
257 | 246 | |
|
258 | ; allows to change the repository location in settings page | |
|
259 | allow_repo_location_change = true | |
|
247 | ; Filesystem location were repositories should be stored | |
|
248 | repo_store.path = /var/opt/rhodecode_repo_store | |
|
260 | 249 | |
|
261 | 250 | ; allows to setup custom hooks in settings page |
|
262 | 251 | allow_custom_hooks_settings = true |
@@ -298,23 +287,72 b' file_store.enabled = true' | |||
|
298 | 287 | ; Storage backend, available options are: local |
|
299 | 288 | file_store.backend = local |
|
300 | 289 | |
|
301 | ; path to store the uploaded binaries | |
|
302 |
file_store.storage_path = |
|
|
290 | ; path to store the uploaded binaries and artifacts | |
|
291 | file_store.storage_path = /var/opt/rhodecode_data/file_store | |
|
292 | ||
|
293 | ||
|
294 | ; Redis url to acquire/check generation of archives locks | |
|
295 | archive_cache.locking.url = redis://redis:6379/1 | |
|
296 | ||
|
297 | ; Storage backend, only 'filesystem' and 'objectstore' are available now | |
|
298 | archive_cache.backend.type = filesystem | |
|
299 | ||
|
300 | ; url for s3 compatible storage that allows to upload artifacts | |
|
301 | ; e.g http://minio:9000 | |
|
302 | archive_cache.objectstore.url = http://s3-minio:9000 | |
|
303 | ||
|
304 | ; key for s3 auth | |
|
305 | archive_cache.objectstore.key = key | |
|
306 | ||
|
307 | ; secret for s3 auth | |
|
308 | archive_cache.objectstore.secret = secret | |
|
303 | 309 | |
|
304 | ; Uncomment and set this path to control settings for archive download cache. | |
|
310 | ;region for s3 storage | |
|
311 | archive_cache.objectstore.region = eu-central-1 | |
|
312 | ||
|
313 | ; number of sharded buckets to create to distribute archives across | |
|
314 | ; default is 8 shards | |
|
315 | archive_cache.objectstore.bucket_shards = 8 | |
|
316 | ||
|
317 | ; a top-level bucket to put all other shards in | |
|
318 | ; objects will be stored in rhodecode-archive-cache/shard-N based on the bucket_shards number | |
|
319 | archive_cache.objectstore.bucket = rhodecode-archive-cache | |
|
320 | ||
|
321 | ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time | |
|
322 | archive_cache.objectstore.retry = false | |
|
323 | ||
|
324 | ; number of seconds to wait for next try using retry | |
|
325 | archive_cache.objectstore.retry_backoff = 1 | |
|
326 | ||
|
327 | ; how many tries do do a retry fetch from this backend | |
|
328 | archive_cache.objectstore.retry_attempts = 10 | |
|
329 | ||
|
330 | ; Default is $cache_dir/archive_cache if not set | |
|
305 | 331 | ; Generated repo archives will be cached at this location |
|
306 | 332 | ; and served from the cache during subsequent requests for the same archive of |
|
307 | 333 | ; the repository. This path is important to be shared across filesystems and with |
|
308 | 334 | ; RhodeCode and vcsserver |
|
309 | ||
|
310 | ; Default is $cache_dir/archive_cache if not set | |
|
311 | archive_cache.store_dir = %(here)s/data/archive_cache | |
|
335 | archive_cache.filesystem.store_dir = /var/opt/rhodecode_data/archive_cache | |
|
312 | 336 | |
|
313 | 337 | ; The limit in GB sets how much data we cache before recycling last used, defaults to 10 gb |
|
314 |
archive_cache.cache_size_gb = 1 |
|
|
338 | archive_cache.filesystem.cache_size_gb = 1 | |
|
339 | ||
|
340 | ; Eviction policy used to clear out after cache_size_gb limit is reached | |
|
341 | archive_cache.filesystem.eviction_policy = least-recently-stored | |
|
315 | 342 | |
|
316 | 343 | ; By default cache uses sharding technique, this specifies how many shards are there |
|
317 | archive_cache.cache_shards = 10 | |
|
344 | ; default is 8 shards | |
|
345 | archive_cache.filesystem.cache_shards = 8 | |
|
346 | ||
|
347 | ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time | |
|
348 | archive_cache.filesystem.retry = false | |
|
349 | ||
|
350 | ; number of seconds to wait for next try using retry | |
|
351 | archive_cache.filesystem.retry_backoff = 1 | |
|
352 | ||
|
353 | ; how many tries do do a retry fetch from this backend | |
|
354 | archive_cache.filesystem.retry_attempts = 10 | |
|
355 | ||
|
318 | 356 | |
|
319 | 357 | ; ############# |
|
320 | 358 | ; CELERY CONFIG |
@@ -322,7 +360,7 b' archive_cache.cache_shards = 10' | |||
|
322 | 360 | |
|
323 | 361 | ; manually run celery: /path/to/celery worker --task-events --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini |
|
324 | 362 | |
|
325 |
use_celery = |
|
|
363 | use_celery = true | |
|
326 | 364 | |
|
327 | 365 | ; path to store schedule database |
|
328 | 366 | #celerybeat-schedule.path = |
@@ -348,7 +386,7 b' celery.task_always_eager = false' | |||
|
348 | 386 | |
|
349 | 387 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. |
|
350 | 388 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space |
|
351 |
cache_dir = |
|
|
389 | cache_dir = /var/opt/rhodecode_data | |
|
352 | 390 | |
|
353 | 391 | ; ********************************************* |
|
354 | 392 | ; `sql_cache_short` cache for heavy SQL queries |
@@ -457,12 +495,12 b' rc_cache.cache_repo.expiration_time = 25' | |||
|
457 | 495 | ; beaker.session.type is type of storage options for the logged users sessions. Current allowed |
|
458 | 496 | ; types are file, ext:redis, ext:database, ext:memcached |
|
459 | 497 | ; Fastest ones are ext:redis and ext:database, DO NOT use memory type for session |
|
460 | beaker.session.type = file | |
|
461 | beaker.session.data_dir = %(here)s/data/sessions | |
|
498 | #beaker.session.type = file | |
|
499 | #beaker.session.data_dir = %(here)s/data/sessions | |
|
462 | 500 | |
|
463 | 501 | ; Redis based sessions |
|
464 |
|
|
|
465 |
|
|
|
502 | beaker.session.type = ext:redis | |
|
503 | beaker.session.url = redis://redis:6379/2 | |
|
466 | 504 | |
|
467 | 505 | ; DB based session, fast, and allows easy management over logged in users |
|
468 | 506 | #beaker.session.type = ext:database |
@@ -474,7 +512,7 b' beaker.session.data_dir = %(here)s/data/' | |||
|
474 | 512 | |
|
475 | 513 | beaker.session.key = rhodecode |
|
476 | 514 | beaker.session.secret = develop-rc-uytcxaz |
|
477 |
beaker.session.lock_dir = |
|
|
515 | beaker.session.lock_dir = /data_ramdisk/lock | |
|
478 | 516 | |
|
479 | 517 | ; Secure encrypted cookie. Requires AES and AES python libraries |
|
480 | 518 | ; you must disable beaker.session.secret to use this |
@@ -515,18 +553,18 b' search.location = %(here)s/data/index' | |||
|
515 | 553 | ; channelstream enables persistent connections and live notification |
|
516 | 554 | ; in the system. It's also used by the chat system |
|
517 | 555 | |
|
518 |
channelstream.enabled = |
|
|
556 | channelstream.enabled = true | |
|
519 | 557 | |
|
520 | 558 | ; server address for channelstream server on the backend |
|
521 |
channelstream.server = |
|
|
559 | channelstream.server = channelstream:9800 | |
|
522 | 560 | |
|
523 | 561 | ; location of the channelstream server from outside world |
|
524 | 562 | ; use ws:// for http or wss:// for https. This address needs to be handled |
|
525 | 563 | ; by external HTTP server such as Nginx or Apache |
|
526 | 564 | ; see Nginx/Apache configuration examples in our docs |
|
527 | 565 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream |
|
528 |
channelstream.secret = |
|
|
529 |
channelstream.history.location = |
|
|
566 | channelstream.secret = ENV_GENERATED | |
|
567 | channelstream.history.location = /var/opt/rhodecode_data/channelstream_history | |
|
530 | 568 | |
|
531 | 569 | ; Internal application path that Javascript uses to connect into. |
|
532 | 570 | ; If you use proxy-prefix the prefix should be added before /_channelstream |
@@ -572,7 +610,7 b' sqlalchemy.db1.pool_recycle = 3600' | |||
|
572 | 610 | ; VCS CONFIG |
|
573 | 611 | ; ########## |
|
574 | 612 | vcs.server.enable = true |
|
575 |
vcs.server = |
|
|
613 | vcs.server = vcsserver:10010 | |
|
576 | 614 | |
|
577 | 615 | ; Web server connectivity protocol, responsible for web based VCS operations |
|
578 | 616 | ; Available protocols are: |
@@ -585,6 +623,7 b' vcs.scm_app_implementation = http' | |||
|
585 | 623 | |
|
586 | 624 | ; Push/Pull operations hooks protocol, available options are: |
|
587 | 625 | ; `http` - use http-rpc backend (default) |
|
626 | ; `celery` - use celery based hooks | |
|
588 | 627 | vcs.hooks.protocol = http |
|
589 | 628 | |
|
590 | 629 | ; Host on which this instance is listening for hooks. vcsserver will call this host to pull/push hooks so it should be |
@@ -604,11 +643,6 b' vcs.backends = hg, git, svn' | |||
|
604 | 643 | ; Wait this number of seconds before killing connection to the vcsserver |
|
605 | 644 | vcs.connection_timeout = 3600 |
|
606 | 645 | |
|
607 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
608 | ; Set a numeric version for your current SVN e.g 1.8, or 1.12 | |
|
609 | ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
610 | #vcs.svn.compatible_version = 1.8 | |
|
611 | ||
|
612 | 646 | ; Cache flag to cache vcsserver remote calls locally |
|
613 | 647 | ; It uses cache_region `cache_repo` |
|
614 | 648 | vcs.methods.cache = true |
@@ -618,14 +652,29 b' vcs.methods.cache = true' | |||
|
618 | 652 | ; Maps RhodeCode repo groups into SVN paths for Apache |
|
619 | 653 | ; #################################################### |
|
620 | 654 | |
|
655 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
656 | ; Set a numeric version for your current SVN e.g 1.8, or 1.12 | |
|
657 | ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
658 | #vcs.svn.compatible_version = 1.8 | |
|
659 | ||
|
660 | ; Redis connection settings for svn integrations logic | |
|
661 | ; This connection string needs to be the same on ce and vcsserver | |
|
662 | vcs.svn.redis_conn = redis://redis:6379/0 | |
|
663 | ||
|
664 | ; Enable SVN proxy of requests over HTTP | |
|
665 | vcs.svn.proxy.enabled = true | |
|
666 | ||
|
667 | ; host to connect to running SVN subsystem | |
|
668 | vcs.svn.proxy.host = http://svn:8090 | |
|
669 | ||
|
621 | 670 | ; Enable or disable the config file generation. |
|
622 |
svn.proxy.generate_config = |
|
|
671 | svn.proxy.generate_config = true | |
|
623 | 672 | |
|
624 | 673 | ; Generate config file with `SVNListParentPath` set to `On`. |
|
625 | 674 | svn.proxy.list_parent_path = true |
|
626 | 675 | |
|
627 | 676 | ; Set location and file name of generated config file. |
|
628 |
svn.proxy.config_file_path = |
|
|
677 | svn.proxy.config_file_path = /etc/rhodecode/conf/svn/mod_dav_svn.conf | |
|
629 | 678 | |
|
630 | 679 | ; alternative mod_dav config template. This needs to be a valid mako template |
|
631 | 680 | ; Example template can be found in the source code: |
@@ -653,7 +702,7 b' svn.proxy.location_root = /' | |||
|
653 | 702 | ; any change user ssh keys. Setting this to false also disables possibility |
|
654 | 703 | ; of adding SSH keys by users from web interface. Super admins can still |
|
655 | 704 | ; manage SSH Keys. |
|
656 |
ssh.generate_authorized_keyfile = |
|
|
705 | ssh.generate_authorized_keyfile = true | |
|
657 | 706 | |
|
658 | 707 | ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` |
|
659 | 708 | # ssh.authorized_keys_ssh_opts = |
@@ -661,12 +710,13 b' ssh.generate_authorized_keyfile = false' | |||
|
661 | 710 | ; Path to the authorized_keys file where the generate entries are placed. |
|
662 | 711 | ; It is possible to have multiple key files specified in `sshd_config` e.g. |
|
663 | 712 | ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode |
|
664 |
ssh.authorized_keys_file_path = |
|
|
713 | ssh.authorized_keys_file_path = /etc/rhodecode/conf/ssh/authorized_keys_rhodecode | |
|
665 | 714 | |
|
666 | 715 | ; Command to execute the SSH wrapper. The binary is available in the |
|
667 | 716 | ; RhodeCode installation directory. |
|
668 | ; e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |
|
669 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |
|
717 | ; legacy: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper | |
|
718 | ; new rewrite: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2 | |
|
719 | ssh.wrapper_cmd = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper | |
|
670 | 720 | |
|
671 | 721 | ; Allow shell when executing the ssh-wrapper command |
|
672 | 722 | ssh.wrapper_cmd_allow_shell = false |
@@ -677,73 +727,14 b' ssh.enable_debug_logging = true' | |||
|
677 | 727 | |
|
678 | 728 | ; Paths to binary executable, by default they are the names, but we can |
|
679 | 729 | ; override them if we want to use a custom one |
|
680 |
ssh.executable.hg = |
|
|
681 |
ssh.executable.git = |
|
|
682 |
ssh.executable.svn = |
|
|
730 | ssh.executable.hg = /usr/local/bin/rhodecode_bin/vcs_bin/hg | |
|
731 | ssh.executable.git = /usr/local/bin/rhodecode_bin/vcs_bin/git | |
|
732 | ssh.executable.svn = /usr/local/bin/rhodecode_bin/vcs_bin/svnserve | |
|
683 | 733 | |
|
684 | 734 | ; Enables SSH key generator web interface. Disabling this still allows users |
|
685 | 735 | ; to add their own keys. |
|
686 | 736 | ssh.enable_ui_key_generator = true |
|
687 | 737 | |
|
688 | ||
|
689 | ; ################# | |
|
690 | ; APPENLIGHT CONFIG | |
|
691 | ; ################# | |
|
692 | ||
|
693 | ; Appenlight is tailored to work with RhodeCode, see | |
|
694 | ; http://appenlight.rhodecode.com for details how to obtain an account | |
|
695 | ||
|
696 | ; Appenlight integration enabled | |
|
697 | #appenlight = false | |
|
698 | ||
|
699 | #appenlight.server_url = https://api.appenlight.com | |
|
700 | #appenlight.api_key = YOUR_API_KEY | |
|
701 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 | |
|
702 | ||
|
703 | ; used for JS client | |
|
704 | #appenlight.api_public_key = YOUR_API_PUBLIC_KEY | |
|
705 | ||
|
706 | ; TWEAK AMOUNT OF INFO SENT HERE | |
|
707 | ||
|
708 | ; enables 404 error logging (default False) | |
|
709 | #appenlight.report_404 = false | |
|
710 | ||
|
711 | ; time in seconds after request is considered being slow (default 1) | |
|
712 | #appenlight.slow_request_time = 1 | |
|
713 | ||
|
714 | ; record slow requests in application | |
|
715 | ; (needs to be enabled for slow datastore recording and time tracking) | |
|
716 | #appenlight.slow_requests = true | |
|
717 | ||
|
718 | ; enable hooking to application loggers | |
|
719 | #appenlight.logging = true | |
|
720 | ||
|
721 | ; minimum log level for log capture | |
|
722 | #ppenlight.logging.level = WARNING | |
|
723 | ||
|
724 | ; send logs only from erroneous/slow requests | |
|
725 | ; (saves API quota for intensive logging) | |
|
726 | #appenlight.logging_on_error = false | |
|
727 | ||
|
728 | ; list of additional keywords that should be grabbed from environ object | |
|
729 | ; can be string with comma separated list of words in lowercase | |
|
730 | ; (by default client will always send following info: | |
|
731 | ; 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
|
732 | ; start with HTTP* this list be extended with additional keywords here | |
|
733 | #appenlight.environ_keys_whitelist = | |
|
734 | ||
|
735 | ; list of keywords that should be blanked from request object | |
|
736 | ; can be string with comma separated list of words in lowercase | |
|
737 | ; (by default client will always blank keys that contain following words | |
|
738 | ; 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
|
739 | ; this list be extended with additional keywords set here | |
|
740 | #appenlight.request_keys_blacklist = | |
|
741 | ||
|
742 | ; list of namespaces that should be ignores when gathering log entries | |
|
743 | ; can be string with comma separated list of namespaces | |
|
744 | ; (by default the client ignores own entries: appenlight_client.client) | |
|
745 | #appenlight.log_namespace_blacklist = | |
|
746 | ||
|
747 | 738 | ; Statsd client config, this is used to send metrics to statsd |
|
748 | 739 | ; We recommend setting statsd_exported and scrape them using Prometheus |
|
749 | 740 | #statsd.enabled = false |
@@ -36,7 +36,7 b' port = 10020' | |||
|
36 | 36 | ; GUNICORN APPLICATION SERVER |
|
37 | 37 | ; ########################### |
|
38 | 38 | |
|
39 |
; run with gunicorn |
|
|
39 | ; run with gunicorn --config gunicorn_conf.py --paste rhodecode.ini | |
|
40 | 40 | |
|
41 | 41 | ; Module to use, this setting shouldn't be changed |
|
42 | 42 | use = egg:gunicorn#main |
@@ -104,6 +104,12 b' startup.import_repos = false' | |||
|
104 | 104 | ; SSH calls. Set this for events to receive proper url for SSH calls. |
|
105 | 105 | app.base_url = http://rhodecode.local |
|
106 | 106 | |
|
107 | ; Host at which the Service API is running. | |
|
108 | app.service_api.host = http://rhodecode.local:10020 | |
|
109 | ||
|
110 | ; Secret for Service API authentication. | |
|
111 | app.service_api.token = | |
|
112 | ||
|
107 | 113 | ; Unique application ID. Should be a random unique string for security. |
|
108 | 114 | app_instance_uuid = rc-production |
|
109 | 115 | |
@@ -206,8 +212,8 b' auth_ret_code_detection = false' | |||
|
206 | 212 | ; codes don't break the transactions while 4XX codes do |
|
207 | 213 | lock_ret_code = 423 |
|
208 | 214 | |
|
209 | ; allows to change the repository location in settings page | |
|
210 | allow_repo_location_change = true | |
|
215 | ; Filesystem location were repositories should be stored | |
|
216 | repo_store.path = /var/opt/rhodecode_repo_store | |
|
211 | 217 | |
|
212 | 218 | ; allows to setup custom hooks in settings page |
|
213 | 219 | allow_custom_hooks_settings = true |
@@ -249,23 +255,72 b' file_store.enabled = true' | |||
|
249 | 255 | ; Storage backend, available options are: local |
|
250 | 256 | file_store.backend = local |
|
251 | 257 | |
|
252 | ; path to store the uploaded binaries | |
|
253 |
file_store.storage_path = |
|
|
258 | ; path to store the uploaded binaries and artifacts | |
|
259 | file_store.storage_path = /var/opt/rhodecode_data/file_store | |
|
260 | ||
|
261 | ||
|
262 | ; Redis url to acquire/check generation of archives locks | |
|
263 | archive_cache.locking.url = redis://redis:6379/1 | |
|
264 | ||
|
265 | ; Storage backend, only 'filesystem' and 'objectstore' are available now | |
|
266 | archive_cache.backend.type = filesystem | |
|
267 | ||
|
268 | ; url for s3 compatible storage that allows to upload artifacts | |
|
269 | ; e.g http://minio:9000 | |
|
270 | archive_cache.objectstore.url = http://s3-minio:9000 | |
|
271 | ||
|
272 | ; key for s3 auth | |
|
273 | archive_cache.objectstore.key = key | |
|
274 | ||
|
275 | ; secret for s3 auth | |
|
276 | archive_cache.objectstore.secret = secret | |
|
254 | 277 | |
|
255 | ; Uncomment and set this path to control settings for archive download cache. | |
|
278 | ;region for s3 storage | |
|
279 | archive_cache.objectstore.region = eu-central-1 | |
|
280 | ||
|
281 | ; number of sharded buckets to create to distribute archives across | |
|
282 | ; default is 8 shards | |
|
283 | archive_cache.objectstore.bucket_shards = 8 | |
|
284 | ||
|
285 | ; a top-level bucket to put all other shards in | |
|
286 | ; objects will be stored in rhodecode-archive-cache/shard-N based on the bucket_shards number | |
|
287 | archive_cache.objectstore.bucket = rhodecode-archive-cache | |
|
288 | ||
|
289 | ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time | |
|
290 | archive_cache.objectstore.retry = false | |
|
291 | ||
|
292 | ; number of seconds to wait for next try using retry | |
|
293 | archive_cache.objectstore.retry_backoff = 1 | |
|
294 | ||
|
295 | ; how many tries do do a retry fetch from this backend | |
|
296 | archive_cache.objectstore.retry_attempts = 10 | |
|
297 | ||
|
298 | ; Default is $cache_dir/archive_cache if not set | |
|
256 | 299 | ; Generated repo archives will be cached at this location |
|
257 | 300 | ; and served from the cache during subsequent requests for the same archive of |
|
258 | 301 | ; the repository. This path is important to be shared across filesystems and with |
|
259 | 302 | ; RhodeCode and vcsserver |
|
260 | ||
|
261 | ; Default is $cache_dir/archive_cache if not set | |
|
262 | archive_cache.store_dir = %(here)s/data/archive_cache | |
|
303 | archive_cache.filesystem.store_dir = /var/opt/rhodecode_data/archive_cache | |
|
263 | 304 | |
|
264 | 305 | ; The limit in GB sets how much data we cache before recycling last used, defaults to 10 gb |
|
265 | archive_cache.cache_size_gb = 40 | |
|
306 | archive_cache.filesystem.cache_size_gb = 40 | |
|
307 | ||
|
308 | ; Eviction policy used to clear out after cache_size_gb limit is reached | |
|
309 | archive_cache.filesystem.eviction_policy = least-recently-stored | |
|
266 | 310 | |
|
267 | 311 | ; By default cache uses sharding technique, this specifies how many shards are there |
|
268 | archive_cache.cache_shards = 4 | |
|
312 | ; default is 8 shards | |
|
313 | archive_cache.filesystem.cache_shards = 8 | |
|
314 | ||
|
315 | ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time | |
|
316 | archive_cache.filesystem.retry = false | |
|
317 | ||
|
318 | ; number of seconds to wait for next try using retry | |
|
319 | archive_cache.filesystem.retry_backoff = 1 | |
|
320 | ||
|
321 | ; how many tries do do a retry fetch from this backend | |
|
322 | archive_cache.filesystem.retry_attempts = 10 | |
|
323 | ||
|
269 | 324 | |
|
270 | 325 | ; ############# |
|
271 | 326 | ; CELERY CONFIG |
@@ -273,7 +328,7 b' archive_cache.cache_shards = 4' | |||
|
273 | 328 | |
|
274 | 329 | ; manually run celery: /path/to/celery worker --task-events --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini |
|
275 | 330 | |
|
276 |
use_celery = |
|
|
331 | use_celery = true | |
|
277 | 332 | |
|
278 | 333 | ; path to store schedule database |
|
279 | 334 | #celerybeat-schedule.path = |
@@ -299,7 +354,7 b' celery.task_always_eager = false' | |||
|
299 | 354 | |
|
300 | 355 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. |
|
301 | 356 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space |
|
302 |
cache_dir = |
|
|
357 | cache_dir = /var/opt/rhodecode_data | |
|
303 | 358 | |
|
304 | 359 | ; ********************************************* |
|
305 | 360 | ; `sql_cache_short` cache for heavy SQL queries |
@@ -408,12 +463,12 b' rc_cache.cache_repo.expiration_time = 25' | |||
|
408 | 463 | ; beaker.session.type is type of storage options for the logged users sessions. Current allowed |
|
409 | 464 | ; types are file, ext:redis, ext:database, ext:memcached |
|
410 | 465 | ; Fastest ones are ext:redis and ext:database, DO NOT use memory type for session |
|
411 | beaker.session.type = file | |
|
412 | beaker.session.data_dir = %(here)s/data/sessions | |
|
466 | #beaker.session.type = file | |
|
467 | #beaker.session.data_dir = %(here)s/data/sessions | |
|
413 | 468 | |
|
414 | 469 | ; Redis based sessions |
|
415 |
|
|
|
416 |
|
|
|
470 | beaker.session.type = ext:redis | |
|
471 | beaker.session.url = redis://redis:6379/2 | |
|
417 | 472 | |
|
418 | 473 | ; DB based session, fast, and allows easy management over logged in users |
|
419 | 474 | #beaker.session.type = ext:database |
@@ -425,7 +480,7 b' beaker.session.data_dir = %(here)s/data/' | |||
|
425 | 480 | |
|
426 | 481 | beaker.session.key = rhodecode |
|
427 | 482 | beaker.session.secret = production-rc-uytcxaz |
|
428 |
beaker.session.lock_dir = |
|
|
483 | beaker.session.lock_dir = /data_ramdisk/lock | |
|
429 | 484 | |
|
430 | 485 | ; Secure encrypted cookie. Requires AES and AES python libraries |
|
431 | 486 | ; you must disable beaker.session.secret to use this |
@@ -466,18 +521,18 b' search.location = %(here)s/data/index' | |||
|
466 | 521 | ; channelstream enables persistent connections and live notification |
|
467 | 522 | ; in the system. It's also used by the chat system |
|
468 | 523 | |
|
469 |
channelstream.enabled = |
|
|
524 | channelstream.enabled = true | |
|
470 | 525 | |
|
471 | 526 | ; server address for channelstream server on the backend |
|
472 |
channelstream.server = |
|
|
527 | channelstream.server = channelstream:9800 | |
|
473 | 528 | |
|
474 | 529 | ; location of the channelstream server from outside world |
|
475 | 530 | ; use ws:// for http or wss:// for https. This address needs to be handled |
|
476 | 531 | ; by external HTTP server such as Nginx or Apache |
|
477 | 532 | ; see Nginx/Apache configuration examples in our docs |
|
478 | 533 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream |
|
479 |
channelstream.secret = |
|
|
480 |
channelstream.history.location = |
|
|
534 | channelstream.secret = ENV_GENERATED | |
|
535 | channelstream.history.location = /var/opt/rhodecode_data/channelstream_history | |
|
481 | 536 | |
|
482 | 537 | ; Internal application path that Javascript uses to connect into. |
|
483 | 538 | ; If you use proxy-prefix the prefix should be added before /_channelstream |
@@ -523,7 +578,7 b' sqlalchemy.db1.pool_recycle = 3600' | |||
|
523 | 578 | ; VCS CONFIG |
|
524 | 579 | ; ########## |
|
525 | 580 | vcs.server.enable = true |
|
526 |
vcs.server = |
|
|
581 | vcs.server = vcsserver:10010 | |
|
527 | 582 | |
|
528 | 583 | ; Web server connectivity protocol, responsible for web based VCS operations |
|
529 | 584 | ; Available protocols are: |
@@ -536,6 +591,7 b' vcs.scm_app_implementation = http' | |||
|
536 | 591 | |
|
537 | 592 | ; Push/Pull operations hooks protocol, available options are: |
|
538 | 593 | ; `http` - use http-rpc backend (default) |
|
594 | ; `celery` - use celery based hooks | |
|
539 | 595 | vcs.hooks.protocol = http |
|
540 | 596 | |
|
541 | 597 | ; Host on which this instance is listening for hooks. vcsserver will call this host to pull/push hooks so it should be |
@@ -555,11 +611,6 b' vcs.backends = hg, git, svn' | |||
|
555 | 611 | ; Wait this number of seconds before killing connection to the vcsserver |
|
556 | 612 | vcs.connection_timeout = 3600 |
|
557 | 613 | |
|
558 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
559 | ; Set a numeric version for your current SVN e.g 1.8, or 1.12 | |
|
560 | ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
561 | #vcs.svn.compatible_version = 1.8 | |
|
562 | ||
|
563 | 614 | ; Cache flag to cache vcsserver remote calls locally |
|
564 | 615 | ; It uses cache_region `cache_repo` |
|
565 | 616 | vcs.methods.cache = true |
@@ -569,14 +620,29 b' vcs.methods.cache = true' | |||
|
569 | 620 | ; Maps RhodeCode repo groups into SVN paths for Apache |
|
570 | 621 | ; #################################################### |
|
571 | 622 | |
|
623 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
624 | ; Set a numeric version for your current SVN e.g 1.8, or 1.12 | |
|
625 | ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
626 | #vcs.svn.compatible_version = 1.8 | |
|
627 | ||
|
628 | ; Redis connection settings for svn integrations logic | |
|
629 | ; This connection string needs to be the same on ce and vcsserver | |
|
630 | vcs.svn.redis_conn = redis://redis:6379/0 | |
|
631 | ||
|
632 | ; Enable SVN proxy of requests over HTTP | |
|
633 | vcs.svn.proxy.enabled = true | |
|
634 | ||
|
635 | ; host to connect to running SVN subsystem | |
|
636 | vcs.svn.proxy.host = http://svn:8090 | |
|
637 | ||
|
572 | 638 | ; Enable or disable the config file generation. |
|
573 |
svn.proxy.generate_config = |
|
|
639 | svn.proxy.generate_config = true | |
|
574 | 640 | |
|
575 | 641 | ; Generate config file with `SVNListParentPath` set to `On`. |
|
576 | 642 | svn.proxy.list_parent_path = true |
|
577 | 643 | |
|
578 | 644 | ; Set location and file name of generated config file. |
|
579 |
svn.proxy.config_file_path = |
|
|
645 | svn.proxy.config_file_path = /etc/rhodecode/conf/svn/mod_dav_svn.conf | |
|
580 | 646 | |
|
581 | 647 | ; alternative mod_dav config template. This needs to be a valid mako template |
|
582 | 648 | ; Example template can be found in the source code: |
@@ -604,7 +670,7 b' svn.proxy.location_root = /' | |||
|
604 | 670 | ; any change user ssh keys. Setting this to false also disables possibility |
|
605 | 671 | ; of adding SSH keys by users from web interface. Super admins can still |
|
606 | 672 | ; manage SSH Keys. |
|
607 |
ssh.generate_authorized_keyfile = |
|
|
673 | ssh.generate_authorized_keyfile = true | |
|
608 | 674 | |
|
609 | 675 | ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` |
|
610 | 676 | # ssh.authorized_keys_ssh_opts = |
@@ -612,12 +678,13 b' ssh.generate_authorized_keyfile = false' | |||
|
612 | 678 | ; Path to the authorized_keys file where the generate entries are placed. |
|
613 | 679 | ; It is possible to have multiple key files specified in `sshd_config` e.g. |
|
614 | 680 | ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode |
|
615 |
ssh.authorized_keys_file_path = |
|
|
681 | ssh.authorized_keys_file_path = /etc/rhodecode/conf/ssh/authorized_keys_rhodecode | |
|
616 | 682 | |
|
617 | 683 | ; Command to execute the SSH wrapper. The binary is available in the |
|
618 | 684 | ; RhodeCode installation directory. |
|
619 | ; e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |
|
620 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |
|
685 | ; legacy: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper | |
|
686 | ; new rewrite: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2 | |
|
687 | ssh.wrapper_cmd = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper | |
|
621 | 688 | |
|
622 | 689 | ; Allow shell when executing the ssh-wrapper command |
|
623 | 690 | ssh.wrapper_cmd_allow_shell = false |
@@ -628,73 +695,14 b' ssh.enable_debug_logging = false' | |||
|
628 | 695 | |
|
629 | 696 | ; Paths to binary executable, by default they are the names, but we can |
|
630 | 697 | ; override them if we want to use a custom one |
|
631 |
ssh.executable.hg = |
|
|
632 |
ssh.executable.git = |
|
|
633 |
ssh.executable.svn = |
|
|
698 | ssh.executable.hg = /usr/local/bin/rhodecode_bin/vcs_bin/hg | |
|
699 | ssh.executable.git = /usr/local/bin/rhodecode_bin/vcs_bin/git | |
|
700 | ssh.executable.svn = /usr/local/bin/rhodecode_bin/vcs_bin/svnserve | |
|
634 | 701 | |
|
635 | 702 | ; Enables SSH key generator web interface. Disabling this still allows users |
|
636 | 703 | ; to add their own keys. |
|
637 | 704 | ssh.enable_ui_key_generator = true |
|
638 | 705 | |
|
639 | ||
|
640 | ; ################# | |
|
641 | ; APPENLIGHT CONFIG | |
|
642 | ; ################# | |
|
643 | ||
|
644 | ; Appenlight is tailored to work with RhodeCode, see | |
|
645 | ; http://appenlight.rhodecode.com for details how to obtain an account | |
|
646 | ||
|
647 | ; Appenlight integration enabled | |
|
648 | #appenlight = false | |
|
649 | ||
|
650 | #appenlight.server_url = https://api.appenlight.com | |
|
651 | #appenlight.api_key = YOUR_API_KEY | |
|
652 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 | |
|
653 | ||
|
654 | ; used for JS client | |
|
655 | #appenlight.api_public_key = YOUR_API_PUBLIC_KEY | |
|
656 | ||
|
657 | ; TWEAK AMOUNT OF INFO SENT HERE | |
|
658 | ||
|
659 | ; enables 404 error logging (default False) | |
|
660 | #appenlight.report_404 = false | |
|
661 | ||
|
662 | ; time in seconds after request is considered being slow (default 1) | |
|
663 | #appenlight.slow_request_time = 1 | |
|
664 | ||
|
665 | ; record slow requests in application | |
|
666 | ; (needs to be enabled for slow datastore recording and time tracking) | |
|
667 | #appenlight.slow_requests = true | |
|
668 | ||
|
669 | ; enable hooking to application loggers | |
|
670 | #appenlight.logging = true | |
|
671 | ||
|
672 | ; minimum log level for log capture | |
|
673 | #ppenlight.logging.level = WARNING | |
|
674 | ||
|
675 | ; send logs only from erroneous/slow requests | |
|
676 | ; (saves API quota for intensive logging) | |
|
677 | #appenlight.logging_on_error = false | |
|
678 | ||
|
679 | ; list of additional keywords that should be grabbed from environ object | |
|
680 | ; can be string with comma separated list of words in lowercase | |
|
681 | ; (by default client will always send following info: | |
|
682 | ; 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
|
683 | ; start with HTTP* this list be extended with additional keywords here | |
|
684 | #appenlight.environ_keys_whitelist = | |
|
685 | ||
|
686 | ; list of keywords that should be blanked from request object | |
|
687 | ; can be string with comma separated list of words in lowercase | |
|
688 | ; (by default client will always blank keys that contain following words | |
|
689 | ; 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
|
690 | ; this list be extended with additional keywords set here | |
|
691 | #appenlight.request_keys_blacklist = | |
|
692 | ||
|
693 | ; list of namespaces that should be ignores when gathering log entries | |
|
694 | ; can be string with comma separated list of namespaces | |
|
695 | ; (by default the client ignores own entries: appenlight_client.client) | |
|
696 | #appenlight.log_namespace_blacklist = | |
|
697 | ||
|
698 | 706 | ; Statsd client config, this is used to send metrics to statsd |
|
699 | 707 | ; We recommend setting statsd_exported and scrape them using Prometheus |
|
700 | 708 | #statsd.enabled = false |
@@ -8,7 +8,7 b' level of support to optimize for product' | |||
|
8 | 8 | use the following instructions: |
|
9 | 9 | |
|
10 | 10 | 1. Open the |RCE| configuration file, |
|
11 |
:file:` |
|
|
11 | :file:`config/_shared/rhodecode.ini` | |
|
12 | 12 | |
|
13 | 13 | 2. Add the following configuration option in the ``[app:main]`` section. |
|
14 | 14 |
@@ -42,7 +42,7 b' information see the :ref:`apache-ws-ref`' | |||
|
42 | 42 | |
|
43 | 43 | |RCE| can also be configured to force strict *https* connections and Strict |
|
44 | 44 | Transport Security. To set this, configure the following options to ``true`` |
|
45 |
in the :file:` |
|
|
45 | in the :file:`config/_shared/rhodecode.ini` file. | |
|
46 | 46 | |
|
47 | 47 | .. code-block:: ini |
|
48 | 48 |
@@ -83,7 +83,7 b' see the `OpenSSL PKI tutorial`_ site, or' | |||
|
83 | 83 | |
|
84 | 84 | If the network you are running is SSL/TLS encrypted, you can configure |RCE| |
|
85 | 85 | to always use secure connections using the ``force_https`` and ``use_htsts`` |
|
86 |
options in the :file:` |
|
|
86 | options in the :file:`config/_shared/rhodecode.ini` file. | |
|
87 | 87 | For more details, see the :ref:`x-frame` section. |
|
88 | 88 | |
|
89 | 89 | FireWalls and Ports |
@@ -78,7 +78,7 b' For example:' | |||
|
78 | 78 | Configuration Files |
|
79 | 79 | ------------------- |
|
80 | 80 | |
|
81 |
* :file:` |
|
|
81 | * :file:`config/_shared/rhodecode.ini` | |
|
82 | 82 | * :file:`/home/{user}/.rccontrol/{instance-id}/search_mapping.ini` |
|
83 | 83 | * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini` |
|
84 | 84 | * :file:`/home/{user}/.rccontrol/supervisor/supervisord.ini` |
@@ -188,7 +188,7 b' Changing Default Language' | |||
|
188 | 188 | ^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
189 | 189 | |
|
190 | 190 | To change the default language of a |RCE| instance, change the language code |
|
191 |
in the :file:` |
|
|
191 | in the :file:`config/_shared/rhodecode.ini` file. To | |
|
192 | 192 | do this, use the following steps. |
|
193 | 193 | |
|
194 | 194 | 1. Open the :file:`rhodecode.ini` file and set the required language code. |
@@ -11,7 +11,7 b' sections.' | |||
|
11 | 11 | |
|
12 | 12 | \- **rhodecode.ini** |
|
13 | 13 | Default location: |
|
14 |
:file:` |
|
|
14 | :file:`config/_shared/rhodecode.ini` | |
|
15 | 15 | |
|
16 | 16 | This is the main |RCE| configuration file and controls much of its |
|
17 | 17 | default behaviour. It is also used to configure certain customer |
@@ -14,7 +14,7 b' track particular user logs only, and exc' | |||
|
14 | 14 | simply grep by `req_id` uuid which you'll have to find for the individual request. |
|
15 | 15 | |
|
16 | 16 | To enable debug mode on a |RCE| instance you need to set the debug property |
|
17 |
in the :file:` |
|
|
17 | in the :file:`config/_shared/rhodecode.ini` file. To | |
|
18 | 18 | do this, use the following steps |
|
19 | 19 | |
|
20 | 20 | 1. Open the file and set the ``debug`` line to ``true`` |
@@ -38,7 +38,7 b' Debug and Logging Configuration' | |||
|
38 | 38 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
39 | 39 | |
|
40 | 40 | Further debugging and logging settings can also be set in the |
|
41 |
:file:` |
|
|
41 | :file:`config/_shared/rhodecode.ini` file. | |
|
42 | 42 | |
|
43 | 43 | In the logging section, the various packages that run with |RCE| can have |
|
44 | 44 | different debug levels set. If you want to increase the logging level change |
@@ -134,7 +134,7 b' 2. Go to the :menuselection:`Admin --> S' | |||
|
134 | 134 | :guilabel:`Subversion HTTP Server URL`. |
|
135 | 135 | |
|
136 | 136 | 3. Open the |RCE| configuration file, |
|
137 |
:file:` |
|
|
137 | :file:`config/_shared/rhodecode.ini` | |
|
138 | 138 | |
|
139 | 139 | 4. Add the following configuration option in the ``[app:main]`` |
|
140 | 140 | section if you don't have it yet. |
@@ -4,7 +4,7 b' Change Default Encoding' | |||
|
4 | 4 | ----------------------- |
|
5 | 5 | |
|
6 | 6 | |RCE| uses ``utf8`` encoding by default. You can change the default encoding |
|
7 |
in the :file:` |
|
|
7 | in the :file:`config/_shared/rhodecode.ini` file. To | |
|
8 | 8 | change the default encoding used by |RCE|, set a new value for the |
|
9 | 9 | ``default_encoding``. |
|
10 | 10 |
@@ -7,7 +7,7 b' When using external authentication tools' | |||
|
7 | 7 | password retry loop in |hg| can result in users being locked out due to too |
|
8 | 8 | many failed password attempts. To prevent this from happening, add the |
|
9 | 9 | following setting to your |
|
10 |
:file:` |
|
|
10 | :file:`config/_shared/rhodecode.ini` file, in the | |
|
11 | 11 | ``[app:main]`` section. |
|
12 | 12 | |
|
13 | 13 |
@@ -100,7 +100,7 b' Each one should already connect to share' | |||
|
100 | 100 | |
|
101 | 101 | 1) Assuming our final url will be http://rc-node-1, Configure `instances_id`, `app.base_url` |
|
102 | 102 | |
|
103 |
a) On **rc-node-2** find the following settings and edit :file:` |
|
|
103 | a) On **rc-node-2** find the following settings and edit :file:`config/_shared/rhodecode.ini` | |
|
104 | 104 | |
|
105 | 105 | .. code-block:: ini |
|
106 | 106 | |
@@ -109,7 +109,7 b' a) On **rc-node-2** find the following s' | |||
|
109 | 109 | app.base_url = http://rc-node-1 |
|
110 | 110 | |
|
111 | 111 | |
|
112 |
b) On **rc-node-3** find the following settings and edit :file:` |
|
|
112 | b) On **rc-node-3** find the following settings and edit :file:`config/_shared/rhodecode.ini` | |
|
113 | 113 | |
|
114 | 114 | .. code-block:: ini |
|
115 | 115 | |
@@ -121,7 +121,7 b' b) On **rc-node-3** find the following s' | |||
|
121 | 121 | |
|
122 | 122 | 2) Configure `User Session` to use a shared database. Example config that should be |
|
123 | 123 | changed on both **rc-node-2** and **rc-node-3** . |
|
124 |
Edit :file:` |
|
|
124 | Edit :file:`config/_shared/rhodecode.ini` | |
|
125 | 125 | |
|
126 | 126 | .. code-block:: ini |
|
127 | 127 | |
@@ -163,7 +163,7 b' 3) Configure stored cached/archive cache' | |||
|
163 | 163 | |
|
164 | 164 | 4) Use shared exception store. Example config that should be |
|
165 | 165 | changed on both **rc-node-2** and **rc-node-3**, and also for VCSServer. |
|
166 |
Edit :file:` |
|
|
166 | Edit :file:`config/_shared/rhodecode.ini` and | |
|
167 | 167 | :file:`/home/{user}/.rccontrol/{vcsserver-instance-id}/vcsserver.ini` |
|
168 | 168 | and add/change following setting. |
|
169 | 169 |
@@ -15,7 +15,7 b' scalability, and maintainability we reco' | |||
|
15 | 15 | sessions to database-based user sessions or Redis based sessions. |
|
16 | 16 | |
|
17 | 17 | To switch to database-based user sessions uncomment the following section in |
|
18 |
your :file:` |
|
|
18 | your :file:`config/_shared/rhodecode.ini` file. | |
|
19 | 19 | |
|
20 | 20 | |
|
21 | 21 | .. code-block:: ini |
@@ -49,7 +49,7 b' uses, or if required it can be a differe' | |||
|
49 | 49 | |
|
50 | 50 | |
|
51 | 51 | To switch to redis-based user sessions uncomment the following section in |
|
52 |
your :file:` |
|
|
52 | your :file:`config/_shared/rhodecode.ini` file. | |
|
53 | 53 | |
|
54 | 54 | .. code-block:: ini |
|
55 | 55 |
@@ -52,7 +52,7 b' To configure a |RCE| instance to use a V' | |||
|
52 | 52 | The following list shows the available options on the |RCE| side of the |
|
53 | 53 | connection to the VCS Server. The settings are configured per |
|
54 | 54 | instance in the |
|
55 |
:file:` |
|
|
55 | :file:`config/_shared/rhodecode.ini` file. | |
|
56 | 56 | |
|
57 | 57 | .. rst-class:: dl-horizontal |
|
58 | 58 |
@@ -27,7 +27,7 b' of views that have API access enabled by' | |||
|
27 | 27 | edit the |RCE| configuration ``.ini`` file. The default location is: |
|
28 | 28 | |
|
29 | 29 | * |RCE| Pre-2.2.7 :file:`root/rhodecode/data/production.ini` |
|
30 |
* |RCE| 3.0 :file:` |
|
|
30 | * |RCE| 3.0 :file:`config/_shared/rhodecode.ini` | |
|
31 | 31 | |
|
32 | 32 | To configure the white list, edit this section of the file. In this |
|
33 | 33 | configuration example, API access is granted to the patch/diff raw file and |
@@ -87,7 +87,7 b' 3. Set base_url for instance to enable p' | |||
|
87 | 87 | Hostname is required for the integration to properly set the instance URL. |
|
88 | 88 | |
|
89 | 89 | When your hostname is known (e.g https://code.rhodecode.com) please set it |
|
90 |
inside :file:` |
|
|
90 | inside :file:`config/_shared/rhodecode.ini` | |
|
91 | 91 | |
|
92 | 92 | add into `[app:main]` section the following configuration: |
|
93 | 93 | |
@@ -111,7 +111,7 b' 4. Add the public key to your user accou' | |||
|
111 | 111 | |
|
112 | 112 | In case of connection problems please set |
|
113 | 113 | `ssh.enable_debug_logging = true` inside the SSH configuration of |
|
114 |
:file:` |
|
|
114 | :file:`config/_shared/rhodecode.ini` | |
|
115 | 115 | Then add, remove your SSH key and try connecting again. |
|
116 | 116 | Debug logging will be printed to help find the problems on the server side. |
|
117 | 117 |
@@ -4,7 +4,7 b' Set up Email' | |||
|
4 | 4 | ------------ |
|
5 | 5 | |
|
6 | 6 | To setup email with your |RCE| instance, open the default |
|
7 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` | |
|
7 | :file:`config/_shared/rhodecode.ini` | |
|
8 | 8 | file and uncomment and configure the email section. If it is not there, |
|
9 | 9 | use the below example to insert it. |
|
10 | 10 |
@@ -4,7 +4,8 b'' | |||
|
4 | 4 | Release Date |
|
5 | 5 | ^^^^^^^^^^^^ |
|
6 | 6 | |
|
7 | - TBA | |
|
7 | ||
|
8 | - 2024-05-14 | |
|
8 | 9 | |
|
9 | 10 | |
|
10 | 11 | New Features |
@@ -12,21 +13,20 b' New Features' | |||
|
12 | 13 | |
|
13 | 14 | - Full support of Python3 and Python3.11 |
|
14 | 15 | - Git repositories with LFS object are now pushing and pulling the LFS objects when remote sync is enabled. |
|
15 | - Archive generation: implemented a new system for caching generated archive files that allows setting cache size limit | |
|
16 | see: `archive_cache.cache_size_gb=` option. | |
|
16 | - Archive generation: implemented a new system for caching generated archive files that allows setting cache size limit see: archive_cache.cache_size_gb= option. | |
|
17 | 17 | - Introduced statsd metrics in various places for new monitoring stack to provide useful details on traffic and usage. |
|
18 | 18 | |
|
19 | 19 | |
|
20 | 20 | General |
|
21 | 21 | ^^^^^^^ |
|
22 | 22 | |
|
23 | - Upgraded all dependency libraries to their latest available versions | |
|
24 | - Dropped support for deprecated hgsubversion no longer available in python3 | |
|
23 | - Upgraded all dependency libraries to their latest available versions for python3 compatability | |
|
25 | 24 | |
|
26 | 25 | |
|
27 | 26 | Security |
|
28 | 27 | ^^^^^^^^ |
|
29 | 28 | |
|
29 | - fixed few edge cases of permission invalidation on change of permissions | |
|
30 | 30 | |
|
31 | 31 | |
|
32 | 32 | Performance |
@@ -38,6 +38,7 b' Performance' | |||
|
38 | 38 | Fixes |
|
39 | 39 | ^^^^^ |
|
40 | 40 | |
|
41 | - Various small fixes and improvements found during python3 migration | |
|
41 | 42 | |
|
42 | 43 | |
|
43 | 44 | Upgrade notes |
@@ -9,6 +9,11 b' Release Notes' | |||
|
9 | 9 | .. toctree:: |
|
10 | 10 | :maxdepth: 1 |
|
11 | 11 | |
|
12 | ||
|
13 | release-notes-5.1.0.rst | |
|
14 | release-notes-5.0.3.rst | |
|
15 | release-notes-5.0.2.rst | |
|
16 | release-notes-5.0.1.rst | |
|
12 | 17 | release-notes-5.0.0.rst |
|
13 | 18 | |
|
14 | 19 |
@@ -27,7 +27,7 b' 1. Install a new instance of |RCE|, choo' | |||
|
27 | 27 | |
|
28 | 28 | Once the new instance is installed you need to update the licence token and |
|
29 | 29 | database connection string in the |
|
30 |
:file:` |
|
|
30 | :file:`config/_shared/rhodecode.ini` file. | |
|
31 | 31 | |
|
32 | 32 | .. code-block:: bash |
|
33 | 33 |
@@ -47,7 +47,6 b'' | |||
|
47 | 47 | "moment": "^2.18.1", |
|
48 | 48 | "mousetrap": "^1.6.1", |
|
49 | 49 | "polymer-webpack-loader": "^2.0.1", |
|
50 | "qrious": "^4.0.2", | |
|
51 | 50 | "raw-loader": "1.0.0-beta.0", |
|
52 | 51 | "sticky-sidebar": "3.3.1", |
|
53 | 52 | "style-loader": "^0.21.0", |
@@ -21,3 +21,7 b' markers =' | |||
|
21 | 21 | skip_backends: Mark tests as skipped for given backends. |
|
22 | 22 | backends: Mark backends |
|
23 | 23 | dbs: database markers for running tests for given DB |
|
24 | ||
|
25 | env = | |
|
26 | RC_TEST=1 | |
|
27 | RUN_ENV=test |
@@ -1,9 +1,9 b'' | |||
|
1 | 1 | # deps, generated via pipdeptree --exclude setuptools,wheel,pipdeptree,pip -f | tr '[:upper:]' '[:lower:]' |
|
2 | 2 | |
|
3 |
alembic==1.1 |
|
|
3 | alembic==1.13.1 | |
|
4 | 4 | mako==1.2.4 |
|
5 | 5 | markupsafe==2.1.2 |
|
6 |
sqlalchemy==1.4.5 |
|
|
6 | sqlalchemy==1.4.52 | |
|
7 | 7 | greenlet==3.0.3 |
|
8 | 8 | typing_extensions==4.9.0 |
|
9 | 9 | async-timeout==4.0.3 |
@@ -27,13 +27,13 b' celery==5.3.6' | |||
|
27 | 27 | vine==5.1.0 |
|
28 | 28 | python-dateutil==2.8.2 |
|
29 | 29 | six==1.16.0 |
|
30 |
tzdata==202 |
|
|
30 | tzdata==2024.1 | |
|
31 | 31 | vine==5.1.0 |
|
32 | 32 | channelstream==0.7.1 |
|
33 | 33 | gevent==24.2.1 |
|
34 | 34 | greenlet==3.0.3 |
|
35 | 35 | zope.event==5.0.0 |
|
36 |
zope.interface==6. |
|
|
36 | zope.interface==6.3.0 | |
|
37 | 37 | itsdangerous==1.1.0 |
|
38 | 38 | marshmallow==2.18.0 |
|
39 | 39 | pyramid==2.0.2 |
@@ -46,9 +46,7 b' channelstream==0.7.1' | |||
|
46 | 46 | venusian==3.0.0 |
|
47 | 47 | webob==1.8.7 |
|
48 | 48 | zope.deprecation==5.0.0 |
|
49 |
zope.interface==6. |
|
|
50 | pyramid-apispec==0.3.3 | |
|
51 | apispec==1.3.3 | |
|
49 | zope.interface==6.3.0 | |
|
52 | 50 | pyramid-jinja2==2.10 |
|
53 | 51 | jinja2==3.1.2 |
|
54 | 52 | markupsafe==2.1.2 |
@@ -63,7 +61,7 b' channelstream==0.7.1' | |||
|
63 | 61 | venusian==3.0.0 |
|
64 | 62 | webob==1.8.7 |
|
65 | 63 | zope.deprecation==5.0.0 |
|
66 |
zope.interface==6. |
|
|
64 | zope.interface==6.3.0 | |
|
67 | 65 | zope.deprecation==5.0.0 |
|
68 | 66 | python-dateutil==2.8.2 |
|
69 | 67 | six==1.16.0 |
@@ -82,20 +80,20 b' deform==2.0.15' | |||
|
82 | 80 | peppercorn==0.6 |
|
83 | 81 | translationstring==1.4 |
|
84 | 82 | zope.deprecation==5.0.0 |
|
85 | diskcache==5.6.3 | |
|
86 | 83 | docutils==0.19 |
|
87 |
dogpile.cache==1.3. |
|
|
84 | dogpile.cache==1.3.3 | |
|
88 | 85 | decorator==5.1.1 |
|
89 | 86 | stevedore==5.1.0 |
|
90 | 87 | pbr==5.11.1 |
|
91 | 88 | formencode==2.1.0 |
|
92 | 89 | six==1.16.0 |
|
90 | fsspec==2024.6.0 | |
|
93 | 91 | gunicorn==21.2.0 |
|
94 |
packaging==2 |
|
|
92 | packaging==24.0 | |
|
95 | 93 | gevent==24.2.1 |
|
96 | 94 | greenlet==3.0.3 |
|
97 | 95 | zope.event==5.0.0 |
|
98 |
zope.interface==6. |
|
|
96 | zope.interface==6.3.0 | |
|
99 | 97 | ipython==8.14.0 |
|
100 | 98 | backcall==0.2.0 |
|
101 | 99 | decorator==5.1.1 |
@@ -116,11 +114,11 b' ipython==8.14.0' | |||
|
116 | 114 | pure-eval==0.2.2 |
|
117 | 115 | traitlets==5.9.0 |
|
118 | 116 | markdown==3.4.3 |
|
119 |
msgpack==1.0. |
|
|
117 | msgpack==1.0.8 | |
|
120 | 118 | mysqlclient==2.1.1 |
|
121 | 119 | nbconvert==7.7.3 |
|
122 |
beautifulsoup4==4.1 |
|
|
123 |
soupsieve==2. |
|
|
120 | beautifulsoup4==4.12.3 | |
|
121 | soupsieve==2.5 | |
|
124 | 122 | bleach==6.1.0 |
|
125 | 123 | six==1.16.0 |
|
126 | 124 | webencodings==0.5.1 |
@@ -165,20 +163,15 b' nbconvert==7.7.3' | |||
|
165 | 163 | platformdirs==3.10.0 |
|
166 | 164 | traitlets==5.9.0 |
|
167 | 165 | traitlets==5.9.0 |
|
168 | packaging==23.1 | |
|
169 | 166 | pandocfilters==1.5.0 |
|
170 | 167 | pygments==2.15.1 |
|
171 | 168 | tinycss2==1.2.1 |
|
172 | 169 | webencodings==0.5.1 |
|
173 | 170 | traitlets==5.9.0 |
|
174 |
orjson==3. |
|
|
175 |
paste |
|
|
176 | paste==3.7.1 | |
|
177 | six==1.16.0 | |
|
178 | pastedeploy==3.1.0 | |
|
179 | six==1.16.0 | |
|
171 | orjson==3.10.3 | |
|
172 | paste==3.10.1 | |
|
180 | 173 | premailer==3.10.0 |
|
181 |
cachetools==5.3. |
|
|
174 | cachetools==5.3.3 | |
|
182 | 175 | cssselect==1.2.0 |
|
183 | 176 | cssutils==2.6.0 |
|
184 | 177 | lxml==4.9.3 |
@@ -194,11 +187,11 b' pycmarkgfm==1.2.0' | |||
|
194 | 187 | cffi==1.16.0 |
|
195 | 188 | pycparser==2.21 |
|
196 | 189 | pycryptodome==3.17 |
|
197 |
pycurl==7.45. |
|
|
190 | pycurl==7.45.3 | |
|
198 | 191 | pymysql==1.0.3 |
|
199 | 192 | pyotp==2.8.0 |
|
200 | 193 | pyparsing==3.1.1 |
|
201 | pyramid-debugtoolbar==4.11 | |
|
194 | pyramid-debugtoolbar==4.12.1 | |
|
202 | 195 | pygments==2.15.1 |
|
203 | 196 | pyramid==2.0.2 |
|
204 | 197 | hupper==1.12 |
@@ -210,7 +203,7 b' pyramid-debugtoolbar==4.11' | |||
|
210 | 203 | venusian==3.0.0 |
|
211 | 204 | webob==1.8.7 |
|
212 | 205 | zope.deprecation==5.0.0 |
|
213 |
zope.interface==6. |
|
|
206 | zope.interface==6.3.0 | |
|
214 | 207 | pyramid-mako==1.1.0 |
|
215 | 208 | mako==1.2.4 |
|
216 | 209 | markupsafe==2.1.2 |
@@ -224,7 +217,7 b' pyramid-debugtoolbar==4.11' | |||
|
224 | 217 | venusian==3.0.0 |
|
225 | 218 | webob==1.8.7 |
|
226 | 219 | zope.deprecation==5.0.0 |
|
227 |
zope.interface==6. |
|
|
220 | zope.interface==6.3.0 | |
|
228 | 221 | pyramid-mailer==0.15.1 |
|
229 | 222 | pyramid==2.0.2 |
|
230 | 223 | hupper==1.12 |
@@ -236,13 +229,13 b' pyramid-mailer==0.15.1' | |||
|
236 | 229 | venusian==3.0.0 |
|
237 | 230 | webob==1.8.7 |
|
238 | 231 | zope.deprecation==5.0.0 |
|
239 |
zope.interface==6. |
|
|
232 | zope.interface==6.3.0 | |
|
240 | 233 | repoze.sendmail==4.4.1 |
|
241 | 234 | transaction==3.1.0 |
|
242 |
zope.interface==6. |
|
|
243 |
zope.interface==6. |
|
|
235 | zope.interface==6.3.0 | |
|
236 | zope.interface==6.3.0 | |
|
244 | 237 | transaction==3.1.0 |
|
245 |
zope.interface==6. |
|
|
238 | zope.interface==6.3.0 | |
|
246 | 239 | python-ldap==3.4.3 |
|
247 | 240 | pyasn1==0.4.8 |
|
248 | 241 | pyasn1-modules==0.2.8 |
@@ -257,39 +250,64 b' python3-saml==1.15.0' | |||
|
257 | 250 | xmlsec==1.3.13 |
|
258 | 251 | lxml==4.9.3 |
|
259 | 252 | pyyaml==6.0.1 |
|
260 |
redis==5.0. |
|
|
253 | redis==5.0.4 | |
|
254 | async-timeout==4.0.3 | |
|
261 | 255 | regex==2022.10.31 |
|
262 | 256 | routes==2.5.1 |
|
263 | 257 | repoze.lru==0.7 |
|
264 | 258 | six==1.16.0 |
|
265 | simplejson==3.19.1 | |
|
259 | s3fs==2024.6.0 | |
|
260 | aiobotocore==2.13.0 | |
|
261 | aiohttp==3.9.5 | |
|
262 | aiosignal==1.3.1 | |
|
263 | frozenlist==1.4.1 | |
|
264 | attrs==22.2.0 | |
|
265 | frozenlist==1.4.1 | |
|
266 | multidict==6.0.5 | |
|
267 | yarl==1.9.4 | |
|
268 | idna==3.4 | |
|
269 | multidict==6.0.5 | |
|
270 | aioitertools==0.11.0 | |
|
271 | botocore==1.34.106 | |
|
272 | jmespath==1.0.1 | |
|
273 | python-dateutil==2.8.2 | |
|
274 | six==1.16.0 | |
|
275 | urllib3==1.26.14 | |
|
276 | wrapt==1.16.0 | |
|
277 | aiohttp==3.9.5 | |
|
278 | aiosignal==1.3.1 | |
|
279 | frozenlist==1.4.1 | |
|
280 | attrs==22.2.0 | |
|
281 | frozenlist==1.4.1 | |
|
282 | multidict==6.0.5 | |
|
283 | yarl==1.9.4 | |
|
284 | idna==3.4 | |
|
285 | multidict==6.0.5 | |
|
286 | fsspec==2024.6.0 | |
|
287 | simplejson==3.19.2 | |
|
266 | 288 | sshpubkeys==3.3.1 |
|
267 | 289 | cryptography==40.0.2 |
|
268 | 290 | cffi==1.16.0 |
|
269 | 291 | pycparser==2.21 |
|
270 | 292 | ecdsa==0.18.0 |
|
271 | 293 | six==1.16.0 |
|
272 |
sqlalchemy==1.4.5 |
|
|
294 | sqlalchemy==1.4.52 | |
|
273 | 295 | greenlet==3.0.3 |
|
274 | 296 | typing_extensions==4.9.0 |
|
275 | 297 | supervisor==4.2.5 |
|
276 | 298 | tzlocal==4.3 |
|
277 | 299 | pytz-deprecation-shim==0.1.0.post0 |
|
278 |
tzdata==202 |
|
|
300 | tzdata==2024.1 | |
|
301 | tempita==0.5.2 | |
|
279 | 302 | unidecode==1.3.6 |
|
280 | 303 | urlobject==2.4.3 |
|
281 | 304 | waitress==3.0.0 |
|
282 | weberror==0.13.1 | |
|
283 | paste==3.7.1 | |
|
284 | six==1.16.0 | |
|
285 | pygments==2.15.1 | |
|
286 | tempita==0.5.2 | |
|
287 | webob==1.8.7 | |
|
288 | webhelpers2==2.0 | |
|
305 | webhelpers2==2.1 | |
|
289 | 306 | markupsafe==2.1.2 |
|
290 | 307 | six==1.16.0 |
|
291 | 308 | whoosh==2.7.4 |
|
292 | 309 | zope.cachedescriptors==5.0.0 |
|
310 | qrcode==7.4.2 | |
|
293 | 311 | |
|
294 | 312 | ## uncomment to add the debug libraries |
|
295 | 313 | #-r requirements_debug.txt |
@@ -1,43 +1,45 b'' | |||
|
1 | 1 | # test related requirements |
|
2 | ||
|
3 | cov-core==1.15.0 | |
|
4 |
coverage==7. |
|
|
5 | mock==5.0.2 | |
|
6 | py==1.11.0 | |
|
7 | pytest-cov==4.0.0 | |
|
8 | coverage==7.2.3 | |
|
9 | pytest==7.3.1 | |
|
10 | attrs==22.2.0 | |
|
2 | mock==5.1.0 | |
|
3 | pytest-cov==4.1.0 | |
|
4 | coverage==7.4.3 | |
|
5 | pytest==8.1.1 | |
|
11 | 6 | iniconfig==2.0.0 |
|
12 |
packaging==2 |
|
|
13 |
pluggy==1. |
|
|
14 |
pytest- |
|
|
7 | packaging==24.0 | |
|
8 | pluggy==1.4.0 | |
|
9 | pytest-env==1.1.3 | |
|
10 | pytest==8.1.1 | |
|
11 | iniconfig==2.0.0 | |
|
12 | packaging==24.0 | |
|
13 | pluggy==1.4.0 | |
|
15 | 14 | pytest-profiling==1.7.0 |
|
16 | 15 | gprof2dot==2022.7.29 |
|
17 |
pytest== |
|
|
18 | attrs==22.2.0 | |
|
16 | pytest==8.1.1 | |
|
19 | 17 | iniconfig==2.0.0 |
|
20 |
packaging==2 |
|
|
21 |
pluggy==1. |
|
|
18 | packaging==24.0 | |
|
19 | pluggy==1.4.0 | |
|
22 | 20 | six==1.16.0 |
|
23 |
pytest-r |
|
|
24 | pytest-sugar==0.9.7 | |
|
25 | packaging==23.1 | |
|
26 | pytest==7.3.1 | |
|
27 | attrs==22.2.0 | |
|
21 | pytest-rerunfailures==13.0 | |
|
22 | packaging==24.0 | |
|
23 | pytest==8.1.1 | |
|
28 | 24 | iniconfig==2.0.0 |
|
29 |
packaging==2 |
|
|
30 |
pluggy==1. |
|
|
31 | termcolor==2.3.0 | |
|
32 |
pytest- |
|
|
33 | pytest==7.3.1 | |
|
34 | attrs==22.2.0 | |
|
25 | packaging==24.0 | |
|
26 | pluggy==1.4.0 | |
|
27 | pytest-runner==6.0.1 | |
|
28 | pytest-sugar==1.0.0 | |
|
29 | packaging==24.0 | |
|
30 | pytest==8.1.1 | |
|
35 | 31 | iniconfig==2.0.0 |
|
36 |
packaging==2 |
|
|
37 |
pluggy==1. |
|
|
32 | packaging==24.0 | |
|
33 | pluggy==1.4.0 | |
|
34 | termcolor==2.4.0 | |
|
35 | pytest-timeout==2.3.1 | |
|
36 | pytest==8.1.1 | |
|
37 | iniconfig==2.0.0 | |
|
38 | packaging==24.0 | |
|
39 | pluggy==1.4.0 | |
|
38 | 40 | webtest==3.0.0 |
|
39 |
beautifulsoup4==4.1 |
|
|
40 |
soupsieve==2. |
|
|
41 | beautifulsoup4==4.12.3 | |
|
42 | soupsieve==2.5 | |
|
41 | 43 | waitress==3.0.0 |
|
42 | 44 | webob==1.8.7 |
|
43 | 45 |
@@ -82,10 +82,10 b' PYRAMID_SETTINGS = {}' | |||
|
82 | 82 | EXTENSIONS = {} |
|
83 | 83 | |
|
84 | 84 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) |
|
85 |
__dbversion__ = 11 |
|
|
85 | __dbversion__ = 115 # defines current db version for migrations | |
|
86 | 86 | __license__ = 'AGPLv3, and Commercial License' |
|
87 | 87 | __author__ = 'RhodeCode GmbH' |
|
88 | 88 | __url__ = 'https://code.rhodecode.com' |
|
89 | 89 | |
|
90 | is_test = False | |
|
90 | is_test = os.getenv('RC_TEST', '0') == '1' | |
|
91 | 91 | disable_error_handler = False |
@@ -22,7 +22,6 b' import sys' | |||
|
22 | 22 | import fnmatch |
|
23 | 23 | |
|
24 | 24 | import decorator |
|
25 | import typing | |
|
26 | 25 | import venusian |
|
27 | 26 | from collections import OrderedDict |
|
28 | 27 | |
@@ -45,7 +44,8 b' from rhodecode.model.db import User, Use' | |||
|
45 | 44 | log = logging.getLogger(__name__) |
|
46 | 45 | |
|
47 | 46 | DEFAULT_RENDERER = 'jsonrpc_renderer' |
|
48 |
DEFAULT_URL = '/_admin/api |
|
|
47 | DEFAULT_URL = '/_admin/api' | |
|
48 | SERVICE_API_IDENTIFIER = 'service_' | |
|
49 | 49 | |
|
50 | 50 | |
|
51 | 51 | def find_methods(jsonrpc_methods, pattern): |
@@ -54,7 +54,9 b' def find_methods(jsonrpc_methods, patter' | |||
|
54 | 54 | pattern = [pattern] |
|
55 | 55 | |
|
56 | 56 | for single_pattern in pattern: |
|
57 |
for method_name, method in |
|
|
57 | for method_name, method in filter( | |
|
58 | lambda x: not x[0].startswith(SERVICE_API_IDENTIFIER), jsonrpc_methods.items() | |
|
59 | ): | |
|
58 | 60 | if fnmatch.fnmatch(method_name, single_pattern): |
|
59 | 61 | matches[method_name] = method |
|
60 | 62 | return matches |
@@ -190,6 +192,7 b' def request_view(request):' | |||
|
190 | 192 | # check if we can find this session using api_key, get_by_auth_token |
|
191 | 193 | # search not expired tokens only |
|
192 | 194 | try: |
|
195 | if not request.rpc_method.startswith(SERVICE_API_IDENTIFIER): | |
|
193 | 196 | api_user = User.get_by_auth_token(request.rpc_api_key) |
|
194 | 197 | |
|
195 | 198 | if api_user is None: |
@@ -227,6 +230,10 b' def request_view(request):' | |||
|
227 | 230 | return jsonrpc_error( |
|
228 | 231 | request, retid=request.rpc_id, |
|
229 | 232 | message='API KEY invalid or, has bad role for an API call') |
|
233 | else: | |
|
234 | auth_u = 'service' | |
|
235 | if request.rpc_api_key != request.registry.settings['app.service_api.token']: | |
|
236 | raise Exception("Provided service secret is not recognized!") | |
|
230 | 237 | |
|
231 | 238 | except Exception: |
|
232 | 239 | log.exception('Error on API AUTH') |
@@ -290,7 +297,8 b' def request_view(request):' | |||
|
290 | 297 | }) |
|
291 | 298 | |
|
292 | 299 | # register some common functions for usage |
|
293 | attach_context_attributes(TemplateArgs(), request, request.rpc_user.user_id) | |
|
300 | rpc_user = request.rpc_user.user_id if hasattr(request, 'rpc_user') else None | |
|
301 | attach_context_attributes(TemplateArgs(), request, rpc_user) | |
|
294 | 302 | |
|
295 | 303 | statsd = request.registry.statsd |
|
296 | 304 |
@@ -41,7 +41,7 b' class TestCreateRepo(object):' | |||
|
41 | 41 | @pytest.mark.parametrize('given, expected_name, expected_exc', [ |
|
42 | 42 | ('api repo-1', 'api-repo-1', False), |
|
43 | 43 | ('api-repo 1-ąć', 'api-repo-1-ąć', False), |
|
44 |
( |
|
|
44 | ('unicode-ąć', u'unicode-ąć', False), | |
|
45 | 45 | ('some repo v1.2', 'some-repo-v1.2', False), |
|
46 | 46 | ('v2.0', 'v2.0', False), |
|
47 | 47 | ]) |
@@ -211,8 +211,8 b' class TestCreateRepoGroup(object):' | |||
|
211 | 211 | |
|
212 | 212 | expected = { |
|
213 | 213 | 'repo_group': |
|
214 |
|
|
|
215 |
|
|
|
214 | 'You do not have the permission to store ' | |
|
215 | 'repository groups in the root location.'} | |
|
216 | 216 | assert_error(id_, expected, given=response.body) |
|
217 | 217 | |
|
218 | 218 | def test_api_create_repo_group_regular_user_no_parent_group_perms(self): |
@@ -232,8 +232,8 b' class TestCreateRepoGroup(object):' | |||
|
232 | 232 | |
|
233 | 233 | expected = { |
|
234 | 234 | 'repo_group': |
|
235 |
|
|
|
236 |
|
|
|
235 | "You do not have the permissions to store " | |
|
236 | "repository groups inside repository group `{}`".format(repo_group_name)} | |
|
237 | 237 | try: |
|
238 | 238 | assert_error(id_, expected, given=response.body) |
|
239 | 239 | finally: |
@@ -76,8 +76,8 b' class TestApiGetGist(object):' | |||
|
76 | 76 | 'url': 'http://%s/_admin/gists/%s' % (http_host_only_stub, gist_id,), |
|
77 | 77 | 'acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
78 | 78 | 'content': { |
|
79 |
|
|
|
80 |
|
|
|
79 | 'filename1.txt': 'hello world', | |
|
80 | 'filename1Ä….txt': 'hello worldÄ™' | |
|
81 | 81 | }, |
|
82 | 82 | } |
|
83 | 83 |
@@ -25,7 +25,7 b' from rhodecode.api import (' | |||
|
25 | 25 | |
|
26 | 26 | from rhodecode.api.utils import ( |
|
27 | 27 | Optional, OAttr, has_superadmin_permission, get_user_or_error) |
|
28 | from rhodecode.lib.utils import repo2db_mapper | |
|
28 | from rhodecode.lib.utils import repo2db_mapper, get_rhodecode_repo_store_path | |
|
29 | 29 | from rhodecode.lib import system_info |
|
30 | 30 | from rhodecode.lib import user_sessions |
|
31 | 31 | from rhodecode.lib import exc_tracking |
@@ -33,7 +33,6 b' from rhodecode.lib.ext_json import json' | |||
|
33 | 33 | from rhodecode.lib.utils2 import safe_int |
|
34 | 34 | from rhodecode.model.db import UserIpMap |
|
35 | 35 | from rhodecode.model.scm import ScmModel |
|
36 | from rhodecode.model.settings import VcsSettingsModel | |
|
37 | 36 | from rhodecode.apps.file_store import utils |
|
38 | 37 | from rhodecode.apps.file_store.exceptions import FileNotAllowedException, \ |
|
39 | 38 | FileOverSizeException |
@@ -103,7 +102,7 b' def get_repo_store(request, apiuser):' | |||
|
103 | 102 | if not has_superadmin_permission(apiuser): |
|
104 | 103 | raise JSONRPCForbidden() |
|
105 | 104 | |
|
106 | path = VcsSettingsModel().get_repos_location() | |
|
105 | path = get_rhodecode_repo_store_path() | |
|
107 | 106 | return {"path": path} |
|
108 | 107 | |
|
109 | 108 |
@@ -104,6 +104,11 b' class TemplateArgs(StrictAttributeDict):' | |||
|
104 | 104 | |
|
105 | 105 | |
|
106 | 106 | class BaseAppView(object): |
|
107 | DONT_CHECKOUT_VIEWS = ["channelstream_connect", "ops_ping"] | |
|
108 | EXTRA_VIEWS_TO_IGNORE = ['login', 'register', 'logout'] | |
|
109 | SETUP_2FA_VIEW = 'setup_2fa' | |
|
110 | VERIFY_2FA_VIEW = 'check_2fa' | |
|
111 | ||
|
107 | 112 | def __init__(self, context, request): |
|
108 | 113 | self.request = request |
|
109 | 114 | self.context = context |
@@ -117,13 +122,19 b' class BaseAppView(object):' | |||
|
117 | 122 | |
|
118 | 123 | self._rhodecode_user = request.user # auth user |
|
119 | 124 | self._rhodecode_db_user = self._rhodecode_user.get_instance() |
|
125 | self.user_data = self._rhodecode_db_user.user_data if self._rhodecode_db_user else {} | |
|
120 | 126 | self._maybe_needs_password_change( |
|
121 | 127 | request.matched_route.name, self._rhodecode_db_user |
|
122 | 128 | ) |
|
129 | self._maybe_needs_2fa_configuration( | |
|
130 | request.matched_route.name, self._rhodecode_db_user | |
|
131 | ) | |
|
132 | self._maybe_needs_2fa_check( | |
|
133 | request.matched_route.name, self._rhodecode_db_user | |
|
134 | ) | |
|
123 | 135 | |
|
124 | 136 | def _maybe_needs_password_change(self, view_name, user_obj): |
|
125 | dont_check_views = ["channelstream_connect", "ops_ping"] | |
|
126 | if view_name in dont_check_views: | |
|
137 | if view_name in self.DONT_CHECKOUT_VIEWS: | |
|
127 | 138 | return |
|
128 | 139 | |
|
129 | 140 | log.debug( |
@@ -133,6 +144,7 b' class BaseAppView(object):' | |||
|
133 | 144 | skip_user_views = [ |
|
134 | 145 | "logout", |
|
135 | 146 | "login", |
|
147 | "check_2fa", | |
|
136 | 148 | "my_account_password", |
|
137 | 149 | "my_account_password_update", |
|
138 | 150 | ] |
@@ -144,7 +156,7 b' class BaseAppView(object):' | |||
|
144 | 156 | return |
|
145 | 157 | |
|
146 | 158 | now = time.time() |
|
147 |
should_change = |
|
|
159 | should_change = self.user_data.get("force_password_change") | |
|
148 | 160 | change_after = safe_int(should_change) or 0 |
|
149 | 161 | if should_change and now > change_after: |
|
150 | 162 | log.debug("User %s requires password change", user_obj) |
@@ -157,6 +169,33 b' class BaseAppView(object):' | |||
|
157 | 169 | if view_name not in skip_user_views: |
|
158 | 170 | raise HTTPFound(self.request.route_path("my_account_password")) |
|
159 | 171 | |
|
172 | def _maybe_needs_2fa_configuration(self, view_name, user_obj): | |
|
173 | if view_name in self.DONT_CHECKOUT_VIEWS + self.EXTRA_VIEWS_TO_IGNORE: | |
|
174 | return | |
|
175 | ||
|
176 | if not user_obj: | |
|
177 | return | |
|
178 | ||
|
179 | if user_obj.needs_2fa_configure and view_name != self.SETUP_2FA_VIEW: | |
|
180 | h.flash( | |
|
181 | "You are required to configure 2FA", | |
|
182 | "warning", | |
|
183 | ignore_duplicate=False, | |
|
184 | ) | |
|
185 | # Special case for users created "on the fly" (ldap case for new user) | |
|
186 | user_obj.check_2fa_required = False | |
|
187 | raise HTTPFound(self.request.route_path(self.SETUP_2FA_VIEW)) | |
|
188 | ||
|
189 | def _maybe_needs_2fa_check(self, view_name, user_obj): | |
|
190 | if view_name in self.DONT_CHECKOUT_VIEWS + self.EXTRA_VIEWS_TO_IGNORE: | |
|
191 | return | |
|
192 | ||
|
193 | if not user_obj: | |
|
194 | return | |
|
195 | ||
|
196 | if user_obj.check_2fa_required and view_name != self.VERIFY_2FA_VIEW: | |
|
197 | raise HTTPFound(self.request.route_path(self.VERIFY_2FA_VIEW)) | |
|
198 | ||
|
160 | 199 | def _log_creation_exception(self, e, repo_name): |
|
161 | 200 | _ = self.request.translate |
|
162 | 201 | reason = None |
@@ -676,6 +715,7 b' class BaseReferencesView(RepoAppView):' | |||
|
676 | 715 | { |
|
677 | 716 | "name": _render("name", ref_name, files_url, closed), |
|
678 | 717 | "name_raw": ref_name, |
|
718 | "closed": closed, | |
|
679 | 719 | "date": _render("date", commit.date), |
|
680 | 720 | "date_raw": datetime_to_time(commit.date), |
|
681 | 721 | "author": _render("author", commit.author), |
@@ -446,8 +446,8 b' class TestAdminRepos(object):' | |||
|
446 | 446 | csrf_token=csrf_token)) |
|
447 | 447 | |
|
448 | 448 | response.mustcontain( |
|
449 |
|
|
|
450 |
|
|
|
449 | "You do not have the permission to store repositories in " | |
|
450 | "the root location.") | |
|
451 | 451 | |
|
452 | 452 | @mock.patch.object(RepoModel, '_create_filesystem_repo', error_function) |
|
453 | 453 | def test_create_repo_when_filesystem_op_fails( |
@@ -485,7 +485,7 b' class TestAdminSystemInfo(object):' | |||
|
485 | 485 | update_data = { |
|
486 | 486 | 'versions': [ |
|
487 | 487 | { |
|
488 |
'version': '100. |
|
|
488 | 'version': '100.0.0', | |
|
489 | 489 | 'general': 'The latest version we are ever going to ship' |
|
490 | 490 | }, |
|
491 | 491 | { |
@@ -502,15 +502,15 b' class TestAdminSystemInfo(object):' | |||
|
502 | 502 | update_data = { |
|
503 | 503 | 'versions': [ |
|
504 | 504 | { |
|
505 |
'version': ' |
|
|
505 | 'version': '4.0.0', | |
|
506 | 506 | 'general': 'The first version we ever shipped' |
|
507 | 507 | } |
|
508 | 508 | ] |
|
509 | 509 | } |
|
510 | text = f"Your current version, {rhodecode.__version__}, is up-to-date as it is equal to or newer than the latest available version, 4.0.0." | |
|
510 | 511 | with mock.patch(UPDATE_DATA_QUALNAME, return_value=update_data): |
|
511 | 512 | response = self.app.get(route_path('admin_settings_system_update')) |
|
512 | response.mustcontain( | |
|
513 | 'This instance is already running the <b>latest</b> stable version') | |
|
513 | response.mustcontain(text) | |
|
514 | 514 | |
|
515 | 515 | def test_system_update_bad_response(self, autologin_user): |
|
516 | 516 | with mock.patch(UPDATE_DATA_QUALNAME, side_effect=ValueError('foo')): |
@@ -28,7 +28,7 b' from pyramid.renderers import render' | |||
|
28 | 28 | from pyramid.response import Response |
|
29 | 29 | |
|
30 | 30 | from rhodecode.apps._base import BaseAppView, DataGridAppView |
|
31 | from rhodecode.apps.ssh_support import SshKeyFileChangeEvent | |
|
31 | from rhodecode.apps.ssh_support.events import SshKeyFileChangeEvent | |
|
32 | 32 | from rhodecode import events |
|
33 | 33 | |
|
34 | 34 | from rhodecode.lib import helpers as h |
@@ -32,13 +32,13 b' from pyramid.response import Response' | |||
|
32 | 32 | |
|
33 | 33 | from rhodecode.apps._base import BaseAppView |
|
34 | 34 | from rhodecode.apps._base.navigation import navigation_list |
|
35 |
from rhodecode.apps.svn_support |
|
|
35 | from rhodecode.apps.svn_support import config_keys | |
|
36 | 36 | from rhodecode.lib import helpers as h |
|
37 | 37 | from rhodecode.lib.auth import ( |
|
38 | 38 | LoginRequired, HasPermissionAllDecorator, CSRFRequired) |
|
39 | 39 | from rhodecode.lib.celerylib import tasks, run_task |
|
40 | 40 | from rhodecode.lib.str_utils import safe_str |
|
41 | from rhodecode.lib.utils import repo2db_mapper | |
|
41 | from rhodecode.lib.utils import repo2db_mapper, get_rhodecode_repo_store_path | |
|
42 | 42 | from rhodecode.lib.utils2 import str2bool, AttributeDict |
|
43 | 43 | from rhodecode.lib.index import searcher_from_config |
|
44 | 44 | |
@@ -113,10 +113,8 b' class AdminSettingsView(BaseAppView):' | |||
|
113 | 113 | model = VcsSettingsModel() |
|
114 | 114 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
115 | 115 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
116 | ||
|
117 | settings = self.request.registry.settings | |
|
118 | c.svn_proxy_generate_config = settings[generate_config] | |
|
119 | ||
|
116 | c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config) | |
|
117 | c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path) | |
|
120 | 118 | defaults = self._form_defaults() |
|
121 | 119 | |
|
122 | 120 | model.create_largeobjects_dirs_if_needed(defaults['paths_root_path']) |
@@ -143,9 +141,8 b' class AdminSettingsView(BaseAppView):' | |||
|
143 | 141 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
144 | 142 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
145 | 143 | |
|
146 | settings = self.request.registry.settings | |
|
147 | c.svn_proxy_generate_config = settings[generate_config] | |
|
148 | ||
|
144 | c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config) | |
|
145 | c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path) | |
|
149 | 146 | application_form = ApplicationUiSettingsForm(self.request.translate)() |
|
150 | 147 | |
|
151 | 148 | try: |
@@ -167,9 +164,6 b' class AdminSettingsView(BaseAppView):' | |||
|
167 | 164 | return Response(html) |
|
168 | 165 | |
|
169 | 166 | try: |
|
170 | if c.visual.allow_repo_location_change: | |
|
171 | model.update_global_path_setting(form_result['paths_root_path']) | |
|
172 | ||
|
173 | 167 | model.update_global_ssl_setting(form_result['web_push_ssl']) |
|
174 | 168 | model.update_global_hook_settings(form_result) |
|
175 | 169 | |
@@ -217,7 +211,7 b' class AdminSettingsView(BaseAppView):' | |||
|
217 | 211 | def settings_mapping(self): |
|
218 | 212 | c = self.load_default_context() |
|
219 | 213 | c.active = 'mapping' |
|
220 |
c.storage_path = |
|
|
214 | c.storage_path = get_rhodecode_repo_store_path() | |
|
221 | 215 | data = render('rhodecode:templates/admin/settings/settings.mako', |
|
222 | 216 | self._get_template_context(c), self.request) |
|
223 | 217 | html = formencode.htmlfill.render( |
@@ -165,15 +165,20 b' class AdminSystemInfoSettingsView(BaseAp' | |||
|
165 | 165 | (_('Storage location'), val('storage')['path'], state('storage')), |
|
166 | 166 | (_('Storage info'), val('storage')['text'], state('storage')), |
|
167 | 167 | (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')), |
|
168 | ('', '', ''), # spacer | |
|
168 | 169 | |
|
169 | 170 | (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')), |
|
170 | 171 | (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')), |
|
172 | ('', '', ''), # spacer | |
|
171 | 173 | |
|
174 | (_('Archive cache storage type'), val('storage_archive')['type'], state('storage_archive')), | |
|
172 | 175 | (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')), |
|
173 | 176 | (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')), |
|
177 | ('', '', ''), # spacer | |
|
174 | 178 | |
|
175 | 179 | (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')), |
|
176 | 180 | (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')), |
|
181 | ('', '', ''), # spacer | |
|
177 | 182 | |
|
178 | 183 | (_('Search info'), val('search')['text'], state('search')), |
|
179 | 184 | (_('Search location'), val('search')['location'], state('search')), |
@@ -209,7 +214,8 b' class AdminSystemInfoSettingsView(BaseAp' | |||
|
209 | 214 | update_url = UpdateModel().get_update_url() |
|
210 | 215 | |
|
211 | 216 | def _err(s): |
|
212 |
return '<div style="color:#ff8888; padding:4px 0px">{}</div>' |
|
|
217 | return f'<div style="color:#ff8888; padding:4px 0px">{s}</div>' | |
|
218 | ||
|
213 | 219 | try: |
|
214 | 220 | data = UpdateModel().get_update_data(update_url) |
|
215 | 221 | except urllib.error.URLError as e: |
@@ -225,12 +231,12 b' class AdminSystemInfoSettingsView(BaseAp' | |||
|
225 | 231 | |
|
226 | 232 | c.update_url = update_url |
|
227 | 233 | c.latest_data = latest |
|
228 | c.latest_ver = latest['version'] | |
|
229 | c.cur_ver = rhodecode.__version__ | |
|
234 | c.latest_ver = (latest['version'] or '').strip() | |
|
235 | c.cur_ver = self.request.GET.get('ver') or rhodecode.__version__ | |
|
230 | 236 | c.should_upgrade = False |
|
231 | 237 | |
|
232 | is_oudated = UpdateModel().is_outdated(c.cur_ver, c.latest_ver) | |
|
233 | if is_oudated: | |
|
238 | is_outdated = UpdateModel().is_outdated(c.cur_ver, c.latest_ver) | |
|
239 | if is_outdated: | |
|
234 | 240 | c.should_upgrade = True |
|
235 | 241 | c.important_notices = latest['general'] |
|
236 | 242 | UpdateModel().store_version(latest['version']) |
@@ -27,7 +27,7 b' from pyramid.response import Response' | |||
|
27 | 27 | |
|
28 | 28 | from rhodecode import events |
|
29 | 29 | from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView |
|
30 | from rhodecode.apps.ssh_support import SshKeyFileChangeEvent | |
|
30 | from rhodecode.apps.ssh_support.events import SshKeyFileChangeEvent | |
|
31 | 31 | from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin |
|
32 | 32 | from rhodecode.authentication.plugins import auth_rhodecode |
|
33 | 33 | from rhodecode.events import trigger |
@@ -214,7 +214,7 b' class AdminUsersView(BaseAppView, DataGr' | |||
|
214 | 214 | html = formencode.htmlfill.render( |
|
215 | 215 | data, |
|
216 | 216 | defaults=errors.value, |
|
217 |
errors=errors. |
|
|
217 | errors=errors.error_dict or {}, | |
|
218 | 218 | prefix_error=False, |
|
219 | 219 | encoding="UTF-8", |
|
220 | 220 | force_defaults=False |
@@ -75,3 +75,27 b' def includeme(config):' | |||
|
75 | 75 | LoginView, |
|
76 | 76 | attr='password_reset_confirmation', |
|
77 | 77 | route_name='reset_password_confirmation', request_method='GET') |
|
78 | ||
|
79 | config.add_route( | |
|
80 | name='setup_2fa', | |
|
81 | pattern=ADMIN_PREFIX + '/setup_2fa') | |
|
82 | config.add_view( | |
|
83 | LoginView, | |
|
84 | attr='setup_2fa', | |
|
85 | route_name='setup_2fa', request_method=['GET', 'POST'], | |
|
86 | renderer='rhodecode:templates/configure_2fa.mako') | |
|
87 | ||
|
88 | config.add_route( | |
|
89 | name='check_2fa', | |
|
90 | pattern=ADMIN_PREFIX + '/check_2fa') | |
|
91 | config.add_view( | |
|
92 | LoginView, | |
|
93 | attr='verify_2fa', | |
|
94 | route_name='check_2fa', request_method='GET', | |
|
95 | renderer='rhodecode:templates/verify_2fa.mako') | |
|
96 | config.add_view( | |
|
97 | LoginView, | |
|
98 | attr='verify_2fa', | |
|
99 | route_name='check_2fa', request_method='POST', | |
|
100 | renderer='rhodecode:templates/verify_2fa.mako') | |
|
101 |
@@ -80,6 +80,18 b' class TestLoginController(object):' | |||
|
80 | 80 | assert username == 'test_regular' |
|
81 | 81 | response.mustcontain('logout') |
|
82 | 82 | |
|
83 | def test_login_with_primary_email(self): | |
|
84 | user_email = 'test_regular@mail.com' | |
|
85 | response = self.app.post(route_path('login'), | |
|
86 | {'username': user_email, | |
|
87 | 'password': 'test12'}, status=302) | |
|
88 | response = response.follow() | |
|
89 | session = response.get_session_from_response() | |
|
90 | user = session['rhodecode_user'] | |
|
91 | assert user['username'] == user_email.split('@')[0] | |
|
92 | assert user['is_authenticated'] | |
|
93 | response.mustcontain('logout') | |
|
94 | ||
|
83 | 95 | def test_login_regular_forbidden_when_super_admin_restriction(self): |
|
84 | 96 | from rhodecode.authentication.plugins.auth_rhodecode import RhodeCodeAuthPlugin |
|
85 | 97 | with fixture.auth_restriction(self.app._pyramid_registry, |
@@ -254,7 +266,7 b' class TestLoginController(object):' | |||
|
254 | 266 | ) |
|
255 | 267 | |
|
256 | 268 | assertr = response.assert_response() |
|
257 |
msg = |
|
|
269 | msg = 'This e-mail address is already taken' | |
|
258 | 270 | assertr.element_contains('#email+.error-message', msg) |
|
259 | 271 | |
|
260 | 272 | def test_register_err_same_email_case_sensitive(self): |
@@ -270,7 +282,7 b' class TestLoginController(object):' | |||
|
270 | 282 | } |
|
271 | 283 | ) |
|
272 | 284 | assertr = response.assert_response() |
|
273 |
msg = |
|
|
285 | msg = 'This e-mail address is already taken' | |
|
274 | 286 | assertr.element_contains('#email+.error-message', msg) |
|
275 | 287 | |
|
276 | 288 | def test_register_err_wrong_data(self): |
@@ -423,7 +435,7 b' class TestLoginController(object):' | |||
|
423 | 435 | 'If such email exists, a password reset link was sent to it.') |
|
424 | 436 | |
|
425 | 437 | # BAD KEY |
|
426 |
confirm_url = |
|
|
438 | confirm_url = route_path('reset_password_confirmation', params={'key': 'badkey'}) | |
|
427 | 439 | response = self.app.get(confirm_url, status=302) |
|
428 | 440 | assert response.location.endswith(route_path('reset_password')) |
|
429 | 441 | assert_session_flash(response, 'Given reset token is invalid') |
@@ -17,6 +17,9 b'' | |||
|
17 | 17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
18 | 18 | |
|
19 | 19 | import time |
|
20 | import json | |
|
21 | import pyotp | |
|
22 | import qrcode | |
|
20 | 23 | import collections |
|
21 | 24 | import datetime |
|
22 | 25 | import formencode |
@@ -24,10 +27,14 b' import formencode.htmlfill' | |||
|
24 | 27 | import logging |
|
25 | 28 | import urllib.parse |
|
26 | 29 | import requests |
|
30 | from io import BytesIO | |
|
31 | from base64 import b64encode | |
|
27 | 32 | |
|
33 | from pyramid.renderers import render | |
|
34 | from pyramid.response import Response | |
|
28 | 35 | from pyramid.httpexceptions import HTTPFound |
|
29 | 36 | |
|
30 | ||
|
37 | import rhodecode | |
|
31 | 38 | from rhodecode.apps._base import BaseAppView |
|
32 | 39 | from rhodecode.authentication.base import authenticate, HTTP_TYPE |
|
33 | 40 | from rhodecode.authentication.plugins import auth_rhodecode |
@@ -35,12 +42,12 b' from rhodecode.events import UserRegiste' | |||
|
35 | 42 | from rhodecode.lib import helpers as h |
|
36 | 43 | from rhodecode.lib import audit_logger |
|
37 | 44 | from rhodecode.lib.auth import ( |
|
38 | AuthUser, HasPermissionAnyDecorator, CSRFRequired) | |
|
45 | AuthUser, HasPermissionAnyDecorator, CSRFRequired, LoginRequired, NotAnonymous) | |
|
39 | 46 | from rhodecode.lib.base import get_ip_addr |
|
40 | 47 | from rhodecode.lib.exceptions import UserCreationError |
|
41 | 48 | from rhodecode.lib.utils2 import safe_str |
|
42 | 49 | from rhodecode.model.db import User, UserApiKeys |
|
43 | from rhodecode.model.forms import LoginForm, RegisterForm, PasswordResetForm | |
|
50 | from rhodecode.model.forms import LoginForm, RegisterForm, PasswordResetForm, TOTPForm | |
|
44 | 51 | from rhodecode.model.meta import Session |
|
45 | 52 | from rhodecode.model.auth_token import AuthTokenModel |
|
46 | 53 | from rhodecode.model.settings import SettingsModel |
@@ -54,8 +61,8 b' CaptchaData = collections.namedtuple(' | |||
|
54 | 61 | 'CaptchaData', 'active, private_key, public_key') |
|
55 | 62 | |
|
56 | 63 | |
|
57 |
def store_user_in_session(session, user |
|
|
58 |
user = User.get_by_username(user |
|
|
64 | def store_user_in_session(session, user_identifier, remember=False): | |
|
65 | user = User.get_by_username_or_primary_email(user_identifier) | |
|
59 | 66 | auth_user = AuthUser(user.user_id) |
|
60 | 67 | auth_user.set_authenticated() |
|
61 | 68 | cs = auth_user.get_cookie_store() |
@@ -74,7 +81,7 b' def store_user_in_session(session, usern' | |||
|
74 | 81 | safe_cs = cs.copy() |
|
75 | 82 | safe_cs['password'] = '****' |
|
76 | 83 | log.info('user %s is now authenticated and stored in ' |
|
77 |
'session, session attrs %s', user |
|
|
84 | 'session, session attrs %s', user_identifier, safe_cs) | |
|
78 | 85 | |
|
79 | 86 | # dumps session attrs back to cookie |
|
80 | 87 | session._update_cookie_out() |
@@ -179,9 +186,13 b' class LoginView(BaseAppView):' | |||
|
179 | 186 | self.session.invalidate() |
|
180 | 187 | form_result = login_form.to_python(self.request.POST) |
|
181 | 188 | # form checks for username/password, now we're authenticated |
|
189 | username = form_result['username'] | |
|
190 | if (user := User.get_by_username_or_primary_email(username)).has_enabled_2fa: | |
|
191 | user.check_2fa_required = True | |
|
192 | ||
|
182 | 193 | headers = store_user_in_session( |
|
183 | 194 | self.session, |
|
184 |
username |
|
|
195 | user_identifier=username, | |
|
185 | 196 | remember=form_result['remember']) |
|
186 | 197 | log.debug('Redirecting to "%s" after login.', c.came_from) |
|
187 | 198 | |
@@ -438,12 +449,12 b' class LoginView(BaseAppView):' | |||
|
438 | 449 | |
|
439 | 450 | def password_reset_confirmation(self): |
|
440 | 451 | self.load_default_context() |
|
441 | if self.request.GET and self.request.GET.get('key'): | |
|
452 | ||
|
453 | if key := self.request.GET.get('key'): | |
|
442 | 454 | # make this take 2s, to prevent brute forcing. |
|
443 | 455 | time.sleep(2) |
|
444 | 456 | |
|
445 | token = AuthTokenModel().get_auth_token( | |
|
446 | self.request.GET.get('key')) | |
|
457 | token = AuthTokenModel().get_auth_token(key) | |
|
447 | 458 | |
|
448 | 459 | # verify token is the correct role |
|
449 | 460 | if token is None or token.role != UserApiKeys.ROLE_PASSWORD_RESET: |
@@ -467,3 +478,76 b' class LoginView(BaseAppView):' | |||
|
467 | 478 | return HTTPFound(self.request.route_path('reset_password')) |
|
468 | 479 | |
|
469 | 480 | return HTTPFound(self.request.route_path('login')) |
|
481 | ||
|
482 | @LoginRequired() | |
|
483 | @NotAnonymous() | |
|
484 | def setup_2fa(self): | |
|
485 | _ = self.request.translate | |
|
486 | c = self.load_default_context() | |
|
487 | user_instance = self._rhodecode_db_user | |
|
488 | form = TOTPForm(_, user_instance)() | |
|
489 | render_ctx = {} | |
|
490 | if self.request.method == 'POST': | |
|
491 | post_items = dict(self.request.POST) | |
|
492 | ||
|
493 | try: | |
|
494 | form_details = form.to_python(post_items) | |
|
495 | secret = form_details['secret_totp'] | |
|
496 | ||
|
497 | user_instance.init_2fa_recovery_codes(persist=True, force=True) | |
|
498 | user_instance.secret_2fa = secret | |
|
499 | ||
|
500 | Session().commit() | |
|
501 | raise HTTPFound(self.request.route_path('my_account_configure_2fa', _query={'show-recovery-codes': 1})) | |
|
502 | except formencode.Invalid as errors: | |
|
503 | defaults = errors.value | |
|
504 | render_ctx = { | |
|
505 | 'errors': errors.error_dict, | |
|
506 | 'defaults': defaults, | |
|
507 | } | |
|
508 | ||
|
509 | # NOTE: here we DO NOT persist the secret 2FA, since this is only for setup, once a setup is completed | |
|
510 | # only then we should persist it | |
|
511 | secret = user_instance.init_secret_2fa(persist=False) | |
|
512 | ||
|
513 | instance_name = rhodecode.ConfigGet().get_str('app.base_url', 'rhodecode') | |
|
514 | totp_name = f'{instance_name}:{self.request.user.username}' | |
|
515 | ||
|
516 | qr = qrcode.QRCode(version=1, box_size=5, border=4) | |
|
517 | qr.add_data(pyotp.totp.TOTP(secret).provisioning_uri(name=totp_name)) | |
|
518 | qr.make(fit=True) | |
|
519 | img = qr.make_image(fill_color='black', back_color='white') | |
|
520 | buffered = BytesIO() | |
|
521 | img.save(buffered) | |
|
522 | return self._get_template_context( | |
|
523 | c, | |
|
524 | qr=b64encode(buffered.getvalue()).decode("utf-8"), | |
|
525 | key=secret, | |
|
526 | totp_name=totp_name, | |
|
527 | ** render_ctx | |
|
528 | ) | |
|
529 | ||
|
530 | @LoginRequired() | |
|
531 | @NotAnonymous() | |
|
532 | def verify_2fa(self): | |
|
533 | _ = self.request.translate | |
|
534 | c = self.load_default_context() | |
|
535 | render_ctx = {} | |
|
536 | user_instance = self._rhodecode_db_user | |
|
537 | totp_form = TOTPForm(_, user_instance, allow_recovery_code_use=True)() | |
|
538 | if self.request.method == 'POST': | |
|
539 | post_items = dict(self.request.POST) | |
|
540 | # NOTE: inject secret, as it's a post configured saved item. | |
|
541 | post_items['secret_totp'] = user_instance.secret_2fa | |
|
542 | try: | |
|
543 | totp_form.to_python(post_items) | |
|
544 | user_instance.check_2fa_required = False | |
|
545 | Session().commit() | |
|
546 | raise HTTPFound(c.came_from) | |
|
547 | except formencode.Invalid as errors: | |
|
548 | defaults = errors.value | |
|
549 | render_ctx = { | |
|
550 | 'errors': errors.error_dict, | |
|
551 | 'defaults': defaults, | |
|
552 | } | |
|
553 | return self._get_template_context(c, **render_ctx) |
@@ -74,6 +74,45 b' def includeme(config):' | |||
|
74 | 74 | route_name='my_account_password_update', request_method='POST', |
|
75 | 75 | renderer='rhodecode:templates/admin/my_account/my_account.mako') |
|
76 | 76 | |
|
77 | # my account 2fa | |
|
78 | config.add_route( | |
|
79 | name='my_account_configure_2fa', | |
|
80 | pattern=ADMIN_PREFIX + '/my_account/configure_2fa') | |
|
81 | config.add_view( | |
|
82 | MyAccountView, | |
|
83 | attr='my_account_2fa', | |
|
84 | route_name='my_account_configure_2fa', request_method='GET', | |
|
85 | renderer='rhodecode:templates/admin/my_account/my_account.mako') | |
|
86 | # my account 2fa save | |
|
87 | config.add_route( | |
|
88 | name='my_account_configure_2fa_update', | |
|
89 | pattern=ADMIN_PREFIX + '/my_account/configure_2fa_update') | |
|
90 | config.add_view( | |
|
91 | MyAccountView, | |
|
92 | attr='my_account_2fa_update', | |
|
93 | route_name='my_account_configure_2fa_update', request_method='POST', | |
|
94 | renderer='rhodecode:templates/admin/my_account/my_account.mako') | |
|
95 | ||
|
96 | # my account 2fa recovery code-reset | |
|
97 | config.add_route( | |
|
98 | name='my_account_show_2fa_recovery_codes', | |
|
99 | pattern=ADMIN_PREFIX + '/my_account/recovery_codes') | |
|
100 | config.add_view( | |
|
101 | MyAccountView, | |
|
102 | attr='my_account_2fa_show_recovery_codes', | |
|
103 | route_name='my_account_show_2fa_recovery_codes', request_method='POST', xhr=True, | |
|
104 | renderer='json_ext') | |
|
105 | ||
|
106 | # my account 2fa recovery code-reset | |
|
107 | config.add_route( | |
|
108 | name='my_account_regenerate_2fa_recovery_codes', | |
|
109 | pattern=ADMIN_PREFIX + '/my_account/regenerate_recovery_codes') | |
|
110 | config.add_view( | |
|
111 | MyAccountView, | |
|
112 | attr='my_account_2fa_regenerate_recovery_codes', | |
|
113 | route_name='my_account_regenerate_2fa_recovery_codes', request_method='POST', | |
|
114 | renderer='rhodecode:templates/admin/my_account/my_account.mako') | |
|
115 | ||
|
77 | 116 | # my account tokens |
|
78 | 117 | config.add_route( |
|
79 | 118 | name='my_account_auth_tokens', |
@@ -16,6 +16,7 b'' | |||
|
16 | 16 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
17 | 17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
18 | 18 | |
|
19 | import time | |
|
19 | 20 | import logging |
|
20 | 21 | import datetime |
|
21 | 22 | import string |
@@ -40,9 +41,10 b' from rhodecode.lib.utils2 import safe_in' | |||
|
40 | 41 | from rhodecode.model.auth_token import AuthTokenModel |
|
41 | 42 | from rhodecode.model.comment import CommentsModel |
|
42 | 43 | from rhodecode.model.db import ( |
|
43 | IntegrityError, or_, in_filter_generator, | |
|
44 | IntegrityError, or_, in_filter_generator, select, | |
|
44 | 45 | Repository, UserEmailMap, UserApiKeys, UserFollowing, |
|
45 | 46 | PullRequest, UserBookmark, RepoGroup, ChangesetStatus) |
|
47 | from rhodecode.model.forms import TOTPForm | |
|
46 | 48 | from rhodecode.model.meta import Session |
|
47 | 49 | from rhodecode.model.pull_request import PullRequestModel |
|
48 | 50 | from rhodecode.model.user import UserModel |
@@ -136,6 +138,7 b' class MyAccountView(BaseAppView, DataGri' | |||
|
136 | 138 | except forms.ValidationFailure as e: |
|
137 | 139 | c.form = e |
|
138 | 140 | return self._get_template_context(c) |
|
141 | ||
|
139 | 142 | except Exception: |
|
140 | 143 | log.exception("Exception updating user") |
|
141 | 144 | h.flash(_('Error occurred during update of user'), |
@@ -203,6 +206,74 b' class MyAccountView(BaseAppView, DataGri' | |||
|
203 | 206 | |
|
204 | 207 | @LoginRequired() |
|
205 | 208 | @NotAnonymous() |
|
209 | def my_account_2fa(self): | |
|
210 | _ = self.request.translate | |
|
211 | c = self.load_default_context() | |
|
212 | c.active = '2fa' | |
|
213 | user_instance = c.auth_user.get_instance() | |
|
214 | locked_by_admin = user_instance.has_forced_2fa | |
|
215 | c.state_of_2fa = user_instance.has_enabled_2fa | |
|
216 | c.user_seen_2fa_recovery_codes = user_instance.has_seen_2fa_codes | |
|
217 | c.locked_2fa = str2bool(locked_by_admin) | |
|
218 | return self._get_template_context(c) | |
|
219 | ||
|
220 | @LoginRequired() | |
|
221 | @NotAnonymous() | |
|
222 | @CSRFRequired() | |
|
223 | def my_account_2fa_update(self): | |
|
224 | _ = self.request.translate | |
|
225 | c = self.load_default_context() | |
|
226 | c.active = '2fa' | |
|
227 | user_instance = c.auth_user.get_instance() | |
|
228 | ||
|
229 | state = str2bool(self.request.POST.get('2fa_status')) | |
|
230 | user_instance.has_enabled_2fa = state | |
|
231 | user_instance.update_userdata(update_2fa=time.time()) | |
|
232 | Session().commit() | |
|
233 | if state: | |
|
234 | h.flash(_("2FA has been successfully enabled"), category='success') | |
|
235 | else: | |
|
236 | h.flash(_("2FA has been successfully disabled"), category='success') | |
|
237 | raise HTTPFound(self.request.route_path('my_account_configure_2fa')) | |
|
238 | ||
|
239 | @LoginRequired() | |
|
240 | @NotAnonymous() | |
|
241 | @CSRFRequired() | |
|
242 | def my_account_2fa_show_recovery_codes(self): | |
|
243 | c = self.load_default_context() | |
|
244 | user_instance = c.auth_user.get_instance() | |
|
245 | user_instance.has_seen_2fa_codes = True | |
|
246 | Session().commit() | |
|
247 | return {'recovery_codes': user_instance.get_2fa_recovery_codes()} | |
|
248 | ||
|
249 | @LoginRequired() | |
|
250 | @NotAnonymous() | |
|
251 | @CSRFRequired() | |
|
252 | def my_account_2fa_regenerate_recovery_codes(self): | |
|
253 | _ = self.request.translate | |
|
254 | c = self.load_default_context() | |
|
255 | user_instance = c.auth_user.get_instance() | |
|
256 | ||
|
257 | totp_form = TOTPForm(_, user_instance, allow_recovery_code_use=True)() | |
|
258 | ||
|
259 | post_items = dict(self.request.POST) | |
|
260 | # NOTE: inject secret, as it's a post configured saved item. | |
|
261 | post_items['secret_totp'] = user_instance.secret_2fa | |
|
262 | try: | |
|
263 | totp_form.to_python(post_items) | |
|
264 | user_instance.regenerate_2fa_recovery_codes() | |
|
265 | Session().commit() | |
|
266 | except formencode.Invalid as errors: | |
|
267 | h.flash(_("Failed to generate new recovery codes: {}").format(errors), category='error') | |
|
268 | raise HTTPFound(self.request.route_path('my_account_configure_2fa')) | |
|
269 | except Exception as e: | |
|
270 | h.flash(_("Failed to generate new recovery codes: {}").format(e), category='error') | |
|
271 | raise HTTPFound(self.request.route_path('my_account_configure_2fa')) | |
|
272 | ||
|
273 | raise HTTPFound(self.request.route_path('my_account_configure_2fa', _query={'show-recovery-codes': 1})) | |
|
274 | ||
|
275 | @LoginRequired() | |
|
276 | @NotAnonymous() | |
|
206 | 277 | def my_account_auth_tokens(self): |
|
207 | 278 | _ = self.request.translate |
|
208 | 279 | |
@@ -483,8 +554,15 b' class MyAccountView(BaseAppView, DataGri' | |||
|
483 | 554 | def my_account_bookmarks(self): |
|
484 | 555 | c = self.load_default_context() |
|
485 | 556 | c.active = 'bookmarks' |
|
486 | c.bookmark_items = UserBookmark.get_bookmarks_for_user( | |
|
487 | self._rhodecode_db_user.user_id, cache=False) | |
|
557 | ||
|
558 | user_bookmarks = \ | |
|
559 | select(UserBookmark, Repository, RepoGroup) \ | |
|
560 | .where(UserBookmark.user_id == self._rhodecode_user.user_id) \ | |
|
561 | .outerjoin(Repository, Repository.repo_id == UserBookmark.bookmark_repo_id) \ | |
|
562 | .outerjoin(RepoGroup, RepoGroup.group_id == UserBookmark.bookmark_repo_group_id) \ | |
|
563 | .order_by(UserBookmark.position.asc()) | |
|
564 | ||
|
565 | c.user_bookmark_items = Session().execute(user_bookmarks).all() | |
|
488 | 566 | return self._get_template_context(c) |
|
489 | 567 | |
|
490 | 568 | def _process_bookmark_entry(self, entry, user_id): |
@@ -21,7 +21,7 b' import logging' | |||
|
21 | 21 | from pyramid.httpexceptions import HTTPFound |
|
22 | 22 | |
|
23 | 23 | from rhodecode.apps._base import BaseAppView, DataGridAppView |
|
24 | from rhodecode.apps.ssh_support import SshKeyFileChangeEvent | |
|
24 | from rhodecode.apps.ssh_support.events import SshKeyFileChangeEvent | |
|
25 | 25 | from rhodecode.events import trigger |
|
26 | 26 | from rhodecode.lib import helpers as h |
|
27 | 27 | from rhodecode.lib import audit_logger |
@@ -41,6 +41,15 b' def admin_routes(config):' | |||
|
41 | 41 | renderer='json_ext') |
|
42 | 42 | |
|
43 | 43 | config.add_route( |
|
44 | name='ops_celery_error_test', | |
|
45 | pattern='/error-celery') | |
|
46 | config.add_view( | |
|
47 | OpsView, | |
|
48 | attr='ops_celery_error_test', | |
|
49 | route_name='ops_celery_error_test', request_method='GET', | |
|
50 | renderer='json_ext') | |
|
51 | ||
|
52 | config.add_route( | |
|
44 | 53 | name='ops_redirect_test', |
|
45 | 54 | pattern='/redirect') |
|
46 | 55 | config.add_view( |
@@ -66,6 +66,20 b' class OpsView(BaseAppView):' | |||
|
66 | 66 | 'Client:{}. Generation time: {}.'.format(self.request.user, time.time())) |
|
67 | 67 | raise TestException(msg) |
|
68 | 68 | |
|
69 | def ops_celery_error_test(self): | |
|
70 | """ | |
|
71 | Test exception handling and emails on errors | |
|
72 | """ | |
|
73 | from rhodecode.lib.celerylib import tasks, run_task | |
|
74 | ||
|
75 | # add timeout so we add some sort of rate limiter | |
|
76 | time.sleep(2) | |
|
77 | ||
|
78 | msg = ('RhodeCode Enterprise test exception. ' | |
|
79 | 'Client:{}. Generation time: {}.'.format(self.request.user, time.time())) | |
|
80 | celery_task = run_task(tasks.test_celery_exception, msg) | |
|
81 | return {'task': str(celery_task)} | |
|
82 | ||
|
69 | 83 | def ops_redirect_test(self): |
|
70 | 84 | """ |
|
71 | 85 | Test redirect handling |
@@ -591,6 +591,15 b' def includeme(config):' | |||
|
591 | 591 | route_name='branches_home', request_method='GET', |
|
592 | 592 | renderer='rhodecode:templates/branches/branches.mako') |
|
593 | 593 | |
|
594 | config.add_route( | |
|
595 | name='branch_remove', | |
|
596 | pattern='/{repo_name:.*?[^/]}/branches/{branch_name:.*?[^/]}/remove', repo_route=True, repo_accepted_types=['hg', 'git']) | |
|
597 | config.add_view( | |
|
598 | RepoBranchesView, | |
|
599 | attr='remove_branch', | |
|
600 | route_name='branch_remove', request_method='POST' | |
|
601 | ) | |
|
602 | ||
|
594 | 603 | # Bookmarks |
|
595 | 604 | config.add_route( |
|
596 | 605 | name='bookmarks_home', |
@@ -19,6 +19,7 b'' | |||
|
19 | 19 | import pytest |
|
20 | 20 | from rhodecode.model.db import Repository |
|
21 | 21 | from rhodecode.tests.routes import route_path |
|
22 | from rhodecode.tests import assert_session_flash | |
|
22 | 23 | |
|
23 | 24 | |
|
24 | 25 | @pytest.mark.usefixtures('autologin_user', 'app') |
@@ -33,3 +34,50 b' class TestBranchesController(object):' | |||
|
33 | 34 | for commit_id, obj_name in repo.scm_instance().branches.items(): |
|
34 | 35 | assert commit_id in response |
|
35 | 36 | assert obj_name in response |
|
37 | ||
|
38 | def test_landing_branch_delete(self, backend, csrf_token): | |
|
39 | if backend.alias == 'svn': | |
|
40 | pytest.skip("Not supported yet") | |
|
41 | branch_related_data_per_backend = { | |
|
42 | 'git': {'name': 'master'}, | |
|
43 | 'hg': {'name': 'default'}, | |
|
44 | } | |
|
45 | response = self.app.post( | |
|
46 | route_path('branch_remove', repo_name=backend.repo_name, | |
|
47 | branch_name=branch_related_data_per_backend[backend.alias]['name']), | |
|
48 | params={'csrf_token': csrf_token}, status=302) | |
|
49 | assert_session_flash( | |
|
50 | response, | |
|
51 | f"This branch {branch_related_data_per_backend[backend.alias]['name']} cannot be removed as it's currently set as landing branch" | |
|
52 | ) | |
|
53 | ||
|
54 | def test_delete_branch_by_repo_owner(self, backend, csrf_token): | |
|
55 | if backend.alias in ('svn', 'hg'): | |
|
56 | pytest.skip("Skipping for hg and svn") | |
|
57 | branch_to_be_removed = 'remove_me' | |
|
58 | repo = Repository.get_by_repo_name(backend.repo_name) | |
|
59 | repo.scm_instance()._create_branch(branch_to_be_removed, repo.scm_instance().commit_ids[1]) | |
|
60 | response = self.app.post( | |
|
61 | route_path('branch_remove', repo_name=backend.repo_name, | |
|
62 | branch_name=branch_to_be_removed), | |
|
63 | params={'csrf_token': csrf_token}, status=302) | |
|
64 | assert_session_flash(response, f"Branch {branch_to_be_removed} has been successfully deleted") | |
|
65 | ||
|
66 | def test_delete_branch_by_not_repo_owner(self, backend, csrf_token): | |
|
67 | username = 'test_regular' | |
|
68 | pwd = 'test12' | |
|
69 | branch_related_data_per_backend = { | |
|
70 | 'git': {'name': 'master', 'action': 'deleted'}, | |
|
71 | 'hg': {'name': 'stable', 'action': 'closed'}, | |
|
72 | } | |
|
73 | if backend.alias == 'svn': | |
|
74 | pytest.skip("Not supported yet") | |
|
75 | self.app.post(route_path('login'), | |
|
76 | {'username': username, | |
|
77 | 'password': pwd}) | |
|
78 | selected_branch = branch_related_data_per_backend[backend.alias]['name'] | |
|
79 | response = self.app.post( | |
|
80 | route_path('branch_remove', repo_name=backend.repo_name, | |
|
81 | branch_name=selected_branch), | |
|
82 | params={'csrf_token': csrf_token, 'username': username, 'password': pwd}, status=404) | |
|
83 | assert response.status_code == 404 |
@@ -1,4 +1,3 b'' | |||
|
1 | ||
|
2 | 1 |
|
|
3 | 2 | # |
|
4 | 3 | # This program is free software: you can redistribute it and/or modify |
@@ -52,11 +51,11 b' def assert_clone_url(response, server, r' | |||
|
52 | 51 | |
|
53 | 52 | @pytest.mark.usefixtures('app') |
|
54 | 53 | class TestSummaryView(object): |
|
54 | ||
|
55 | 55 | def test_index(self, autologin_user, backend, http_host_only_stub): |
|
56 | 56 | repo_id = backend.repo.repo_id |
|
57 | 57 | repo_name = backend.repo_name |
|
58 | with mock.patch('rhodecode.lib.helpers.is_svn_without_proxy', | |
|
59 | return_value=False): | |
|
58 | ||
|
60 | 59 |
|
|
61 | 60 |
|
|
62 | 61 | |
@@ -71,37 +70,43 b' class TestSummaryView(object):' | |||
|
71 | 70 | |
|
72 | 71 | # clone url... |
|
73 | 72 | assert_clone_url(response, http_host_only_stub, repo_name) |
|
74 |
assert_clone_url(response, http_host_only_stub, '_{}' |
|
|
73 | assert_clone_url(response, http_host_only_stub, f'_{repo_id}') | |
|
75 | 74 | |
|
76 | 75 | def test_index_svn_without_proxy( |
|
77 | 76 | self, autologin_user, backend_svn, http_host_only_stub): |
|
77 | ||
|
78 | 78 | repo_id = backend_svn.repo.repo_id |
|
79 | 79 | repo_name = backend_svn.repo_name |
|
80 | response = self.app.get(route_path('repo_summary', repo_name=repo_name)) | |
|
81 | # clone url... | |
|
80 | ||
|
81 | # by default the SVN is enabled now, this is how inputs look when it's disabled | |
|
82 | with mock.patch('rhodecode.lib.helpers.is_svn_without_proxy', return_value=True): | |
|
82 | 83 | |
|
84 | response = self.app.get( | |
|
85 | route_path('repo_summary', repo_name=repo_name), | |
|
86 | status=200) | |
|
87 | ||
|
88 | # clone url test... | |
|
83 | 89 | assert_clone_url(response, http_host_only_stub, repo_name, disabled=True) |
|
84 |
assert_clone_url(response, http_host_only_stub, '_{}' |
|
|
90 | assert_clone_url(response, http_host_only_stub, f'_{repo_id}', disabled=True) | |
|
85 | 91 | |
|
86 | 92 | def test_index_with_trailing_slash( |
|
87 | 93 | self, autologin_user, backend, http_host_only_stub): |
|
88 | 94 | |
|
89 | 95 | repo_id = backend.repo.repo_id |
|
90 | 96 | repo_name = backend.repo_name |
|
91 | with mock.patch('rhodecode.lib.helpers.is_svn_without_proxy', | |
|
92 | return_value=False): | |
|
97 | trailing_slash = '/' | |
|
93 | 98 |
|
|
94 |
|
|
|
99 | route_path('repo_summary', repo_name=repo_name) + trailing_slash, | |
|
95 | 100 |
|
|
96 | 101 | |
|
97 | 102 | # clone url... |
|
98 | 103 | assert_clone_url(response, http_host_only_stub, repo_name) |
|
99 |
assert_clone_url(response, http_host_only_stub, '_{}' |
|
|
104 | assert_clone_url(response, http_host_only_stub, f'_{repo_id}') | |
|
100 | 105 | |
|
101 | 106 | def test_index_by_id(self, autologin_user, backend): |
|
102 | 107 | repo_id = backend.repo.repo_id |
|
103 | 108 | response = self.app.get( |
|
104 |
route_path('repo_summary', repo_name='_ |
|
|
109 | route_path('repo_summary', repo_name=f'_{repo_id}')) | |
|
105 | 110 | |
|
106 | 111 | # repo type |
|
107 | 112 | response.mustcontain( |
@@ -18,11 +18,15 b'' | |||
|
18 | 18 | |
|
19 | 19 | import logging |
|
20 | 20 | |
|
21 | from pyramid.httpexceptions import HTTPFound | |
|
21 | 22 | |
|
22 | 23 | from rhodecode.apps._base import BaseReferencesView |
|
23 | 24 | from rhodecode.lib import ext_json |
|
24 | from rhodecode.lib.auth import (LoginRequired, HasRepoPermissionAnyDecorator) | |
|
25 | from rhodecode.lib import helpers as h | |
|
26 | from rhodecode.lib.auth import (LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired) | |
|
25 | 27 | from rhodecode.model.scm import ScmModel |
|
28 | from rhodecode.model.meta import Session | |
|
29 | from rhodecode.model.db import PullRequest | |
|
26 | 30 | |
|
27 | 31 | log = logging.getLogger(__name__) |
|
28 | 32 | |
@@ -33,15 +37,71 b' class RepoBranchesView(BaseReferencesVie' | |||
|
33 | 37 | @HasRepoPermissionAnyDecorator( |
|
34 | 38 | 'repository.read', 'repository.write', 'repository.admin') |
|
35 | 39 | def branches(self): |
|
40 | partial_render = self.request.get_partial_renderer( | |
|
41 | 'rhodecode:templates/data_table/_dt_elements.mako') | |
|
42 | repo_name = self.db_repo_name | |
|
36 | 43 | c = self.load_default_context() |
|
37 | 44 | self._prepare_and_set_clone_url(c) |
|
38 | 45 | c.rhodecode_repo = self.rhodecode_vcs_repo |
|
39 | 46 | c.repository_forks = ScmModel().get_forks(self.db_repo) |
|
40 | ||
|
41 | 47 | ref_items = self.rhodecode_vcs_repo.branches_all.items() |
|
42 | 48 | data = self.load_refs_context( |
|
43 | 49 | ref_items=ref_items, partials_template='branches/branches_data.mako') |
|
44 | ||
|
50 | data_with_actions = [] | |
|
51 | if self.db_repo.repo_type != 'svn': | |
|
52 | for branch in data: | |
|
53 | branch['action'] = partial_render( | |
|
54 | f"branch_actions_{self.db_repo.repo_type}", branch['name_raw'], repo_name, closed=branch['closed'] | |
|
55 | ) | |
|
56 | data_with_actions.append(branch) | |
|
57 | data = data_with_actions | |
|
45 | 58 | c.has_references = bool(data) |
|
46 | 59 | c.data = ext_json.str_json(data) |
|
47 | 60 | return self._get_template_context(c) |
|
61 | ||
|
62 | @LoginRequired() | |
|
63 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') | |
|
64 | @CSRFRequired() | |
|
65 | def remove_branch(self): | |
|
66 | _ = self.request.translate | |
|
67 | self.load_default_context() | |
|
68 | repo = self.db_repo | |
|
69 | repo_name = self.db_repo_name | |
|
70 | repo_type = repo.repo_type | |
|
71 | action = _('deleted') if repo_type == 'git' else _('closed') | |
|
72 | redirect = HTTPFound(location=self.request.route_path('branches_home', repo_name=repo_name)) | |
|
73 | branch_name = self.request.matchdict.get('branch_name') | |
|
74 | if repo.landing_ref_name == branch_name: | |
|
75 | h.flash( | |
|
76 | _("This branch {} cannot be removed as it's currently set as landing branch").format(branch_name), | |
|
77 | category='error' | |
|
78 | ) | |
|
79 | return redirect | |
|
80 | if prs_related_to := Session().query(PullRequest).filter(PullRequest.target_repo_id == repo.repo_id, | |
|
81 | PullRequest.status != PullRequest.STATUS_CLOSED).filter( | |
|
82 | (PullRequest.source_ref.like(f'branch:{branch_name}:%')) | ( | |
|
83 | PullRequest.target_ref.like(f'branch:{branch_name}:%')) | |
|
84 | ).all(): | |
|
85 | h.flash(_("Branch cannot be {} - it's used in following open Pull Request ids: {}").format(action, ','.join( | |
|
86 | map(str, prs_related_to))), category='error') | |
|
87 | return redirect | |
|
88 | ||
|
89 | match repo_type: | |
|
90 | case 'git': | |
|
91 | self.rhodecode_vcs_repo.delete_branch(branch_name) | |
|
92 | case 'hg': | |
|
93 | from rhodecode.lib.vcs.backends.base import Reference | |
|
94 | self.rhodecode_vcs_repo._local_close( | |
|
95 | source_ref=Reference(type='branch', name=branch_name, | |
|
96 | commit_id=self.rhodecode_vcs_repo.branches[branch_name]), | |
|
97 | target_ref=Reference(type='branch', name='', commit_id=None), | |
|
98 | user_name=self.request.user.name, | |
|
99 | user_email=self.request.user.email) | |
|
100 | case _: | |
|
101 | raise NotImplementedError('Branch deleting functionality not yet implemented') | |
|
102 | ScmModel().mark_for_invalidation(repo_name) | |
|
103 | self.rhodecode_vcs_repo._invalidate_prop_cache('commit_ids') | |
|
104 | self.rhodecode_vcs_repo._invalidate_prop_cache('_refs') | |
|
105 | self.rhodecode_vcs_repo._invalidate_prop_cache('branches') | |
|
106 | h.flash(_("Branch {} has been successfully {}").format(branch_name, action), category='success') | |
|
107 | return redirect |
@@ -24,6 +24,8 b' import urllib.request' | |||
|
24 | 24 | import urllib.parse |
|
25 | 25 | import urllib.error |
|
26 | 26 | import pathlib |
|
27 | import time | |
|
28 | import random | |
|
27 | 29 | |
|
28 | 30 | from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound |
|
29 | 31 | |
@@ -37,7 +39,8 b' from rhodecode.apps._base import RepoApp' | |||
|
37 | 39 | from rhodecode.lib import diffs, helpers as h, rc_cache |
|
38 | 40 | from rhodecode.lib import audit_logger |
|
39 | 41 | from rhodecode.lib.hash_utils import sha1_safe |
|
40 | from rhodecode.lib.rc_cache.archive_cache import get_archival_cache_store, get_archival_config, ReentrantLock | |
|
42 | from rhodecode.lib.archive_cache import ( | |
|
43 | get_archival_cache_store, get_archival_config, ArchiveCacheGenerationLock, archive_iterator) | |
|
41 | 44 | from rhodecode.lib.str_utils import safe_bytes, convert_special_chars |
|
42 | 45 | from rhodecode.lib.view_utils import parse_path_ref |
|
43 | 46 | from rhodecode.lib.exceptions import NonRelativePathError |
@@ -417,22 +420,24 b' class RepoFilesView(RepoAppView):' | |||
|
417 | 420 | # NOTE: we get the config to pass to a call to lazy-init the SAME type of cache on vcsserver |
|
418 | 421 | d_cache_conf = get_archival_config(config=CONFIG) |
|
419 | 422 | |
|
423 | # This is also a cache key, and lock key | |
|
420 | 424 | reentrant_lock_key = archive_name_key + '.lock' |
|
421 | with ReentrantLock(d_cache, reentrant_lock_key): | |
|
422 | # This is also a cache key | |
|
425 | ||
|
423 | 426 |
|
|
424 | if archive_name_key in d_cache and not archive_cache_disable: | |
|
425 |
|
|
|
427 | if not archive_cache_disable and archive_name_key in d_cache: | |
|
428 | reader, metadata = d_cache.fetch(archive_name_key) | |
|
429 | ||
|
426 | 430 |
|
|
427 | 431 |
|
|
428 |
|
|
|
432 | archive_name_key, metadata, reader.name) | |
|
429 | 433 |
|
|
430 | 434 |
|
|
431 | 435 |
|
|
432 | 436 | |
|
437 | if not reader: | |
|
433 | 438 | # generate new archive, as previous was not found in the cache |
|
434 |
|
|
|
435 | ||
|
439 | try: | |
|
440 | with d_cache.get_lock(reentrant_lock_key): | |
|
436 | 441 | try: |
|
437 | 442 | commit.archive_repo(archive_name_key, archive_dir_name=archive_dir_name, |
|
438 | 443 | kind=fileformat, subrepos=subrepos, |
@@ -440,18 +445,21 b' class RepoFilesView(RepoAppView):' | |||
|
440 | 445 | except ImproperArchiveTypeError: |
|
441 | 446 | return _('Unknown archive type') |
|
442 | 447 | |
|
443 | reader, tag = d_cache.get(archive_name_key, read=True, tag=True, retry=True) | |
|
444 | ||
|
445 | if not reader: | |
|
446 | raise ValueError('archive cache reader is empty, failed to fetch file from distributed archive cache') | |
|
448 | except ArchiveCacheGenerationLock: | |
|
449 | retry_after = round(random.uniform(0.3, 3.0), 1) | |
|
450 | time.sleep(retry_after) | |
|
447 | 451 | |
|
448 | def archive_iterator(_reader, block_size: int = 4096*512): | |
|
449 | # 4096 * 64 = 64KB | |
|
450 | while 1: | |
|
451 | data = _reader.read(block_size) | |
|
452 | if not data: | |
|
453 | break | |
|
454 | yield data | |
|
452 | location = self.request.url | |
|
453 | response = Response( | |
|
454 | f"archive {archive_name_key} generation in progress, Retry-After={retry_after}, Location={location}" | |
|
455 | ) | |
|
456 | response.headers["Retry-After"] = str(retry_after) | |
|
457 | response.status_code = 307 # temporary redirect | |
|
458 | ||
|
459 | response.location = location | |
|
460 | return response | |
|
461 | ||
|
462 | reader, metadata = d_cache.fetch(archive_name_key, retry=True, retry_attempts=30) | |
|
455 | 463 | |
|
456 | 464 | response = Response(app_iter=archive_iterator(reader)) |
|
457 | 465 | response.content_disposition = f'attachment; filename={response_archive_name}' |
@@ -24,7 +24,9 b' from pyramid.httpexceptions import HTTPF' | |||
|
24 | 24 | from pyramid.response import Response |
|
25 | 25 | from pyramid.renderers import render |
|
26 | 26 | |
|
27 | import rhodecode | |
|
27 | 28 | from rhodecode.apps._base import RepoAppView |
|
29 | from rhodecode.apps.svn_support import config_keys | |
|
28 | 30 | from rhodecode.lib import helpers as h |
|
29 | 31 | from rhodecode.lib.auth import ( |
|
30 | 32 | LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired) |
@@ -38,8 +40,6 b' log = logging.getLogger(__name__)' | |||
|
38 | 40 | class RepoSettingsVcsView(RepoAppView): |
|
39 | 41 | def load_default_context(self): |
|
40 | 42 | c = self._get_local_tmpl_context() |
|
41 | ||
|
42 | ||
|
43 | 43 | return c |
|
44 | 44 | |
|
45 | 45 | def _vcs_form_defaults(self, repo_name): |
@@ -77,6 +77,9 b' class RepoSettingsVcsView(RepoAppView):' | |||
|
77 | 77 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() |
|
78 | 78 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() |
|
79 | 79 | |
|
80 | c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config) | |
|
81 | c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path) | |
|
82 | ||
|
80 | 83 | defaults = self._vcs_form_defaults(self.db_repo_name) |
|
81 | 84 | c.inherit_global_settings = defaults['inherit_global_settings'] |
|
82 | 85 | |
@@ -103,6 +106,8 b' class RepoSettingsVcsView(RepoAppView):' | |||
|
103 | 106 | c.global_svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
104 | 107 | c.svn_branch_patterns = model.get_repo_svn_branch_patterns() |
|
105 | 108 | c.svn_tag_patterns = model.get_repo_svn_tag_patterns() |
|
109 | c.svn_generate_config = rhodecode.ConfigGet().get_bool(config_keys.generate_config) | |
|
110 | c.svn_config_path = rhodecode.ConfigGet().get_str(config_keys.config_file_path) | |
|
106 | 111 | |
|
107 | 112 | defaults = self._vcs_form_defaults(self.db_repo_name) |
|
108 | 113 | c.inherit_global_settings = defaults['inherit_global_settings'] |
@@ -19,8 +19,6 b'' | |||
|
19 | 19 | import logging |
|
20 | 20 | |
|
21 | 21 | from . import config_keys |
|
22 | from .events import SshKeyFileChangeEvent | |
|
23 | from .subscribers import generate_ssh_authorized_keys_file_subscriber | |
|
24 | 22 | |
|
25 | 23 | from rhodecode.config.settings_maker import SettingsMaker |
|
26 | 24 | |
@@ -42,9 +40,9 b' def _sanitize_settings_and_apply_default' | |||
|
42 | 40 | settings_maker.make_setting(config_keys.wrapper_cmd, '') |
|
43 | 41 | settings_maker.make_setting(config_keys.authorized_keys_line_ssh_opts, '') |
|
44 | 42 | |
|
45 |
settings_maker.make_setting(config_keys.ssh_hg_bin, ' |
|
|
46 |
settings_maker.make_setting(config_keys.ssh_git_bin, ' |
|
|
47 |
settings_maker.make_setting(config_keys.ssh_svn_bin, ' |
|
|
43 | settings_maker.make_setting(config_keys.ssh_hg_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/hg') | |
|
44 | settings_maker.make_setting(config_keys.ssh_git_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/git') | |
|
45 | settings_maker.make_setting(config_keys.ssh_svn_bin, '/usr/local/bin/rhodecode_bin/vcs_bin/svnserve') | |
|
48 | 46 | |
|
49 | 47 | settings_maker.env_expand() |
|
50 | 48 | |
@@ -55,5 +53,8 b' def includeme(config):' | |||
|
55 | 53 | |
|
56 | 54 | # if we have enable generation of file, subscribe to event |
|
57 | 55 | if settings[config_keys.generate_authorized_keyfile]: |
|
56 | # lazy import here for faster code reading... via sshwrapper-v2 mode | |
|
57 | from .subscribers import generate_ssh_authorized_keys_file_subscriber | |
|
58 | from .events import SshKeyFileChangeEvent | |
|
58 | 59 | config.add_subscriber( |
|
59 | 60 | generate_ssh_authorized_keys_file_subscriber, SshKeyFileChangeEvent) |
@@ -20,11 +20,10 b' import os' | |||
|
20 | 20 | import re |
|
21 | 21 | import logging |
|
22 | 22 | import datetime |
|
23 | import configparser | |
|
24 | 23 | from sqlalchemy import Table |
|
25 | 24 | |
|
25 | from rhodecode.lib.api_utils import call_service_api | |
|
26 | 26 | from rhodecode.lib.utils2 import AttributeDict |
|
27 | from rhodecode.model.scm import ScmModel | |
|
28 | 27 | |
|
29 | 28 | from .hg import MercurialServer |
|
30 | 29 | from .git import GitServer |
@@ -38,7 +37,7 b' class SshWrapper(object):' | |||
|
38 | 37 | svn_cmd_pat = re.compile(r'^svnserve -t') |
|
39 | 38 | |
|
40 | 39 | def __init__(self, command, connection_info, mode, |
|
41 | user, user_id, key_id: int, shell, ini_path: str, env): | |
|
40 | user, user_id, key_id: int, shell, ini_path: str, settings, env): | |
|
42 | 41 | self.command = command |
|
43 | 42 | self.connection_info = connection_info |
|
44 | 43 | self.mode = mode |
@@ -48,15 +47,9 b' class SshWrapper(object):' | |||
|
48 | 47 | self.shell = shell |
|
49 | 48 | self.ini_path = ini_path |
|
50 | 49 | self.env = env |
|
51 | ||
|
52 | self.config = self.parse_config(ini_path) | |
|
50 | self.settings = settings | |
|
53 | 51 | self.server_impl = None |
|
54 | 52 | |
|
55 | def parse_config(self, config_path): | |
|
56 | parser = configparser.ConfigParser() | |
|
57 | parser.read(config_path) | |
|
58 | return parser | |
|
59 | ||
|
60 | 53 | def update_key_access_time(self, key_id): |
|
61 | 54 | from rhodecode.model.meta import raw_query_executor, Base |
|
62 | 55 | |
@@ -161,6 +154,9 b' class SshWrapper(object):' | |||
|
161 | 154 | return vcs_type, repo_name, mode |
|
162 | 155 | |
|
163 | 156 | def serve(self, vcs, repo, mode, user, permissions, branch_permissions): |
|
157 | # TODO: remove this once we have .ini defined access path... | |
|
158 | from rhodecode.model.scm import ScmModel | |
|
159 | ||
|
164 | 160 | store = ScmModel().repos_path |
|
165 | 161 | |
|
166 | 162 | check_branch_perms = False |
@@ -185,7 +181,7 b' class SshWrapper(object):' | |||
|
185 | 181 | server = MercurialServer( |
|
186 | 182 | store=store, ini_path=self.ini_path, |
|
187 | 183 | repo_name=repo, user=user, |
|
188 |
user_permissions=permissions, |
|
|
184 | user_permissions=permissions, settings=self.settings, env=self.env) | |
|
189 | 185 | self.server_impl = server |
|
190 | 186 | return server.run(tunnel_extras=extras) |
|
191 | 187 | |
@@ -193,7 +189,7 b' class SshWrapper(object):' | |||
|
193 | 189 | server = GitServer( |
|
194 | 190 | store=store, ini_path=self.ini_path, |
|
195 | 191 | repo_name=repo, repo_mode=mode, user=user, |
|
196 |
user_permissions=permissions, |
|
|
192 | user_permissions=permissions, settings=self.settings, env=self.env) | |
|
197 | 193 | self.server_impl = server |
|
198 | 194 | return server.run(tunnel_extras=extras) |
|
199 | 195 | |
@@ -201,7 +197,7 b' class SshWrapper(object):' | |||
|
201 | 197 | server = SubversionServer( |
|
202 | 198 | store=store, ini_path=self.ini_path, |
|
203 | 199 | repo_name=None, user=user, |
|
204 |
user_permissions=permissions, |
|
|
200 | user_permissions=permissions, settings=self.settings, env=self.env) | |
|
205 | 201 | self.server_impl = server |
|
206 | 202 | return server.run(tunnel_extras=extras) |
|
207 | 203 | |
@@ -261,3 +257,131 b' class SshWrapper(object):' | |||
|
261 | 257 | exit_code = -1 |
|
262 | 258 | |
|
263 | 259 | return exit_code |
|
260 | ||
|
261 | ||
|
262 | class SshWrapperStandalone(SshWrapper): | |
|
263 | """ | |
|
264 | New version of SshWrapper designed to be depended only on service API | |
|
265 | """ | |
|
266 | repos_path = None | |
|
267 | ||
|
268 | @staticmethod | |
|
269 | def parse_user_related_data(user_data): | |
|
270 | user = AttributeDict() | |
|
271 | user.user_id = user_data['user_id'] | |
|
272 | user.username = user_data['username'] | |
|
273 | user.repo_permissions = user_data['repo_permissions'] | |
|
274 | user.branch_permissions = user_data['branch_permissions'] | |
|
275 | return user | |
|
276 | ||
|
277 | def wrap(self): | |
|
278 | mode = self.mode | |
|
279 | username = self.username | |
|
280 | user_id = self.user_id | |
|
281 | shell = self.shell | |
|
282 | ||
|
283 | scm_detected, scm_repo, scm_mode = self.get_repo_details(mode) | |
|
284 | ||
|
285 | log.debug( | |
|
286 | 'Mode: `%s` User: `name:%s : id:%s` Shell: `%s` SSH Command: `\"%s\"` ' | |
|
287 | 'SCM_DETECTED: `%s` SCM Mode: `%s` SCM Repo: `%s`', | |
|
288 | mode, username, user_id, shell, self.command, | |
|
289 | scm_detected, scm_mode, scm_repo) | |
|
290 | ||
|
291 | log.debug('SSH Connection info %s', self.get_connection_info()) | |
|
292 | ||
|
293 | if shell and self.command is None: | |
|
294 | log.info('Dropping to shell, no command given and shell is allowed') | |
|
295 | os.execl('/bin/bash', '-l') | |
|
296 | exit_code = 1 | |
|
297 | ||
|
298 | elif scm_detected: | |
|
299 | data = call_service_api(self.settings, { | |
|
300 | "method": "service_get_data_for_ssh_wrapper", | |
|
301 | "args": {"user_id": user_id, "repo_name": scm_repo, "key_id": self.key_id} | |
|
302 | }) | |
|
303 | user = self.parse_user_related_data(data) | |
|
304 | if not user: | |
|
305 | log.warning('User with id %s not found', user_id) | |
|
306 | exit_code = -1 | |
|
307 | return exit_code | |
|
308 | self.repos_path = data['repos_path'] | |
|
309 | permissions = user.repo_permissions | |
|
310 | repo_branch_permissions = user.branch_permissions | |
|
311 | try: | |
|
312 | exit_code, is_updated = self.serve( | |
|
313 | scm_detected, scm_repo, scm_mode, user, permissions, | |
|
314 | repo_branch_permissions) | |
|
315 | except Exception: | |
|
316 | log.exception('Error occurred during execution of SshWrapper') | |
|
317 | exit_code = -1 | |
|
318 | ||
|
319 | elif self.command is None and shell is False: | |
|
320 | log.error('No Command given.') | |
|
321 | exit_code = -1 | |
|
322 | ||
|
323 | else: | |
|
324 | log.error('Unhandled Command: "%s" Aborting.', self.command) | |
|
325 | exit_code = -1 | |
|
326 | ||
|
327 | return exit_code | |
|
328 | ||
|
329 | def maybe_translate_repo_uid(self, repo_name): | |
|
330 | _org_name = repo_name | |
|
331 | if _org_name.startswith('_'): | |
|
332 | _org_name = _org_name.split('/', 1)[0] | |
|
333 | ||
|
334 | if repo_name.startswith('_'): | |
|
335 | org_repo_name = repo_name | |
|
336 | log.debug('translating UID repo %s', org_repo_name) | |
|
337 | by_id_match = call_service_api(self.settings, { | |
|
338 | 'method': 'service_get_repo_name_by_id', | |
|
339 | "args": {"repo_id": repo_name} | |
|
340 | }) | |
|
341 | if by_id_match: | |
|
342 | repo_name = by_id_match['repo_name'] | |
|
343 | log.debug('translation of UID repo %s got `%s`', org_repo_name, repo_name) | |
|
344 | ||
|
345 | return repo_name, _org_name | |
|
346 | ||
|
347 | def serve(self, vcs, repo, mode, user, permissions, branch_permissions): | |
|
348 | store = self.repos_path | |
|
349 | ||
|
350 | check_branch_perms = False | |
|
351 | detect_force_push = False | |
|
352 | ||
|
353 | if branch_permissions: | |
|
354 | check_branch_perms = True | |
|
355 | detect_force_push = True | |
|
356 | ||
|
357 | log.debug( | |
|
358 | 'VCS detected:`%s` mode: `%s` repo_name: %s, branch_permission_checks:%s', | |
|
359 | vcs, mode, repo, check_branch_perms) | |
|
360 | ||
|
361 | # detect if we have to check branch permissions | |
|
362 | extras = { | |
|
363 | 'detect_force_push': detect_force_push, | |
|
364 | 'check_branch_perms': check_branch_perms, | |
|
365 | 'config': self.ini_path | |
|
366 | } | |
|
367 | ||
|
368 | match vcs: | |
|
369 | case 'hg': | |
|
370 | server = MercurialServer( | |
|
371 | store=store, ini_path=self.ini_path, | |
|
372 | repo_name=repo, user=user, | |
|
373 | user_permissions=permissions, settings=self.settings, env=self.env) | |
|
374 | case 'git': | |
|
375 | server = GitServer( | |
|
376 | store=store, ini_path=self.ini_path, | |
|
377 | repo_name=repo, repo_mode=mode, user=user, | |
|
378 | user_permissions=permissions, settings=self.settings, env=self.env) | |
|
379 | case 'svn': | |
|
380 | server = SubversionServer( | |
|
381 | store=store, ini_path=self.ini_path, | |
|
382 | repo_name=None, user=user, | |
|
383 | user_permissions=permissions, settings=self.settings, env=self.env) | |
|
384 | case _: | |
|
385 | raise Exception(f'Unrecognised VCS: {vcs}') | |
|
386 | self.server_impl = server | |
|
387 | return server.run(tunnel_extras=extras) |
@@ -20,26 +20,27 b' import os' | |||
|
20 | 20 | import sys |
|
21 | 21 | import logging |
|
22 | 22 | |
|
23 |
from rhodecode.lib.hook |
|
|
23 | from rhodecode.lib.hook_daemon.base import prepare_callback_daemon | |
|
24 | 24 | from rhodecode.lib.ext_json import sjson as json |
|
25 | 25 | from rhodecode.lib.vcs.conf import settings as vcs_settings |
|
26 |
from rhodecode. |
|
|
26 | from rhodecode.lib.api_utils import call_service_api | |
|
27 | 27 | |
|
28 | 28 | log = logging.getLogger(__name__) |
|
29 | 29 | |
|
30 | 30 | |
|
31 | class VcsServer(object): | |
|
31 | class SshVcsServer(object): | |
|
32 | 32 | repo_user_agent = None # set in child classes |
|
33 | 33 | _path = None # set executable path for hg/git/svn binary |
|
34 | 34 | backend = None # set in child classes |
|
35 | 35 | tunnel = None # subprocess handling tunnel |
|
36 | settings = None # parsed settings module | |
|
36 | 37 | write_perms = ['repository.admin', 'repository.write'] |
|
37 | 38 | read_perms = ['repository.read', 'repository.admin', 'repository.write'] |
|
38 | 39 | |
|
39 |
def __init__(self, user, user_permissions, |
|
|
40 | def __init__(self, user, user_permissions, settings, env): | |
|
40 | 41 | self.user = user |
|
41 | 42 | self.user_permissions = user_permissions |
|
42 |
self. |
|
|
43 | self.settings = settings | |
|
43 | 44 | self.env = env |
|
44 | 45 | self.stdin = sys.stdin |
|
45 | 46 | |
@@ -47,6 +48,7 b' class VcsServer(object):' | |||
|
47 | 48 | self.repo_mode = None |
|
48 | 49 | self.store = '' |
|
49 | 50 | self.ini_path = '' |
|
51 | self.hooks_protocol = None | |
|
50 | 52 | |
|
51 | 53 | def _invalidate_cache(self, repo_name): |
|
52 | 54 | """ |
@@ -54,7 +56,16 b' class VcsServer(object):' | |||
|
54 | 56 | |
|
55 | 57 | :param repo_name: full repo name, also a cache key |
|
56 | 58 | """ |
|
59 | # Todo: Leave only "celery" case after transition. | |
|
60 | match self.hooks_protocol: | |
|
61 | case 'http': | |
|
62 | from rhodecode.model.scm import ScmModel | |
|
57 | 63 | ScmModel().mark_for_invalidation(repo_name) |
|
64 | case 'celery': | |
|
65 | call_service_api(self.settings, { | |
|
66 | "method": "service_mark_for_invalidation", | |
|
67 | "args": {"repo_name": repo_name} | |
|
68 | }) | |
|
58 | 69 | |
|
59 | 70 | def has_write_perm(self): |
|
60 | 71 | permission = self.user_permissions.get(self.repo_name) |
@@ -65,30 +76,31 b' class VcsServer(object):' | |||
|
65 | 76 | |
|
66 | 77 | def _check_permissions(self, action): |
|
67 | 78 | permission = self.user_permissions.get(self.repo_name) |
|
79 | user_info = f'{self.user["user_id"]}:{self.user["username"]}' | |
|
68 | 80 | log.debug('permission for %s on %s are: %s', |
|
69 |
|
|
|
81 | user_info, self.repo_name, permission) | |
|
70 | 82 | |
|
71 | 83 | if not permission: |
|
72 | 84 | log.error('user `%s` permissions to repo:%s are empty. Forbidding access.', |
|
73 |
|
|
|
85 | user_info, self.repo_name) | |
|
74 | 86 | return -2 |
|
75 | 87 | |
|
76 | 88 | if action == 'pull': |
|
77 | 89 | if permission in self.read_perms: |
|
78 | 90 | log.info( |
|
79 | 91 | 'READ Permissions for User "%s" detected to repo "%s"!', |
|
80 |
|
|
|
92 | user_info, self.repo_name) | |
|
81 | 93 | return 0 |
|
82 | 94 | else: |
|
83 | 95 | if permission in self.write_perms: |
|
84 | 96 | log.info( |
|
85 | 97 | 'WRITE, or Higher Permissions for User "%s" detected to repo "%s"!', |
|
86 |
|
|
|
98 | user_info, self.repo_name) | |
|
87 | 99 | return 0 |
|
88 | 100 | |
|
89 | 101 | log.error('Cannot properly fetch or verify user `%s` permissions. ' |
|
90 | 102 | 'Permissions: %s, vcs action: %s', |
|
91 |
|
|
|
103 | user_info, permission, action) | |
|
92 | 104 | return -2 |
|
93 | 105 | |
|
94 | 106 | def update_environment(self, action, extras=None): |
@@ -107,7 +119,7 b' class VcsServer(object):' | |||
|
107 | 119 | 'server_url': None, |
|
108 | 120 | 'user_agent': f'{self.repo_user_agent}/ssh-user-agent', |
|
109 | 121 | 'hooks': ['push', 'pull'], |
|
110 |
'hooks_module': 'rhodecode.lib.hook |
|
|
122 | 'hooks_module': 'rhodecode.lib.hook_daemon.hook_module', | |
|
111 | 123 | 'is_shadow_repo': False, |
|
112 | 124 | 'detect_force_push': False, |
|
113 | 125 | 'check_branch_perms': False, |
@@ -134,7 +146,8 b' class VcsServer(object):' | |||
|
134 | 146 | if exit_code: |
|
135 | 147 | return exit_code, False |
|
136 | 148 | |
|
137 |
req = self.env |
|
|
149 | req = self.env.get('request') | |
|
150 | if req: | |
|
138 | 151 | server_url = req.host_url + req.script_name |
|
139 | 152 | extras['server_url'] = server_url |
|
140 | 153 | |
@@ -144,12 +157,13 b' class VcsServer(object):' | |||
|
144 | 157 | return exit_code, action == "push" |
|
145 | 158 | |
|
146 | 159 | def run(self, tunnel_extras=None): |
|
160 | self.hooks_protocol = self.settings['vcs.hooks.protocol'] | |
|
147 | 161 | tunnel_extras = tunnel_extras or {} |
|
148 | 162 | extras = {} |
|
149 | 163 | extras.update(tunnel_extras) |
|
150 | 164 | |
|
151 | 165 | callback_daemon, extras = prepare_callback_daemon( |
|
152 |
extras, protocol= |
|
|
166 | extras, protocol=self.hooks_protocol, | |
|
153 | 167 | host=vcs_settings.HOOKS_HOST) |
|
154 | 168 | |
|
155 | 169 | with callback_daemon: |
@@ -21,7 +21,7 b' import logging' | |||
|
21 | 21 | import subprocess |
|
22 | 22 | |
|
23 | 23 | from vcsserver import hooks |
|
24 | from .base import VcsServer | |
|
24 | from .base import SshVcsServer | |
|
25 | 25 | |
|
26 | 26 | log = logging.getLogger(__name__) |
|
27 | 27 | |
@@ -70,19 +70,17 b' class GitTunnelWrapper(object):' | |||
|
70 | 70 | return result |
|
71 | 71 | |
|
72 | 72 | |
|
73 | class GitServer(VcsServer): | |
|
73 | class GitServer(SshVcsServer): | |
|
74 | 74 | backend = 'git' |
|
75 | 75 | repo_user_agent = 'git' |
|
76 | 76 | |
|
77 | def __init__(self, store, ini_path, repo_name, repo_mode, | |
|
78 |
|
|
|
79 | super().\ | |
|
80 | __init__(user, user_permissions, config, env) | |
|
77 | def __init__(self, store, ini_path, repo_name, repo_mode, user, user_permissions, settings, env): | |
|
78 | super().__init__(user, user_permissions, settings, env) | |
|
81 | 79 | |
|
82 | 80 | self.store = store |
|
83 | 81 | self.ini_path = ini_path |
|
84 | 82 | self.repo_name = repo_name |
|
85 |
self._path = self.git_path = |
|
|
83 | self._path = self.git_path = settings['ssh.executable.git'] | |
|
86 | 84 | |
|
87 | 85 | self.repo_mode = repo_mode |
|
88 | 86 | self.tunnel = GitTunnelWrapper(server=self) |
@@ -22,9 +22,10 b' import logging' | |||
|
22 | 22 | import tempfile |
|
23 | 23 | import textwrap |
|
24 | 24 | import collections |
|
25 | from .base import VcsServer | |
|
26 | from rhodecode.model.db import RhodeCodeUi | |
|
27 | from rhodecode.model.settings import VcsSettingsModel | |
|
25 | ||
|
26 | from .base import SshVcsServer | |
|
27 | ||
|
28 | from rhodecode.lib.api_utils import call_service_api | |
|
28 | 29 | |
|
29 | 30 | log = logging.getLogger(__name__) |
|
30 | 31 | |
@@ -56,7 +57,7 b' class MercurialTunnelWrapper(object):' | |||
|
56 | 57 | # cleanup custom hgrc file |
|
57 | 58 | if os.path.isfile(hgrc_custom): |
|
58 | 59 | with open(hgrc_custom, 'wb') as f: |
|
59 | f.write('') | |
|
60 | f.write(b'') | |
|
60 | 61 | log.debug('Cleanup custom hgrc file under %s', hgrc_custom) |
|
61 | 62 | |
|
62 | 63 | # write temp |
@@ -93,21 +94,31 b' class MercurialTunnelWrapper(object):' | |||
|
93 | 94 | self.remove_configs() |
|
94 | 95 | |
|
95 | 96 | |
|
96 | class MercurialServer(VcsServer): | |
|
97 | class MercurialServer(SshVcsServer): | |
|
97 | 98 | backend = 'hg' |
|
98 | 99 | repo_user_agent = 'mercurial' |
|
99 | 100 | cli_flags = ['phases', 'largefiles', 'extensions', 'experimental', 'hooks'] |
|
100 | 101 | |
|
101 |
def __init__(self, store, ini_path, repo_name, user, user_permissions, |
|
|
102 |
super().__init__(user, user_permissions, |
|
|
102 | def __init__(self, store, ini_path, repo_name, user, user_permissions, settings, env): | |
|
103 | super().__init__(user, user_permissions, settings, env) | |
|
103 | 104 | |
|
104 | 105 | self.store = store |
|
105 | 106 | self.ini_path = ini_path |
|
106 | 107 | self.repo_name = repo_name |
|
107 |
self._path = self.hg_path = |
|
|
108 | self._path = self.hg_path = settings['ssh.executable.hg'] | |
|
108 | 109 | self.tunnel = MercurialTunnelWrapper(server=self) |
|
109 | 110 | |
|
110 | 111 | def config_to_hgrc(self, repo_name): |
|
112 | # Todo: once transition is done only call to service api should exist | |
|
113 | if self.hooks_protocol == 'celery': | |
|
114 | data = call_service_api(self.settings, { | |
|
115 | "method": "service_config_to_hgrc", | |
|
116 | "args": {"cli_flags": self.cli_flags, "repo_name": repo_name} | |
|
117 | }) | |
|
118 | return data['flags'] | |
|
119 | else: | |
|
120 | from rhodecode.model.db import RhodeCodeUi | |
|
121 | from rhodecode.model.settings import VcsSettingsModel | |
|
111 | 122 | ui_sections = collections.defaultdict(list) |
|
112 | 123 | ui = VcsSettingsModel(repo=repo_name).get_ui_settings(section=None, key=None) |
|
113 | 124 |
@@ -25,7 +25,8 b' import tempfile' | |||
|
25 | 25 | from subprocess import Popen, PIPE |
|
26 | 26 | import urllib.parse |
|
27 | 27 | |
|
28 | from .base import VcsServer | |
|
28 | from rhodecode_tools.lib.utils import safe_str | |
|
29 | from .base import SshVcsServer | |
|
29 | 30 | |
|
30 | 31 | log = logging.getLogger(__name__) |
|
31 | 32 | |
@@ -81,7 +82,7 b' class SubversionTunnelWrapper(object):' | |||
|
81 | 82 | |
|
82 | 83 | def sync(self): |
|
83 | 84 | while self.process.poll() is None: |
|
84 | next_byte = self.stdin.read(1) | |
|
85 | next_byte = self.stdin.buffer.read(1) | |
|
85 | 86 | if not next_byte: |
|
86 | 87 | break |
|
87 | 88 | self.process.stdin.write(next_byte) |
@@ -101,19 +102,27 b' class SubversionTunnelWrapper(object):' | |||
|
101 | 102 | |
|
102 | 103 | def patch_first_client_response(self, response, **kwargs): |
|
103 | 104 | self.create_hooks_env() |
|
104 | data = response.copy() | |
|
105 | data.update(kwargs) | |
|
106 | data['url'] = self._svn_string(data['url']) | |
|
107 | data['ra_client'] = self._svn_string(data['ra_client']) | |
|
108 | data['client'] = data['client'] or '' | |
|
109 | buffer_ = ( | |
|
110 | "( {version} ( {capabilities} ) {url}{ra_client}" | |
|
111 | "( {client}) ) ".format(**data)) | |
|
105 | ||
|
106 | version = response['version'] | |
|
107 | capabilities = response['capabilities'] | |
|
108 | client = response['client'] or b'' | |
|
109 | ||
|
110 | url = self._svn_bytes(response['url']) | |
|
111 | ra_client = self._svn_bytes(response['ra_client']) | |
|
112 | ||
|
113 | buffer_ = b"( %b ( %b ) %b%b( %b) ) " % ( | |
|
114 | version, | |
|
115 | capabilities, | |
|
116 | url, | |
|
117 | ra_client, | |
|
118 | client | |
|
119 | ) | |
|
112 | 120 | self.process.stdin.write(buffer_) |
|
113 | 121 | |
|
114 | 122 | def fail(self, message): |
|
115 |
|
|
|
116 | message=self._svn_string(message))) | |
|
123 | fail_msg = b"( failure ( ( 210005 %b 0: 0 ) ) )" % self._svn_bytes(message) | |
|
124 | sys.stdout.buffer.write(fail_msg) | |
|
125 | sys.stdout.flush() | |
|
117 | 126 | self.remove_configs() |
|
118 | 127 | self.process.kill() |
|
119 | 128 | return 1 |
@@ -121,27 +130,28 b' class SubversionTunnelWrapper(object):' | |||
|
121 | 130 | def interrupt(self, signum, frame): |
|
122 | 131 | self.fail("Exited by timeout") |
|
123 | 132 | |
|
124 |
def _svn_s |
|
|
125 |
if not |
|
|
126 | return '' | |
|
127 | return f'{len(str_)}:{str_} ' | |
|
133 | def _svn_bytes(self, bytes_: bytes) -> bytes: | |
|
134 | if not bytes_: | |
|
135 | return b'' | |
|
136 | ||
|
137 | return f'{len(bytes_)}:'.encode() + bytes_ + b' ' | |
|
128 | 138 | |
|
129 | 139 | def _read_first_client_response(self): |
|
130 | buffer_ = "" | |
|
140 | buffer_ = b"" | |
|
131 | 141 | brackets_stack = [] |
|
132 | 142 | while True: |
|
133 | next_byte = self.stdin.read(1) | |
|
143 | next_byte = self.stdin.buffer.read(1) | |
|
134 | 144 | buffer_ += next_byte |
|
135 | if next_byte == "(": | |
|
145 | if next_byte == b"(": | |
|
136 | 146 | brackets_stack.append(next_byte) |
|
137 | elif next_byte == ")": | |
|
147 | elif next_byte == b")": | |
|
138 | 148 | brackets_stack.pop() |
|
139 | elif next_byte == " " and not brackets_stack: | |
|
149 | elif next_byte == b" " and not brackets_stack: | |
|
140 | 150 | break |
|
141 | 151 | |
|
142 | 152 | return buffer_ |
|
143 | 153 | |
|
144 | def _parse_first_client_response(self, buffer_): | |
|
154 | def _parse_first_client_response(self, buffer_: bytes): | |
|
145 | 155 | """ |
|
146 | 156 | According to the Subversion RA protocol, the first request |
|
147 | 157 | should look like: |
@@ -151,16 +161,20 b' class SubversionTunnelWrapper(object):' | |||
|
151 | 161 | |
|
152 | 162 | Please check https://svn.apache.org/repos/asf/subversion/trunk/subversion/libsvn_ra_svn/protocol |
|
153 | 163 | """ |
|
154 | version_re = r'(?P<version>\d+)' | |
|
155 | capabilities_re = r'\(\s(?P<capabilities>[\w\d\-\ ]+)\s\)' | |
|
156 | url_re = r'\d+\:(?P<url>[\W\w]+)' | |
|
157 | ra_client_re = r'(\d+\:(?P<ra_client>[\W\w]+)\s)' | |
|
158 | client_re = r'(\d+\:(?P<client>[\W\w]+)\s)*' | |
|
164 | version_re = br'(?P<version>\d+)' | |
|
165 | capabilities_re = br'\(\s(?P<capabilities>[\w\d\-\ ]+)\s\)' | |
|
166 | url_re = br'\d+\:(?P<url>[\W\w]+)' | |
|
167 | ra_client_re = br'(\d+\:(?P<ra_client>[\W\w]+)\s)' | |
|
168 | client_re = br'(\d+\:(?P<client>[\W\w]+)\s)*' | |
|
159 | 169 | regex = re.compile( |
|
160 | r'^\(\s{version}\s{capabilities}\s{url}\s{ra_client}' | |
|
161 |
r'\(\s |
|
|
162 | version=version_re, capabilities=capabilities_re, | |
|
163 | url=url_re, ra_client=ra_client_re, client=client_re)) | |
|
170 | br'^\(\s%b\s%b\s%b\s%b' | |
|
171 | br'\(\s%b\)\s\)\s*$' % ( | |
|
172 | version_re, | |
|
173 | capabilities_re, | |
|
174 | url_re, | |
|
175 | ra_client_re, | |
|
176 | client_re) | |
|
177 | ) | |
|
164 | 178 | matcher = regex.match(buffer_) |
|
165 | 179 | |
|
166 | 180 | return matcher.groupdict() if matcher else None |
@@ -198,11 +212,11 b' class SubversionTunnelWrapper(object):' | |||
|
198 | 212 | |
|
199 | 213 | first_response = self.get_first_client_response() |
|
200 | 214 | if not first_response: |
|
201 | return self.fail("Repository name cannot be extracted") | |
|
215 | return self.fail(b"Repository name cannot be extracted") | |
|
202 | 216 | |
|
203 | 217 | url_parts = urllib.parse.urlparse(first_response['url']) |
|
204 | 218 | |
|
205 | self.server.repo_name = self._match_repo_name(url_parts.path.strip('/')) | |
|
219 | self.server.repo_name = self._match_repo_name(safe_str(url_parts.path).strip('/')) | |
|
206 | 220 | |
|
207 | 221 | exit_code = self.server._check_permissions(action) |
|
208 | 222 | if exit_code: |
@@ -218,20 +232,18 b' class SubversionTunnelWrapper(object):' | |||
|
218 | 232 | return self.return_code |
|
219 | 233 | |
|
220 | 234 | |
|
221 | class SubversionServer(VcsServer): | |
|
235 | class SubversionServer(SshVcsServer): | |
|
222 | 236 | backend = 'svn' |
|
223 | 237 | repo_user_agent = 'svn' |
|
224 | 238 | |
|
225 | def __init__(self, store, ini_path, repo_name, | |
|
226 |
|
|
|
227 | super()\ | |
|
228 | .__init__(user, user_permissions, config, env) | |
|
239 | def __init__(self, store, ini_path, repo_name, user, user_permissions, settings, env): | |
|
240 | super().__init__(user, user_permissions, settings, env) | |
|
229 | 241 | self.store = store |
|
230 | 242 | self.ini_path = ini_path |
|
231 | 243 | # NOTE(dan): repo_name at this point is empty, |
|
232 | 244 | # this is set later in .run() based from parsed input stream |
|
233 | 245 | self.repo_name = repo_name |
|
234 |
self._path = self.svn_path = |
|
|
246 | self._path = self.svn_path = settings['ssh.executable.svn'] | |
|
235 | 247 | |
|
236 | 248 | self.tunnel = SubversionTunnelWrapper(server=self) |
|
237 | 249 | |
@@ -244,7 +256,8 b' class SubversionServer(VcsServer):' | |||
|
244 | 256 | # if exit_code: |
|
245 | 257 | # return exit_code, False |
|
246 | 258 | |
|
247 |
req = self.env |
|
|
259 | req = self.env.get('request') | |
|
260 | if req: | |
|
248 | 261 | server_url = req.host_url + req.script_name |
|
249 | 262 | extras['server_url'] = server_url |
|
250 | 263 |
@@ -23,29 +23,14 b' import logging' | |||
|
23 | 23 | |
|
24 | 24 | import click |
|
25 | 25 | |
|
26 | from pyramid.paster import setup_logging | |
|
27 | ||
|
28 | 26 | from rhodecode.lib.pyramid_utils import bootstrap |
|
29 | 27 | from rhodecode.lib.statsd_client import StatsdClient |
|
30 | 28 | from .backends import SshWrapper |
|
29 | from .utils import setup_custom_logging | |
|
31 | 30 | |
|
32 | 31 | log = logging.getLogger(__name__) |
|
33 | 32 | |
|
34 | 33 | |
|
35 | def setup_custom_logging(ini_path, debug): | |
|
36 | if debug: | |
|
37 | # enabled rhodecode.ini controlled logging setup | |
|
38 | setup_logging(ini_path) | |
|
39 | else: | |
|
40 | # configure logging in a mode that doesn't print anything. | |
|
41 | # in case of regularly configured logging it gets printed out back | |
|
42 | # to the client doing an SSH command. | |
|
43 | logger = logging.getLogger('') | |
|
44 | null = logging.NullHandler() | |
|
45 | # add the handler to the root logger | |
|
46 | logger.handlers = [null] | |
|
47 | ||
|
48 | ||
|
49 | 34 | @click.command() |
|
50 | 35 | @click.argument('ini_path', type=click.Path(exists=True)) |
|
51 | 36 | @click.option( |
@@ -69,11 +54,12 b' def main(ini_path, mode, user, user_id, ' | |||
|
69 | 54 | connection_info = os.environ.get('SSH_CONNECTION', '') |
|
70 | 55 | time_start = time.time() |
|
71 | 56 | with bootstrap(ini_path, env={'RC_CMD_SSH_WRAPPER': '1'}) as env: |
|
57 | settings = env['registry'].settings | |
|
72 | 58 | statsd = StatsdClient.statsd |
|
73 | 59 | try: |
|
74 | 60 | ssh_wrapper = SshWrapper( |
|
75 | 61 | command, connection_info, mode, |
|
76 | user, user_id, key_id, shell, ini_path, env) | |
|
62 | user, user_id, key_id, shell, ini_path, settings, env) | |
|
77 | 63 | except Exception: |
|
78 | 64 | log.exception('Failed to execute SshWrapper') |
|
79 | 65 | sys.exit(-5) |
@@ -20,7 +20,7 b' import os' | |||
|
20 | 20 | import pytest |
|
21 | 21 | import configparser |
|
22 | 22 | |
|
23 | from rhodecode.apps.ssh_support.lib.ssh_wrapper import SshWrapper | |
|
23 | from rhodecode.apps.ssh_support.lib.ssh_wrapper_v1 import SshWrapper | |
|
24 | 24 | from rhodecode.lib.utils2 import AttributeDict |
|
25 | 25 | |
|
26 | 26 | |
@@ -52,7 +52,10 b' def dummy_env():' | |||
|
52 | 52 | |
|
53 | 53 | |
|
54 | 54 | def plain_dummy_user(): |
|
55 |
return AttributeDict( |
|
|
55 | return AttributeDict( | |
|
56 | user_id=1, | |
|
57 | username='test_user' | |
|
58 | ) | |
|
56 | 59 | |
|
57 | 60 | |
|
58 | 61 | @pytest.fixture() |
@@ -65,4 +68,4 b' def ssh_wrapper(app, dummy_conf_file, du' | |||
|
65 | 68 | conn_info = '127.0.0.1 22 10.0.0.1 443' |
|
66 | 69 | return SshWrapper( |
|
67 | 70 | 'random command', conn_info, 'auto', 'admin', '1', key_id='1', |
|
68 | shell=False, ini_path=dummy_conf_file, env=dummy_env) | |
|
71 | shell=False, ini_path=dummy_conf_file, settings={}, env=dummy_env) |
@@ -25,6 +25,7 b' from rhodecode.apps.ssh_support.lib.back' | |||
|
25 | 25 | from rhodecode.apps.ssh_support.tests.conftest import plain_dummy_env, plain_dummy_user |
|
26 | 26 | from rhodecode.lib.ext_json import json |
|
27 | 27 | |
|
28 | ||
|
28 | 29 | class GitServerCreator(object): |
|
29 | 30 | root = '/tmp/repo/path/' |
|
30 | 31 | git_path = '/usr/local/bin/git' |
@@ -39,10 +40,7 b' class GitServerCreator(object):' | |||
|
39 | 40 | user = plain_dummy_user() |
|
40 | 41 | |
|
41 | 42 | def __init__(self): |
|
42 | def config_get(part, key): | |
|
43 | return self.config_data.get(part, {}).get(key) | |
|
44 | self.config_mock = mock.Mock() | |
|
45 | self.config_mock.get = mock.Mock(side_effect=config_get) | |
|
43 | pass | |
|
46 | 44 | |
|
47 | 45 | def create(self, **kwargs): |
|
48 | 46 | parameters = { |
@@ -54,7 +52,7 b' class GitServerCreator(object):' | |||
|
54 | 52 | 'user_permissions': { |
|
55 | 53 | self.repo_name: 'repository.admin' |
|
56 | 54 | }, |
|
57 |
' |
|
|
55 | 'settings': self.config_data['app:main'], | |
|
58 | 56 | 'env': plain_dummy_env() |
|
59 | 57 | } |
|
60 | 58 | parameters.update(kwargs) |
@@ -142,7 +140,7 b' class TestGitServer(object):' | |||
|
142 | 140 | 'server_url': None, |
|
143 | 141 | 'hooks': ['push', 'pull'], |
|
144 | 142 | 'is_shadow_repo': False, |
|
145 |
'hooks_module': 'rhodecode.lib.hook |
|
|
143 | 'hooks_module': 'rhodecode.lib.hook_daemon.hook_module', | |
|
146 | 144 | 'check_branch_perms': False, |
|
147 | 145 | 'detect_force_push': False, |
|
148 | 146 | 'user_agent': u'git/ssh-user-agent', |
@@ -38,10 +38,7 b' class MercurialServerCreator(object):' | |||
|
38 | 38 | user = plain_dummy_user() |
|
39 | 39 | |
|
40 | 40 | def __init__(self): |
|
41 | def config_get(part, key): | |
|
42 | return self.config_data.get(part, {}).get(key) | |
|
43 | self.config_mock = mock.Mock() | |
|
44 | self.config_mock.get = mock.Mock(side_effect=config_get) | |
|
41 | pass | |
|
45 | 42 | |
|
46 | 43 | def create(self, **kwargs): |
|
47 | 44 | parameters = { |
@@ -52,7 +49,7 b' class MercurialServerCreator(object):' | |||
|
52 | 49 | 'user_permissions': { |
|
53 | 50 | 'test_hg': 'repository.admin' |
|
54 | 51 | }, |
|
55 |
' |
|
|
52 | 'settings': self.config_data['app:main'], | |
|
56 | 53 | 'env': plain_dummy_env() |
|
57 | 54 | } |
|
58 | 55 | parameters.update(kwargs) |
@@ -36,10 +36,7 b' class SubversionServerCreator(object):' | |||
|
36 | 36 | user = plain_dummy_user() |
|
37 | 37 | |
|
38 | 38 | def __init__(self): |
|
39 | def config_get(part, key): | |
|
40 | return self.config_data.get(part, {}).get(key) | |
|
41 | self.config_mock = mock.Mock() | |
|
42 | self.config_mock.get = mock.Mock(side_effect=config_get) | |
|
39 | pass | |
|
43 | 40 | |
|
44 | 41 | def create(self, **kwargs): |
|
45 | 42 | parameters = { |
@@ -50,7 +47,7 b' class SubversionServerCreator(object):' | |||
|
50 | 47 | 'user_permissions': { |
|
51 | 48 | self.repo_name: 'repository.admin' |
|
52 | 49 | }, |
|
53 |
' |
|
|
50 | 'settings': self.config_data['app:main'], | |
|
54 | 51 | 'env': plain_dummy_env() |
|
55 | 52 | } |
|
56 | 53 | |
@@ -65,6 +62,7 b' def svn_server(app):' | |||
|
65 | 62 | |
|
66 | 63 | |
|
67 | 64 | class TestSubversionServer(object): |
|
65 | ||
|
68 | 66 | def test_command(self, svn_server): |
|
69 | 67 | server = svn_server.create() |
|
70 | 68 | expected_command = [ |
@@ -28,10 +28,6 b' class TestSSHWrapper(object):' | |||
|
28 | 28 | permissions={}, branch_permissions={}) |
|
29 | 29 | assert str(exc_info.value) == 'Unrecognised VCS: microsoft-tfs' |
|
30 | 30 | |
|
31 | def test_parse_config(self, ssh_wrapper): | |
|
32 | config = ssh_wrapper.parse_config(ssh_wrapper.ini_path) | |
|
33 | assert config | |
|
34 | ||
|
35 | 31 | def test_get_connection_info(self, ssh_wrapper): |
|
36 | 32 | conn_info = ssh_wrapper.get_connection_info() |
|
37 | 33 | assert {'client_ip': '127.0.0.1', |
@@ -22,7 +22,7 b' import os' | |||
|
22 | 22 | from pyramid.renderers import render |
|
23 | 23 | |
|
24 | 24 | from rhodecode.events import trigger |
|
25 |
from rhodecode.lib.utils import get_rhodecode_realm, get_rhodecode_ |
|
|
25 | from rhodecode.lib.utils import get_rhodecode_realm, get_rhodecode_repo_store_path | |
|
26 | 26 | from rhodecode.lib.utils2 import str2bool |
|
27 | 27 | from rhodecode.model.db import RepoGroup |
|
28 | 28 | |
@@ -38,7 +38,7 b' def write_mod_dav_svn_config(settings):' | |||
|
38 | 38 | file_path = settings[config_keys.config_file_path] |
|
39 | 39 | config = _render_mod_dav_svn_config( |
|
40 | 40 | use_ssl=use_ssl, |
|
41 |
parent_path_root=get_rhodecode_ |
|
|
41 | parent_path_root=get_rhodecode_repo_store_path(), | |
|
42 | 42 | list_parent_path=settings[config_keys.list_parent_path], |
|
43 | 43 | location_root=settings[config_keys.location_root], |
|
44 | 44 | repo_groups=RepoGroup.get_all_repo_groups(), |
@@ -389,11 +389,7 b' class RhodeCodeAuthPluginBase(object):' | |||
|
389 | 389 | log.debug( |
|
390 | 390 | 'Trying to fetch user `%s` from RhodeCode database', username) |
|
391 | 391 | if username: |
|
392 | user = User.get_by_username(username) | |
|
393 | if not user: | |
|
394 | log.debug('User not found, fallback to fetch user in ' | |
|
395 | 'case insensitive mode') | |
|
396 | user = User.get_by_username(username, case_insensitive=True) | |
|
392 | user = User.get_by_username_or_primary_email(username) | |
|
397 | 393 | else: |
|
398 | 394 | log.debug('provided username:`%s` is empty skipping...', username) |
|
399 | 395 | if not user: |
@@ -31,7 +31,7 b' import urllib.parse' | |||
|
31 | 31 | from rhodecode.translation import _ |
|
32 | 32 | from rhodecode.authentication.base import ( |
|
33 | 33 | RhodeCodeExternalAuthPlugin, hybrid_property) |
|
34 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase | |
|
34 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase, TwoFactorAuthnPluginSettingsSchemaMixin | |
|
35 | 35 | from rhodecode.authentication.routes import AuthnPluginResourceBase |
|
36 | 36 | from rhodecode.lib.colander_utils import strip_whitespace |
|
37 | 37 | from rhodecode.lib.ext_json import json, formatted_json |
@@ -53,7 +53,7 b' class CrowdAuthnResource(AuthnPluginReso' | |||
|
53 | 53 | pass |
|
54 | 54 | |
|
55 | 55 | |
|
56 | class CrowdSettingsSchema(AuthnPluginSettingsSchemaBase): | |
|
56 | class CrowdSettingsSchema(TwoFactorAuthnPluginSettingsSchemaMixin, AuthnPluginSettingsSchemaBase): | |
|
57 | 57 | host = colander.SchemaNode( |
|
58 | 58 | colander.String(), |
|
59 | 59 | default='127.0.0.1', |
@@ -33,7 +33,7 b' import urllib.error' | |||
|
33 | 33 | from rhodecode.translation import _ |
|
34 | 34 | from rhodecode.authentication.base import ( |
|
35 | 35 | RhodeCodeExternalAuthPlugin, hybrid_property) |
|
36 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase | |
|
36 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase, TwoFactorAuthnPluginSettingsSchemaMixin | |
|
37 | 37 | from rhodecode.authentication.routes import AuthnPluginResourceBase |
|
38 | 38 | from rhodecode.lib.colander_utils import strip_whitespace |
|
39 | 39 | from rhodecode.model.db import User |
@@ -55,7 +55,7 b' class JasigCasAuthnResource(AuthnPluginR' | |||
|
55 | 55 | pass |
|
56 | 56 | |
|
57 | 57 | |
|
58 | class JasigCasSettingsSchema(AuthnPluginSettingsSchemaBase): | |
|
58 | class JasigCasSettingsSchema(TwoFactorAuthnPluginSettingsSchemaMixin, AuthnPluginSettingsSchemaBase): | |
|
59 | 59 | service_url = colander.SchemaNode( |
|
60 | 60 | colander.String(), |
|
61 | 61 | default='https://domain.com/cas/v1/tickets', |
@@ -27,7 +27,7 b' import colander' | |||
|
27 | 27 | from rhodecode.translation import _ |
|
28 | 28 | from rhodecode.authentication.base import ( |
|
29 | 29 | RhodeCodeExternalAuthPlugin, AuthLdapBase, hybrid_property) |
|
30 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase | |
|
30 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase, TwoFactorAuthnPluginSettingsSchemaMixin | |
|
31 | 31 | from rhodecode.authentication.routes import AuthnPluginResourceBase |
|
32 | 32 | from rhodecode.lib.colander_utils import strip_whitespace |
|
33 | 33 | from rhodecode.lib.exceptions import ( |
@@ -245,7 +245,7 b' class AuthLdap(AuthLdapBase):' | |||
|
245 | 245 | return dn, user_attrs |
|
246 | 246 | |
|
247 | 247 | |
|
248 | class LdapSettingsSchema(AuthnPluginSettingsSchemaBase): | |
|
248 | class LdapSettingsSchema(TwoFactorAuthnPluginSettingsSchemaMixin, AuthnPluginSettingsSchemaBase): | |
|
249 | 249 | tls_kind_choices = ['PLAIN', 'LDAPS', 'START_TLS'] |
|
250 | 250 | tls_reqcert_choices = ['NEVER', 'ALLOW', 'TRY', 'DEMAND', 'HARD'] |
|
251 | 251 | search_scope_choices = ['BASE', 'ONELEVEL', 'SUBTREE'] |
@@ -31,7 +31,7 b' import socket' | |||
|
31 | 31 | from rhodecode.translation import _ |
|
32 | 32 | from rhodecode.authentication.base import ( |
|
33 | 33 | RhodeCodeExternalAuthPlugin, hybrid_property) |
|
34 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase | |
|
34 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase, TwoFactorAuthnPluginSettingsSchemaMixin | |
|
35 | 35 | from rhodecode.authentication.routes import AuthnPluginResourceBase |
|
36 | 36 | from rhodecode.lib.colander_utils import strip_whitespace |
|
37 | 37 | |
@@ -51,7 +51,7 b' class PamAuthnResource(AuthnPluginResour' | |||
|
51 | 51 | pass |
|
52 | 52 | |
|
53 | 53 | |
|
54 | class PamSettingsSchema(AuthnPluginSettingsSchemaBase): | |
|
54 | class PamSettingsSchema(TwoFactorAuthnPluginSettingsSchemaMixin, AuthnPluginSettingsSchemaBase): | |
|
55 | 55 | service = colander.SchemaNode( |
|
56 | 56 | colander.String(), |
|
57 | 57 | default='login', |
@@ -27,7 +27,7 b' import colander' | |||
|
27 | 27 | from rhodecode.translation import _ |
|
28 | 28 | from rhodecode.lib.utils2 import safe_bytes |
|
29 | 29 | from rhodecode.model.db import User |
|
30 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase | |
|
30 | from rhodecode.authentication.schema import AuthnPluginSettingsSchemaBase, TwoFactorAuthnPluginSettingsSchemaMixin | |
|
31 | 31 | from rhodecode.authentication.base import ( |
|
32 | 32 | RhodeCodeAuthPluginBase, hybrid_property, HTTP_TYPE, VCS_TYPE) |
|
33 | 33 | from rhodecode.authentication.routes import AuthnPluginResourceBase |
@@ -169,7 +169,7 b' class RhodeCodeAuthPlugin(RhodeCodeAuthP' | |||
|
169 | 169 | extra={"action": "user_auth_ok", "auth_module": "auth_rhodecode_anon", "username": userobj.username}) |
|
170 | 170 | return user_attrs |
|
171 | 171 | |
|
172 | elif userobj.username == username and password_match: | |
|
172 | elif (userobj.username == username or userobj.email == username) and password_match: | |
|
173 | 173 | log.info('user `%s` authenticated correctly', userobj.username, |
|
174 | 174 | extra={"action": "user_auth_ok", "auth_module": "auth_rhodecode", "username": userobj.username}) |
|
175 | 175 | return user_attrs |
@@ -182,8 +182,7 b' class RhodeCodeAuthPlugin(RhodeCodeAuthP' | |||
|
182 | 182 | return None |
|
183 | 183 | |
|
184 | 184 | |
|
185 | class RhodeCodeSettingsSchema(AuthnPluginSettingsSchemaBase): | |
|
186 | ||
|
185 | class RhodeCodeSettingsSchema(TwoFactorAuthnPluginSettingsSchemaMixin, AuthnPluginSettingsSchemaBase): | |
|
187 | 186 | auth_restriction_choices = [ |
|
188 | 187 | (RhodeCodeAuthPlugin.AUTH_RESTRICTION_NONE, 'All users'), |
|
189 | 188 | (RhodeCodeAuthPlugin.AUTH_RESTRICTION_SUPER_ADMIN, 'Super admins only'), |
@@ -48,3 +48,17 b' class AuthnPluginSettingsSchemaBase(cola' | |||
|
48 | 48 | validator=colander.Range(min=0, max=None), |
|
49 | 49 | widget='int', |
|
50 | 50 | ) |
|
51 | ||
|
52 | ||
|
53 | class TwoFactorAuthnPluginSettingsSchemaMixin(colander.MappingSchema): | |
|
54 | """ | |
|
55 | Mixin for extending plugins with two-factor authentication option. | |
|
56 | """ | |
|
57 | global_2fa = colander.SchemaNode( | |
|
58 | colander.Bool(), | |
|
59 | default=False, | |
|
60 | description=_('Force all users to use two factor authentication with this plugin.'), | |
|
61 | missing=False, | |
|
62 | title=_('enforce 2FA for users'), | |
|
63 | widget='bool', | |
|
64 | ) |
@@ -46,8 +46,7 b' def load_pyramid_environment(global_conf' | |||
|
46 | 46 | # If this is a test run we prepare the test environment like |
|
47 | 47 | # creating a test database, test search index and test repositories. |
|
48 | 48 | # This has to be done before the database connection is initialized. |
|
49 |
if |
|
|
50 | rhodecode.is_test = True | |
|
49 | if rhodecode.is_test: | |
|
51 | 50 | rhodecode.disable_error_handler = True |
|
52 | 51 | from rhodecode import authentication |
|
53 | 52 | authentication.plugin_default_auth_ttl = 0 |
@@ -81,7 +80,6 b' def load_pyramid_environment(global_conf' | |||
|
81 | 80 | rhodecode.PYRAMID_SETTINGS = settings_merged |
|
82 | 81 | rhodecode.CONFIG = settings_merged |
|
83 | 82 | rhodecode.CONFIG['default_user_id'] = utils.get_default_user_id() |
|
84 | rhodecode.CONFIG['default_base_path'] = utils.get_default_base_path() | |
|
85 | 83 | |
|
86 | 84 | if vcs_server_enabled: |
|
87 | 85 | connect_vcs(vcs_server_uri, utils.get_vcs_server_protocol(settings)) |
@@ -19,14 +19,13 b'' | |||
|
19 | 19 | import os |
|
20 | 20 | import sys |
|
21 | 21 | import collections |
|
22 | import tempfile | |
|
22 | ||
|
23 | 23 | import time |
|
24 | 24 | import logging.config |
|
25 | 25 | |
|
26 | 26 | from paste.gzipper import make_gzip_middleware |
|
27 | 27 | import pyramid.events |
|
28 | 28 | from pyramid.wsgi import wsgiapp |
|
29 | from pyramid.authorization import ACLAuthorizationPolicy | |
|
30 | 29 | from pyramid.config import Configurator |
|
31 | 30 | from pyramid.settings import asbool, aslist |
|
32 | 31 | from pyramid.httpexceptions import ( |
@@ -35,11 +34,11 b' from pyramid.renderers import render_to_' | |||
|
35 | 34 | |
|
36 | 35 | from rhodecode.model import meta |
|
37 | 36 | from rhodecode.config import patches |
|
38 | from rhodecode.config import utils as config_utils | |
|
39 | from rhodecode.config.settings_maker import SettingsMaker | |
|
37 | ||
|
40 | 38 | from rhodecode.config.environment import load_pyramid_environment |
|
41 | 39 | |
|
42 | 40 | import rhodecode.events |
|
41 | from rhodecode.config.config_maker import sanitize_settings_and_apply_defaults | |
|
43 | 42 | from rhodecode.lib.middleware.vcs import VCSMiddleware |
|
44 | 43 | from rhodecode.lib.request import Request |
|
45 | 44 | from rhodecode.lib.vcs import VCSCommunicationError |
@@ -327,7 +326,7 b' def includeme(config, auth_resources=Non' | |||
|
327 | 326 | config.include('pyramid_mako') |
|
328 | 327 | config.include('rhodecode.lib.rc_beaker') |
|
329 | 328 | config.include('rhodecode.lib.rc_cache') |
|
330 |
config.include('rhodecode.lib. |
|
|
329 | config.include('rhodecode.lib.archive_cache') | |
|
331 | 330 | |
|
332 | 331 | config.include('rhodecode.apps._base.navigation') |
|
333 | 332 | config.include('rhodecode.apps._base.subscribers') |
@@ -465,173 +464,3 b' def wrap_app_in_wsgi_middlewares(pyramid' | |||
|
465 | 464 | log.debug('Request processing finalized: %.4fs', total) |
|
466 | 465 | |
|
467 | 466 | return pyramid_app_with_cleanup |
|
468 | ||
|
469 | ||
|
470 | def sanitize_settings_and_apply_defaults(global_config, settings): | |
|
471 | """ | |
|
472 | Applies settings defaults and does all type conversion. | |
|
473 | ||
|
474 | We would move all settings parsing and preparation into this place, so that | |
|
475 | we have only one place left which deals with this part. The remaining parts | |
|
476 | of the application would start to rely fully on well prepared settings. | |
|
477 | ||
|
478 | This piece would later be split up per topic to avoid a big fat monster | |
|
479 | function. | |
|
480 | """ | |
|
481 | jn = os.path.join | |
|
482 | ||
|
483 | global_settings_maker = SettingsMaker(global_config) | |
|
484 | global_settings_maker.make_setting('debug', default=False, parser='bool') | |
|
485 | debug_enabled = asbool(global_config.get('debug')) | |
|
486 | ||
|
487 | settings_maker = SettingsMaker(settings) | |
|
488 | ||
|
489 | settings_maker.make_setting( | |
|
490 | 'logging.autoconfigure', | |
|
491 | default=False, | |
|
492 | parser='bool') | |
|
493 | ||
|
494 | logging_conf = jn(os.path.dirname(global_config.get('__file__')), 'logging.ini') | |
|
495 | settings_maker.enable_logging(logging_conf, level='INFO' if debug_enabled else 'DEBUG') | |
|
496 | ||
|
497 | # Default includes, possible to change as a user | |
|
498 | pyramid_includes = settings_maker.make_setting('pyramid.includes', [], parser='list:newline') | |
|
499 | log.debug( | |
|
500 | "Using the following pyramid.includes: %s", | |
|
501 | pyramid_includes) | |
|
502 | ||
|
503 | settings_maker.make_setting('rhodecode.edition', 'Community Edition') | |
|
504 | settings_maker.make_setting('rhodecode.edition_id', 'CE') | |
|
505 | ||
|
506 | if 'mako.default_filters' not in settings: | |
|
507 | # set custom default filters if we don't have it defined | |
|
508 | settings['mako.imports'] = 'from rhodecode.lib.base import h_filter' | |
|
509 | settings['mako.default_filters'] = 'h_filter' | |
|
510 | ||
|
511 | if 'mako.directories' not in settings: | |
|
512 | mako_directories = settings.setdefault('mako.directories', [ | |
|
513 | # Base templates of the original application | |
|
514 | 'rhodecode:templates', | |
|
515 | ]) | |
|
516 | log.debug( | |
|
517 | "Using the following Mako template directories: %s", | |
|
518 | mako_directories) | |
|
519 | ||
|
520 | # NOTE(marcink): fix redis requirement for schema of connection since 3.X | |
|
521 | if 'beaker.session.type' in settings and settings['beaker.session.type'] == 'ext:redis': | |
|
522 | raw_url = settings['beaker.session.url'] | |
|
523 | if not raw_url.startswith(('redis://', 'rediss://', 'unix://')): | |
|
524 | settings['beaker.session.url'] = 'redis://' + raw_url | |
|
525 | ||
|
526 | settings_maker.make_setting('__file__', global_config.get('__file__')) | |
|
527 | ||
|
528 | # TODO: johbo: Re-think this, usually the call to config.include | |
|
529 | # should allow to pass in a prefix. | |
|
530 | settings_maker.make_setting('rhodecode.api.url', '/_admin/api') | |
|
531 | ||
|
532 | # Sanitize generic settings. | |
|
533 | settings_maker.make_setting('default_encoding', 'UTF-8', parser='list') | |
|
534 | settings_maker.make_setting('is_test', False, parser='bool') | |
|
535 | settings_maker.make_setting('gzip_responses', False, parser='bool') | |
|
536 | ||
|
537 | # statsd | |
|
538 | settings_maker.make_setting('statsd.enabled', False, parser='bool') | |
|
539 | settings_maker.make_setting('statsd.statsd_host', 'statsd-exporter', parser='string') | |
|
540 | settings_maker.make_setting('statsd.statsd_port', 9125, parser='int') | |
|
541 | settings_maker.make_setting('statsd.statsd_prefix', '') | |
|
542 | settings_maker.make_setting('statsd.statsd_ipv6', False, parser='bool') | |
|
543 | ||
|
544 | settings_maker.make_setting('vcs.svn.compatible_version', '') | |
|
545 | settings_maker.make_setting('vcs.hooks.protocol', 'http') | |
|
546 | settings_maker.make_setting('vcs.hooks.host', '*') | |
|
547 | settings_maker.make_setting('vcs.scm_app_implementation', 'http') | |
|
548 | settings_maker.make_setting('vcs.server', '') | |
|
549 | settings_maker.make_setting('vcs.server.protocol', 'http') | |
|
550 | settings_maker.make_setting('vcs.server.enable', 'true', parser='bool') | |
|
551 | settings_maker.make_setting('startup.import_repos', 'false', parser='bool') | |
|
552 | settings_maker.make_setting('vcs.hooks.direct_calls', 'false', parser='bool') | |
|
553 | settings_maker.make_setting('vcs.start_server', 'false', parser='bool') | |
|
554 | settings_maker.make_setting('vcs.backends', 'hg, git, svn', parser='list') | |
|
555 | settings_maker.make_setting('vcs.connection_timeout', 3600, parser='int') | |
|
556 | ||
|
557 | settings_maker.make_setting('vcs.methods.cache', True, parser='bool') | |
|
558 | ||
|
559 | # Support legacy values of vcs.scm_app_implementation. Legacy | |
|
560 | # configurations may use 'rhodecode.lib.middleware.utils.scm_app_http', or | |
|
561 | # disabled since 4.13 'vcsserver.scm_app' which is now mapped to 'http'. | |
|
562 | scm_app_impl = settings['vcs.scm_app_implementation'] | |
|
563 | if scm_app_impl in ['rhodecode.lib.middleware.utils.scm_app_http', 'vcsserver.scm_app']: | |
|
564 | settings['vcs.scm_app_implementation'] = 'http' | |
|
565 | ||
|
566 | settings_maker.make_setting('appenlight', False, parser='bool') | |
|
567 | ||
|
568 | temp_store = tempfile.gettempdir() | |
|
569 | tmp_cache_dir = jn(temp_store, 'rc_cache') | |
|
570 | ||
|
571 | # save default, cache dir, and use it for all backends later. | |
|
572 | default_cache_dir = settings_maker.make_setting( | |
|
573 | 'cache_dir', | |
|
574 | default=tmp_cache_dir, default_when_empty=True, | |
|
575 | parser='dir:ensured') | |
|
576 | ||
|
577 | # exception store cache | |
|
578 | settings_maker.make_setting( | |
|
579 | 'exception_tracker.store_path', | |
|
580 | default=jn(default_cache_dir, 'exc_store'), default_when_empty=True, | |
|
581 | parser='dir:ensured' | |
|
582 | ) | |
|
583 | ||
|
584 | settings_maker.make_setting( | |
|
585 | 'celerybeat-schedule.path', | |
|
586 | default=jn(default_cache_dir, 'celerybeat_schedule', 'celerybeat-schedule.db'), default_when_empty=True, | |
|
587 | parser='file:ensured' | |
|
588 | ) | |
|
589 | ||
|
590 | settings_maker.make_setting('exception_tracker.send_email', False, parser='bool') | |
|
591 | settings_maker.make_setting('exception_tracker.email_prefix', '[RHODECODE ERROR]', default_when_empty=True) | |
|
592 | ||
|
593 | # sessions, ensure file since no-value is memory | |
|
594 | settings_maker.make_setting('beaker.session.type', 'file') | |
|
595 | settings_maker.make_setting('beaker.session.data_dir', jn(default_cache_dir, 'session_data')) | |
|
596 | ||
|
597 | # cache_general | |
|
598 | settings_maker.make_setting('rc_cache.cache_general.backend', 'dogpile.cache.rc.file_namespace') | |
|
599 | settings_maker.make_setting('rc_cache.cache_general.expiration_time', 60 * 60 * 12, parser='int') | |
|
600 | settings_maker.make_setting('rc_cache.cache_general.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_general.db')) | |
|
601 | ||
|
602 | # cache_perms | |
|
603 | settings_maker.make_setting('rc_cache.cache_perms.backend', 'dogpile.cache.rc.file_namespace') | |
|
604 | settings_maker.make_setting('rc_cache.cache_perms.expiration_time', 60 * 60, parser='int') | |
|
605 | settings_maker.make_setting('rc_cache.cache_perms.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_perms_db')) | |
|
606 | ||
|
607 | # cache_repo | |
|
608 | settings_maker.make_setting('rc_cache.cache_repo.backend', 'dogpile.cache.rc.file_namespace') | |
|
609 | settings_maker.make_setting('rc_cache.cache_repo.expiration_time', 60 * 60 * 24 * 30, parser='int') | |
|
610 | settings_maker.make_setting('rc_cache.cache_repo.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_repo_db')) | |
|
611 | ||
|
612 | # cache_license | |
|
613 | settings_maker.make_setting('rc_cache.cache_license.backend', 'dogpile.cache.rc.file_namespace') | |
|
614 | settings_maker.make_setting('rc_cache.cache_license.expiration_time', 60 * 5, parser='int') | |
|
615 | settings_maker.make_setting('rc_cache.cache_license.arguments.filename', jn(default_cache_dir, 'rhodecode_cache_license_db')) | |
|
616 | ||
|
617 | # cache_repo_longterm memory, 96H | |
|
618 | settings_maker.make_setting('rc_cache.cache_repo_longterm.backend', 'dogpile.cache.rc.memory_lru') | |
|
619 | settings_maker.make_setting('rc_cache.cache_repo_longterm.expiration_time', 345600, parser='int') | |
|
620 | settings_maker.make_setting('rc_cache.cache_repo_longterm.max_size', 10000, parser='int') | |
|
621 | ||
|
622 | # sql_cache_short | |
|
623 | settings_maker.make_setting('rc_cache.sql_cache_short.backend', 'dogpile.cache.rc.memory_lru') | |
|
624 | settings_maker.make_setting('rc_cache.sql_cache_short.expiration_time', 30, parser='int') | |
|
625 | settings_maker.make_setting('rc_cache.sql_cache_short.max_size', 10000, parser='int') | |
|
626 | ||
|
627 | # archive_cache | |
|
628 | settings_maker.make_setting('archive_cache.store_dir', jn(default_cache_dir, 'archive_cache'), default_when_empty=True,) | |
|
629 | settings_maker.make_setting('archive_cache.cache_size_gb', 10, parser='float') | |
|
630 | settings_maker.make_setting('archive_cache.cache_shards', 10, parser='int') | |
|
631 | ||
|
632 | settings_maker.env_expand() | |
|
633 | ||
|
634 | # configure instance id | |
|
635 | config_utils.set_instance_id(settings) | |
|
636 | ||
|
637 | return settings |
@@ -23,6 +23,7 b' import functools' | |||
|
23 | 23 | import logging |
|
24 | 24 | import tempfile |
|
25 | 25 | import logging.config |
|
26 | ||
|
26 | 27 | from rhodecode.lib.type_utils import str2bool, aslist |
|
27 | 28 | |
|
28 | 29 | log = logging.getLogger(__name__) |
@@ -34,13 +35,16 b' set_keys = {' | |||
|
34 | 35 | } |
|
35 | 36 | |
|
36 | 37 | |
|
37 |
class SettingsMaker |
|
|
38 | class SettingsMaker: | |
|
38 | 39 | |
|
39 | 40 | def __init__(self, app_settings): |
|
40 | 41 | self.settings = app_settings |
|
41 | 42 | |
|
42 | 43 | @classmethod |
|
43 | 44 | def _bool_func(cls, input_val): |
|
45 | if isinstance(input_val, bytes): | |
|
46 | # decode to str | |
|
47 | input_val = input_val.decode('utf8') | |
|
44 | 48 | return str2bool(input_val) |
|
45 | 49 | |
|
46 | 50 | @classmethod |
@@ -62,11 +66,24 b' class SettingsMaker(object):' | |||
|
62 | 66 | return input_val |
|
63 | 67 | |
|
64 | 68 | @classmethod |
|
69 | def _string_no_quote_func(cls, input_val, lower=True): | |
|
70 | """ | |
|
71 | Special case string function that detects if value is set to empty quote string | |
|
72 | e.g. | |
|
73 | ||
|
74 | core.binar_dir = "" | |
|
75 | """ | |
|
76 | ||
|
77 | input_val = cls._string_func(input_val, lower=lower) | |
|
78 | if input_val in ['""', "''"]: | |
|
79 | return '' | |
|
80 | ||
|
81 | @classmethod | |
|
65 | 82 | def _dir_func(cls, input_val, ensure_dir=False, mode=0o755): |
|
66 | 83 | |
|
67 | 84 | # ensure we have our dir created |
|
68 | 85 | if not os.path.isdir(input_val) and ensure_dir: |
|
69 | os.makedirs(input_val, mode=mode) | |
|
86 | os.makedirs(input_val, mode=mode, exist_ok=True) | |
|
70 | 87 | |
|
71 | 88 | if not os.path.isdir(input_val): |
|
72 | 89 | raise Exception(f'Dir at {input_val} does not exist') |
@@ -150,6 +167,7 b' class SettingsMaker(object):' | |||
|
150 | 167 | 'list:newline': functools.partial(self._list_func, sep='/n'), |
|
151 | 168 | 'list:spacesep': functools.partial(self._list_func, sep=' '), |
|
152 | 169 | 'string': functools.partial(self._string_func, lower=lower), |
|
170 | 'string:noquote': functools.partial(self._string_no_quote_func, lower=lower), | |
|
153 | 171 | 'dir': self._dir_func, |
|
154 | 172 | 'dir:ensured': functools.partial(self._dir_func, ensure_dir=True), |
|
155 | 173 | 'file': self._file_path_func, |
@@ -19,7 +19,7 b'' | |||
|
19 | 19 | import os |
|
20 | 20 | import platform |
|
21 | 21 | |
|
22 | from rhodecode.model import init_model | |
|
22 | DEFAULT_USER = 'default' | |
|
23 | 23 | |
|
24 | 24 | |
|
25 | 25 | def configure_vcs(config): |
@@ -44,6 +44,7 b' def configure_vcs(config):' | |||
|
44 | 44 | |
|
45 | 45 | def initialize_database(config): |
|
46 | 46 | from rhodecode.lib.utils2 import engine_from_config, get_encryption_key |
|
47 | from rhodecode.model import init_model | |
|
47 | 48 | engine = engine_from_config(config, 'sqlalchemy.db1.') |
|
48 | 49 | init_model(engine, encryption_key=get_encryption_key(config)) |
|
49 | 50 | |
@@ -93,25 +94,17 b' def set_instance_id(config):' | |||
|
93 | 94 | |
|
94 | 95 | |
|
95 | 96 | def get_default_user_id(): |
|
96 | DEFAULT_USER = 'default' | |
|
97 | 97 | from sqlalchemy import text |
|
98 | 98 | from rhodecode.model import meta |
|
99 | 99 | |
|
100 | 100 | engine = meta.get_engine() |
|
101 | 101 | with meta.SA_Session(engine) as session: |
|
102 | result = session.execute(text("SELECT user_id from users where username = :uname"), {'uname': DEFAULT_USER}) | |
|
103 | user_id = result.first()[0] | |
|
102 | result = session.execute(text( | |
|
103 | "SELECT user_id from users where username = :uname" | |
|
104 | ), {'uname': DEFAULT_USER}) | |
|
105 | user = result.first() | |
|
106 | if not user: | |
|
107 | raise ValueError('Unable to retrieve default user data from DB') | |
|
108 | user_id = user[0] | |
|
104 | 109 | |
|
105 | 110 | return user_id |
|
106 | ||
|
107 | ||
|
108 | def get_default_base_path(): | |
|
109 | from sqlalchemy import text | |
|
110 | from rhodecode.model import meta | |
|
111 | ||
|
112 | engine = meta.get_engine() | |
|
113 | with meta.SA_Session(engine) as session: | |
|
114 | result = session.execute(text("SELECT ui_value from rhodecode_ui where ui_key = '/'")) | |
|
115 | base_path = result.first()[0] | |
|
116 | ||
|
117 | return base_path |
@@ -15,13 +15,13 b'' | |||
|
15 | 15 | # This program is dual-licensed. If you wish to learn more about the |
|
16 | 16 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
17 | 17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
18 | ||
|
18 | 19 | import logging |
|
19 | 20 | import datetime |
|
20 | import typing | |
|
21 | 21 | |
|
22 | 22 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
23 | from pyramid.threadlocal import get_current_request | |
|
24 | 23 | |
|
24 | from rhodecode.lib.pyramid_utils import get_current_request | |
|
25 | 25 | from rhodecode.lib.utils2 import AttributeDict |
|
26 | 26 | |
|
27 | 27 | |
@@ -41,8 +41,9 b' class RhodecodeEvent(object):' | |||
|
41 | 41 | name = "RhodeCodeEvent" |
|
42 | 42 | no_url_set = '<no server_url available>' |
|
43 | 43 | |
|
44 | def __init__(self, request=None): | |
|
44 | def __init__(self, request=None, actor=None): | |
|
45 | 45 | self._request = request |
|
46 | self._actor = actor | |
|
46 | 47 | self.utc_timestamp = datetime.datetime.utcnow() |
|
47 | 48 | |
|
48 | 49 | def __repr__(self): |
@@ -72,16 +73,24 b' class RhodecodeEvent(object):' | |||
|
72 | 73 | |
|
73 | 74 | @property |
|
74 | 75 | def actor(self): |
|
76 | from rhodecode.lib.auth import AuthUser | |
|
77 | ||
|
78 | # if an explicit actor is specified, use this | |
|
79 | if self._actor: | |
|
80 | return self._actor | |
|
81 | ||
|
75 | 82 | auth_user = self.auth_user |
|
76 | if auth_user: | |
|
83 | log.debug('Got integration actor: %s', auth_user) | |
|
84 | if isinstance(auth_user, AuthUser): | |
|
77 | 85 | instance = auth_user.get_instance() |
|
86 | # we can't find this DB user... | |
|
78 | 87 | if not instance: |
|
79 | 88 | return AttributeDict(dict( |
|
80 | 89 | username=auth_user.username, |
|
81 | 90 | user_id=auth_user.user_id, |
|
82 | 91 | )) |
|
83 | return instance | |
|
84 | ||
|
92 | elif auth_user: | |
|
93 | return auth_user | |
|
85 | 94 | return SYSTEM_USER |
|
86 | 95 | |
|
87 | 96 | @property |
@@ -129,3 +138,4 b' class FtsBuild(RhodecodeEvent):' | |||
|
129 | 138 | """ |
|
130 | 139 | name = 'fts-build' |
|
131 | 140 | display_name = 'Start FTS Build' |
|
141 |
@@ -156,11 +156,11 b' class RepoEvent(RhodeCodeIntegrationEven' | |||
|
156 | 156 | Base class for events acting on a repository. |
|
157 | 157 | """ |
|
158 | 158 | |
|
159 | def __init__(self, repo): | |
|
159 | def __init__(self, repo, actor=None): | |
|
160 | 160 | """ |
|
161 | 161 | :param repo: a :class:`Repository` instance |
|
162 | 162 | """ |
|
163 | super().__init__() | |
|
163 | super().__init__(actor=actor) | |
|
164 | 164 | self.repo = repo |
|
165 | 165 | |
|
166 | 166 | def as_dict(self): |
@@ -11,9 +11,9 b' import importlib' | |||
|
11 | 11 | from inspect import istraceback |
|
12 | 12 | |
|
13 | 13 | from collections import OrderedDict |
|
14 | from rhodecode.lib.logging_formatter import _inject_req_id, ExceptionAwareFormatter | |
|
15 | from rhodecode.lib.ext_json import sjson as json | |
|
16 | 14 | |
|
15 | from ...logging_formatter import _inject_req_id, ExceptionAwareFormatter | |
|
16 | from ...ext_json import sjson as json | |
|
17 | 17 | |
|
18 | 18 | ZERO = timedelta(0) |
|
19 | 19 | HOUR = timedelta(hours=1) |
@@ -78,7 +78,7 b' class JsonEncoder(json.JSONEncoder):' | |||
|
78 | 78 | return str(obj) |
|
79 | 79 | |
|
80 | 80 | try: |
|
81 |
return super( |
|
|
81 | return super().default(obj) | |
|
82 | 82 | |
|
83 | 83 | except TypeError: |
|
84 | 84 | try: |
@@ -194,7 +194,7 b' class JsonFormatter(ExceptionAwareFormat' | |||
|
194 | 194 | |
|
195 | 195 | def serialize_log_record(self, log_record): |
|
196 | 196 | """Returns the final representation of the log record.""" |
|
197 |
return " |
|
|
197 | return "{}{}".format(self.prefix, self.jsonify_log_record(log_record)) | |
|
198 | 198 | |
|
199 | 199 | def format(self, record): |
|
200 | 200 | """Formats a log record and serializes to json""" |
@@ -102,7 +102,7 b' class NotExpirable(RuntimeError):' | |||
|
102 | 102 | pass |
|
103 | 103 | |
|
104 | 104 | |
|
105 |
class Lock |
|
|
105 | class Lock: | |
|
106 | 106 | """ |
|
107 | 107 | A Lock context manager implemented via redis SETNX/BLPOP. |
|
108 | 108 | """ |
@@ -111,11 +111,12 b' class Lock(object):' | |||
|
111 | 111 | extend_script = None |
|
112 | 112 | reset_script = None |
|
113 | 113 | reset_all_script = None |
|
114 | blocking = None | |
|
114 | 115 | |
|
115 | 116 | _lock_renewal_interval: float |
|
116 | 117 | _lock_renewal_thread: Union[threading.Thread, None] |
|
117 | 118 | |
|
118 | def __init__(self, redis_client, name, expire=None, id=None, auto_renewal=False, strict=True, signal_expire=1000): | |
|
119 | def __init__(self, redis_client, name, expire=None, id=None, auto_renewal=False, strict=True, signal_expire=1000, blocking=True): | |
|
119 | 120 | """ |
|
120 | 121 | :param redis_client: |
|
121 | 122 | An instance of :class:`~StrictRedis`. |
@@ -143,6 +144,9 b' class Lock(object):' | |||
|
143 | 144 | If set ``True`` then the ``redis_client`` needs to be an instance of ``redis.StrictRedis``. |
|
144 | 145 | :param signal_expire: |
|
145 | 146 | Advanced option to override signal list expiration in milliseconds. Increase it for very slow clients. Default: ``1000``. |
|
147 | :param blocking: | |
|
148 | Boolean value specifying whether lock should be blocking or not. | |
|
149 | Used in `__enter__` method. | |
|
146 | 150 | """ |
|
147 | 151 | if strict and not isinstance(redis_client, StrictRedis): |
|
148 | 152 | raise ValueError("redis_client must be instance of StrictRedis. " |
@@ -179,6 +183,8 b' class Lock(object):' | |||
|
179 | 183 | else None) |
|
180 | 184 | self._lock_renewal_thread = None |
|
181 | 185 | |
|
186 | self.blocking = blocking | |
|
187 | ||
|
182 | 188 | self.register_scripts(redis_client) |
|
183 | 189 | |
|
184 | 190 | @classmethod |
@@ -342,9 +348,11 b' class Lock(object):' | |||
|
342 | 348 | loggers["refresh.exit"].debug("Renewal thread for Lock(%r) exited.", self._name) |
|
343 | 349 | |
|
344 | 350 | def __enter__(self): |
|
345 |
acquired = self.acquire(blocking= |
|
|
351 | acquired = self.acquire(blocking=self.blocking) | |
|
346 | 352 | if not acquired: |
|
353 | if self.blocking: | |
|
347 | 354 | raise AssertionError(f"Lock({self._name}) wasn't acquired, but blocking=True was used!") |
|
355 | raise NotAcquired(f"Lock({self._name}) is not acquired or it already expired.") | |
|
348 | 356 | return self |
|
349 | 357 | |
|
350 | 358 | def __exit__(self, exc_type=None, exc_value=None, traceback=None): |
@@ -1,5 +1,3 b'' | |||
|
1 | ||
|
2 | ||
|
3 | 1 |
|
|
4 | 2 | |
|
5 | 3 | from .stream import TCPStatsClient, UnixSocketStatsClient # noqa |
@@ -26,9 +24,10 b' def client_from_config(configuration, pr' | |||
|
26 | 24 | from pyramid.settings import asbool |
|
27 | 25 | |
|
28 | 26 | _config = statsd_config(configuration, prefix) |
|
27 | statsd_flag = _config.get('enabled') | |
|
29 | 28 | statsd_enabled = asbool(_config.pop('enabled', False)) |
|
30 | 29 | if not statsd_enabled: |
|
31 | log.debug('statsd client not enabled by statsd.enabled = flag, skipping...') | |
|
30 | log.debug('statsd client not enabled by statsd.enabled = %s flag, skipping...', statsd_flag) | |
|
32 | 31 | return |
|
33 | 32 | |
|
34 | 33 | host = _config.pop('statsd_host', HOST) |
@@ -1,5 +1,3 b'' | |||
|
1 | ||
|
2 | ||
|
3 | 1 |
|
|
4 | 2 | import random |
|
5 | 3 | from collections import deque |
@@ -31,7 +29,7 b' def normalize_tags(tag_list):' | |||
|
31 | 29 | return _normalize_tags_with_cache(tuple(tag_list)) |
|
32 | 30 | |
|
33 | 31 | |
|
34 |
class StatsClientBase |
|
|
32 | class StatsClientBase: | |
|
35 | 33 | """A Base class for various statsd clients.""" |
|
36 | 34 | |
|
37 | 35 | def close(self): |
@@ -73,7 +71,7 b' class StatsClientBase(object):' | |||
|
73 | 71 | |
|
74 | 72 | def incr(self, stat, count=1, rate=1, tags=None): |
|
75 | 73 | """Increment a stat by `count`.""" |
|
76 |
self._send_stat(stat, ' |
|
|
74 | self._send_stat(stat, f'{count}|c', rate, tags) | |
|
77 | 75 | |
|
78 | 76 | def decr(self, stat, count=1, rate=1, tags=None): |
|
79 | 77 | """Decrement a stat by `count`.""" |
@@ -87,18 +85,18 b' class StatsClientBase(object):' | |||
|
87 | 85 | return |
|
88 | 86 | with self.pipeline() as pipe: |
|
89 | 87 | pipe._send_stat(stat, '0|g', 1) |
|
90 |
pipe._send_stat(stat, ' |
|
|
88 | pipe._send_stat(stat, f'{value}|g', 1) | |
|
91 | 89 | else: |
|
92 | 90 | prefix = '+' if delta and value >= 0 else '' |
|
93 |
self._send_stat(stat, ' |
|
|
91 | self._send_stat(stat, f'{prefix}{value}|g', rate, tags) | |
|
94 | 92 | |
|
95 | 93 | def set(self, stat, value, rate=1): |
|
96 | 94 | """Set a set value.""" |
|
97 |
self._send_stat(stat, ' |
|
|
95 | self._send_stat(stat, f'{value}|s', rate) | |
|
98 | 96 | |
|
99 | 97 | def histogram(self, stat, value, rate=1, tags=None): |
|
100 | 98 | """Set a histogram""" |
|
101 |
self._send_stat(stat, ' |
|
|
99 | self._send_stat(stat, f'{value}|h', rate, tags) | |
|
102 | 100 | |
|
103 | 101 | def _send_stat(self, stat, value, rate, tags=None): |
|
104 | 102 | self._after(self._prepare(stat, value, rate, tags)) |
@@ -110,10 +108,10 b' class StatsClientBase(object):' | |||
|
110 | 108 | if rate < 1: |
|
111 | 109 | if random.random() > rate: |
|
112 | 110 | return |
|
113 |
value = ' |
|
|
111 | value = f'{value}|@{rate}' | |
|
114 | 112 | |
|
115 | 113 | if self._prefix: |
|
116 |
stat = ' |
|
|
114 | stat = f'{self._prefix}.{stat}' | |
|
117 | 115 | |
|
118 | 116 | res = '%s:%s%s' % ( |
|
119 | 117 | stat, |
@@ -1,5 +1,3 b'' | |||
|
1 | ||
|
2 | ||
|
3 | 1 |
|
|
4 | 2 | from time import perf_counter as time_now |
|
5 | 3 | |
@@ -11,7 +9,7 b' def safe_wraps(wrapper, *args, **kwargs)' | |||
|
11 | 9 | return functools.wraps(wrapper, *args, **kwargs) |
|
12 | 10 | |
|
13 | 11 | |
|
14 |
class Timer |
|
|
12 | class Timer: | |
|
15 | 13 | """A context manager/decorator for statsd.timing().""" |
|
16 | 14 | |
|
17 | 15 | def __init__(self, client, stat, rate=1, tags=None, use_decimals=True, auto_send=True): |
@@ -1,5 +1,3 b'' | |||
|
1 | ||
|
2 | ||
|
3 | 1 |
|
|
4 | 2 | |
|
5 | 3 | from .base import StatsClientBase, PipelineBase |
@@ -8,7 +6,7 b' from .base import StatsClientBase, Pipel' | |||
|
8 | 6 | class Pipeline(PipelineBase): |
|
9 | 7 | |
|
10 | 8 | def __init__(self, client): |
|
11 |
super( |
|
|
9 | super().__init__(client) | |
|
12 | 10 | self._maxudpsize = client._maxudpsize |
|
13 | 11 | |
|
14 | 12 | def _send(self): |
@@ -258,8 +258,7 b' class ActionParser(object):' | |||
|
258 | 258 | commit = repo.get_commit(commit_id=commit_id) |
|
259 | 259 | commits.append(commit) |
|
260 | 260 | except CommitDoesNotExistError: |
|
261 | log.error( | |
|
262 | 'cannot find commit id %s in this repository', | |
|
261 | log.error('cannot find commit id %s in this repository', | |
|
263 | 262 | commit_id) |
|
264 | 263 | commits.append(commit_id) |
|
265 | 264 | continue |
@@ -1,4 +1,4 b'' | |||
|
1 |
# Copyright (C) 2015-202 |
|
|
1 | # Copyright (C) 2015-2024 RhodeCode GmbH | |
|
2 | 2 | # |
|
3 | 3 | # This program is free software: you can redistribute it and/or modify |
|
4 | 4 | # it under the terms of the GNU Affero General Public License, version 3 |
@@ -16,73 +16,162 b'' | |||
|
16 | 16 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
17 | 17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
18 | 18 | |
|
19 | import codecs | |
|
20 | import hashlib | |
|
19 | 21 | import logging |
|
20 | 22 | import os |
|
21 | import diskcache | |
|
22 | from diskcache import RLock | |
|
23 | import typing | |
|
24 | ||
|
25 | import fsspec | |
|
26 | ||
|
27 | from .base import BaseCache, BaseShard | |
|
28 | from ..utils import ShardFileReader, NOT_GIVEN | |
|
29 | from ...type_utils import str2bool | |
|
23 | 30 | |
|
24 | 31 | log = logging.getLogger(__name__) |
|
25 | 32 | |
|
26 | cache_meta = None | |
|
33 | ||
|
34 | class FileSystemShard(BaseShard): | |
|
35 | ||
|
36 | def __init__(self, index, directory, directory_folder, fs, **settings): | |
|
37 | self._index: int = index | |
|
38 | self._directory: str = directory | |
|
39 | self._directory_folder: str = directory_folder | |
|
40 | self.storage_type: str = 'directory' | |
|
27 | 41 | |
|
42 | self.fs = fs | |
|
43 | ||
|
44 | @property | |
|
45 | def directory(self) -> str: | |
|
46 | """Cache directory final path.""" | |
|
47 | return os.path.join(self._directory, self._directory_folder) | |
|
48 | ||
|
49 | def _get_keyfile(self, archive_key) -> tuple[str, str]: | |
|
50 | key_file: str = f'{archive_key}.{self.key_suffix}' | |
|
51 | return key_file, os.path.join(self.directory, key_file) | |
|
28 | 52 | |
|
29 | class ReentrantLock(RLock): | |
|
30 | def __enter__(self): | |
|
31 | reentrant_lock_key = self._key | |
|
53 | def _get_writer(self, path, mode): | |
|
54 | for count in range(1, 11): | |
|
55 | try: | |
|
56 | # Another cache may have deleted the directory before | |
|
57 | # the file could be opened. | |
|
58 | return self.fs.open(path, mode) | |
|
59 | except OSError: | |
|
60 | if count == 10: | |
|
61 | # Give up after 10 tries to open the file. | |
|
62 | raise | |
|
63 | continue | |
|
32 | 64 | |
|
33 | log.debug('Acquire ReentrantLock(key=%s) for archive cache generation...', reentrant_lock_key) | |
|
34 | #self.acquire() | |
|
35 | log.debug('Lock for key=%s acquired', reentrant_lock_key) | |
|
65 | def _write_file(self, full_path, iterator, mode): | |
|
66 | ||
|
67 | # ensure dir exists | |
|
68 | destination, _ = os.path.split(full_path) | |
|
69 | if not self.fs.exists(destination): | |
|
70 | self.fs.makedirs(destination) | |
|
71 | ||
|
72 | writer = self._get_writer(full_path, mode) | |
|
36 | 73 | |
|
37 | def __exit__(self, *exc_info): | |
|
38 | #self.release() | |
|
39 | pass | |
|
74 | digest = hashlib.sha256() | |
|
75 | with writer: | |
|
76 | size = 0 | |
|
77 | for chunk in iterator: | |
|
78 | size += len(chunk) | |
|
79 | digest.update(chunk) | |
|
80 | writer.write(chunk) | |
|
81 | writer.flush() | |
|
82 | # Get the file descriptor | |
|
83 | fd = writer.fileno() | |
|
40 | 84 | |
|
85 | # Sync the file descriptor to disk, helps with NFS cases... | |
|
86 | os.fsync(fd) | |
|
87 | sha256 = digest.hexdigest() | |
|
88 | log.debug('written new archive cache under %s, sha256: %s', full_path, sha256) | |
|
89 | return size, sha256 | |
|
41 | 90 | |
|
42 | def get_archival_config(config): | |
|
91 | def store(self, key, value_reader, metadata: dict | None = None): | |
|
92 | return self._store(key, value_reader, metadata, mode='xb') | |
|
43 | 93 | |
|
44 | final_config = { | |
|
45 | 'archive_cache.eviction_policy': 'least-frequently-used' | |
|
46 | } | |
|
94 | def fetch(self, key, retry=NOT_GIVEN, | |
|
95 | retry_attempts=NOT_GIVEN, retry_backoff=1, **kwargs) -> tuple[ShardFileReader, dict]: | |
|
96 | return self._fetch(key, retry, retry_attempts, retry_backoff) | |
|
97 | ||
|
98 | def remove(self, key): | |
|
99 | return self._remove(key) | |
|
100 | ||
|
101 | def random_filename(self): | |
|
102 | """Return filename and full-path tuple for file storage. | |
|
47 | 103 |
|
|
48 | for k, v in config.items(): | |
|
49 | if k.startswith('archive_cache'): | |
|
50 | final_config[k] = v | |
|
104 | Filename will be a randomly generated 28 character hexadecimal string | |
|
105 | with ".archive_cache" suffixed. Two levels of sub-directories will be used to | |
|
106 | reduce the size of directories. On older filesystems, lookups in | |
|
107 | directories with many files may be slow. | |
|
108 | """ | |
|
109 | ||
|
110 | hex_name = codecs.encode(os.urandom(16), 'hex').decode('utf-8') | |
|
51 | 111 | |
|
52 | return final_config | |
|
112 | archive_name = hex_name[4:] + '.archive_cache' | |
|
113 | filename = f"{hex_name[:2]}/{hex_name[2:4]}/{archive_name}" | |
|
114 | ||
|
115 | full_path = os.path.join(self.directory, filename) | |
|
116 | return archive_name, full_path | |
|
117 | ||
|
118 | def __repr__(self): | |
|
119 | return f'{self.__class__.__name__}(index={self._index}, dir={self.directory})' | |
|
53 | 120 | |
|
54 | 121 | |
|
55 | def get_archival_cache_store(config): | |
|
122 | class FileSystemFanoutCache(BaseCache): | |
|
123 | shard_name: str = 'shard_{:03d}' | |
|
124 | shard_cls = FileSystemShard | |
|
56 | 125 | |
|
57 | global cache_meta | |
|
58 | if cache_meta is not None: | |
|
59 | return cache_meta | |
|
126 | def __init__(self, locking_url, **settings): | |
|
127 | """ | |
|
128 | Initialize file system cache instance. | |
|
129 | ||
|
130 | :param str locking_url: redis url for a lock | |
|
131 | :param settings: settings dict | |
|
60 | 132 |
|
|
61 | config = get_archival_config(config) | |
|
133 | """ | |
|
134 | self._locking_url = locking_url | |
|
135 | self._config = settings | |
|
136 | cache_dir = self.get_conf('archive_cache.filesystem.store_dir') | |
|
137 | directory = str(cache_dir) | |
|
138 | directory = os.path.expanduser(directory) | |
|
139 | directory = os.path.expandvars(directory) | |
|
140 | self._directory = directory | |
|
141 | self._storage_path = directory # common path for all from BaseCache | |
|
62 | 142 | |
|
63 | archive_cache_dir = config['archive_cache.store_dir'] | |
|
64 | archive_cache_size_gb = config['archive_cache.cache_size_gb'] | |
|
65 | archive_cache_shards = config['archive_cache.cache_shards'] | |
|
66 | archive_cache_eviction_policy = config['archive_cache.eviction_policy'] | |
|
143 | self._shard_count = int(self.get_conf('archive_cache.filesystem.cache_shards', pop=True)) | |
|
144 | if self._shard_count < 1: | |
|
145 | raise ValueError('cache_shards must be 1 or more') | |
|
67 | 146 | |
|
68 | log.debug('Initializing archival cache instance under %s', archive_cache_dir) | |
|
147 | self._eviction_policy = self.get_conf('archive_cache.filesystem.eviction_policy', pop=True) | |
|
148 | self._cache_size_limit = self.gb_to_bytes(int(self.get_conf('archive_cache.filesystem.cache_size_gb'))) | |
|
69 | 149 | |
|
70 | # check if it's ok to write, and re-create the archive cache | |
|
71 | if not os.path.isdir(archive_cache_dir): | |
|
72 | os.makedirs(archive_cache_dir, exist_ok=True) | |
|
150 | self.retry = str2bool(self.get_conf('archive_cache.filesystem.retry', pop=True)) | |
|
151 | self.retry_attempts = int(self.get_conf('archive_cache.filesystem.retry_attempts', pop=True)) | |
|
152 | self.retry_backoff = int(self.get_conf('archive_cache.filesystem.retry_backoff', pop=True)) | |
|
153 | ||
|
154 | log.debug('Initializing %s archival cache instance', self) | |
|
155 | fs = fsspec.filesystem('file') | |
|
156 | # check if it's ok to write, and re-create the archive cache main dir | |
|
157 | # A directory is the virtual equivalent of a physical file cabinet. | |
|
158 | # In other words, it's a container for organizing digital data. | |
|
159 | # Unlike a folder, which can only store files, a directory can store files, | |
|
160 | # subdirectories, and other directories. | |
|
161 | if not fs.exists(self._directory): | |
|
162 | fs.makedirs(self._directory, exist_ok=True) | |
|
73 | 163 | |
|
74 | d_cache = diskcache.FanoutCache( | |
|
75 | archive_cache_dir, shards=archive_cache_shards, | |
|
76 | cull_limit=0, # manual eviction required | |
|
77 | size_limit=archive_cache_size_gb * 1024 * 1024 * 1024, | |
|
78 | eviction_policy=archive_cache_eviction_policy, | |
|
79 | timeout=30 | |
|
164 | self._shards = tuple( | |
|
165 | self.shard_cls( | |
|
166 | index=num, | |
|
167 | directory=directory, | |
|
168 | directory_folder=self.shard_name.format(num), | |
|
169 | fs=fs, | |
|
170 | **settings, | |
|
80 | 171 | ) |
|
81 | cache_meta = d_cache | |
|
82 | return cache_meta | |
|
83 | ||
|
172 | for num in range(self._shard_count) | |
|
173 | ) | |
|
174 | self._hash = self._shards[0].hash | |
|
84 | 175 | |
|
85 | def includeme(config): | |
|
86 | # init our cache at start | |
|
87 | settings = config.get_settings() | |
|
88 | get_archival_cache_store(settings) | |
|
176 | def _get_size(self, shard, archive_path): | |
|
177 | return os.stat(archive_path).st_size |
@@ -1688,7 +1688,7 b' def get_csrf_token(session, force_new=Fa' | |||
|
1688 | 1688 | |
|
1689 | 1689 | |
|
1690 | 1690 | def get_request(perm_class_instance): |
|
1691 |
from |
|
|
1691 | from rhodecode.lib.pyramid_utils import get_current_request | |
|
1692 | 1692 | pyramid_request = get_current_request() |
|
1693 | 1693 | return pyramid_request |
|
1694 | 1694 |
@@ -347,8 +347,6 b' def attach_context_attributes(context, r' | |||
|
347 | 347 | context.ssh_key_generator_enabled = str2bool( |
|
348 | 348 | config.get('ssh.enable_ui_key_generator', 'true')) |
|
349 | 349 | |
|
350 | context.visual.allow_repo_location_change = str2bool( | |
|
351 | config.get('allow_repo_location_change', True)) | |
|
352 | 350 | context.visual.allow_custom_hooks_settings = str2bool( |
|
353 | 351 | config.get('allow_custom_hooks_settings', True)) |
|
354 | 352 | context.debug_style = str2bool(config.get('debug_style', False)) |
@@ -567,7 +565,7 b' def add_events_routes(config):' | |||
|
567 | 565 | |
|
568 | 566 | |
|
569 | 567 | def bootstrap_config(request, registry_name='RcTestRegistry'): |
|
570 |
from rhodecode.config. |
|
|
568 | from rhodecode.config.config_maker import sanitize_settings_and_apply_defaults | |
|
571 | 569 | import pyramid.testing |
|
572 | 570 | registry = pyramid.testing.Registry(registry_name) |
|
573 | 571 | |
@@ -580,7 +578,7 b' def bootstrap_config(request, registry_n' | |||
|
580 | 578 | config.include('pyramid_mako') |
|
581 | 579 | config.include('rhodecode.lib.rc_beaker') |
|
582 | 580 | config.include('rhodecode.lib.rc_cache') |
|
583 |
config.include('rhodecode.lib. |
|
|
581 | config.include('rhodecode.lib.archive_cache') | |
|
584 | 582 | add_events_routes(config) |
|
585 | 583 | |
|
586 | 584 | return config |
@@ -193,6 +193,7 b' def create_repo(form_data, cur_user):' | |||
|
193 | 193 | enable_downloads=enable_downloads, |
|
194 | 194 | state=state |
|
195 | 195 | ) |
|
196 | ||
|
196 | 197 | Session().commit() |
|
197 | 198 | |
|
198 | 199 | # now create this repo on Filesystem |
@@ -402,6 +403,11 b' def sync_last_update_for_objects(*args, ' | |||
|
402 | 403 | |
|
403 | 404 | |
|
404 | 405 | @async_task(ignore_result=True, base=RequestContextTask) |
|
406 | def test_celery_exception(msg): | |
|
407 | raise Exception(f'Test exception: {msg}') | |
|
408 | ||
|
409 | ||
|
410 | @async_task(ignore_result=True, base=RequestContextTask) | |
|
405 | 411 | def sync_last_update(*args, **kwargs): |
|
406 | 412 | sync_last_update_for_objects(*args, **kwargs) |
|
407 | 413 |
@@ -201,7 +201,7 b' class DbManage(object):' | |||
|
201 | 201 | f'version {curr_version} to version {__dbversion__}') |
|
202 | 202 | |
|
203 | 203 | # CALL THE PROPER ORDER OF STEPS TO PERFORM FULL UPGRADE |
|
204 |
_step = |
|
|
204 | final_step = 'latest' | |
|
205 | 205 | for step in upgrade_steps: |
|
206 | 206 | notify(f'performing upgrade step {step}') |
|
207 | 207 | time.sleep(0.5) |
@@ -210,10 +210,10 b' class DbManage(object):' | |||
|
210 | 210 | self.sa.rollback() |
|
211 | 211 | notify(f'schema upgrade for step {step} completed') |
|
212 | 212 | |
|
213 | _step = step | |
|
213 | final_step = step | |
|
214 | 214 | |
|
215 | 215 | self.run_post_migration_tasks() |
|
216 | notify(f'upgrade to version {step} successful') | |
|
216 | notify(f'upgrade to version {final_step} successful') | |
|
217 | 217 | |
|
218 | 218 | def fix_repo_paths(self): |
|
219 | 219 | """ |
@@ -1,3 +1,21 b'' | |||
|
1 | # Copyright (C) 2011-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
1 | 19 | from rhodecode.lib.str_utils import safe_bytes |
|
2 | 20 | from rhodecode.lib.encrypt import encrypt_data, validate_and_decrypt_data |
|
3 | 21 | from rhodecode.lib.encrypt2 import Encryptor |
@@ -9,6 +27,10 b' def get_default_algo():' | |||
|
9 | 27 | import rhodecode |
|
10 | 28 | return rhodecode.CONFIG.get('rhodecode.encrypted_values.algorithm') or 'aes' |
|
11 | 29 | |
|
30 | def get_strict_mode(): | |
|
31 | import rhodecode | |
|
32 | return rhodecode.ConfigGet().get_bool('rhodecode.encrypted_values.strict') or False | |
|
33 | ||
|
12 | 34 | |
|
13 | 35 | def encrypt_value(value: bytes, enc_key: bytes, algo: str = ''): |
|
14 | 36 | if not algo: |
@@ -29,16 +51,21 b' def encrypt_value(value: bytes, enc_key:' | |||
|
29 | 51 | return value |
|
30 | 52 | |
|
31 | 53 | |
|
32 |
def decrypt_value(value: bytes, enc_key: bytes, algo: str = '', strict_mode: bool = |
|
|
54 | def decrypt_value(value: bytes, enc_key: bytes, algo: str = '', strict_mode: bool | None = None): | |
|
55 | ||
|
56 | if strict_mode is None: | |
|
57 | # we use config value rather then explicit True/False | |
|
58 | strict_mode = get_strict_mode() | |
|
59 | ||
|
60 | enc_key = safe_bytes(enc_key) | |
|
61 | value = safe_bytes(value) | |
|
33 | 62 | |
|
34 | 63 | if not algo: |
|
35 | 64 | # not explicit algo, just use what's set by config |
|
36 | algo = get_default_algo() | |
|
65 | algo = Encryptor.detect_enc_algo(value) or get_default_algo() | |
|
37 | 66 | if algo not in ALLOWED_ALGOS: |
|
38 | 67 | ValueError(f'Bad encryption algorithm, should be {ALLOWED_ALGOS}, got: {algo}') |
|
39 | 68 | |
|
40 | enc_key = safe_bytes(enc_key) | |
|
41 | value = safe_bytes(value) | |
|
42 | 69 | safe = not strict_mode |
|
43 | 70 | |
|
44 | 71 | if algo == 'aes': |
@@ -43,6 +43,7 b' class InvalidDecryptedValue(str):' | |||
|
43 | 43 | content = f'<{cls.__name__}({content[:16]}...)>' |
|
44 | 44 | return str.__new__(cls, content) |
|
45 | 45 | |
|
46 | ||
|
46 | 47 | KEY_FORMAT = b'enc$aes_hmac${1}' |
|
47 | 48 | |
|
48 | 49 |
@@ -23,8 +23,25 b' class InvalidDecryptedValue(str):' | |||
|
23 | 23 | |
|
24 | 24 | class Encryptor(object): |
|
25 | 25 | key_format = b'enc2$salt:{1}$data:{2}' |
|
26 | ||
|
26 | 27 | pref_len = 5 # salt:, data: |
|
27 | 28 | |
|
29 | @classmethod | |
|
30 | def detect_enc_algo(cls, enc_data: bytes): | |
|
31 | parts = enc_data.split(b'$', 3) | |
|
32 | ||
|
33 | if b'enc$aes_hmac$' in enc_data: | |
|
34 | # we expect this data is encrypted, so validate the header | |
|
35 | if len(parts) != 3: | |
|
36 | raise ValueError(f'Encrypted Data has invalid format, expected {cls.key_format}, got `{parts}`') | |
|
37 | return 'aes' | |
|
38 | elif b'enc2$salt' in enc_data: | |
|
39 | # we expect this data is encrypted, so validate the header | |
|
40 | if len(parts) != 3: | |
|
41 | raise ValueError(f'Encrypted Data has invalid format, expected {cls.key_format}, got `{parts}`') | |
|
42 | return 'fernet' | |
|
43 | return None | |
|
44 | ||
|
28 | 45 | def __init__(self, enc_key: bytes): |
|
29 | 46 | self.enc_key = enc_key |
|
30 | 47 | |
@@ -52,7 +69,7 b' class Encryptor(object):' | |||
|
52 | 69 | def _get_parts(self, enc_data): |
|
53 | 70 | parts = enc_data.split(b'$', 3) |
|
54 | 71 | if len(parts) != 3: |
|
55 | raise ValueError(f'Encrypted Data has invalid format, expected {self.key_format}, got {parts}') | |
|
72 | raise ValueError(f'Encrypted Data has invalid format, expected {self.key_format}, got `{parts}`') | |
|
56 | 73 | prefix, salt, enc_data = parts |
|
57 | 74 | |
|
58 | 75 | try: |
@@ -144,6 +144,10 b' class NotAllowedToCreateUserError(Except' | |||
|
144 | 144 | pass |
|
145 | 145 | |
|
146 | 146 | |
|
147 | class DuplicateUpdateUserError(Exception): | |
|
148 | pass | |
|
149 | ||
|
150 | ||
|
147 | 151 | class RepositoryCreationError(Exception): |
|
148 | 152 | pass |
|
149 | 153 |
@@ -74,6 +74,7 b' from webhelpers2.html.tags import (' | |||
|
74 | 74 | |
|
75 | 75 | from webhelpers2.number import format_byte_size |
|
76 | 76 | # python3.11 backport fixes for webhelpers2 |
|
77 | from rhodecode import ConfigGet | |
|
77 | 78 | from rhodecode.lib._vendor.webhelpers_backports import raw_select |
|
78 | 79 | |
|
79 | 80 | from rhodecode.lib.action_parser import action_parser |
@@ -916,9 +917,7 b' def get_repo_type_by_name(repo_name):' | |||
|
916 | 917 | |
|
917 | 918 | def is_svn_without_proxy(repository): |
|
918 | 919 | if is_svn(repository): |
|
919 | from rhodecode.model.settings import VcsSettingsModel | |
|
920 | conf = VcsSettingsModel().get_ui_settings_as_config_obj() | |
|
921 | return not str2bool(conf.get('vcs_svn_proxy', 'http_requests_enabled')) | |
|
920 | return not ConfigGet().get_bool('vcs.svn.proxy.enabled') | |
|
922 | 921 | return False |
|
923 | 922 | |
|
924 | 923 | |
@@ -2197,3 +2196,35 b' class IssuesRegistry(object):' | |||
|
2197 | 2196 | @property |
|
2198 | 2197 | def issues_unique_count(self): |
|
2199 | 2198 | return len(set(i['id'] for i in self.issues)) |
|
2199 | ||
|
2200 | ||
|
2201 | def get_directory_statistics(start_path): | |
|
2202 | """ | |
|
2203 | total_files, total_size, directory_stats = get_directory_statistics(start_path) | |
|
2204 | ||
|
2205 | print(f"Directory statistics for: {start_path}\n") | |
|
2206 | print(f"Total files: {total_files}") | |
|
2207 | print(f"Total size: {format_size(total_size)}\n") | |
|
2208 | ||
|
2209 | :param start_path: | |
|
2210 | :return: | |
|
2211 | """ | |
|
2212 | ||
|
2213 | total_files = 0 | |
|
2214 | total_size = 0 | |
|
2215 | directory_stats = {} | |
|
2216 | ||
|
2217 | for dir_path, dir_names, file_names in os.walk(start_path): | |
|
2218 | dir_size = 0 | |
|
2219 | file_count = len(file_names) | |
|
2220 | ||
|
2221 | for fname in file_names: | |
|
2222 | filepath = os.path.join(dir_path, fname) | |
|
2223 | file_size = os.path.getsize(filepath) | |
|
2224 | dir_size += file_size | |
|
2225 | ||
|
2226 | directory_stats[dir_path] = {'file_count': file_count, 'size': dir_size} | |
|
2227 | total_files += file_count | |
|
2228 | total_size += dir_size | |
|
2229 | ||
|
2230 | return total_files, total_size, directory_stats |
@@ -17,7 +17,6 b'' | |||
|
17 | 17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
18 | 18 | |
|
19 | 19 | import webob |
|
20 | from pyramid.threadlocal import get_current_request | |
|
21 | 20 | |
|
22 | 21 | from rhodecode import events |
|
23 | 22 | from rhodecode.lib import hooks_base |
@@ -33,6 +32,7 b' def _supports_repo_type(repo_type):' | |||
|
33 | 32 | def _get_vcs_operation_context(username, repo_name, repo_type, action): |
|
34 | 33 | # NOTE(dan): import loop |
|
35 | 34 | from rhodecode.lib.base import vcs_operation_context |
|
35 | from rhodecode.lib.pyramid_utils import get_current_request | |
|
36 | 36 | |
|
37 | 37 | check_locking = action in ('pull', 'push') |
|
38 | 38 |
@@ -141,7 +141,7 b' class ColorFormatter(ExceptionAwareForma' | |||
|
141 | 141 | """ |
|
142 | 142 | Changes record's levelname to use with COLORS enum |
|
143 | 143 | """ |
|
144 |
def_record = super( |
|
|
144 | def_record = super().format(record) | |
|
145 | 145 | |
|
146 | 146 | levelname = record.levelname |
|
147 | 147 | start = COLOR_SEQ % (COLORS[levelname]) |
@@ -17,7 +17,8 b'' | |||
|
17 | 17 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | 18 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | 19 | |
|
20 |
import |
|
|
20 | import re | |
|
21 | import os | |
|
21 | 22 | import logging |
|
22 | 23 | import urllib.request |
|
23 | 24 | import urllib.parse |
@@ -27,15 +28,11 b' import urllib.parse' | |||
|
27 | 28 | import requests |
|
28 | 29 | from pyramid.httpexceptions import HTTPNotAcceptable |
|
29 | 30 | |
|
30 |
from rhodecode |
|
|
31 | from rhodecode import ConfigGet | |
|
31 | 32 | from rhodecode.lib.middleware import simplevcs |
|
32 | 33 | from rhodecode.lib.middleware.utils import get_path_info |
|
33 | 34 | from rhodecode.lib.utils import is_valid_repo |
|
34 |
from rhodecode.lib.str_utils import safe_str |
|
|
35 | from rhodecode.lib.type_utils import str2bool | |
|
36 | from rhodecode.lib.ext_json import json | |
|
37 | from rhodecode.lib.hooks_daemon import store_txn_id_data | |
|
38 | ||
|
35 | from rhodecode.lib.str_utils import safe_str | |
|
39 | 36 | |
|
40 | 37 | log = logging.getLogger(__name__) |
|
41 | 38 | |
@@ -54,37 +51,20 b' class SimpleSvnApp(object):' | |||
|
54 | 51 | request_headers = self._get_request_headers(environ) |
|
55 | 52 | data_io = environ['wsgi.input'] |
|
56 | 53 | req_method: str = environ['REQUEST_METHOD'] |
|
57 | has_content_length = 'CONTENT_LENGTH' in environ | |
|
54 | has_content_length: bool = 'CONTENT_LENGTH' in environ | |
|
58 | 55 | |
|
59 | 56 | path_info = self._get_url( |
|
60 | 57 | self.config.get('subversion_http_server_url', ''), get_path_info(environ)) |
|
61 | 58 | transfer_encoding = environ.get('HTTP_TRANSFER_ENCODING', '') |
|
62 | log.debug('Handling: %s method via `%s`', req_method, path_info) | |
|
59 | log.debug('Handling: %s method via `%s` has_content_length:%s', req_method, path_info, has_content_length) | |
|
63 | 60 | |
|
64 | 61 | # stream control flag, based on request and content type... |
|
65 | 62 | stream = False |
|
66 | ||
|
67 | 63 | if req_method in ['MKCOL'] or has_content_length: |
|
68 | data_processed = False | |
|
69 | # read chunk to check if we have txn-with-props | |
|
70 | initial_data: bytes = data_io.read(1024) | |
|
71 | if initial_data.startswith(b'(create-txn-with-props'): | |
|
72 | data_io = initial_data + data_io.read() | |
|
73 | # store on-the-fly our rc_extra using svn revision properties | |
|
74 | # those can be read later on in hooks executed so we have a way | |
|
75 | # to pass in the data into svn hooks | |
|
76 | rc_data = base64.urlsafe_b64encode(json.dumps(self.rc_extras)) | |
|
77 | rc_data_len = str(len(rc_data)) | |
|
78 | # header defines data length, and serialized data | |
|
79 | skel = b' rc-scm-extras %b %b' % (safe_bytes(rc_data_len), safe_bytes(rc_data)) | |
|
80 | data_io = data_io[:-2] + skel + b'))' | |
|
81 | data_processed = True | |
|
82 | ||
|
83 | if not data_processed: | |
|
84 | 64 |
|
|
85 | 65 |
|
|
86 | 66 |
|
|
87 |
|
|
|
67 | data_io = data_io.read() | |
|
88 | 68 | |
|
89 | 69 | if req_method in ['GET', 'PUT'] or transfer_encoding == 'chunked': |
|
90 | 70 | # NOTE(marcink): when getting/uploading files, we want to STREAM content |
@@ -101,6 +81,7 b' class SimpleSvnApp(object):' | |||
|
101 | 81 | stream=stream |
|
102 | 82 | ) |
|
103 | 83 | if req_method in ['HEAD', 'DELETE']: |
|
84 | # NOTE(marcink): HEAD might be deprecated for SVN 1.14+ protocol | |
|
104 | 85 | del call_kwargs['data'] |
|
105 | 86 | |
|
106 | 87 | try: |
@@ -120,14 +101,6 b' class SimpleSvnApp(object):' | |||
|
120 | 101 | log.debug('got response code: %s', response.status_code) |
|
121 | 102 | |
|
122 | 103 | response_headers = self._get_response_headers(response.headers) |
|
123 | ||
|
124 | if response.headers.get('SVN-Txn-name'): | |
|
125 | svn_tx_id = response.headers.get('SVN-Txn-name') | |
|
126 | txn_id = rc_cache.utils.compute_key_from_params( | |
|
127 | self.config['repository'], svn_tx_id) | |
|
128 | port = safe_int(self.rc_extras['hooks_uri'].split(':')[-1]) | |
|
129 | store_txn_id_data(txn_id, {'port': port}) | |
|
130 | ||
|
131 | 104 | start_response(f'{response.status_code} {response.reason}', response_headers) |
|
132 | 105 | return response.iter_content(chunk_size=1024) |
|
133 | 106 | |
@@ -137,6 +110,20 b' class SimpleSvnApp(object):' | |||
|
137 | 110 | url_path = urllib.parse.quote(url_path, safe="/:=~+!$,;'") |
|
138 | 111 | return url_path |
|
139 | 112 | |
|
113 | def _get_txn_id(self, environ): | |
|
114 | url = environ['RAW_URI'] | |
|
115 | ||
|
116 | # Define the regex pattern | |
|
117 | pattern = r'/txr/([^/]+)/' | |
|
118 | ||
|
119 | # Search for the pattern in the URL | |
|
120 | match = re.search(pattern, url) | |
|
121 | ||
|
122 | # Check if a match is found and extract the captured group | |
|
123 | if match: | |
|
124 | txn_id = match.group(1) | |
|
125 | return txn_id | |
|
126 | ||
|
140 | 127 | def _get_request_headers(self, environ): |
|
141 | 128 | headers = {} |
|
142 | 129 | whitelist = { |
@@ -182,10 +169,39 b' class DisabledSimpleSvnApp(object):' | |||
|
182 | 169 | |
|
183 | 170 | |
|
184 | 171 | class SimpleSvn(simplevcs.SimpleVCS): |
|
172 | """ | |
|
173 | details: https://svn.apache.org/repos/asf/subversion/trunk/notes/http-and-webdav/webdav-protocol | |
|
174 | ||
|
175 | Read Commands : (OPTIONS, PROPFIND, GET, REPORT) | |
|
176 | ||
|
177 | GET: fetch info about resources | |
|
178 | PROPFIND: Used to retrieve properties of resources. | |
|
179 | REPORT: Used for specialized queries to the repository. E.g History etc... | |
|
180 | OPTIONS: request is sent to an SVN server, the server responds with information about the available HTTP | |
|
181 | methods and other server capabilities. | |
|
182 | ||
|
183 | Write Commands : (MKACTIVITY, PROPPATCH, PUT, CHECKOUT, MKCOL, MOVE, | |
|
184 | -------------- COPY, DELETE, LOCK, UNLOCK, MERGE) | |
|
185 | ||
|
186 | With the exception of LOCK/UNLOCK, every write command performs some | |
|
187 | sort of DeltaV commit operation. In DeltaV, a commit always starts | |
|
188 | by creating a transaction (MKACTIVITY), applies a log message | |
|
189 | (PROPPATCH), does some other write methods, and then ends by | |
|
190 | committing the transaction (MERGE). If the MERGE fails, the client | |
|
191 | may try to remove the transaction with a DELETE. | |
|
192 | ||
|
193 | PROPPATCH: Used to set and/or remove properties on resources. | |
|
194 | MKCOL: Creates a new collection (directory). | |
|
195 | DELETE: Removes a resource. | |
|
196 | COPY and MOVE: Used for copying and moving resources. | |
|
197 | MERGE: Used to merge changes from different branches. | |
|
198 | CHECKOUT, CHECKIN, UNCHECKOUT: DeltaV methods for managing working resources and versions. | |
|
199 | """ | |
|
185 | 200 | |
|
186 | 201 | SCM = 'svn' |
|
187 | 202 | READ_ONLY_COMMANDS = ('OPTIONS', 'PROPFIND', 'GET', 'REPORT') |
|
188 | DEFAULT_HTTP_SERVER = 'http://localhost:8090' | |
|
203 | WRITE_COMMANDS = ('MERGE', 'POST', 'PUT', 'COPY', 'MOVE', 'DELETE', 'MKCOL') | |
|
204 | DEFAULT_HTTP_SERVER = 'http://svn:8090' | |
|
189 | 205 | |
|
190 | 206 | def _get_repository_name(self, environ): |
|
191 | 207 | """ |
@@ -218,10 +234,10 b' class SimpleSvn(simplevcs.SimpleVCS):' | |||
|
218 | 234 | else 'push') |
|
219 | 235 | |
|
220 | 236 | def _should_use_callback_daemon(self, extras, environ, action): |
|
221 | # only MERGE command triggers hooks, so we don't want to start | |
|
237 | # only PUT & MERGE command triggers hooks, so we don't want to start | |
|
222 | 238 | # hooks server too many times. POST however starts the svn transaction |
|
223 | 239 | # so we also need to run the init of callback daemon of POST |
|
224 |
if environ['REQUEST_METHOD'] in |
|
|
240 | if environ['REQUEST_METHOD'] not in self.READ_ONLY_COMMANDS: | |
|
225 | 241 | return True |
|
226 | 242 | return False |
|
227 | 243 | |
@@ -232,12 +248,10 b' class SimpleSvn(simplevcs.SimpleVCS):' | |||
|
232 | 248 | return DisabledSimpleSvnApp(config) |
|
233 | 249 | |
|
234 | 250 | def _is_svn_enabled(self): |
|
235 | conf = self.repo_vcs_config | |
|
236 | return str2bool(conf.get('vcs_svn_proxy', 'http_requests_enabled')) | |
|
251 | return ConfigGet().get_bool('vcs.svn.proxy.enabled') | |
|
237 | 252 | |
|
238 | 253 | def _create_config(self, extras, repo_name, scheme='http'): |
|
239 | conf = self.repo_vcs_config | |
|
240 | server_url = conf.get('vcs_svn_proxy', 'http_server_url') | |
|
254 | server_url = ConfigGet().get_str('vcs.svn.proxy.host') | |
|
241 | 255 | server_url = server_url or self.DEFAULT_HTTP_SERVER |
|
242 | 256 | |
|
243 | 257 | extras['subversion_http_server_url'] = server_url |
@@ -25,11 +25,9 b" It's implemented with basic auth functio" | |||
|
25 | 25 | |
|
26 | 26 | import os |
|
27 | 27 | import re |
|
28 | import io | |
|
29 | 28 | import logging |
|
30 | 29 | import importlib |
|
31 | 30 | from functools import wraps |
|
32 | from lxml import etree | |
|
33 | 31 | |
|
34 | 32 | import time |
|
35 | 33 | from paste.httpheaders import REMOTE_USER, AUTH_TYPE |
@@ -41,14 +39,15 b' from zope.cachedescriptors.property impo' | |||
|
41 | 39 | import rhodecode |
|
42 | 40 | from rhodecode.authentication.base import authenticate, VCS_TYPE, loadplugin |
|
43 | 41 | from rhodecode.lib import rc_cache |
|
42 | from rhodecode.lib.svn_txn_utils import store_txn_id_data | |
|
44 | 43 | from rhodecode.lib.auth import AuthUser, HasPermissionAnyMiddleware |
|
45 | 44 | from rhodecode.lib.base import ( |
|
46 | 45 | BasicAuth, get_ip_addr, get_user_agent, vcs_operation_context) |
|
47 | 46 | from rhodecode.lib.exceptions import (UserCreationError, NotAllowedToCreateUserError) |
|
48 |
from rhodecode.lib.hook |
|
|
47 | from rhodecode.lib.hook_daemon.base import prepare_callback_daemon | |
|
49 | 48 | from rhodecode.lib.middleware import appenlight |
|
50 | 49 | from rhodecode.lib.middleware.utils import scm_app_http |
|
51 | from rhodecode.lib.str_utils import safe_bytes | |
|
50 | from rhodecode.lib.str_utils import safe_bytes, safe_int | |
|
52 | 51 | from rhodecode.lib.utils import is_valid_repo, SLUG_RE |
|
53 | 52 | from rhodecode.lib.utils2 import safe_str, fix_PATH, str2bool |
|
54 | 53 | from rhodecode.lib.vcs.conf import settings as vcs_settings |
@@ -63,29 +62,6 b' from rhodecode.model.settings import Set' | |||
|
63 | 62 | log = logging.getLogger(__name__) |
|
64 | 63 | |
|
65 | 64 | |
|
66 | def extract_svn_txn_id(acl_repo_name, data: bytes): | |
|
67 | """ | |
|
68 | Helper method for extraction of svn txn_id from submitted XML data during | |
|
69 | POST operations | |
|
70 | """ | |
|
71 | ||
|
72 | try: | |
|
73 | root = etree.fromstring(data) | |
|
74 | pat = re.compile(r'/txn/(?P<txn_id>.*)') | |
|
75 | for el in root: | |
|
76 | if el.tag == '{DAV:}source': | |
|
77 | for sub_el in el: | |
|
78 | if sub_el.tag == '{DAV:}href': | |
|
79 | match = pat.search(sub_el.text) | |
|
80 | if match: | |
|
81 | svn_tx_id = match.groupdict()['txn_id'] | |
|
82 | txn_id = rc_cache.utils.compute_key_from_params( | |
|
83 | acl_repo_name, svn_tx_id) | |
|
84 | return txn_id | |
|
85 | except Exception: | |
|
86 | log.exception('Failed to extract txn_id') | |
|
87 | ||
|
88 | ||
|
89 | 65 | def initialize_generator(factory): |
|
90 | 66 | """ |
|
91 | 67 | Initializes the returned generator by draining its first element. |
@@ -156,17 +132,10 b' class SimpleVCS(object):' | |||
|
156 | 132 | |
|
157 | 133 | @property |
|
158 | 134 | def base_path(self): |
|
159 |
settings_path = self. |
|
|
160 | ||
|
161 | if not settings_path: | |
|
162 | settings_path = self.global_vcs_config.get(*VcsSettingsModel.PATH_SETTING) | |
|
135 | settings_path = self.config.get('repo_store.path') | |
|
163 | 136 | |
|
164 | 137 | if not settings_path: |
|
165 | # try, maybe we passed in explicitly as config option | |
|
166 | settings_path = self.config.get('base_path') | |
|
167 | ||
|
168 | if not settings_path: | |
|
169 | raise ValueError('FATAL: base_path is empty') | |
|
138 | raise ValueError('FATAL: repo_store.path is empty') | |
|
170 | 139 | return settings_path |
|
171 | 140 | |
|
172 | 141 | def set_repo_names(self, environ): |
@@ -475,7 +444,6 b' class SimpleVCS(object):' | |||
|
475 | 444 | log.debug('Not enough credentials to access repo: `%s` ' |
|
476 | 445 | 'repository as anonymous user', self.acl_repo_name) |
|
477 | 446 | |
|
478 | ||
|
479 | 447 | username = None |
|
480 | 448 | # ============================================================== |
|
481 | 449 | # DEFAULT PERM FAILED OR ANONYMOUS ACCESS IS DISABLED SO WE |
@@ -589,6 +557,24 b' class SimpleVCS(object):' | |||
|
589 | 557 | return self._generate_vcs_response( |
|
590 | 558 | environ, start_response, repo_path, extras, action) |
|
591 | 559 | |
|
560 | def _get_txn_id(self, environ): | |
|
561 | ||
|
562 | for k in ['RAW_URI', 'HTTP_DESTINATION']: | |
|
563 | url = environ.get(k) | |
|
564 | if not url: | |
|
565 | continue | |
|
566 | ||
|
567 | # regex to search for svn-txn-id | |
|
568 | pattern = r'/!svn/txr/([^/]+)/' | |
|
569 | ||
|
570 | # Search for the pattern in the URL | |
|
571 | match = re.search(pattern, url) | |
|
572 | ||
|
573 | # Check if a match is found and extract the captured group | |
|
574 | if match: | |
|
575 | txn_id = match.group(1) | |
|
576 | return txn_id | |
|
577 | ||
|
592 | 578 | @initialize_generator |
|
593 | 579 | def _generate_vcs_response( |
|
594 | 580 | self, environ, start_response, repo_path, extras, action): |
@@ -600,28 +586,23 b' class SimpleVCS(object):' | |||
|
600 | 586 | also handles the locking exceptions which will be triggered when |
|
601 | 587 | the first chunk is produced by the underlying WSGI application. |
|
602 | 588 | """ |
|
603 | ||
|
604 |
|
|
|
605 | if 'CONTENT_LENGTH' in environ and environ['REQUEST_METHOD'] == 'MERGE': | |
|
606 | # case for SVN, we want to re-use the callback daemon port | |
|
607 | # so we use the txn_id, for this we peek the body, and still save | |
|
608 | # it as wsgi.input | |
|
609 | ||
|
610 | stream = environ['wsgi.input'] | |
|
611 | ||
|
612 | if isinstance(stream, io.BytesIO): | |
|
613 | data: bytes = stream.getvalue() | |
|
614 | elif hasattr(stream, 'buf'): # most likely gunicorn.http.body.Body | |
|
615 | data: bytes = stream.buf.getvalue() | |
|
616 | else: | |
|
617 | # fallback to the crudest way, copy the iterator | |
|
618 | data = safe_bytes(stream.read()) | |
|
619 | environ['wsgi.input'] = io.BytesIO(data) | |
|
620 | ||
|
621 | txn_id = extract_svn_txn_id(self.acl_repo_name, data) | |
|
589 | svn_txn_id = '' | |
|
590 | if action == 'push': | |
|
591 | svn_txn_id = self._get_txn_id(environ) | |
|
622 | 592 | |
|
623 | 593 | callback_daemon, extras = self._prepare_callback_daemon( |
|
624 | extras, environ, action, txn_id=txn_id) | |
|
594 | extras, environ, action, txn_id=svn_txn_id) | |
|
595 | ||
|
596 | if svn_txn_id: | |
|
597 | ||
|
598 | port = safe_int(extras['hooks_uri'].split(':')[-1]) | |
|
599 | txn_id_data = extras.copy() | |
|
600 | txn_id_data.update({'port': port}) | |
|
601 | txn_id_data.update({'req_method': environ['REQUEST_METHOD']}) | |
|
602 | ||
|
603 | full_repo_path = repo_path | |
|
604 | store_txn_id_data(full_repo_path, svn_txn_id, txn_id_data) | |
|
605 | ||
|
625 | 606 | log.debug('HOOKS extras is %s', extras) |
|
626 | 607 | |
|
627 | 608 | http_scheme = self._get_http_scheme(environ) |
@@ -684,6 +665,7 b' class SimpleVCS(object):' | |||
|
684 | 665 | |
|
685 | 666 | def _prepare_callback_daemon(self, extras, environ, action, txn_id=None): |
|
686 | 667 | protocol = vcs_settings.HOOKS_PROTOCOL |
|
668 | ||
|
687 | 669 | if not self._should_use_callback_daemon(extras, environ, action): |
|
688 | 670 | # disable callback daemon for actions that don't require it |
|
689 | 671 | protocol = 'local' |
@@ -26,6 +26,7 b' import urllib.parse' | |||
|
26 | 26 | from webob.exc import HTTPNotFound |
|
27 | 27 | |
|
28 | 28 | import rhodecode |
|
29 | from rhodecode.apps._base import ADMIN_PREFIX | |
|
29 | 30 | from rhodecode.lib.middleware.utils import get_path_info |
|
30 | 31 | from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled |
|
31 | 32 | from rhodecode.lib.middleware.simplegit import SimpleGit, GIT_PROTO_PAT |
@@ -164,14 +165,18 b' def detect_vcs_request(environ, backends' | |||
|
164 | 165 | # login |
|
165 | 166 | "_admin/login", |
|
166 | 167 | |
|
168 | # 2fa | |
|
169 | f"{ADMIN_PREFIX}/check_2fa", | |
|
170 | f"{ADMIN_PREFIX}/setup_2fa", | |
|
171 | ||
|
167 | 172 | # _admin/api is safe too |
|
168 |
' |
|
|
173 | f'{ADMIN_PREFIX}/api', | |
|
169 | 174 | |
|
170 | 175 | # _admin/gist is safe too |
|
171 |
' |
|
|
176 | f'{ADMIN_PREFIX}/gists++', | |
|
172 | 177 | |
|
173 | 178 | # _admin/my_account is safe too |
|
174 |
' |
|
|
179 | f'{ADMIN_PREFIX}/my_account++', | |
|
175 | 180 | |
|
176 | 181 | # static files no detection |
|
177 | 182 | '_static++', |
@@ -180,11 +185,11 b' def detect_vcs_request(environ, backends' | |||
|
180 | 185 | '_debug_toolbar++', |
|
181 | 186 | |
|
182 | 187 | # skip ops ping, status |
|
183 |
' |
|
|
184 |
' |
|
|
188 | f'{ADMIN_PREFIX}/ops/ping', | |
|
189 | f'{ADMIN_PREFIX}/ops/status', | |
|
185 | 190 | |
|
186 | 191 | # full channelstream connect should be VCS skipped |
|
187 |
' |
|
|
192 | f'{ADMIN_PREFIX}/channelstream/connect', | |
|
188 | 193 | |
|
189 | 194 | '++/repo_creating_check' |
|
190 | 195 | ] |
@@ -41,6 +41,12 b' or reset some user/system settings.' | |||
|
41 | 41 | """ |
|
42 | 42 | |
|
43 | 43 | |
|
44 | def import_all_from_module(module_name): | |
|
45 | import importlib | |
|
46 | module = importlib.import_module(module_name) | |
|
47 | globals().update({k: v for k, v in module.__dict__.items() if not k.startswith('_')}) | |
|
48 | ||
|
49 | ||
|
44 | 50 | def ipython_shell_runner(env, help): |
|
45 | 51 | |
|
46 | 52 | # imports, used in ipython shell |
@@ -50,7 +56,7 b' def ipython_shell_runner(env, help):' | |||
|
50 | 56 | import shutil |
|
51 | 57 | import datetime |
|
52 | 58 | from rhodecode.model import user, user_group, repo, repo_group |
|
53 |
|
|
|
59 | import_all_from_module('rhodecode.model.db') | |
|
54 | 60 | |
|
55 | 61 | try: |
|
56 | 62 | import IPython |
@@ -19,40 +19,35 b'' | |||
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | import os |
|
22 | import configparser | |
|
22 | ||
|
23 | 23 | from pyramid.paster import bootstrap as pyramid_bootstrap, setup_logging # pragma: no cover |
|
24 | ||
|
25 | from rhodecode.lib.request import Request | |
|
26 | ||
|
27 | ||
|
28 | def get_config(ini_path, **kwargs): | |
|
29 | parser = configparser.ConfigParser(**kwargs) | |
|
30 | parser.read(ini_path) | |
|
31 | return parser | |
|
32 | ||
|
33 | ||
|
34 | def get_app_config(ini_path): | |
|
35 | from paste.deploy.loadwsgi import appconfig | |
|
36 | return appconfig(f'config:{ini_path}', relative_to=os.getcwd()) | |
|
24 | from pyramid.threadlocal import get_current_request as pyramid_current_request | |
|
37 | 25 | |
|
38 | 26 | |
|
39 | 27 | def bootstrap(config_uri, options=None, env=None): |
|
28 | from rhodecode.config.utils import DEFAULT_USER | |
|
29 | from rhodecode.lib.config_utils import get_app_config_lightweight | |
|
40 | 30 | from rhodecode.lib.utils2 import AttributeDict |
|
31 | from rhodecode.lib.request import Request | |
|
41 | 32 | |
|
42 | 33 | if env: |
|
43 | 34 | os.environ.update(env) |
|
44 | 35 | |
|
45 | config = get_config(config_uri) | |
|
46 | base_url = 'http://rhodecode.local' | |
|
47 | try: | |
|
48 | base_url = config.get('app:main', 'app.base_url') | |
|
49 | except (configparser.NoSectionError, configparser.NoOptionError): | |
|
50 | pass | |
|
36 | config = get_app_config_lightweight(config_uri) | |
|
37 | base_url = config['app.base_url'] | |
|
51 | 38 | |
|
52 | 39 | request = Request.blank('/', base_url=base_url) |
|
53 | 40 | # fake inject a running user for bootstrap request ! |
|
54 |
request.user = AttributeDict({'username': |
|
|
41 | request.user = AttributeDict({'username': DEFAULT_USER, | |
|
55 | 42 | 'user_id': 1, |
|
56 | 43 | 'ip_addr': '127.0.0.1'}) |
|
57 | 44 | return pyramid_bootstrap(config_uri, request=request, options=options) |
|
58 | 45 | |
|
46 | ||
|
47 | def get_current_request(): | |
|
48 | pyramid_req = pyramid_current_request() | |
|
49 | if not pyramid_req: | |
|
50 | # maybe we're in celery context and need to get the PYRAMID_REQUEST | |
|
51 | from rhodecode.lib.celerylib.loader import celery_app | |
|
52 | pyramid_req = celery_app.conf['PYRAMID_REQUEST'] | |
|
53 | return pyramid_req |
@@ -20,7 +20,8 b' import logging' | |||
|
20 | 20 | import click |
|
21 | 21 | import pyramid.paster |
|
22 | 22 | |
|
23 |
from rhodecode.lib.pyramid_utils import bootstrap |
|
|
23 | from rhodecode.lib.pyramid_utils import bootstrap | |
|
24 | from rhodecode.lib.config_utils import get_app_config | |
|
24 | 25 | from rhodecode.lib.db_manage import DbManage |
|
25 | 26 | from rhodecode.lib.utils2 import get_encryption_key |
|
26 | 27 | from rhodecode.model.db import Session |
@@ -25,6 +25,9 b' class StatsdClientClass(Singleton):' | |||
|
25 | 25 | statsd_client = None |
|
26 | 26 | statsd = None |
|
27 | 27 | |
|
28 | def __repr__(self): | |
|
29 | return f"{self.__class__}(statsd={self.statsd})" | |
|
30 | ||
|
28 | 31 | def __getattribute__(self, name): |
|
29 | 32 | |
|
30 | 33 | if name.startswith("statsd"): |
@@ -167,3 +167,17 b' def convert_special_chars(str_) -> str:' | |||
|
167 | 167 | value = safe_str(str_) |
|
168 | 168 | converted_value = unidecode(value) |
|
169 | 169 | return converted_value |
|
170 | ||
|
171 | ||
|
172 | def splitnewlines(text: bytes): | |
|
173 | """ | |
|
174 | like splitlines, but only split on newlines. | |
|
175 | """ | |
|
176 | ||
|
177 | lines = [_l + b'\n' for _l in text.split(b'\n')] | |
|
178 | if lines: | |
|
179 | if lines[-1] == b'\n': | |
|
180 | lines.pop() | |
|
181 | else: | |
|
182 | lines[-1] = lines[-1][:-1] | |
|
183 | return lines |
@@ -100,7 +100,7 b' def get_cert_path(ini_path):' | |||
|
100 | 100 | default = '/etc/ssl/certs/ca-certificates.crt' |
|
101 | 101 | control_ca_bundle = os.path.join( |
|
102 | 102 | os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(ini_path)))), |
|
103 | '.rccontrol-profile/etc/ca-bundle.crt') | |
|
103 | '/etc/ssl/certs/ca-certificates.crt') | |
|
104 | 104 | if os.path.isfile(control_ca_bundle): |
|
105 | 105 | default = control_ca_bundle |
|
106 | 106 | |
@@ -323,7 +323,7 b' def cpu():' | |||
|
323 | 323 | value['cpu_count'] = psutil.cpu_count() |
|
324 | 324 | |
|
325 | 325 | human_value = value.copy() |
|
326 |
human_value['text'] = '{ |
|
|
326 | human_value['text'] = f'{value["cpu_count"]} cores at {value["cpu"]} %' | |
|
327 | 327 | |
|
328 | 328 | return SysInfoRes(value=value, state=state, human_value=human_value) |
|
329 | 329 | |
@@ -331,8 +331,8 b' def cpu():' | |||
|
331 | 331 | @register_sysinfo |
|
332 | 332 | def storage(): |
|
333 | 333 | from rhodecode.lib.helpers import format_byte_size_binary |
|
334 |
from rhodecode. |
|
|
335 | path = VcsSettingsModel().get_repos_location() | |
|
334 | from rhodecode.lib.utils import get_rhodecode_repo_store_path | |
|
335 | path = get_rhodecode_repo_store_path() | |
|
336 | 336 | |
|
337 | 337 | value = dict(percent=0, used=0, total=0, path=path, text='') |
|
338 | 338 | state = STATE_OK_DEFAULT |
@@ -364,8 +364,8 b' def storage():' | |||
|
364 | 364 | |
|
365 | 365 | @register_sysinfo |
|
366 | 366 | def storage_inodes(): |
|
367 |
from rhodecode. |
|
|
368 | path = VcsSettingsModel().get_repos_location() | |
|
367 | from rhodecode.lib.utils import get_rhodecode_repo_store_path | |
|
368 | path = get_rhodecode_repo_store_path() | |
|
369 | 369 | |
|
370 | 370 | value = dict(percent=0.0, free=0, used=0, total=0, path=path, text='') |
|
371 | 371 | state = STATE_OK_DEFAULT |
@@ -398,32 +398,24 b' def storage_inodes():' | |||
|
398 | 398 | @register_sysinfo |
|
399 | 399 | def storage_archives(): |
|
400 | 400 | import rhodecode |
|
401 | from rhodecode.lib.utils import safe_str | |
|
402 | 401 | from rhodecode.lib.helpers import format_byte_size_binary |
|
402 | from rhodecode.lib.archive_cache import get_archival_cache_store | |
|
403 | 403 | |
|
404 | msg = 'Archive cache storage is controlled by ' \ | |
|
405 | 'archive_cache.store_dir=/path/to/cache option in the .ini file' | |
|
406 | path = safe_str(rhodecode.CONFIG.get('archive_cache.store_dir', msg)) | |
|
404 | storage_type = rhodecode.ConfigGet().get_str('archive_cache.backend.type') | |
|
407 | 405 | |
|
408 |
value = dict(percent=0, used=0, total=0, items=0, path= |
|
|
406 | value = dict(percent=0, used=0, total=0, items=0, path='', text='', type=storage_type) | |
|
409 | 407 | state = STATE_OK_DEFAULT |
|
410 | 408 | try: |
|
411 | items_count = 0 | |
|
412 | used = 0 | |
|
413 | for root, dirs, files in os.walk(path): | |
|
414 | if root == path: | |
|
415 | items_count = len(dirs) | |
|
409 | d_cache = get_archival_cache_store(config=rhodecode.CONFIG) | |
|
416 | 410 | |
|
417 | for f in files: | |
|
418 | try: | |
|
419 | used += os.path.getsize(os.path.join(root, f)) | |
|
420 | except OSError: | |
|
421 | pass | |
|
411 | total_files, total_size, _directory_stats = d_cache.get_statistics() | |
|
412 | ||
|
422 | 413 | value.update({ |
|
423 | 414 | 'percent': 100, |
|
424 |
'used': |
|
|
425 |
'total': |
|
|
426 |
'items': |
|
|
415 | 'used': total_size, | |
|
416 | 'total': total_size, | |
|
417 | 'items': total_files, | |
|
418 | 'path': d_cache.storage_path | |
|
427 | 419 | }) |
|
428 | 420 | |
|
429 | 421 | except Exception as e: |
@@ -442,33 +434,23 b' def storage_archives():' | |||
|
442 | 434 | @register_sysinfo |
|
443 | 435 | def storage_gist(): |
|
444 | 436 | from rhodecode.model.gist import GIST_STORE_LOC |
|
445 | from rhodecode.model.settings import VcsSettingsModel | |
|
446 | from rhodecode.lib.utils import safe_str | |
|
447 | from rhodecode.lib.helpers import format_byte_size_binary | |
|
437 | from rhodecode.lib.utils import safe_str, get_rhodecode_repo_store_path | |
|
438 | from rhodecode.lib.helpers import format_byte_size_binary, get_directory_statistics | |
|
439 | ||
|
448 | 440 | path = safe_str(os.path.join( |
|
449 |
|
|
|
441 | get_rhodecode_repo_store_path(), GIST_STORE_LOC)) | |
|
450 | 442 | |
|
451 | 443 | # gist storage |
|
452 | 444 | value = dict(percent=0, used=0, total=0, items=0, path=path, text='') |
|
453 | 445 | state = STATE_OK_DEFAULT |
|
454 | 446 | |
|
455 | 447 | try: |
|
456 | items_count = 0 | |
|
457 | used = 0 | |
|
458 | for root, dirs, files in os.walk(path): | |
|
459 | if root == path: | |
|
460 | items_count = len(dirs) | |
|
461 | ||
|
462 | for f in files: | |
|
463 | try: | |
|
464 | used += os.path.getsize(os.path.join(root, f)) | |
|
465 | except OSError: | |
|
466 | pass | |
|
448 | total_files, total_size, _directory_stats = get_directory_statistics(path) | |
|
467 | 449 | value.update({ |
|
468 | 450 | 'percent': 100, |
|
469 |
'used': |
|
|
470 |
'total': |
|
|
471 |
'items': |
|
|
451 | 'used': total_size, | |
|
452 | 'total': total_size, | |
|
453 | 'items': total_files | |
|
472 | 454 | }) |
|
473 | 455 | except Exception as e: |
|
474 | 456 | log.exception('failed to fetch gist storage items') |
@@ -21,6 +21,7 b' Utilities library for RhodeCode' | |||
|
21 | 21 | """ |
|
22 | 22 | |
|
23 | 23 | import datetime |
|
24 | ||
|
24 | 25 | import decorator |
|
25 | 26 | import logging |
|
26 | 27 | import os |
@@ -31,7 +32,7 b' import socket' | |||
|
31 | 32 | import tempfile |
|
32 | 33 | import traceback |
|
33 | 34 | import tarfile |
|
34 | import warnings | |
|
35 | ||
|
35 | 36 | from functools import wraps |
|
36 | 37 | from os.path import join as jn |
|
37 | 38 | |
@@ -471,14 +472,14 b' def get_rhodecode_realm():' | |||
|
471 | 472 | return safe_str(realm.app_settings_value) |
|
472 | 473 | |
|
473 | 474 | |
|
474 |
def get_rhodecode_ |
|
|
475 | def get_rhodecode_repo_store_path(): | |
|
475 | 476 | """ |
|
476 | 477 | Returns the base path. The base path is the filesystem path which points |
|
477 | 478 | to the repository store. |
|
478 | 479 | """ |
|
479 | 480 | |
|
480 | 481 | import rhodecode |
|
481 |
return rhodecode.CONFIG[' |
|
|
482 | return rhodecode.CONFIG['repo_store.path'] | |
|
482 | 483 | |
|
483 | 484 | |
|
484 | 485 | def map_groups(path): |
@@ -33,7 +33,8 b' from rhodecode.lib.vcs.exceptions import' | |||
|
33 | 33 | |
|
34 | 34 | __all__ = [ |
|
35 | 35 | 'get_vcs_instance', 'get_backend', |
|
36 | 'VCSError', 'RepositoryError', 'CommitError', 'VCSCommunicationError' | |
|
36 | 'VCSError', 'RepositoryError', 'CommitError', 'VCSCommunicationError', | |
|
37 | 'CurlSession', 'CurlResponse' | |
|
37 | 38 | ] |
|
38 | 39 | |
|
39 | 40 | log = logging.getLogger(__name__) |
@@ -135,7 +136,12 b' class CurlSession(object):' | |||
|
135 | 136 | curl.setopt(curl.FOLLOWLOCATION, allow_redirects) |
|
136 | 137 | curl.setopt(curl.WRITEDATA, response_buffer) |
|
137 | 138 | curl.setopt(curl.HTTPHEADER, headers_list) |
|
139 | ||
|
140 | try: | |
|
138 | 141 | curl.perform() |
|
142 | except pycurl.error as exc: | |
|
143 | log.error('Failed to call endpoint url: {} using pycurl'.format(url)) | |
|
144 | raise | |
|
139 | 145 | |
|
140 | 146 | status_code = curl.getinfo(pycurl.HTTP_CODE) |
|
141 | 147 | content_type = curl.getinfo(pycurl.CONTENT_TYPE) |
@@ -326,6 +326,9 b' class GitRepository(BaseRepository):' | |||
|
326 | 326 | def _get_branches(self): |
|
327 | 327 | return self._get_refs_entries(prefix='refs/heads/', strip_prefix=True) |
|
328 | 328 | |
|
329 | def delete_branch(self, branch_name): | |
|
330 | return self._remote.delete_branch(branch_name) | |
|
331 | ||
|
329 | 332 | @CachedProperty |
|
330 | 333 | def branches(self): |
|
331 | 334 | return self._get_branches() |
@@ -1037,6 +1040,8 b' class GitRepository(BaseRepository):' | |||
|
1037 | 1040 | pr_branch, self.path, target_ref.name, enable_hooks=True, |
|
1038 | 1041 | rc_scm_data=self.config.get('rhodecode', 'RC_SCM_DATA')) |
|
1039 | 1042 | merge_succeeded = True |
|
1043 | if close_branch and source_ref.name != target_ref.name and not dry_run and source_ref.type == 'branch': | |
|
1044 | self.delete_branch(source_ref.name) | |
|
1040 | 1045 | except RepositoryError: |
|
1041 | 1046 | log.exception( |
|
1042 | 1047 | 'Failure when doing local push from the shadow ' |
@@ -35,6 +35,7 b' from rhodecode.lib.datelib import (' | |||
|
35 | 35 | from rhodecode.lib.str_utils import safe_str |
|
36 | 36 | from rhodecode.lib.utils2 import CachedProperty |
|
37 | 37 | from rhodecode.lib.vcs import connection, exceptions |
|
38 | from rhodecode.lib.vcs.conf import settings as vcs_settings | |
|
38 | 39 | from rhodecode.lib.vcs.backends.base import ( |
|
39 | 40 | BaseRepository, CollectionGenerator, Config, MergeResponse, |
|
40 | 41 | MergeFailureReason, Reference, BasePathPermissionChecker) |
@@ -722,7 +723,12 b' class MercurialRepository(BaseRepository' | |||
|
722 | 723 | commit needs to be pushed. |
|
723 | 724 | """ |
|
724 | 725 | self._update(source_ref.commit_id) |
|
725 | message = close_message or f"Closing branch: `{source_ref.name}`" | |
|
726 | message = (close_message or vcs_settings.HG_CLOSE_BRANCH_MESSAGE_TMPL).format( | |
|
727 | user_name=user_name, | |
|
728 | user_email=user_email, | |
|
729 | target_ref_name=target_ref.name, | |
|
730 | source_ref_name=source_ref.name | |
|
731 | ) | |
|
726 | 732 | try: |
|
727 | 733 | self._remote.commit( |
|
728 | 734 | message=safe_str(message), |
@@ -58,6 +58,9 b' MERGE_MESSAGE_TMPL = (' | |||
|
58 | 58 | MERGE_DRY_RUN_MESSAGE = 'dry_run_merge_message_from_rhodecode' |
|
59 | 59 | MERGE_DRY_RUN_USER = 'Dry-Run User' |
|
60 | 60 | MERGE_DRY_RUN_EMAIL = 'dry-run-merge@rhodecode.com' |
|
61 | HG_CLOSE_BRANCH_MESSAGE_TMPL = ( | |
|
62 | 'Closing branch: `{source_ref_name}`' | |
|
63 | ) | |
|
61 | 64 | |
|
62 | 65 | |
|
63 | 66 | def available_aliases(): |
@@ -146,6 +146,10 b' class CommandError(VCSError):' | |||
|
146 | 146 | pass |
|
147 | 147 | |
|
148 | 148 | |
|
149 | class ImproperlyConfiguredError(Exception): | |
|
150 | pass | |
|
151 | ||
|
152 | ||
|
149 | 153 | class UnhandledException(VCSError): |
|
150 | 154 | """ |
|
151 | 155 | Signals that something unexpected went wrong. |
@@ -21,6 +21,7 b' import logging' | |||
|
21 | 21 | |
|
22 | 22 | import rhodecode |
|
23 | 23 | from rhodecode.model import meta, db |
|
24 | from rhodecode.lib.utils import get_rhodecode_repo_store_path | |
|
24 | 25 | from rhodecode.lib.utils2 import obfuscate_url_pw, get_encryption_key |
|
25 | 26 | |
|
26 | 27 | log = logging.getLogger(__name__) |
@@ -138,3 +139,11 b' class BaseModel(object):' | |||
|
138 | 139 | Returns all instances of what is defined in `cls` class variable |
|
139 | 140 | """ |
|
140 | 141 | return cls.cls.getAll() |
|
142 | ||
|
143 | @property | |
|
144 | def repos_path(self): | |
|
145 | """ | |
|
146 | Gets the repositories root path from *ini file | |
|
147 | """ | |
|
148 | ||
|
149 | return get_rhodecode_repo_store_path() |
@@ -33,9 +33,10 b' import functools' | |||
|
33 | 33 | import traceback |
|
34 | 34 | import collections |
|
35 | 35 | |
|
36 | import pyotp | |
|
36 | 37 | from sqlalchemy import ( |
|
37 | 38 | or_, and_, not_, func, cast, TypeDecorator, event, select, |
|
38 | true, false, null, | |
|
39 | true, false, null, union_all, | |
|
39 | 40 | Index, Sequence, UniqueConstraint, ForeignKey, CheckConstraint, Column, |
|
40 | 41 | Boolean, String, Unicode, UnicodeText, DateTime, Integer, LargeBinary, |
|
41 | 42 | Text, Float, PickleType, BigInteger) |
@@ -51,6 +52,7 b' from zope.cachedescriptors.property impo' | |||
|
51 | 52 | from pyramid.threadlocal import get_current_request |
|
52 | 53 | from webhelpers2.text import remove_formatting |
|
53 | 54 | |
|
55 | from rhodecode import ConfigGet | |
|
54 | 56 | from rhodecode.lib.str_utils import safe_bytes |
|
55 | 57 | from rhodecode.translation import _ |
|
56 | 58 | from rhodecode.lib.vcs import get_vcs_instance, VCSError |
@@ -126,6 +128,11 b' def _hash_key(k):' | |||
|
126 | 128 | return sha1_safe(k) |
|
127 | 129 | |
|
128 | 130 | |
|
131 | def description_escaper(desc): | |
|
132 | from rhodecode.lib import helpers as h | |
|
133 | return h.escape(desc) | |
|
134 | ||
|
135 | ||
|
129 | 136 | def in_filter_generator(qry, items, limit=500): |
|
130 | 137 | """ |
|
131 | 138 | Splits IN() into multiple with OR |
@@ -197,9 +204,7 b' class EncryptedTextValue(TypeDecorator):' | |||
|
197 | 204 | if not value: |
|
198 | 205 | return value |
|
199 | 206 | |
|
200 | enc_strict_mode = rhodecode.ConfigGet().get_bool('rhodecode.encrypted_values.strict', missing=True) | |
|
201 | ||
|
202 | bytes_val = enc_utils.decrypt_value(value, enc_key=ENCRYPTION_KEY, strict_mode=enc_strict_mode) | |
|
207 | bytes_val = enc_utils.decrypt_value(value, enc_key=ENCRYPTION_KEY) | |
|
203 | 208 | |
|
204 | 209 | return safe_str(bytes_val) |
|
205 | 210 | |
@@ -586,6 +591,7 b' class User(Base, BaseModel):' | |||
|
586 | 591 | DEFAULT_USER = 'default' |
|
587 | 592 | DEFAULT_USER_EMAIL = 'anonymous@rhodecode.org' |
|
588 | 593 | DEFAULT_GRAVATAR_URL = 'https://secure.gravatar.com/avatar/{md5email}?d=identicon&s={size}' |
|
594 | RECOVERY_CODES_COUNT = 10 | |
|
589 | 595 | |
|
590 | 596 | user_id = Column("user_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
591 | 597 | username = Column("username", String(255), nullable=True, unique=None, default=None) |
@@ -662,16 +668,14 b' class User(Base, BaseModel):' | |||
|
662 | 668 | |
|
663 | 669 | @hybrid_property |
|
664 | 670 | def first_name(self): |
|
665 | from rhodecode.lib import helpers as h | |
|
666 | 671 | if self.name: |
|
667 |
return |
|
|
672 | return description_escaper(self.name) | |
|
668 | 673 | return self.name |
|
669 | 674 | |
|
670 | 675 | @hybrid_property |
|
671 | 676 | def last_name(self): |
|
672 | from rhodecode.lib import helpers as h | |
|
673 | 677 | if self.lastname: |
|
674 |
return |
|
|
678 | return description_escaper(self.lastname) | |
|
675 | 679 | return self.lastname |
|
676 | 680 | |
|
677 | 681 | @hybrid_property |
@@ -793,16 +797,148 b' class User(Base, BaseModel):' | |||
|
793 | 797 | Session.commit() |
|
794 | 798 | return artifact_token.api_key |
|
795 | 799 | |
|
796 | @classmethod | |
|
797 | def get(cls, user_id, cache=False): | |
|
798 | if not user_id: | |
|
799 | return | |
|
800 | ||
|
801 | user = cls.query() | |
|
802 | if cache: | |
|
803 | user = user.options( | |
|
804 | FromCache("sql_cache_short", f"get_users_{user_id}")) | |
|
805 | return user.get(user_id) | |
|
800 | def is_totp_valid(self, received_code, secret): | |
|
801 | totp = pyotp.TOTP(secret) | |
|
802 | return totp.verify(received_code) | |
|
803 | ||
|
804 | def is_2fa_recovery_code_valid(self, received_code, secret): | |
|
805 | encrypted_recovery_codes = self.user_data.get('recovery_codes_2fa', []) | |
|
806 | recovery_codes = self.get_2fa_recovery_codes() | |
|
807 | if received_code in recovery_codes: | |
|
808 | encrypted_recovery_codes.pop(recovery_codes.index(received_code)) | |
|
809 | self.update_userdata(recovery_codes_2fa=encrypted_recovery_codes) | |
|
810 | return True | |
|
811 | return False | |
|
812 | ||
|
813 | @hybrid_property | |
|
814 | def has_forced_2fa(self): | |
|
815 | """ | |
|
816 | Checks if 2fa was forced for current user | |
|
817 | """ | |
|
818 | from rhodecode.model.settings import SettingsModel | |
|
819 | if value := SettingsModel().get_setting_by_name(f'auth_{self.extern_type}_global_2fa'): | |
|
820 | return value.app_settings_value | |
|
821 | return False | |
|
822 | ||
|
823 | @hybrid_property | |
|
824 | def has_enabled_2fa(self): | |
|
825 | """ | |
|
826 | Checks if user enabled 2fa | |
|
827 | """ | |
|
828 | if value := self.has_forced_2fa: | |
|
829 | return value | |
|
830 | return self.user_data.get('enabled_2fa', False) | |
|
831 | ||
|
832 | @has_enabled_2fa.setter | |
|
833 | def has_enabled_2fa(self, val): | |
|
834 | val = str2bool(val) | |
|
835 | self.update_userdata(enabled_2fa=val) | |
|
836 | if not val: | |
|
837 | # NOTE: setting to false we clear the user_data to not store any 2fa artifacts | |
|
838 | self.update_userdata(secret_2fa=None, recovery_codes_2fa=[], check_2fa=False) | |
|
839 | Session().commit() | |
|
840 | ||
|
841 | @hybrid_property | |
|
842 | def check_2fa_required(self): | |
|
843 | """ | |
|
844 | Check if check 2fa flag is set for this user | |
|
845 | """ | |
|
846 | value = self.user_data.get('check_2fa', False) | |
|
847 | return value | |
|
848 | ||
|
849 | @check_2fa_required.setter | |
|
850 | def check_2fa_required(self, val): | |
|
851 | val = str2bool(val) | |
|
852 | self.update_userdata(check_2fa=val) | |
|
853 | Session().commit() | |
|
854 | ||
|
855 | @hybrid_property | |
|
856 | def has_seen_2fa_codes(self): | |
|
857 | """ | |
|
858 | get the flag about if user has seen 2fa recovery codes | |
|
859 | """ | |
|
860 | value = self.user_data.get('recovery_codes_2fa_seen', False) | |
|
861 | return value | |
|
862 | ||
|
863 | @has_seen_2fa_codes.setter | |
|
864 | def has_seen_2fa_codes(self, val): | |
|
865 | val = str2bool(val) | |
|
866 | self.update_userdata(recovery_codes_2fa_seen=val) | |
|
867 | Session().commit() | |
|
868 | ||
|
869 | @hybrid_property | |
|
870 | def needs_2fa_configure(self): | |
|
871 | """ | |
|
872 | Determines if setup2fa has completed for this user. Means he has all needed data for 2fa to work. | |
|
873 | ||
|
874 | Currently this is 2fa enabled and secret exists | |
|
875 | """ | |
|
876 | if self.has_enabled_2fa: | |
|
877 | return not self.user_data.get('secret_2fa') | |
|
878 | return False | |
|
879 | ||
|
880 | def init_2fa_recovery_codes(self, persist=True, force=False): | |
|
881 | """ | |
|
882 | Creates 2fa recovery codes | |
|
883 | """ | |
|
884 | recovery_codes = self.user_data.get('recovery_codes_2fa', []) | |
|
885 | encrypted_codes = [] | |
|
886 | if not recovery_codes or force: | |
|
887 | for _ in range(self.RECOVERY_CODES_COUNT): | |
|
888 | recovery_code = pyotp.random_base32() | |
|
889 | recovery_codes.append(recovery_code) | |
|
890 | encrypted_code = enc_utils.encrypt_value(safe_bytes(recovery_code), enc_key=ENCRYPTION_KEY) | |
|
891 | encrypted_codes.append(safe_str(encrypted_code)) | |
|
892 | if persist: | |
|
893 | self.update_userdata(recovery_codes_2fa=encrypted_codes, recovery_codes_2fa_seen=False) | |
|
894 | return recovery_codes | |
|
895 | # User should not check the same recovery codes more than once | |
|
896 | return [] | |
|
897 | ||
|
898 | def get_2fa_recovery_codes(self): | |
|
899 | encrypted_recovery_codes = self.user_data.get('recovery_codes_2fa', []) | |
|
900 | ||
|
901 | recovery_codes = list(map( | |
|
902 | lambda val: safe_str( | |
|
903 | enc_utils.decrypt_value( | |
|
904 | val, | |
|
905 | enc_key=ENCRYPTION_KEY | |
|
906 | )), | |
|
907 | encrypted_recovery_codes)) | |
|
908 | return recovery_codes | |
|
909 | ||
|
910 | def init_secret_2fa(self, persist=True, force=False): | |
|
911 | secret_2fa = self.user_data.get('secret_2fa') | |
|
912 | if not secret_2fa or force: | |
|
913 | secret = pyotp.random_base32() | |
|
914 | if persist: | |
|
915 | self.update_userdata(secret_2fa=safe_str(enc_utils.encrypt_value(safe_bytes(secret), enc_key=ENCRYPTION_KEY))) | |
|
916 | return secret | |
|
917 | return '' | |
|
918 | ||
|
919 | @hybrid_property | |
|
920 | def secret_2fa(self) -> str: | |
|
921 | """ | |
|
922 | get stored secret for 2fa | |
|
923 | """ | |
|
924 | secret_2fa = self.user_data.get('secret_2fa') | |
|
925 | if secret_2fa: | |
|
926 | return safe_str( | |
|
927 | enc_utils.decrypt_value(secret_2fa, enc_key=ENCRYPTION_KEY)) | |
|
928 | return '' | |
|
929 | ||
|
930 | @secret_2fa.setter | |
|
931 | def secret_2fa(self, value: str) -> None: | |
|
932 | encrypted_value = enc_utils.encrypt_value(safe_bytes(value), enc_key=ENCRYPTION_KEY) | |
|
933 | self.update_userdata(secret_2fa=safe_str(encrypted_value)) | |
|
934 | ||
|
935 | def regenerate_2fa_recovery_codes(self): | |
|
936 | """ | |
|
937 | Regenerates 2fa recovery codes upon request | |
|
938 | """ | |
|
939 | new_recovery_codes = self.init_2fa_recovery_codes(force=True) | |
|
940 | Session().commit() | |
|
941 | return new_recovery_codes | |
|
806 | 942 | |
|
807 | 943 | @classmethod |
|
808 | 944 | def extra_valid_auth_tokens(cls, user, role=None): |
@@ -930,13 +1066,24 b' class User(Base, BaseModel):' | |||
|
930 | 1066 | @user_data.setter |
|
931 | 1067 | def user_data(self, val): |
|
932 | 1068 | if not isinstance(val, dict): |
|
933 |
raise Exception('user_data must be dict, got |
|
|
1069 | raise Exception(f'user_data must be dict, got {type(val)}') | |
|
934 | 1070 | try: |
|
935 | 1071 | self._user_data = safe_bytes(json.dumps(val)) |
|
936 | 1072 | except Exception: |
|
937 | 1073 | log.error(traceback.format_exc()) |
|
938 | 1074 | |
|
939 | 1075 | @classmethod |
|
1076 | def get(cls, user_id, cache=False): | |
|
1077 | if not user_id: | |
|
1078 | return | |
|
1079 | ||
|
1080 | user = cls.query() | |
|
1081 | if cache: | |
|
1082 | user = user.options( | |
|
1083 | FromCache("sql_cache_short", f"get_users_{user_id}")) | |
|
1084 | return user.get(user_id) | |
|
1085 | ||
|
1086 | @classmethod | |
|
940 | 1087 | def get_by_username(cls, username, case_insensitive=False, |
|
941 | 1088 | cache=False): |
|
942 | 1089 | |
@@ -954,6 +1101,12 b' class User(Base, BaseModel):' | |||
|
954 | 1101 | return cls.execute(q).scalar_one_or_none() |
|
955 | 1102 | |
|
956 | 1103 | @classmethod |
|
1104 | def get_by_username_or_primary_email(cls, user_identifier): | |
|
1105 | qs = union_all(cls.select().where(func.lower(cls.username) == func.lower(user_identifier)), | |
|
1106 | cls.select().where(func.lower(cls.email) == func.lower(user_identifier))) | |
|
1107 | return cls.execute(cls.select(User).from_statement(qs)).scalar_one_or_none() | |
|
1108 | ||
|
1109 | @classmethod | |
|
957 | 1110 | def get_by_auth_token(cls, auth_token, cache=False): |
|
958 | 1111 | |
|
959 | 1112 | q = cls.select(User)\ |
@@ -1218,8 +1371,7 b' class UserApiKeys(Base, BaseModel):' | |||
|
1218 | 1371 | |
|
1219 | 1372 | @hybrid_property |
|
1220 | 1373 | def description_safe(self): |
|
1221 | from rhodecode.lib import helpers as h | |
|
1222 | return h.escape(self.description) | |
|
1374 | return description_escaper(self.description) | |
|
1223 | 1375 | |
|
1224 | 1376 | @property |
|
1225 | 1377 | def expired(self): |
@@ -1322,8 +1474,7 b' class UserIpMap(Base, BaseModel):' | |||
|
1322 | 1474 | |
|
1323 | 1475 | @hybrid_property |
|
1324 | 1476 | def description_safe(self): |
|
1325 | from rhodecode.lib import helpers as h | |
|
1326 | return h.escape(self.description) | |
|
1477 | return description_escaper(self.description) | |
|
1327 | 1478 | |
|
1328 | 1479 | @classmethod |
|
1329 | 1480 | def _get_ip_range(cls, ip_addr): |
@@ -1461,8 +1612,7 b' class UserGroup(Base, BaseModel):' | |||
|
1461 | 1612 | |
|
1462 | 1613 | @hybrid_property |
|
1463 | 1614 | def description_safe(self): |
|
1464 | from rhodecode.lib import helpers as h | |
|
1465 | return h.escape(self.user_group_description) | |
|
1615 | return description_escaper(self.user_group_description) | |
|
1466 | 1616 | |
|
1467 | 1617 | @hybrid_property |
|
1468 | 1618 | def group_data(self): |
@@ -1514,7 +1664,7 b' class UserGroup(Base, BaseModel):' | |||
|
1514 | 1664 | user_group = cls.query() |
|
1515 | 1665 | if cache: |
|
1516 | 1666 | user_group = user_group.options( |
|
1517 |
FromCache("sql_cache_short", "get_users_group_ |
|
|
1667 | FromCache("sql_cache_short", f"get_users_group_{user_group_id}")) | |
|
1518 | 1668 | return user_group.get(user_group_id) |
|
1519 | 1669 | |
|
1520 | 1670 | def permissions(self, with_admins=True, with_owner=True, |
@@ -1806,8 +1956,7 b' class Repository(Base, BaseModel):' | |||
|
1806 | 1956 | |
|
1807 | 1957 | @hybrid_property |
|
1808 | 1958 | def description_safe(self): |
|
1809 | from rhodecode.lib import helpers as h | |
|
1810 | return h.escape(self.description) | |
|
1959 | return description_escaper(self.description) | |
|
1811 | 1960 | |
|
1812 | 1961 | @hybrid_property |
|
1813 | 1962 | def landing_rev(self): |
@@ -1908,7 +2057,7 b' class Repository(Base, BaseModel):' | |||
|
1908 | 2057 | if val: |
|
1909 | 2058 | return val |
|
1910 | 2059 | else: |
|
1911 |
cache_key = "get_repo_by_name_ |
|
|
2060 | cache_key = f"get_repo_by_name_{_hash_key(repo_name)}" | |
|
1912 | 2061 | q = q.options( |
|
1913 | 2062 | FromCache("sql_cache_short", cache_key)) |
|
1914 | 2063 | |
@@ -1942,8 +2091,8 b' class Repository(Base, BaseModel):' | |||
|
1942 | 2091 | |
|
1943 | 2092 | :param cls: |
|
1944 | 2093 | """ |
|
1945 |
from rhodecode.lib.utils import get_rhodecode_ |
|
|
1946 |
return get_rhodecode_ |
|
|
2094 | from rhodecode.lib.utils import get_rhodecode_repo_store_path | |
|
2095 | return get_rhodecode_repo_store_path() | |
|
1947 | 2096 | |
|
1948 | 2097 | @classmethod |
|
1949 | 2098 | def get_all_repos(cls, user_id=Optional(None), group_id=Optional(None), |
@@ -2009,16 +2158,13 b' class Repository(Base, BaseModel):' | |||
|
2009 | 2158 | def groups_and_repo(self): |
|
2010 | 2159 | return self.groups_with_parents, self |
|
2011 | 2160 | |
|
2012 |
@ |
|
|
2161 | @property | |
|
2013 | 2162 | def repo_path(self): |
|
2014 | 2163 | """ |
|
2015 | 2164 | Returns base full path for that repository means where it actually |
|
2016 | 2165 | exists on a filesystem |
|
2017 | 2166 | """ |
|
2018 | q = Session().query(RhodeCodeUi).filter( | |
|
2019 | RhodeCodeUi.ui_key == self.NAME_SEP) | |
|
2020 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) | |
|
2021 | return q.one().ui_value | |
|
2167 | return self.base_path() | |
|
2022 | 2168 | |
|
2023 | 2169 | @property |
|
2024 | 2170 | def repo_full_path(self): |
@@ -2768,8 +2914,7 b' class RepoGroup(Base, BaseModel):' | |||
|
2768 | 2914 | |
|
2769 | 2915 | @hybrid_property |
|
2770 | 2916 | def description_safe(self): |
|
2771 | from rhodecode.lib import helpers as h | |
|
2772 | return h.escape(self.group_description) | |
|
2917 | return description_escaper(self.group_description) | |
|
2773 | 2918 | |
|
2774 | 2919 | @classmethod |
|
2775 | 2920 | def hash_repo_group_name(cls, repo_group_name): |
@@ -4271,8 +4416,7 b' class _PullRequestBase(BaseModel):' | |||
|
4271 | 4416 | |
|
4272 | 4417 | @hybrid_property |
|
4273 | 4418 | def description_safe(self): |
|
4274 | from rhodecode.lib import helpers as h | |
|
4275 | return h.escape(self.description) | |
|
4419 | return description_escaper(self.description) | |
|
4276 | 4420 | |
|
4277 | 4421 | @hybrid_property |
|
4278 | 4422 | def revisions(self): |
@@ -4438,6 +4582,12 b' class PullRequest(Base, _PullRequestBase' | |||
|
4438 | 4582 | else: |
|
4439 | 4583 | return f'<DB:PullRequest at {id(self)!r}>' |
|
4440 | 4584 | |
|
4585 | def __str__(self): | |
|
4586 | if self.pull_request_id: | |
|
4587 | return f'#{self.pull_request_id}' | |
|
4588 | else: | |
|
4589 | return f'#{id(self)!r}' | |
|
4590 | ||
|
4441 | 4591 | reviewers = relationship('PullRequestReviewers', cascade="all, delete-orphan", back_populates='pull_request') |
|
4442 | 4592 | statuses = relationship('ChangesetStatus', cascade="all, delete-orphan", back_populates='pull_request') |
|
4443 | 4593 | comments = relationship('ChangesetComment', cascade="all, delete-orphan", back_populates='pull_request') |
@@ -4874,8 +5024,7 b' class Gist(Base, BaseModel):' | |||
|
4874 | 5024 | |
|
4875 | 5025 | @hybrid_property |
|
4876 | 5026 | def description_safe(self): |
|
4877 | from rhodecode.lib import helpers as h | |
|
4878 | return h.escape(self.gist_description) | |
|
5027 | return description_escaper(self.gist_description) | |
|
4879 | 5028 | |
|
4880 | 5029 | @classmethod |
|
4881 | 5030 | def get_or_404(cls, id_): |
@@ -4903,10 +5052,9 b' class Gist(Base, BaseModel):' | |||
|
4903 | 5052 | :param cls: |
|
4904 | 5053 | """ |
|
4905 | 5054 | from rhodecode.model.gist import GIST_STORE_LOC |
|
4906 | q = Session().query(RhodeCodeUi)\ | |
|
4907 | .filter(RhodeCodeUi.ui_key == URL_SEP) | |
|
4908 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) | |
|
4909 | return os.path.join(q.one().ui_value, GIST_STORE_LOC) | |
|
5055 | from rhodecode.lib.utils import get_rhodecode_repo_store_path | |
|
5056 | repo_store_path = get_rhodecode_repo_store_path() | |
|
5057 | return os.path.join(repo_store_path, GIST_STORE_LOC) | |
|
4910 | 5058 | |
|
4911 | 5059 | def get_api_data(self): |
|
4912 | 5060 | """ |
@@ -4928,8 +5076,7 b' class Gist(Base, BaseModel):' | |||
|
4928 | 5076 | return data |
|
4929 | 5077 | |
|
4930 | 5078 | def __json__(self): |
|
4931 | data = dict( | |
|
4932 | ) | |
|
5079 | data = dict() | |
|
4933 | 5080 | data.update(self.get_api_data()) |
|
4934 | 5081 | return data |
|
4935 | 5082 | # SCM functions |
@@ -5334,8 +5481,7 b' class ScheduleEntry(Base, BaseModel):' | |||
|
5334 | 5481 | @schedule_type.setter |
|
5335 | 5482 | def schedule_type(self, val): |
|
5336 | 5483 | if val not in self.schedule_types: |
|
5337 |
raise ValueError('Value must be on of `{}` and got `{}`' |
|
|
5338 | val, self.schedule_type)) | |
|
5484 | raise ValueError(f'Value must be on of `{val}` and got `{self.schedule_type}`') | |
|
5339 | 5485 | |
|
5340 | 5486 | self._schedule_type = val |
|
5341 | 5487 | |
@@ -5343,21 +5489,25 b' class ScheduleEntry(Base, BaseModel):' | |||
|
5343 | 5489 | def get_uid(cls, obj): |
|
5344 | 5490 | args = obj.task_args |
|
5345 | 5491 | kwargs = obj.task_kwargs |
|
5492 | ||
|
5346 | 5493 | if isinstance(args, JsonRaw): |
|
5347 | 5494 | try: |
|
5348 | args = json.loads(args) | |
|
5495 | args = json.loads(str(args)) | |
|
5349 | 5496 | except ValueError: |
|
5497 | log.exception('json.loads of args failed...') | |
|
5350 | 5498 | args = tuple() |
|
5351 | 5499 | |
|
5352 | 5500 | if isinstance(kwargs, JsonRaw): |
|
5353 | 5501 | try: |
|
5354 | kwargs = json.loads(kwargs) | |
|
5502 | kwargs = json.loads(str(kwargs)) | |
|
5355 | 5503 | except ValueError: |
|
5504 | log.exception('json.loads of kwargs failed...') | |
|
5356 | 5505 | kwargs = dict() |
|
5357 | 5506 | |
|
5358 | 5507 | dot_notation = obj.task_dot_notation |
|
5359 | val = '.'.join(map(safe_str, [ | |
|
5360 | sorted(dot_notation), args, sorted(kwargs.items())])) | |
|
5508 | val = '.'.join(map(safe_str, [dot_notation, args, sorted(kwargs.items())])) | |
|
5509 | log.debug('calculating task uid using id:`%s`', val) | |
|
5510 | ||
|
5361 | 5511 | return sha1(safe_bytes(val)) |
|
5362 | 5512 | |
|
5363 | 5513 | @classmethod |
@@ -5368,6 +5518,10 b' class ScheduleEntry(Base, BaseModel):' | |||
|
5368 | 5518 | def get_by_schedule_id(cls, schedule_id): |
|
5369 | 5519 | return cls.query().filter(cls.schedule_entry_id == schedule_id).scalar() |
|
5370 | 5520 | |
|
5521 | @classmethod | |
|
5522 | def get_by_task_uid(cls, task_uid): | |
|
5523 | return cls.query().filter(cls.task_uid == task_uid).scalar() | |
|
5524 | ||
|
5371 | 5525 | @property |
|
5372 | 5526 | def task(self): |
|
5373 | 5527 | return self.task_dot_notation |
@@ -5549,18 +5703,23 b' class UserBookmark(Base, BaseModel):' | |||
|
5549 | 5703 | |
|
5550 | 5704 | @classmethod |
|
5551 | 5705 | def get_bookmarks_for_user(cls, user_id, cache=True): |
|
5552 |
bookmarks = |
|
|
5553 |
|
|
|
5554 |
|
|
|
5555 | .options(joinedload(UserBookmark.repository_group)) \ | |
|
5706 | bookmarks = select( | |
|
5707 | UserBookmark.title, | |
|
5708 | UserBookmark.position, | |
|
5709 | ) \ | |
|
5710 | .add_columns(Repository.repo_id, Repository.repo_type, Repository.repo_name) \ | |
|
5711 | .add_columns(RepoGroup.group_id, RepoGroup.group_name) \ | |
|
5712 | .where(UserBookmark.user_id == user_id) \ | |
|
5713 | .outerjoin(Repository, Repository.repo_id == UserBookmark.bookmark_repo_id) \ | |
|
5714 | .outerjoin(RepoGroup, RepoGroup.group_id == UserBookmark.bookmark_repo_group_id) \ | |
|
5556 | 5715 | .order_by(UserBookmark.position.asc()) |
|
5557 | 5716 | |
|
5558 | 5717 | if cache: |
|
5559 | 5718 | bookmarks = bookmarks.options( |
|
5560 |
FromCache("sql_cache_short", "get_user_{}_bookmarks" |
|
|
5719 | FromCache("sql_cache_short", f"get_user_{user_id}_bookmarks") | |
|
5561 | 5720 | ) |
|
5562 | 5721 | |
|
5563 | return bookmarks.all() | |
|
5722 | return Session().execute(bookmarks).all() | |
|
5564 | 5723 | |
|
5565 | 5724 | def __repr__(self): |
|
5566 | 5725 | return f'<UserBookmark({self.position} @ {self.redirect_url!r})>' |
@@ -104,6 +104,31 b' def LoginForm(localizer):' | |||
|
104 | 104 | return _LoginForm |
|
105 | 105 | |
|
106 | 106 | |
|
107 | def TOTPForm(localizer, user, allow_recovery_code_use=False): | |
|
108 | _ = localizer | |
|
109 | ||
|
110 | class _TOTPForm(formencode.Schema): | |
|
111 | allow_extra_fields = True | |
|
112 | filter_extra_fields = False | |
|
113 | totp = v.Regex(r'^(?:\d{6}|[A-Z0-9]{32})$') | |
|
114 | secret_totp = v.String() | |
|
115 | ||
|
116 | def to_python(self, value, state=None): | |
|
117 | validation_checks = [user.is_totp_valid] | |
|
118 | if allow_recovery_code_use: | |
|
119 | validation_checks.append(user.is_2fa_recovery_code_valid) | |
|
120 | form_data = super().to_python(value, state) | |
|
121 | received_code = form_data['totp'] | |
|
122 | secret = form_data.get('secret_totp') | |
|
123 | ||
|
124 | if not any(map(lambda func: func(received_code, secret), validation_checks)): | |
|
125 | error_msg = _('Code is invalid. Try again!') | |
|
126 | raise formencode.Invalid(error_msg, v, state, error_dict={'totp': error_msg}) | |
|
127 | return form_data | |
|
128 | ||
|
129 | return _TOTPForm | |
|
130 | ||
|
131 | ||
|
107 | 132 | def UserForm(localizer, edit=False, available_languages=None, old_data=None): |
|
108 | 133 | old_data = old_data or {} |
|
109 | 134 | available_languages = available_languages or [] |
@@ -421,10 +446,6 b' class _BaseVcsSettingsForm(formencode.Sc' | |||
|
421 | 446 | rhodecode_git_use_rebase_for_merging = v.StringBoolean(if_missing=False) |
|
422 | 447 | rhodecode_git_close_branch_before_merging = v.StringBoolean(if_missing=False) |
|
423 | 448 | |
|
424 | # svn | |
|
425 | vcs_svn_proxy_http_requests_enabled = v.StringBoolean(if_missing=False) | |
|
426 | vcs_svn_proxy_http_server_url = v.UnicodeString(strip=True, if_missing=None) | |
|
427 | ||
|
428 | 449 | # cache |
|
429 | 450 | rhodecode_diff_cache = v.StringBoolean(if_missing=False) |
|
430 | 451 | |
@@ -434,10 +455,6 b' def ApplicationUiSettingsForm(localizer)' | |||
|
434 | 455 | |
|
435 | 456 | class _ApplicationUiSettingsForm(_BaseVcsSettingsForm): |
|
436 | 457 | web_push_ssl = v.StringBoolean(if_missing=False) |
|
437 | paths_root_path = All( | |
|
438 | v.ValidPath(localizer), | |
|
439 | v.UnicodeString(strip=True, min=1, not_empty=True) | |
|
440 | ) | |
|
441 | 458 | largefiles_usercache = All( |
|
442 | 459 | v.ValidPath(localizer), |
|
443 | 460 | v.UnicodeString(strip=True, min=2, not_empty=True)) |
@@ -38,7 +38,7 b' from rhodecode.translation import lazy_u' | |||
|
38 | 38 | from rhodecode.lib import helpers as h, hooks_utils, diffs |
|
39 | 39 | from rhodecode.lib import audit_logger |
|
40 | 40 | from collections import OrderedDict |
|
41 |
from rhodecode.lib.hook |
|
|
41 | from rhodecode.lib.hook_daemon.base import prepare_callback_daemon | |
|
42 | 42 | from rhodecode.lib.ext_json import sjson as json |
|
43 | 43 | from rhodecode.lib.markup_renderer import ( |
|
44 | 44 | DEFAULT_COMMENTS_RENDERER, RstTemplateRenderer) |
@@ -83,14 +83,6 b' class RepoModel(BaseModel):' | |||
|
83 | 83 | |
|
84 | 84 | return repo_to_perm |
|
85 | 85 | |
|
86 | @LazyProperty | |
|
87 | def repos_path(self): | |
|
88 | """ | |
|
89 | Gets the repositories root path from database | |
|
90 | """ | |
|
91 | settings_model = VcsSettingsModel(sa=self.sa) | |
|
92 | return settings_model.get_repos_location() | |
|
93 | ||
|
94 | 86 | def get(self, repo_id): |
|
95 | 87 | repo = self.sa.query(Repository) \ |
|
96 | 88 | .filter(Repository.repo_id == repo_id) |
@@ -608,7 +600,7 b' class RepoModel(BaseModel):' | |||
|
608 | 600 | # we need to flush here, in order to check if database won't |
|
609 | 601 | # throw any exceptions, create filesystem dirs at the very end |
|
610 | 602 | self.sa.flush() |
|
611 | events.trigger(events.RepoCreateEvent(new_repo)) | |
|
603 | events.trigger(events.RepoCreateEvent(new_repo, actor=owner)) | |
|
612 | 604 | return new_repo |
|
613 | 605 | |
|
614 | 606 | except Exception: |
@@ -62,15 +62,6 b' class RepoGroupModel(BaseModel):' | |||
|
62 | 62 | def get_repo_group(self, repo_group): |
|
63 | 63 | return self._get_repo_group(repo_group) |
|
64 | 64 | |
|
65 | @LazyProperty | |
|
66 | def repos_path(self): | |
|
67 | """ | |
|
68 | Gets the repositories root path from database | |
|
69 | """ | |
|
70 | ||
|
71 | settings_model = VcsSettingsModel(sa=self.sa) | |
|
72 | return settings_model.get_repos_location() | |
|
73 | ||
|
74 | 65 | def get_by_group_name(self, repo_group_name, cache=None): |
|
75 | 66 | repo = self.sa.query(RepoGroup) \ |
|
76 | 67 | .filter(RepoGroup.group_name == repo_group_name) |
@@ -189,15 +189,6 b' class ScmModel(BaseModel):' | |||
|
189 | 189 | Generic Scm Model |
|
190 | 190 | """ |
|
191 | 191 | |
|
192 | @LazyProperty | |
|
193 | def repos_path(self): | |
|
194 | """ | |
|
195 | Gets the repositories root path from database | |
|
196 | """ | |
|
197 | ||
|
198 | settings_model = VcsSettingsModel(sa=self.sa) | |
|
199 | return settings_model.get_repos_location() | |
|
200 | ||
|
201 | 192 | def repo_scan(self, repos_path=None): |
|
202 | 193 | """ |
|
203 | 194 | Listing of repositories in given path. This path should not be a |
@@ -499,11 +499,6 b' class VcsSettingsModel(object):' | |||
|
499 | 499 | ('vcs_git_lfs', 'store_location') |
|
500 | 500 | ) |
|
501 | 501 | |
|
502 | GLOBAL_SVN_SETTINGS = ( | |
|
503 | ('vcs_svn_proxy', 'http_requests_enabled'), | |
|
504 | ('vcs_svn_proxy', 'http_server_url') | |
|
505 | ) | |
|
506 | ||
|
507 | 502 | SVN_BRANCH_SECTION = 'vcs_svn_branch' |
|
508 | 503 | SVN_TAG_SECTION = 'vcs_svn_tag' |
|
509 | 504 | SSL_SETTING = ('web', 'push_ssl') |
@@ -718,25 +713,10 b' class VcsSettingsModel(object):' | |||
|
718 | 713 | # branch/tags patterns |
|
719 | 714 | self._create_svn_settings(self.global_settings, data) |
|
720 | 715 | |
|
721 | http_requests_enabled, http_server_url = self.GLOBAL_SVN_SETTINGS | |
|
722 | http_requests_enabled_key, http_server_url_key = self._get_settings_keys( | |
|
723 | self.GLOBAL_SVN_SETTINGS, data) | |
|
724 | ||
|
725 | self._create_or_update_ui( | |
|
726 | self.global_settings, *http_requests_enabled, | |
|
727 | value=safe_str(data[http_requests_enabled_key])) | |
|
728 | self._create_or_update_ui( | |
|
729 | self.global_settings, *http_server_url, | |
|
730 | value=data[http_server_url_key]) | |
|
731 | ||
|
732 | 716 | def update_global_ssl_setting(self, value): |
|
733 | 717 | self._create_or_update_ui( |
|
734 | 718 | self.global_settings, *self.SSL_SETTING, value=value) |
|
735 | 719 | |
|
736 | def update_global_path_setting(self, value): | |
|
737 | self._create_or_update_ui( | |
|
738 | self.global_settings, *self.PATH_SETTING, value=value) | |
|
739 | ||
|
740 | 720 | @assert_repo_settings |
|
741 | 721 | def delete_repo_svn_pattern(self, id_): |
|
742 | 722 | ui = self.repo_settings.UiDbModel.get(id_) |
@@ -811,9 +791,6 b' class VcsSettingsModel(object):' | |||
|
811 | 791 | else: |
|
812 | 792 | return self.get_repo_general_settings() |
|
813 | 793 | |
|
814 | def get_repos_location(self): | |
|
815 | return self.global_settings.get_ui_by_key('/').ui_value | |
|
816 | ||
|
817 | 794 | def _filter_ui_settings(self, settings): |
|
818 | 795 | filtered_settings = [ |
|
819 | 796 | s for s in settings if self._should_keep_setting(s)] |
@@ -37,7 +37,7 b' from rhodecode.lib.str_utils import safe' | |||
|
37 | 37 | from rhodecode.lib.exceptions import ( |
|
38 | 38 | DefaultUserException, UserOwnsReposException, UserOwnsRepoGroupsException, |
|
39 | 39 | UserOwnsUserGroupsException, NotAllowedToCreateUserError, |
|
40 | UserOwnsPullRequestsException, UserOwnsArtifactsException) | |
|
40 | UserOwnsPullRequestsException, UserOwnsArtifactsException, DuplicateUpdateUserError) | |
|
41 | 41 | from rhodecode.lib.caching_query import FromCache |
|
42 | 42 | from rhodecode.model import BaseModel |
|
43 | 43 | from rhodecode.model.db import ( |
@@ -114,6 +114,7 b' class UserModel(BaseModel):' | |||
|
114 | 114 | else: |
|
115 | 115 | user = self.sa.query(User)\ |
|
116 | 116 | .filter(User.username == username) |
|
117 | ||
|
117 | 118 | if cache: |
|
118 | 119 | name_key = _hash_key(username) |
|
119 | 120 | user = user.options( |
@@ -308,6 +309,10 b' class UserModel(BaseModel):' | |||
|
308 | 309 | log.debug('Checking for existing account in RhodeCode ' |
|
309 | 310 | 'database with user_id `%s` ', updating_user_id) |
|
310 | 311 | user = User.get(updating_user_id) |
|
312 | # now also validate if USERNAME belongs to potentially other user | |
|
313 | maybe_other_user = User.get_by_username(username, case_insensitive=True) | |
|
314 | if maybe_other_user and maybe_other_user.user_id != updating_user_id: | |
|
315 | raise DuplicateUpdateUserError(f'different user exists with the {username} username') | |
|
311 | 316 | else: |
|
312 | 317 | log.debug('Checking for existing account in RhodeCode ' |
|
313 | 318 | 'database with username `%s` ', username) |
@@ -761,25 +766,29 b' class UserModel(BaseModel):' | |||
|
761 | 766 | 'AuthUser: fill data execution based on: ' |
|
762 | 767 | 'user_id:%s api_key:%s username:%s', user_id, api_key, username) |
|
763 | 768 | try: |
|
769 | found_with = '' | |
|
764 | 770 | dbuser = None |
|
765 | 771 | if user_id: |
|
766 | 772 | dbuser = self.get(user_id) |
|
773 | found_with = 'user_id' | |
|
767 | 774 | elif api_key: |
|
768 | 775 | dbuser = self.get_by_auth_token(api_key) |
|
776 | found_with = 'auth_token' | |
|
769 | 777 | elif username: |
|
770 | 778 | dbuser = self.get_by_username(username) |
|
779 | found_with = 'username' | |
|
771 | 780 | |
|
772 | 781 | if not dbuser: |
|
773 | 782 | log.warning( |
|
774 | 'Unable to lookup user by id:%s api_key:%s username:%s', | |
|
775 | user_id, token_obfuscate(api_key), username) | |
|
783 | 'Unable to lookup user by id:%s api_key:%s username:%s, found with: %s', | |
|
784 | user_id, token_obfuscate(api_key), username, found_with) | |
|
776 | 785 | return False |
|
777 | 786 | if not dbuser.active: |
|
778 | 787 | log.debug('User `%s:%s` is inactive, skipping fill data', |
|
779 | 788 | username, user_id) |
|
780 | 789 | return False |
|
781 | 790 | |
|
782 | log.debug('AuthUser: filling found user:%s data', dbuser) | |
|
791 | log.debug('AuthUser: filling found user:%s data, found with: %s', dbuser, found_with) | |
|
783 | 792 | |
|
784 | 793 | attrs = { |
|
785 | 794 | 'user_id': dbuser.user_id, |
@@ -64,6 +64,7 b' class ChangePasswordSchema(colander.Sche' | |||
|
64 | 64 | |
|
65 | 65 | @colander.deferred |
|
66 | 66 | def deferred_username_validator(node, kw): |
|
67 | old_username = kw.get('username') | |
|
67 | 68 | |
|
68 | 69 | def name_validator(node, value): |
|
69 | 70 | msg = _( |
@@ -74,6 +75,11 b' def deferred_username_validator(node, kw' | |||
|
74 | 75 | if not re.match(r'^[\w]{1}[\w\-\.]{0,254}$', value): |
|
75 | 76 | raise colander.Invalid(node, msg) |
|
76 | 77 | |
|
78 | if value != old_username: | |
|
79 | existing_user = User.get_by_username(value, case_insensitive=True) | |
|
80 | if existing_user: | |
|
81 | raise colander.Invalid(node, 'Username is already taken') | |
|
82 | ||
|
77 | 83 | return name_validator |
|
78 | 84 | |
|
79 | 85 |
@@ -432,7 +432,7 b' def ValidAuth(localizer):' | |||
|
432 | 432 | |
|
433 | 433 | if not authenticate(username, password, '', HTTP_TYPE, |
|
434 | 434 | skip_missing=True): |
|
435 | user = User.get_by_username(username) | |
|
435 | user = User.get_by_username_or_primary_email(username) | |
|
436 | 436 | if user and not user.active: |
|
437 | 437 | log.warning('user %s is disabled', username) |
|
438 | 438 | msg = M(self, 'disabled_account', state) |
@@ -91,10 +91,12 b' function registerRCRoutes() {' | |||
|
91 | 91 | pyroutes.register('atom_feed_home_old', '/%(repo_name)s/feed/atom', ['repo_name']); |
|
92 | 92 | pyroutes.register('auth_home', '/_admin/auth*traverse', []); |
|
93 | 93 | pyroutes.register('bookmarks_home', '/%(repo_name)s/bookmarks', ['repo_name']); |
|
94 | pyroutes.register('branch_remove', '/%(repo_name)s/branches/%(branch_name)s/remove', ['repo_name', 'branch_name']); | |
|
94 | 95 | pyroutes.register('branches_home', '/%(repo_name)s/branches', ['repo_name']); |
|
95 | 96 | pyroutes.register('channelstream_connect', '/_admin/channelstream/connect', []); |
|
96 | 97 | pyroutes.register('channelstream_proxy', '/_channelstream', []); |
|
97 | 98 | pyroutes.register('channelstream_subscribe', '/_admin/channelstream/subscribe', []); |
|
99 | pyroutes.register('check_2fa', '/_admin/check_2fa', []); | |
|
98 | 100 | pyroutes.register('commit_draft_comments_submit', '/%(repo_name)s/changeset/%(commit_id)s/draft_comments_submit', ['repo_name', 'commit_id']); |
|
99 | 101 | pyroutes.register('debug_style_email', '/_admin/debug_style/email/%(email_id)s', ['email_id']); |
|
100 | 102 | pyroutes.register('debug_style_email_plain_rendered', '/_admin/debug_style/email-rendered/%(email_id)s', ['email_id']); |
@@ -214,6 +216,8 b' function registerRCRoutes() {' | |||
|
214 | 216 | pyroutes.register('my_account_auth_tokens_view', '/_admin/my_account/auth_tokens/view', []); |
|
215 | 217 | pyroutes.register('my_account_bookmarks', '/_admin/my_account/bookmarks', []); |
|
216 | 218 | pyroutes.register('my_account_bookmarks_update', '/_admin/my_account/bookmarks/update', []); |
|
219 | pyroutes.register('my_account_configure_2fa', '/_admin/my_account/configure_2fa', []); | |
|
220 | pyroutes.register('my_account_configure_2fa_update', '/_admin/my_account/configure_2fa_update', []); | |
|
217 | 221 | pyroutes.register('my_account_edit', '/_admin/my_account/edit', []); |
|
218 | 222 | pyroutes.register('my_account_emails', '/_admin/my_account/emails', []); |
|
219 | 223 | pyroutes.register('my_account_emails_add', '/_admin/my_account/emails/new', []); |
@@ -230,7 +234,9 b' function registerRCRoutes() {' | |||
|
230 | 234 | pyroutes.register('my_account_profile', '/_admin/my_account/profile', []); |
|
231 | 235 | pyroutes.register('my_account_pullrequests', '/_admin/my_account/pull_requests', []); |
|
232 | 236 | pyroutes.register('my_account_pullrequests_data', '/_admin/my_account/pull_requests/data', []); |
|
237 | pyroutes.register('my_account_regenerate_2fa_recovery_codes', '/_admin/my_account/regenerate_recovery_codes', []); | |
|
233 | 238 | pyroutes.register('my_account_repos', '/_admin/my_account/repos', []); |
|
239 | pyroutes.register('my_account_show_2fa_recovery_codes', '/_admin/my_account/recovery_codes', []); | |
|
234 | 240 | pyroutes.register('my_account_ssh_keys', '/_admin/my_account/ssh_keys', []); |
|
235 | 241 | pyroutes.register('my_account_ssh_keys_add', '/_admin/my_account/ssh_keys/new', []); |
|
236 | 242 | pyroutes.register('my_account_ssh_keys_delete', '/_admin/my_account/ssh_keys/delete', []); |
@@ -243,6 +249,7 b' function registerRCRoutes() {' | |||
|
243 | 249 | pyroutes.register('notifications_show', '/_admin/notifications/%(notification_id)s', ['notification_id']); |
|
244 | 250 | pyroutes.register('notifications_show_all', '/_admin/notifications', []); |
|
245 | 251 | pyroutes.register('notifications_update', '/_admin/notifications/%(notification_id)s/update', ['notification_id']); |
|
252 | pyroutes.register('ops_celery_error_test', '/_admin/ops/error-celery', []); | |
|
246 | 253 | pyroutes.register('ops_error_test', '/_admin/ops/error', []); |
|
247 | 254 | pyroutes.register('ops_healthcheck', '/_admin/ops/status', []); |
|
248 | 255 | pyroutes.register('ops_ping', '/_admin/ops/ping', []); |
@@ -379,6 +386,7 b' function registerRCRoutes() {' | |||
|
379 | 386 | pyroutes.register('search_repo', '/%(repo_name)s/_search', ['repo_name']); |
|
380 | 387 | pyroutes.register('search_repo_alt', '/%(repo_name)s/search', ['repo_name']); |
|
381 | 388 | pyroutes.register('search_repo_group', '/%(repo_group_name)s/_search', ['repo_group_name']); |
|
389 | pyroutes.register('setup_2fa', '/_admin/setup_2fa', []); | |
|
382 | 390 | pyroutes.register('store_user_session_value', '/_store_session_attr', []); |
|
383 | 391 | pyroutes.register('strip_check', '/%(repo_name)s/settings/strip_check', ['repo_name']); |
|
384 | 392 | pyroutes.register('strip_execute', '/%(repo_name)s/settings/strip_execute', ['repo_name']); |
@@ -116,8 +116,9 b' def scan_repositories_if_enabled(event):' | |||
|
116 | 116 | import_on_startup = settings['startup.import_repos'] |
|
117 | 117 | if vcs_server_enabled and import_on_startup: |
|
118 | 118 | from rhodecode.model.scm import ScmModel |
|
119 |
from rhodecode.lib.utils import repo2db_mapper |
|
|
120 | repositories = ScmModel().repo_scan(get_rhodecode_base_path()) | |
|
119 | from rhodecode.lib.utils import repo2db_mapper | |
|
120 | scm = ScmModel() | |
|
121 | repositories = scm.repo_scan(scm.repos_path) | |
|
121 | 122 | repo2db_mapper(repositories, remove_obsolete=False) |
|
122 | 123 | |
|
123 | 124 |
@@ -63,7 +63,12 b'' | |||
|
63 | 63 | %elif node.widget == "password": |
|
64 | 64 | ${h.password(node.name, defaults.get(node.name), class_="large")} |
|
65 | 65 | %elif node.widget == "bool": |
|
66 | %if node.name == "global_2fa" and c.rhodecode_edition_id != "EE": | |
|
67 | <input type="checkbox" disabled/> | |
|
68 | <%node.description = _('This feature is available in RhodeCode EE edition only. Contact {sales_email} to obtain a trial license.').format(sales_email='<a href="mailto:sales@rhodecode.com">sales@rhodecode.com</a>')%> | |
|
69 | %else: | |
|
66 | 70 | <div class="checkbox">${h.checkbox(node.name, True, checked=defaults.get(node.name))}</div> |
|
71 | %endif | |
|
67 | 72 | %elif node.widget == "select": |
|
68 | 73 | ${h.select(node.name, defaults.get(node.name), node.validator.choices, class_="select2AuthSetting")} |
|
69 | 74 | %elif node.widget == "select_with_labels": |
@@ -80,7 +85,7 b'' | |||
|
80 | 85 | <span class="error-message">${errors.get(node.name)}</span> |
|
81 | 86 | <br /> |
|
82 | 87 | %endif |
|
83 | <p class="help-block pre-formatting">${node.description}</p> | |
|
88 | <p class="help-block pre-formatting">${node.description | n}</p> | |
|
84 | 89 | </div> |
|
85 | 90 | </div> |
|
86 | 91 | %endfor |
@@ -28,6 +28,7 b'' | |||
|
28 | 28 | <li class="${h.is_active(['profile', 'profile_edit'], c.active)}"><a href="${h.route_path('my_account_profile')}">${_('Profile')}</a></li> |
|
29 | 29 | <li class="${h.is_active('emails', c.active)}"><a href="${h.route_path('my_account_emails')}">${_('Emails')}</a></li> |
|
30 | 30 | <li class="${h.is_active('password', c.active)}"><a href="${h.route_path('my_account_password')}">${_('Password')}</a></li> |
|
31 | <li class="${h.is_active('2fa', c.active)}"><a href="${h.route_path('my_account_configure_2fa')}">${_('2FA')}</a></li> | |
|
31 | 32 | <li class="${h.is_active('bookmarks', c.active)}"><a href="${h.route_path('my_account_bookmarks')}">${_('Bookmarks')}</a></li> |
|
32 | 33 | <li class="${h.is_active('auth_tokens', c.active)}"><a href="${h.route_path('my_account_auth_tokens')}">${_('Auth Tokens')}</a></li> |
|
33 | 34 | <li class="${h.is_active(['ssh_keys', 'ssh_keys_generate'], c.active)}"><a href="${h.route_path('my_account_ssh_keys')}">${_('SSH Keys')}</a></li> |
@@ -83,14 +83,14 b'' | |||
|
83 | 83 | <table class="rctable"> |
|
84 | 84 | ## generate always 10 entries |
|
85 | 85 | <input type="hidden" name="__start__" value="bookmarks:sequence"/> |
|
86 | % for item in (c.bookmark_items + [None for i in range(10)])[:10]: | |
|
86 | % for item in (c.user_bookmark_items + [None for i in range(10)])[:10]: | |
|
87 | 87 | <input type="hidden" name="__start__" value="bookmark:mapping"/> |
|
88 | 88 | % if item is None: |
|
89 | 89 | ## empty placehodlder |
|
90 | 90 | ${form_item()} |
|
91 | 91 | % else: |
|
92 | 92 | ## actual entry |
|
93 |
${form_item(position=item.position, title=item.title, redirect_url=item.redirect_url, repo=item |
|
|
93 | ${form_item(position=item[0].position, title=item[0].title, redirect_url=item[0].redirect_url, repo=item[1], repo_group=item[2])} | |
|
94 | 94 | % endif |
|
95 | 95 | <input type="hidden" name="__end__" value="bookmark:mapping"/> |
|
96 | 96 | % endfor |
@@ -84,6 +84,6 b'' | |||
|
84 | 84 | <script> |
|
85 | 85 | $('#check_for_update').click(function(e){ |
|
86 | 86 | $('#update_notice').show(); |
|
87 | $('#update_notice').load("${h.route_path('admin_settings_system_update')}"); | |
|
87 | $('#update_notice').load("${h.route_path('admin_settings_system_update', _query={'ver': request.GET.get('ver')})}"); | |
|
88 | 88 | }) |
|
89 | 89 | </script> |
@@ -1,25 +1,30 b'' | |||
|
1 | 1 | ## upgrade block rendered afte on-click check |
|
2 | 2 | |
|
3 | 3 | <div class="alert ${'alert-warning' if c.should_upgrade else 'alert-success'}"> |
|
4 | <p> | |
|
4 | ||
|
5 | 5 | %if c.should_upgrade: |
|
6 |
A <b>new version</b> is available |
|
|
6 | <span style="font-size: 130%">A <b>new version</b> is available !</span> | |
|
7 | <br/> | |
|
8 | <br/> | |
|
9 | ||
|
7 | 10 |
|
|
8 |
<b>${h.literal(c.latest_data['title'])} |
|
|
11 | RhodeCode <b>${c.latest_ver}</b> - ${h.literal(c.latest_data['title'])} | |
|
9 | 12 |
|
|
10 | <b>${c.latest_ver}</b> | |
|
13 | RhodeCode <b>${c.latest_ver}</b> | |
|
11 | 14 | %endif |
|
12 | 15 | %else: |
|
13 | This instance is already running the <b>latest</b> stable version ${c.latest_ver}. | |
|
16 | Your current version, ${c.cur_ver}, is up-to-date as it is equal to or newer than the latest available version, ${c.latest_ver}. | |
|
14 | 17 | %endif |
|
15 | </p> | |
|
18 | ||
|
16 | 19 |
|
|
17 | 20 | % if c.should_upgrade and c.important_notices: |
|
18 | <div>Important notes for this release:</div> | |
|
19 | <ul> | |
|
21 | <br/> | |
|
22 | <br/> | |
|
23 | <div>Summary:</div> | |
|
24 | <br/> | |
|
20 | 25 | % for notice in c.important_notices: |
|
21 |
|
|
|
26 | - ${notice}<br/> | |
|
22 | 27 | % endfor |
|
23 | </ul> | |
|
24 | 28 | % endif |
|
29 | ||
|
25 | 30 | </div> |
@@ -6,8 +6,7 b'' | |||
|
6 | 6 | suffix='', |
|
7 | 7 | svn_tag_patterns=c.svn_tag_patterns, |
|
8 | 8 | svn_branch_patterns=c.svn_branch_patterns, |
|
9 |
display_globals=True |
|
|
10 | allow_repo_location_change=c.visual.allow_repo_location_change | |
|
9 | display_globals=True | |
|
11 | 10 | )} |
|
12 | 11 | <div class="buttons"> |
|
13 | 12 | ${h.submit('save',_('Save settings'),class_="btn")} |
@@ -50,16 +49,5 b'' | |||
|
50 | 49 | unlockpath(); |
|
51 | 50 | } |
|
52 | 51 | |
|
53 | /* On click handler for the `Generate Apache Config` button. It sends a | |
|
54 | POST request to trigger the (re)generation of the mod_dav_svn config. */ | |
|
55 | $('#vcs_svn_generate_cfg').on('click', function(event) { | |
|
56 | event.preventDefault(); | |
|
57 | var url = "${h.route_path('admin_settings_vcs_svn_generate_cfg')}"; | |
|
58 | var jqxhr = $.post(url, {'csrf_token': CSRF_TOKEN}); | |
|
59 | jqxhr.done(function(data) { | |
|
60 | $.Topic('/notifications').publish(data); | |
|
61 | }); | |
|
62 | }); | |
|
63 | ||
|
64 | 52 | }); |
|
65 | 53 | </script> |
@@ -651,26 +651,26 b'' | |||
|
651 | 651 | % endif |
|
652 | 652 | % for item in c.bookmark_items: |
|
653 | 653 | <li> |
|
654 |
% if item.repo |
|
|
654 | % if item.repo_id: | |
|
655 | 655 | <div> |
|
656 | 656 | <a class="bookmark-item" href="${h.route_path('my_account_goto_bookmark', bookmark_id=item.position)}"> |
|
657 | 657 | <code>${item.position}</code> |
|
658 |
% if item |
|
|
658 | % if item.repo_type == 'hg': | |
|
659 | 659 | <i class="icon-hg" title="${_('Repository')}" style="font-size: 16px"></i> |
|
660 |
% elif item |
|
|
660 | % elif item.repo_type == 'git': | |
|
661 | 661 | <i class="icon-git" title="${_('Repository')}" style="font-size: 16px"></i> |
|
662 |
% elif item |
|
|
662 | % elif item.repo_type == 'svn': | |
|
663 | 663 | <i class="icon-svn" title="${_('Repository')}" style="font-size: 16px"></i> |
|
664 | 664 | % endif |
|
665 |
${(item.title or h.shorter(item. |
|
|
665 | ${(item.title or h.shorter(item.repo_name, 30))} | |
|
666 | 666 | </a> |
|
667 | 667 | </div> |
|
668 |
% elif item. |
|
|
668 | % elif item.group_id: | |
|
669 | 669 | <div> |
|
670 | 670 | <a class="bookmark-item" href="${h.route_path('my_account_goto_bookmark', bookmark_id=item.position)}"> |
|
671 | 671 | <code>${item.position}</code> |
|
672 | 672 | <i class="icon-repo-group" title="${_('Repository group')}" style="font-size: 14px"></i> |
|
673 |
${(item.title or h.shorter(item. |
|
|
673 | ${(item.title or h.shorter(item.group_name, 30))} | |
|
674 | 674 | </a> |
|
675 | 675 | </div> |
|
676 | 676 | % else: |
@@ -17,7 +17,7 b' examples = [' | |||
|
17 | 17 | |
|
18 | 18 | ( |
|
19 | 19 | 'Tickets with #123 (Redmine etc)', |
|
20 |
'(?<![a-zA-Z0-9_/]{1,10}-?)(#)(?P<issue_id> |
|
|
20 | '(?<![a-zA-Z0-9_/]{1,10}-?)(#)(?P<issue_id>[0-9]+)', | |
|
21 | 21 | 'https://myissueserver.com/${repo}/issue/${issue_id}', |
|
22 | 22 | '' |
|
23 | 23 | ), |
@@ -60,7 +60,7 b' examples = [' | |||
|
60 | 60 | |
|
61 | 61 | ( |
|
62 | 62 | 'Pivotal Tracker', |
|
63 |
'(?:pivot-)(?P<project_id>\d+)-(?P<story> |
|
|
63 | '(?:pivot-)(?P<project_id>\d+)-(?P<story>[0-9]+)', | |
|
64 | 64 | 'https://www.pivotaltracker.com/s/projects/${project_id}/stories/${story}', |
|
65 | 65 | 'PIV-', |
|
66 | 66 | ), |
@@ -3,7 +3,7 b'' | |||
|
3 | 3 | ## <%namespace name="vcss" file="/base/vcssettings.mako"/> |
|
4 | 4 | ## ${vcss.vcs_settings_fields()} |
|
5 | 5 | |
|
6 |
<%def name="vcs_settings_fields(suffix='', svn_branch_patterns=None, svn_tag_patterns=None, repo_type=None, display_globals=False, |
|
|
6 | <%def name="vcs_settings_fields(suffix='', svn_branch_patterns=None, svn_tag_patterns=None, repo_type=None, display_globals=False, **kwargs)"> | |
|
7 | 7 | % if display_globals: |
|
8 | 8 | <div class="panel panel-default"> |
|
9 | 9 | <div class="panel-heading" id="general"> |
@@ -23,34 +23,6 b'' | |||
|
23 | 23 | </div> |
|
24 | 24 | % endif |
|
25 | 25 | |
|
26 | % if display_globals: | |
|
27 | <div class="panel panel-default"> | |
|
28 | <div class="panel-heading" id="vcs-storage-options"> | |
|
29 | <h3 class="panel-title">${_('Main Storage Location')}<a class="permalink" href="#vcs-storage-options"> ¶</a></h3> | |
|
30 | </div> | |
|
31 | <div class="panel-body"> | |
|
32 | <div class="field"> | |
|
33 | <div class="inputx locked_input"> | |
|
34 | %if allow_repo_location_change: | |
|
35 | ${h.text('paths_root_path',size=59,readonly="readonly", class_="disabled")} | |
|
36 | <span id="path_unlock" class="tooltip" | |
|
37 | title="${h.tooltip(_('Click to unlock. You must restart RhodeCode in order to make this setting take effect.'))}"> | |
|
38 | <div class="btn btn-default lock_input_button"><i id="path_unlock_icon" class="icon-lock"></i></div> | |
|
39 | </span> | |
|
40 | %else: | |
|
41 | ${_('Repository location change is disabled. You can enable this by changing the `allow_repo_location_change` inside .ini file.')} | |
|
42 | ## form still requires this but we cannot internally change it anyway | |
|
43 | ${h.hidden('paths_root_path',size=30,readonly="readonly", class_="disabled")} | |
|
44 | %endif | |
|
45 | </div> | |
|
46 | </div> | |
|
47 | <div class="label"> | |
|
48 | <span class="help-block">${_('Filesystem location where repositories should be stored. After changing this value a restart and rescan of the repository folder are required.')}</span> | |
|
49 | </div> | |
|
50 | </div> | |
|
51 | </div> | |
|
52 | % endif | |
|
53 | ||
|
54 | 26 | % if display_globals or repo_type in ['git', 'hg']: |
|
55 | 27 | <div class="panel panel-default"> |
|
56 | 28 | <div class="panel-heading" id="vcs-hooks-options"> |
@@ -170,48 +142,31 b'' | |||
|
170 | 142 | </div> |
|
171 | 143 | % endif |
|
172 | 144 | |
|
173 | ||
|
174 | % if display_globals: | |
|
175 | <div class="panel panel-default"> | |
|
176 | <div class="panel-heading" id="vcs-global-svn-options"> | |
|
177 | <h3 class="panel-title">${_('Global Subversion Settings')}<a class="permalink" href="#vcs-global-svn-options"> ¶</a></h3> | |
|
178 | </div> | |
|
179 | <div class="panel-body"> | |
|
180 | <div class="field"> | |
|
181 | <div class="checkbox"> | |
|
182 | ${h.checkbox('vcs_svn_proxy_http_requests_enabled' + suffix, 'True', **kwargs)} | |
|
183 | <label for="vcs_svn_proxy_http_requests_enabled${suffix}">${_('Proxy subversion HTTP requests')}</label> | |
|
184 | </div> | |
|
185 | <div class="label"> | |
|
186 | <span class="help-block"> | |
|
187 | ${_('Subversion HTTP Support. Enables communication with SVN over HTTP protocol.')} | |
|
188 | <a href="${h.route_url('enterprise_svn_setup')}" target="_blank">${_('SVN Protocol setup Documentation')}</a>. | |
|
189 | </span> | |
|
190 | </div> | |
|
191 | </div> | |
|
192 | <div class="field"> | |
|
193 | <div class="label"> | |
|
194 | <label for="vcs_svn_proxy_http_server_url">${_('Subversion HTTP Server URL')}</label><br/> | |
|
195 | </div> | |
|
196 | <div class="input"> | |
|
197 | ${h.text('vcs_svn_proxy_http_server_url',size=59)} | |
|
198 | % if c.svn_proxy_generate_config: | |
|
199 | <span class="buttons"> | |
|
200 | <button class="btn btn-primary" id="vcs_svn_generate_cfg">${_('Generate Apache Config')}</button> | |
|
201 | </span> | |
|
202 | % endif | |
|
203 | </div> | |
|
204 | </div> | |
|
205 | </div> | |
|
206 | </div> | |
|
207 | % endif | |
|
208 | ||
|
209 | 145 | % if display_globals or repo_type in ['svn']: |
|
210 | 146 | <div class="panel panel-default"> |
|
211 | 147 | <div class="panel-heading" id="vcs-svn-options"> |
|
212 | 148 | <h3 class="panel-title">${_('Subversion Settings')}<a class="permalink" href="#vcs-svn-options"> ¶</a></h3> |
|
213 | 149 | </div> |
|
214 | 150 | <div class="panel-body"> |
|
151 | % if display_globals: | |
|
152 | <div class="field"> | |
|
153 | <div class="content" > | |
|
154 | <label>${_('mod_dav config')}</label><br/> | |
|
155 | <code>path: ${c.svn_config_path}</code> | |
|
156 | </div> | |
|
157 | <br/> | |
|
158 | ||
|
159 | <div> | |
|
160 | ||
|
161 | % if c.svn_generate_config: | |
|
162 | <span class="buttons"> | |
|
163 | <button class="btn btn-primary" id="vcs_svn_generate_cfg">${_('Re-generate Apache Config')}</button> | |
|
164 | </span> | |
|
165 | % endif | |
|
166 | </div> | |
|
167 | </div> | |
|
168 | % endif | |
|
169 | ||
|
215 | 170 | <div class="field"> |
|
216 | 171 | <div class="content" > |
|
217 | 172 | <label>${_('Repository patterns')}</label><br/> |
@@ -345,13 +300,12 b'' | |||
|
345 | 300 | </div> |
|
346 | 301 | % endif |
|
347 | 302 | |
|
348 | ## DISABLED FOR GIT FOR NOW as the rebase/close is not supported yet | |
|
349 | ## % if display_globals or repo_type in ['git']: | |
|
350 |
|
|
|
351 | ## <div class="panel-heading" id="vcs-pull-requests-options"> | |
|
352 | ## <h3 class="panel-title">${_('Git Pull Request Settings')}<a class="permalink" href="#vcs-git-pull-requests-options"> ¶</a></h3> | |
|
353 | ## </div> | |
|
354 | ## <div class="panel-body"> | |
|
303 | % if display_globals or repo_type in ['git']: | |
|
304 | <div class="panel panel-default"> | |
|
305 | <div class="panel-heading" id="vcs-pull-requests-options"> | |
|
306 | <h3 class="panel-title">${_('Git Pull Request Settings')}<a class="permalink" href="#vcs-git-pull-requests-options"> ¶</a></h3> | |
|
307 | </div> | |
|
308 | <div class="panel-body"> | |
|
355 | 309 |
## |
|
356 | 310 |
## |
|
357 | 311 |
## |
@@ -359,17 +313,33 b'' | |||
|
359 | 313 |
## |
|
360 | 314 |
## |
|
361 | 315 |
## |
|
362 | ## | |
|
363 |
|
|
|
364 |
|
|
|
365 |
|
|
|
366 |
|
|
|
367 |
|
|
|
368 |
|
|
|
369 |
|
|
|
370 |
|
|
|
371 |
|
|
|
372 |
|
|
|
316 | ||
|
317 | <div class="checkbox"> | |
|
318 | ${h.checkbox('rhodecode_git_close_branch_before_merging' + suffix, 'True', **kwargs)} | |
|
319 | <label for="rhodecode_git_close_branch_before_merging{suffix}">${_('Delete branch after merging it')}</label> | |
|
320 | </div> | |
|
321 | <div class="label"> | |
|
322 | <span class="help-block">${_('Delete branch after merging it into destination branch.')}</span> | |
|
323 | </div> | |
|
324 | </div> | |
|
325 | </div> | |
|
326 | % endif | |
|
327 | ||
|
328 | <script type="text/javascript"> | |
|
373 | 329 |
|
|
330 | $(document).ready(function() { | |
|
331 | /* On click handler for the `Generate Apache Config` button. It sends a | |
|
332 | POST request to trigger the (re)generation of the mod_dav_svn config. */ | |
|
333 | $('#vcs_svn_generate_cfg').on('click', function(event) { | |
|
334 | event.preventDefault(); | |
|
335 | var url = "${h.route_path('admin_settings_vcs_svn_generate_cfg')}"; | |
|
336 | var jqxhr = $.post(url, {'csrf_token': CSRF_TOKEN}); | |
|
337 | jqxhr.done(function(data) { | |
|
338 | $.Topic('/notifications').publish(data); | |
|
339 | }); | |
|
340 | }); | |
|
341 | }); | |
|
374 | 342 | |
|
343 | </script> | |
|
375 | 344 | </%def> |
|
345 |
@@ -62,13 +62,8 b'' | |||
|
62 | 62 | }; |
|
63 | 63 | |
|
64 | 64 | var branches_data = ${c.data|n}; |
|
65 | // object list | |
|
66 | $('#obj_list_table').DataTable({ | |
|
67 | data: branches_data, | |
|
68 | dom: 'rtp', | |
|
69 | pageLength: ${c.visual.dashboard_items}, | |
|
70 | order: [[ 0, "asc" ]], | |
|
71 | columns: [ | |
|
65 | var repo_type = "${c.rhodecode_db_repo.repo_type}"; | |
|
66 | var columns = [ | |
|
72 | 67 | { data: {"_": "name", |
|
73 | 68 | "sort": "name_raw"}, title: "${_('Name')}", className: "td-tags" }, |
|
74 | 69 | { data: {"_": "date", |
@@ -80,7 +75,22 b'' | |||
|
80 | 75 | "type": Number}, title: "${_('Commit')}", className: "td-hash" }, |
|
81 | 76 | { data: {"_": "compare", |
|
82 | 77 | "sort": "compare"}, title: "${_('Compare')}", className: "td-compare" } |
|
83 |
] |
|
|
78 | ]; | |
|
79 | if (repo_type !== 'svn') { | |
|
80 | columns.push({ | |
|
81 | data: { "_": "action", "sort": "action" }, | |
|
82 | title: `${_('Action')}`, | |
|
83 | className: "td-action", | |
|
84 | orderable: false | |
|
85 | }); | |
|
86 | } | |
|
87 | ||
|
88 | $('#obj_list_table').DataTable({ | |
|
89 | data: branches_data, | |
|
90 | dom: 'rtp', | |
|
91 | pageLength: ${c.visual.dashboard_items}, | |
|
92 | order: [[ 0, "asc" ]], | |
|
93 | columns: columns, | |
|
84 | 94 | language: { |
|
85 | 95 | paginate: DEFAULT_GRID_PAGINATION, |
|
86 | 96 | emptyTable: _gettext("No branches available yet.") |
@@ -277,6 +277,30 b'' | |||
|
277 | 277 | </div> |
|
278 | 278 | </%def> |
|
279 | 279 | |
|
280 | <%def name="branch_actions_git(branch_name, repo_name, **kwargs)"> | |
|
281 | <div class="grid_delete"> | |
|
282 | ${h.secure_form(h.route_path('branch_remove', repo_name=repo_name, branch_name=branch_name), request=request)} | |
|
283 | <input class="btn btn-link btn-danger" id="remove_branch_${branch_name}" name="remove_branch_${branch_name}" | |
|
284 | onclick="submitConfirm(event, this, _gettext('Confirm to delete this branch'), _gettext('Delete'), '${branch_name}')" | |
|
285 | type="submit" value="Delete" | |
|
286 | > | |
|
287 | ${h.end_form()} | |
|
288 | </div> | |
|
289 | </%def> | |
|
290 | ||
|
291 | <%def name="branch_actions_hg(branch_name, repo_name, **kwargs)"> | |
|
292 | <div class="grid_delete"> | |
|
293 | %if not kwargs['closed']: | |
|
294 | ${h.secure_form(h.route_path('branch_remove', repo_name=repo_name, branch_name=branch_name), request=request)} | |
|
295 | <input class="btn btn-link btn-danger" id="remove_branch_${branch_name}" name="remove_branch_${branch_name}" | |
|
296 | onclick="submitConfirm(event, this, _gettext('Confirm to close this branch'), _gettext('Close'), '${branch_name}')" | |
|
297 | type="submit" value="Close" | |
|
298 | > | |
|
299 | ${h.end_form()} | |
|
300 | %endif | |
|
301 | </div> | |
|
302 | </%def> | |
|
303 | ||
|
280 | 304 | <%def name="user_group_actions(user_group_id, user_group_name)"> |
|
281 | 305 | <div class="grid_edit"> |
|
282 | 306 | <a href="${h.route_path('edit_user_group', user_group_id=user_group_id)}" title="${_('Edit')}">Edit</a> |
@@ -269,6 +269,18 b' They are permanent until deleted, or con' | |||
|
269 | 269 | <pre><%= submodule_url %></pre> |
|
270 | 270 | </script> |
|
271 | 271 | |
|
272 | <script id="ejs_recoveryCodes" type="text/template" class="ejsTemplate"> | |
|
273 | <code> | |
|
274 | <ol> | |
|
275 | <% for (var i = 0; i < recoveryCodes.length; i++) { %> | |
|
276 | <% var code = recoveryCodes[i] %> | |
|
277 | <li><%= code %></li> | |
|
278 | <% } %> | |
|
279 | </ol> | |
|
280 | </code> | |
|
281 | <i class="icon-clipboard clipboard-action" data-clipboard-text="<%= recoveryCodes %>" >Copy All</i> | |
|
282 | </script> | |
|
283 | ||
|
272 | 284 | ##// END OF EJS Templates |
|
273 | 285 | </div> |
|
274 | 286 |
@@ -35,12 +35,12 b'' | |||
|
35 | 35 | <%block name="above_login_button" /> |
|
36 | 36 | <!-- login --> |
|
37 | 37 | <div class="sign-in-title"> |
|
38 |
<h1>${_('Sign In using |
|
|
38 | <h1>${_('Sign In using credentials')}</h1> | |
|
39 | 39 | </div> |
|
40 | 40 | <div class="inner form"> |
|
41 | 41 | ${h.form(request.route_path('login', _query={'came_from': c.came_from}), needs_csrf_token=False)} |
|
42 | 42 | |
|
43 | <label for="username">${_('Username')}:</label> | |
|
43 | <label for="username">${_('Username or email address')}:</label> | |
|
44 | 44 | ${h.text('username', class_='focus', value=defaults.get('username'))} |
|
45 | 45 | %if 'username' in errors: |
|
46 | 46 | <span class="error-message">${errors.get('username')}</span> |
@@ -27,6 +27,7 b' import urllib.parse' | |||
|
27 | 27 | |
|
28 | 28 | import pytest |
|
29 | 29 | |
|
30 | import rhodecode | |
|
30 | 31 | from rhodecode.model.db import User |
|
31 | 32 | from rhodecode.lib import auth |
|
32 | 33 | from rhodecode.lib import helpers as h |
@@ -54,7 +55,6 b' log = logging.getLogger(__name__)' | |||
|
54 | 55 | # SOME GLOBALS FOR TESTS |
|
55 | 56 | TEST_DIR = tempfile.gettempdir() |
|
56 | 57 | |
|
57 | TESTS_TMP_PATH = jn(TEST_DIR, 'rc_test_{}'.format(next(tempfile._RandomNameSequence()))) | |
|
58 | 58 | TEST_USER_ADMIN_LOGIN = 'test_admin' |
|
59 | 59 | TEST_USER_ADMIN_PASS = 'test12' |
|
60 | 60 | TEST_USER_ADMIN_EMAIL = 'test_admin@mail.com' |
@@ -81,6 +81,8 b" GIT_FORK = 'vcs_test_git_fork'" | |||
|
81 | 81 | SCM_TESTS = ['hg', 'git'] |
|
82 | 82 | uniq_suffix = str(int(time.mktime(datetime.datetime.now().timetuple()))) |
|
83 | 83 | |
|
84 | TESTS_TMP_PATH = tempfile.mkdtemp(prefix='rc_test_', dir=TEST_DIR) | |
|
85 | ||
|
84 | 86 | TEST_GIT_REPO = jn(TESTS_TMP_PATH, GIT_REPO) |
|
85 | 87 | TEST_GIT_REPO_CLONE = jn(TESTS_TMP_PATH, f'vcsgitclone{uniq_suffix}') |
|
86 | 88 | TEST_GIT_REPO_PULL = jn(TESTS_TMP_PATH, f'vcsgitpull{uniq_suffix}') |
@@ -111,7 +113,7 b' def get_new_dir(title):' | |||
|
111 | 113 | hex_str = sha1_safe(f'{os.getpid()} {time.time()}') |
|
112 | 114 | name_parts.append(hex_str) |
|
113 | 115 | name = '-'.join(name_parts) |
|
114 |
path = |
|
|
116 | path = jn(TEST_DIR, name) | |
|
115 | 117 | return get_normalized_path(path) |
|
116 | 118 | |
|
117 | 119 |
@@ -44,7 +44,7 b' class RhodeCodeAuthPlugin(RhodeCodeExter' | |||
|
44 | 44 | |
|
45 | 45 | @hybrid_property |
|
46 | 46 | def name(self): |
|
47 |
return |
|
|
47 | return "external_test" | |
|
48 | 48 | |
|
49 | 49 | def settings(self): |
|
50 | 50 | settings = [ |
@@ -82,10 +82,11 b' def pytest_addoption(parser):' | |||
|
82 | 82 | parser.addoption( |
|
83 | 83 | '--test-loglevel', dest='test_loglevel', |
|
84 | 84 | help="Set default Logging level for tests, critical(default), error, warn , info, debug") |
|
85 | group = parser.getgroup('pylons') | |
|
85 | ||
|
86 | group = parser.getgroup('pyramid') | |
|
86 | 87 | group.addoption( |
|
87 |
'-- |
|
|
88 |
help="Set up a |
|
|
88 | '--pyramid-config', dest='pyramid_config', | |
|
89 | help="Set up a pyramid with the specified ini config file.") | |
|
89 | 90 | group.addoption( |
|
90 | 91 | '--ini-config-override', action='store', type=_parse_json, |
|
91 | 92 | default=None, dest='pyramid_config_override', help=( |
@@ -86,10 +86,10 b' class DBBackend(object):' | |||
|
86 | 86 | _store = os.path.dirname(os.path.abspath(__file__)) |
|
87 | 87 | _type = None |
|
88 | 88 | _base_ini_config = [{'app:main': {'vcs.start_server': 'false', |
|
89 |
'startup.import_repos': 'false' |
|
|
90 | 'is_test': 'False'}}] | |
|
89 | 'startup.import_repos': 'false'}}] | |
|
91 | 90 | _db_url = [{'app:main': {'sqlalchemy.db1.url': ''}}] |
|
92 | 91 | _base_db_name = 'rhodecode_test_db_backend' |
|
92 | std_env = {'RC_TEST': '0'} | |
|
93 | 93 | |
|
94 | 94 | def __init__( |
|
95 | 95 | self, config_file, db_name=None, basetemp=None, |
@@ -135,13 +135,15 b' class DBBackend(object):' | |||
|
135 | 135 | """ |
|
136 | 136 | |
|
137 | 137 | command = cmd + ' ' + ' '.join(args) |
|
138 | sys.stdout.write(command) | |
|
138 | sys.stdout.write(f'CMD: {command}') | |
|
139 | 139 | |
|
140 | 140 | # Tell Python to use UTF-8 encoding out stdout |
|
141 | 141 | _env = os.environ.copy() |
|
142 | 142 | _env['PYTHONIOENCODING'] = 'UTF-8' |
|
143 | _env.update(self.std_env) | |
|
143 | 144 | if env: |
|
144 | 145 | _env.update(env) |
|
146 | ||
|
145 | 147 | self.p = Popen(command, shell=True, stdout=PIPE, stderr=PIPE, env=_env) |
|
146 | 148 | self.stdout, self.stderr = self.p.communicate() |
|
147 | 149 | stdout_str = safe_str(self.stdout) |
@@ -19,8 +19,9 b'' | |||
|
19 | 19 | |
|
20 | 20 | import pytest |
|
21 | 21 | |
|
22 |
from rhodecode.lib. |
|
|
22 | from rhodecode.lib.config_utils import get_app_config | |
|
23 | 23 | from rhodecode.tests.fixture import TestINI |
|
24 | from rhodecode.tests import TESTS_TMP_PATH | |
|
24 | 25 | from rhodecode.tests.server_utils import RcVCSServer |
|
25 | 26 | |
|
26 | 27 | |
@@ -57,7 +58,7 b' def vcsserver_factory(tmpdir_factory):' | |||
|
57 | 58 | """ |
|
58 | 59 | |
|
59 | 60 | def factory(request, overrides=(), vcsserver_port=None, |
|
60 |
log_file=None, workers=' |
|
|
61 | log_file=None, workers='3'): | |
|
61 | 62 | |
|
62 | 63 | if vcsserver_port is None: |
|
63 | 64 | vcsserver_port = get_available_port() |
@@ -99,7 +100,7 b' def ini_config(request, tmpdir_factory, ' | |||
|
99 | 100 | overrides = [ |
|
100 | 101 | {'server:main': {'port': rcserver_port}}, |
|
101 | 102 | {'app:main': { |
|
102 | 'cache_dir': '%(here)s/rc_data', | |
|
103 | 'cache_dir': '%(here)s/rc-tests/rc_data', | |
|
103 | 104 | 'vcs.server': f'localhost:{vcsserver_port}', |
|
104 | 105 | # johbo: We will always start the VCSServer on our own based on the |
|
105 | 106 | # fixtures of the test cases. For the test run it must always be |
@@ -108,8 +109,11 b' def ini_config(request, tmpdir_factory, ' | |||
|
108 | 109 | |
|
109 | 110 | 'vcs.server.protocol': 'http', |
|
110 | 111 | 'vcs.scm_app_implementation': 'http', |
|
112 | 'vcs.svn.proxy.enabled': 'true', | |
|
111 | 113 | 'vcs.hooks.protocol': 'http', |
|
112 | 114 | 'vcs.hooks.host': '*', |
|
115 | 'repo_store.path': TESTS_TMP_PATH, | |
|
116 | 'app.service_api.token': 'service_secret_token', | |
|
113 | 117 | }}, |
|
114 | 118 | |
|
115 | 119 | {'handler_console': { |
@@ -157,7 +161,7 b' def vcsserver_port(request):' | |||
|
157 | 161 | |
|
158 | 162 | |
|
159 | 163 | @pytest.fixture(scope='session') |
|
160 | def available_port_factory(): | |
|
164 | def available_port_factory() -> get_available_port: | |
|
161 | 165 | """ |
|
162 | 166 | Returns a callable which returns free port numbers. |
|
163 | 167 | """ |
@@ -174,7 +174,7 b' def http_environ():' | |||
|
174 | 174 | |
|
175 | 175 | @pytest.fixture(scope='session') |
|
176 | 176 | def baseapp(ini_config, vcsserver, http_environ_session): |
|
177 |
from rhodecode.lib. |
|
|
177 | from rhodecode.lib.config_utils import get_app_config | |
|
178 | 178 | from rhodecode.config.middleware import make_pyramid_app |
|
179 | 179 | |
|
180 | 180 | log.info("Using the RhodeCode configuration:{}".format(ini_config)) |
@@ -29,7 +29,7 b' from rhodecode.model import db' | |||
|
29 | 29 | class RcTestAuthPlugin(RhodeCodeAuthPluginBase): |
|
30 | 30 | |
|
31 | 31 | def name(self): |
|
32 |
return |
|
|
32 | return 'stub_auth' | |
|
33 | 33 | |
|
34 | 34 | |
|
35 | 35 | def test_authenticate_returns_from_auth(stub_auth_data): |
@@ -23,16 +23,16 b' import pytest' | |||
|
23 | 23 | from unittest.mock import patch, Mock, MagicMock |
|
24 | 24 | |
|
25 | 25 | from rhodecode.lib.middleware.simplesvn import SimpleSvn, SimpleSvnApp |
|
26 |
from rhodecode.lib.utils import get_rhodecode_ |
|
|
26 | from rhodecode.lib.utils import get_rhodecode_repo_store_path | |
|
27 | 27 | from rhodecode.tests import SVN_REPO, TEST_USER_ADMIN_LOGIN, TEST_USER_ADMIN_PASS |
|
28 | 28 | |
|
29 | 29 | |
|
30 | 30 | class TestSimpleSvn(object): |
|
31 | 31 | @pytest.fixture(autouse=True) |
|
32 | 32 | def simple_svn(self, baseapp, request_stub): |
|
33 |
base_path = get_rhodecode_ |
|
|
33 | base_path = get_rhodecode_repo_store_path() | |
|
34 | 34 | self.app = SimpleSvn( |
|
35 |
config={'auth_ret_code': '', ' |
|
|
35 | config={'auth_ret_code': '', 'repo_store.path': base_path}, | |
|
36 | 36 | registry=request_stub.registry) |
|
37 | 37 | |
|
38 | 38 | def test_get_config(self): |
@@ -126,7 +126,7 b' class TestSimpleSvnApp(object):' | |||
|
126 | 126 | def setup_method(self, method): |
|
127 | 127 | # note(marcink): this is hostname from docker compose used for testing... |
|
128 | 128 | self.host = 'http://svn:8090' |
|
129 |
base_path = get_rhodecode_ |
|
|
129 | base_path = get_rhodecode_repo_store_path() | |
|
130 | 130 | self.app = SimpleSvnApp( |
|
131 | 131 | config={'subversion_http_server_url': self.host, |
|
132 | 132 | 'base_path': base_path}) |
@@ -25,17 +25,20 b' import msgpack' | |||
|
25 | 25 | import pytest |
|
26 | 26 | import tempfile |
|
27 | 27 | |
|
28 |
from rhodecode.lib import hooks_d |
|
|
28 | from rhodecode.lib.hook_daemon import http_hooks_deamon | |
|
29 | from rhodecode.lib.hook_daemon import celery_hooks_deamon | |
|
30 | from rhodecode.lib.hook_daemon import hook_module | |
|
31 | from rhodecode.lib.hook_daemon import base as hook_base | |
|
29 | 32 | from rhodecode.lib.str_utils import safe_bytes |
|
30 | 33 | from rhodecode.tests.utils import assert_message_in_log |
|
31 | 34 | from rhodecode.lib.ext_json import json |
|
32 | 35 | |
|
33 |
test_proto = hooks_d |
|
|
36 | test_proto = http_hooks_deamon.HooksHttpHandler.MSGPACK_HOOKS_PROTO | |
|
34 | 37 | |
|
35 | 38 | |
|
36 | 39 | class TestHooks(object): |
|
37 | 40 | def test_hooks_can_be_used_as_a_context_processor(self): |
|
38 |
hooks = hook |
|
|
41 | hooks = hook_module.Hooks() | |
|
39 | 42 | with hooks as return_value: |
|
40 | 43 | pass |
|
41 | 44 | assert hooks == return_value |
@@ -52,10 +55,10 b' class TestHooksHttpHandler(object):' | |||
|
52 | 55 | } |
|
53 | 56 | request = self._generate_post_request(data) |
|
54 | 57 | hooks_patcher = mock.patch.object( |
|
55 |
hook |
|
|
58 | hook_module.Hooks, data['method'], create=True, return_value=1) | |
|
56 | 59 | |
|
57 | 60 | with hooks_patcher as hooks_mock: |
|
58 |
handler = hooks_d |
|
|
61 | handler = http_hooks_deamon.HooksHttpHandler | |
|
59 | 62 | handler.DEFAULT_HOOKS_PROTO = test_proto |
|
60 | 63 | handler.wbufsize = 10240 |
|
61 | 64 | MockServer(handler, request) |
@@ -73,21 +76,21 b' class TestHooksHttpHandler(object):' | |||
|
73 | 76 | |
|
74 | 77 | # patching our _read to return test method and proto used |
|
75 | 78 | read_patcher = mock.patch.object( |
|
76 |
hooks_d |
|
|
79 | http_hooks_deamon.HooksHttpHandler, '_read_request', | |
|
77 | 80 | return_value=(test_proto, rpc_method, extras)) |
|
78 | 81 | |
|
79 | 82 | # patch Hooks instance to return hook_result data on 'test' call |
|
80 | 83 | hooks_patcher = mock.patch.object( |
|
81 |
hook |
|
|
84 | hook_module.Hooks, rpc_method, create=True, | |
|
82 | 85 | return_value=hook_result) |
|
83 | 86 | |
|
84 | 87 | with read_patcher, hooks_patcher: |
|
85 |
handler = hooks_d |
|
|
88 | handler = http_hooks_deamon.HooksHttpHandler | |
|
86 | 89 | handler.DEFAULT_HOOKS_PROTO = test_proto |
|
87 | 90 | handler.wbufsize = 10240 |
|
88 | 91 | server = MockServer(handler, request) |
|
89 | 92 | |
|
90 |
expected_result = hooks_d |
|
|
93 | expected_result = http_hooks_deamon.HooksHttpHandler.serialize_data(hook_result) | |
|
91 | 94 | |
|
92 | 95 | server.request.output_stream.seek(0) |
|
93 | 96 | assert server.request.output_stream.readlines()[-1] == expected_result |
@@ -97,15 +100,15 b' class TestHooksHttpHandler(object):' | |||
|
97 | 100 | rpc_method = 'test' |
|
98 | 101 | |
|
99 | 102 | read_patcher = mock.patch.object( |
|
100 |
hooks_d |
|
|
103 | http_hooks_deamon.HooksHttpHandler, '_read_request', | |
|
101 | 104 | return_value=(test_proto, rpc_method, {})) |
|
102 | 105 | |
|
103 | 106 | hooks_patcher = mock.patch.object( |
|
104 |
hook |
|
|
107 | hook_module.Hooks, rpc_method, create=True, | |
|
105 | 108 | side_effect=Exception('Test exception')) |
|
106 | 109 | |
|
107 | 110 | with read_patcher, hooks_patcher: |
|
108 |
handler = hooks_d |
|
|
111 | handler = http_hooks_deamon.HooksHttpHandler | |
|
109 | 112 | handler.DEFAULT_HOOKS_PROTO = test_proto |
|
110 | 113 | handler.wbufsize = 10240 |
|
111 | 114 | server = MockServer(handler, request) |
@@ -113,7 +116,7 b' class TestHooksHttpHandler(object):' | |||
|
113 | 116 | server.request.output_stream.seek(0) |
|
114 | 117 | data = server.request.output_stream.readlines() |
|
115 | 118 | msgpack_data = b''.join(data[5:]) |
|
116 |
org_exc = hooks_d |
|
|
119 | org_exc = http_hooks_deamon.HooksHttpHandler.deserialize_data(msgpack_data) | |
|
117 | 120 | expected_result = { |
|
118 | 121 | 'exception': 'Exception', |
|
119 | 122 | 'exception_traceback': org_exc['exception_traceback'], |
@@ -123,8 +126,7 b' class TestHooksHttpHandler(object):' | |||
|
123 | 126 | |
|
124 | 127 | def test_log_message_writes_to_debug_log(self, caplog): |
|
125 | 128 | ip_port = ('0.0.0.0', 8888) |
|
126 |
handler = hooks_d |
|
|
127 | MockRequest('POST /'), ip_port, mock.Mock()) | |
|
129 | handler = http_hooks_deamon.HooksHttpHandler(MockRequest('POST /'), ip_port, mock.Mock()) | |
|
128 | 130 | fake_date = '1/Nov/2015 00:00:00' |
|
129 | 131 | date_patcher = mock.patch.object( |
|
130 | 132 | handler, 'log_date_time_string', return_value=fake_date) |
@@ -136,10 +138,10 b' class TestHooksHttpHandler(object):' | |||
|
136 | 138 | |
|
137 | 139 | assert_message_in_log( |
|
138 | 140 | caplog.records, expected_message, |
|
139 |
levelno=logging.DEBUG, module='hooks_d |
|
|
141 | levelno=logging.DEBUG, module='http_hooks_deamon') | |
|
140 | 142 | |
|
141 | 143 | def _generate_post_request(self, data, proto=test_proto): |
|
142 |
if proto == hooks_d |
|
|
144 | if proto == http_hooks_deamon.HooksHttpHandler.MSGPACK_HOOKS_PROTO: | |
|
143 | 145 | payload = msgpack.packb(data) |
|
144 | 146 | else: |
|
145 | 147 | payload = json.dumps(data) |
@@ -151,18 +153,18 b' class TestHooksHttpHandler(object):' | |||
|
151 | 153 | class ThreadedHookCallbackDaemon(object): |
|
152 | 154 | def test_constructor_calls_prepare(self): |
|
153 | 155 | prepare_daemon_patcher = mock.patch.object( |
|
154 |
hooks_d |
|
|
156 | http_hooks_deamon.ThreadedHookCallbackDaemon, '_prepare') | |
|
155 | 157 | with prepare_daemon_patcher as prepare_daemon_mock: |
|
156 |
hooks_d |
|
|
158 | http_hooks_deamon.ThreadedHookCallbackDaemon() | |
|
157 | 159 | prepare_daemon_mock.assert_called_once_with() |
|
158 | 160 | |
|
159 | 161 | def test_run_is_called_on_context_start(self): |
|
160 | 162 | patchers = mock.patch.multiple( |
|
161 |
hooks_d |
|
|
163 | http_hooks_deamon.ThreadedHookCallbackDaemon, | |
|
162 | 164 | _run=mock.DEFAULT, _prepare=mock.DEFAULT, __exit__=mock.DEFAULT) |
|
163 | 165 | |
|
164 | 166 | with patchers as mocks: |
|
165 |
daemon = hooks_d |
|
|
167 | daemon = http_hooks_deamon.ThreadedHookCallbackDaemon() | |
|
166 | 168 | with daemon as daemon_context: |
|
167 | 169 | pass |
|
168 | 170 | mocks['_run'].assert_called_once_with() |
@@ -170,11 +172,11 b' class ThreadedHookCallbackDaemon(object)' | |||
|
170 | 172 | |
|
171 | 173 | def test_stop_is_called_on_context_exit(self): |
|
172 | 174 | patchers = mock.patch.multiple( |
|
173 |
hooks_d |
|
|
175 | http_hooks_deamon.ThreadedHookCallbackDaemon, | |
|
174 | 176 | _run=mock.DEFAULT, _prepare=mock.DEFAULT, _stop=mock.DEFAULT) |
|
175 | 177 | |
|
176 | 178 | with patchers as mocks: |
|
177 |
daemon = hooks_d |
|
|
179 | daemon = http_hooks_deamon.ThreadedHookCallbackDaemon() | |
|
178 | 180 | with daemon as daemon_context: |
|
179 | 181 | assert mocks['_stop'].call_count == 0 |
|
180 | 182 | |
@@ -185,46 +187,47 b' class ThreadedHookCallbackDaemon(object)' | |||
|
185 | 187 | class TestHttpHooksCallbackDaemon(object): |
|
186 | 188 | def test_hooks_callback_generates_new_port(self, caplog): |
|
187 | 189 | with caplog.at_level(logging.DEBUG): |
|
188 |
daemon = hooks_d |
|
|
190 | daemon = http_hooks_deamon.HttpHooksCallbackDaemon(host='127.0.0.1', port=8881) | |
|
189 | 191 | assert daemon._daemon.server_address == ('127.0.0.1', 8881) |
|
190 | 192 | |
|
191 | 193 | with caplog.at_level(logging.DEBUG): |
|
192 |
daemon = hooks_d |
|
|
194 | daemon = http_hooks_deamon.HttpHooksCallbackDaemon(host=None, port=None) | |
|
193 | 195 | assert daemon._daemon.server_address[1] in range(0, 66000) |
|
194 | 196 | assert daemon._daemon.server_address[0] != '127.0.0.1' |
|
195 | 197 | |
|
196 | 198 | def test_prepare_inits_daemon_variable(self, tcp_server, caplog): |
|
197 | 199 | with self._tcp_patcher(tcp_server), caplog.at_level(logging.DEBUG): |
|
198 |
daemon = hooks_d |
|
|
200 | daemon = http_hooks_deamon.HttpHooksCallbackDaemon(host='127.0.0.1', port=8881) | |
|
199 | 201 | assert daemon._daemon == tcp_server |
|
200 | 202 | |
|
201 | 203 | _, port = tcp_server.server_address |
|
202 | 204 | |
|
203 | 205 | msg = f"HOOKS: 127.0.0.1:{port} Preparing HTTP callback daemon registering " \ |
|
204 |
f"hook object: <class 'rhodecode.lib.hooks_d |
|
|
206 | f"hook object: <class 'rhodecode.lib.hook_daemon.http_hooks_deamon.HooksHttpHandler'>" | |
|
205 | 207 | assert_message_in_log( |
|
206 |
caplog.records, msg, levelno=logging.DEBUG, module='hooks_d |
|
|
208 | caplog.records, msg, levelno=logging.DEBUG, module='http_hooks_deamon') | |
|
207 | 209 | |
|
208 | 210 | def test_prepare_inits_hooks_uri_and_logs_it( |
|
209 | 211 | self, tcp_server, caplog): |
|
210 | 212 | with self._tcp_patcher(tcp_server), caplog.at_level(logging.DEBUG): |
|
211 |
daemon = hooks_d |
|
|
213 | daemon = http_hooks_deamon.HttpHooksCallbackDaemon(host='127.0.0.1', port=8881) | |
|
212 | 214 | |
|
213 | 215 | _, port = tcp_server.server_address |
|
214 | 216 | expected_uri = '{}:{}'.format('127.0.0.1', port) |
|
215 | 217 | assert daemon.hooks_uri == expected_uri |
|
216 | 218 | |
|
217 | 219 | msg = f"HOOKS: 127.0.0.1:{port} Preparing HTTP callback daemon registering " \ |
|
218 |
f"hook object: <class 'rhodecode.lib.hooks_d |
|
|
220 | f"hook object: <class 'rhodecode.lib.hook_daemon.http_hooks_deamon.HooksHttpHandler'>" | |
|
221 | ||
|
219 | 222 | assert_message_in_log( |
|
220 | 223 | caplog.records, msg, |
|
221 |
levelno=logging.DEBUG, module='hooks_d |
|
|
224 | levelno=logging.DEBUG, module='http_hooks_deamon') | |
|
222 | 225 | |
|
223 | 226 | def test_run_creates_a_thread(self, tcp_server): |
|
224 | 227 | thread = mock.Mock() |
|
225 | 228 | |
|
226 | 229 | with self._tcp_patcher(tcp_server): |
|
227 |
daemon = hooks_d |
|
|
230 | daemon = http_hooks_deamon.HttpHooksCallbackDaemon() | |
|
228 | 231 | |
|
229 | 232 | with self._thread_patcher(thread) as thread_mock: |
|
230 | 233 | daemon._run() |
@@ -238,7 +241,7 b' class TestHttpHooksCallbackDaemon(object' | |||
|
238 | 241 | def test_run_logs(self, tcp_server, caplog): |
|
239 | 242 | |
|
240 | 243 | with self._tcp_patcher(tcp_server): |
|
241 |
daemon = hooks_d |
|
|
244 | daemon = http_hooks_deamon.HttpHooksCallbackDaemon() | |
|
242 | 245 | |
|
243 | 246 | with self._thread_patcher(mock.Mock()), caplog.at_level(logging.DEBUG): |
|
244 | 247 | daemon._run() |
@@ -246,13 +249,13 b' class TestHttpHooksCallbackDaemon(object' | |||
|
246 | 249 | assert_message_in_log( |
|
247 | 250 | caplog.records, |
|
248 | 251 | 'Running thread-based loop of callback daemon in background', |
|
249 |
levelno=logging.DEBUG, module='hooks_d |
|
|
252 | levelno=logging.DEBUG, module='http_hooks_deamon') | |
|
250 | 253 | |
|
251 | 254 | def test_stop_cleans_up_the_connection(self, tcp_server, caplog): |
|
252 | 255 | thread = mock.Mock() |
|
253 | 256 | |
|
254 | 257 | with self._tcp_patcher(tcp_server): |
|
255 |
daemon = hooks_d |
|
|
258 | daemon = http_hooks_deamon.HttpHooksCallbackDaemon() | |
|
256 | 259 | |
|
257 | 260 | with self._thread_patcher(thread), caplog.at_level(logging.DEBUG): |
|
258 | 261 | with daemon: |
@@ -266,18 +269,19 b' class TestHttpHooksCallbackDaemon(object' | |||
|
266 | 269 | |
|
267 | 270 | assert_message_in_log( |
|
268 | 271 | caplog.records, 'Waiting for background thread to finish.', |
|
269 |
levelno=logging.DEBUG, module='hooks_d |
|
|
272 | levelno=logging.DEBUG, module='http_hooks_deamon') | |
|
270 | 273 | |
|
271 | 274 | def _tcp_patcher(self, tcp_server): |
|
272 | 275 | return mock.patch.object( |
|
273 |
hooks_d |
|
|
276 | http_hooks_deamon, 'TCPServer', return_value=tcp_server) | |
|
274 | 277 | |
|
275 | 278 | def _thread_patcher(self, thread): |
|
276 | 279 | return mock.patch.object( |
|
277 |
hooks_d |
|
|
280 | http_hooks_deamon.threading, 'Thread', return_value=thread) | |
|
278 | 281 | |
|
279 | 282 | |
|
280 | 283 | class TestPrepareHooksDaemon(object): |
|
284 | ||
|
281 | 285 | @pytest.mark.parametrize('protocol', ('celery',)) |
|
282 | 286 | def test_returns_celery_hooks_callback_daemon_when_celery_protocol_specified( |
|
283 | 287 | self, protocol): |
@@ -286,12 +290,12 b' class TestPrepareHooksDaemon(object):' | |||
|
286 | 290 | "celery.result_backend = redis://redis/0") |
|
287 | 291 | temp_file.flush() |
|
288 | 292 | expected_extras = {'config': temp_file.name} |
|
289 |
callback, extras = hook |
|
|
293 | callback, extras = hook_base.prepare_callback_daemon( | |
|
290 | 294 | expected_extras, protocol=protocol, host='') |
|
291 |
assert isinstance(callback, hooks_d |
|
|
295 | assert isinstance(callback, celery_hooks_deamon.CeleryHooksCallbackDaemon) | |
|
292 | 296 | |
|
293 | 297 | @pytest.mark.parametrize('protocol, expected_class', ( |
|
294 |
('http', hooks_d |
|
|
298 | ('http', http_hooks_deamon.HttpHooksCallbackDaemon), | |
|
295 | 299 | )) |
|
296 | 300 | def test_returns_real_hooks_callback_daemon_when_protocol_is_specified( |
|
297 | 301 | self, protocol, expected_class): |
@@ -300,9 +304,13 b' class TestPrepareHooksDaemon(object):' | |||
|
300 | 304 | 'txn_id': 'txnid2', |
|
301 | 305 | 'hooks_protocol': protocol.lower(), |
|
302 | 306 | 'task_backend': '', |
|
303 | 'task_queue': '' | |
|
307 | 'task_queue': '', | |
|
308 | 'repo_store': '/var/opt/rhodecode_repo_store', | |
|
309 | 'repository': 'rhodecode', | |
|
304 | 310 | } |
|
305 | callback, extras = hooks_daemon.prepare_callback_daemon( | |
|
311 | from rhodecode import CONFIG | |
|
312 | CONFIG['vcs.svn.redis_conn'] = 'redis://redis:6379/0' | |
|
313 | callback, extras = hook_base.prepare_callback_daemon( | |
|
306 | 314 | expected_extras.copy(), protocol=protocol, host='127.0.0.1', |
|
307 | 315 | txn_id='txnid2') |
|
308 | 316 | assert isinstance(callback, expected_class) |
@@ -321,7 +329,7 b' class TestPrepareHooksDaemon(object):' | |||
|
321 | 329 | 'hooks_protocol': protocol.lower() |
|
322 | 330 | } |
|
323 | 331 | with pytest.raises(Exception): |
|
324 |
callback, extras = hook |
|
|
332 | callback, extras = hook_base.prepare_callback_daemon( | |
|
325 | 333 | expected_extras.copy(), |
|
326 | 334 | protocol=protocol, host='127.0.0.1') |
|
327 | 335 |
@@ -589,17 +589,6 b' class TestUpdateGlobalSslSetting(object)' | |||
|
589 | 589 | model.global_settings, 'web', 'push_ssl', value='False') |
|
590 | 590 | |
|
591 | 591 | |
|
592 | class TestUpdateGlobalPathSetting(object): | |
|
593 | def test_updates_global_path_settings(self): | |
|
594 | model = VcsSettingsModel() | |
|
595 | with mock.patch.object(model, '_create_or_update_ui') as create_mock: | |
|
596 | model.update_global_path_setting('False') | |
|
597 | Session().commit() | |
|
598 | ||
|
599 | create_mock.assert_called_once_with( | |
|
600 | model.global_settings, 'paths', '/', value='False') | |
|
601 | ||
|
602 | ||
|
603 | 592 | class TestCreateOrUpdateGlobalHgSettings(object): |
|
604 | 593 | FORM_DATA = { |
|
605 | 594 | 'extensions_largefiles': False, |
@@ -1004,21 +993,6 b' class TestGetSvnPatterns(object):' | |||
|
1004 | 993 | settings_mock.assert_called_once_with(*args) |
|
1005 | 994 | |
|
1006 | 995 | |
|
1007 | class TestGetReposLocation(object): | |
|
1008 | def test_returns_repos_location(self, repo_stub): | |
|
1009 | model = VcsSettingsModel() | |
|
1010 | ||
|
1011 | result_mock = mock.Mock() | |
|
1012 | result_mock.ui_value = '/tmp' | |
|
1013 | ||
|
1014 | with mock.patch.object(model, 'global_settings') as settings_mock: | |
|
1015 | settings_mock.get_ui_by_key.return_value = result_mock | |
|
1016 | result = model.get_repos_location() | |
|
1017 | ||
|
1018 | settings_mock.get_ui_by_key.assert_called_once_with('/') | |
|
1019 | assert result == '/tmp' | |
|
1020 | ||
|
1021 | ||
|
1022 | 996 | class TestCreateOrUpdateRepoSettings(object): |
|
1023 | 997 | FORM_DATA = { |
|
1024 | 998 | 'inherit_global_settings': False, |
@@ -121,10 +121,10 b' class TestRepoModel(object):' | |||
|
121 | 121 | def test_create_filesystem_repo_installs_hooks(self, tmpdir, backend): |
|
122 | 122 | repo = backend.create_repo() |
|
123 | 123 | repo_name = repo.repo_name |
|
124 | model = RepoModel() | |
|
125 | repo_location = tempfile.mkdtemp() | |
|
126 | model.repos_path = repo_location | |
|
127 |
repo = |
|
|
124 | with mock.patch('rhodecode.model.repo.RepoModel.repos_path', | |
|
125 | new_callable=mock.PropertyMock) as mocked_models_property: | |
|
126 | mocked_models_property.return_value = tempfile.mkdtemp() | |
|
127 | repo = RepoModel()._create_filesystem_repo( | |
|
128 | 128 | repo_name, backend.alias, repo_group='', clone_uri=None) |
|
129 | 129 | |
|
130 | 130 | hooks = { |
@@ -66,7 +66,6 b' prefix = /' | |||
|
66 | 66 | ;can be overridden by |
|
67 | 67 | ;export RC_CACHE_REPO_OBJECT_BACKEND=foo |
|
68 | 68 | |
|
69 | is_test = True | |
|
70 | 69 | use = egg:rhodecode-enterprise-ce |
|
71 | 70 | |
|
72 | 71 | ; enable proxy prefix middleware, defined above |
@@ -105,6 +104,12 b' startup.import_repos = true' | |||
|
105 | 104 | ; SSH calls. Set this for events to receive proper url for SSH calls. |
|
106 | 105 | app.base_url = http://rhodecode.local |
|
107 | 106 | |
|
107 | ; Host at which the Service API is running. | |
|
108 | app.service_api.host = http://rhodecode.local:10020 | |
|
109 | ||
|
110 | ; Secret for Service API authentication. | |
|
111 | app.service_api.token = | |
|
112 | ||
|
108 | 113 | ; Unique application ID. Should be a random unique string for security. |
|
109 | 114 | app_instance_uuid = rc-production |
|
110 | 115 | |
@@ -207,8 +212,8 b' auth_ret_code_detection = false' | |||
|
207 | 212 | ; codes don't break the transactions while 4XX codes do |
|
208 | 213 | lock_ret_code = 423 |
|
209 | 214 | |
|
210 | ; allows to change the repository location in settings page | |
|
211 | allow_repo_location_change = true | |
|
215 | ; Filesystem location were repositories should be stored | |
|
216 | repo_store.path = /var/opt/rhodecode_repo_store | |
|
212 | 217 | |
|
213 | 218 | ; allows to setup custom hooks in settings page |
|
214 | 219 | allow_custom_hooks_settings = true |
@@ -250,23 +255,72 b' file_store.enabled = true' | |||
|
250 | 255 | ; Storage backend, available options are: local |
|
251 | 256 | file_store.backend = local |
|
252 | 257 | |
|
253 | ; path to store the uploaded binaries | |
|
254 |
file_store.storage_path = |
|
|
258 | ; path to store the uploaded binaries and artifacts | |
|
259 | file_store.storage_path = /var/opt/rhodecode_data/file_store | |
|
260 | ||
|
261 | ||
|
262 | ; Redis url to acquire/check generation of archives locks | |
|
263 | archive_cache.locking.url = redis://redis:6379/1 | |
|
264 | ||
|
265 | ; Storage backend, only 'filesystem' and 'objectstore' are available now | |
|
266 | archive_cache.backend.type = filesystem | |
|
267 | ||
|
268 | ; url for s3 compatible storage that allows to upload artifacts | |
|
269 | ; e.g http://minio:9000 | |
|
270 | archive_cache.objectstore.url = http://s3-minio:9000 | |
|
271 | ||
|
272 | ; key for s3 auth | |
|
273 | archive_cache.objectstore.key = key | |
|
274 | ||
|
275 | ; secret for s3 auth | |
|
276 | archive_cache.objectstore.secret = secret | |
|
255 | 277 | |
|
256 | ; Uncomment and set this path to control settings for archive download cache. | |
|
278 | ;region for s3 storage | |
|
279 | archive_cache.objectstore.region = eu-central-1 | |
|
280 | ||
|
281 | ; number of sharded buckets to create to distribute archives across | |
|
282 | ; default is 8 shards | |
|
283 | archive_cache.objectstore.bucket_shards = 8 | |
|
284 | ||
|
285 | ; a top-level bucket to put all other shards in | |
|
286 | ; objects will be stored in rhodecode-archive-cache/shard-N based on the bucket_shards number | |
|
287 | archive_cache.objectstore.bucket = rhodecode-archive-cache | |
|
288 | ||
|
289 | ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time | |
|
290 | archive_cache.objectstore.retry = false | |
|
291 | ||
|
292 | ; number of seconds to wait for next try using retry | |
|
293 | archive_cache.objectstore.retry_backoff = 1 | |
|
294 | ||
|
295 | ; how many tries do do a retry fetch from this backend | |
|
296 | archive_cache.objectstore.retry_attempts = 10 | |
|
297 | ||
|
298 | ; Default is $cache_dir/archive_cache if not set | |
|
257 | 299 | ; Generated repo archives will be cached at this location |
|
258 | 300 | ; and served from the cache during subsequent requests for the same archive of |
|
259 | 301 | ; the repository. This path is important to be shared across filesystems and with |
|
260 | 302 | ; RhodeCode and vcsserver |
|
261 | ||
|
262 | ; Default is $cache_dir/archive_cache if not set | |
|
263 | archive_cache.store_dir = /tmp/rc-test-data/archive_cache | |
|
303 | archive_cache.filesystem.store_dir = %(here)s/rc-tests/archive_cache | |
|
264 | 304 | |
|
265 | 305 | ; The limit in GB sets how much data we cache before recycling last used, defaults to 10 gb |
|
266 |
archive_cache.cache_size_gb = |
|
|
306 | archive_cache.filesystem.cache_size_gb = 2 | |
|
307 | ||
|
308 | ; Eviction policy used to clear out after cache_size_gb limit is reached | |
|
309 | archive_cache.filesystem.eviction_policy = least-recently-stored | |
|
267 | 310 | |
|
268 | 311 | ; By default cache uses sharding technique, this specifies how many shards are there |
|
269 | archive_cache.cache_shards = 10 | |
|
312 | ; default is 8 shards | |
|
313 | archive_cache.filesystem.cache_shards = 8 | |
|
314 | ||
|
315 | ; if true, this cache will try to retry with retry_attempts=N times waiting retry_backoff time | |
|
316 | archive_cache.filesystem.retry = false | |
|
317 | ||
|
318 | ; number of seconds to wait for next try using retry | |
|
319 | archive_cache.filesystem.retry_backoff = 1 | |
|
320 | ||
|
321 | ; how many tries do do a retry fetch from this backend | |
|
322 | archive_cache.filesystem.retry_attempts = 10 | |
|
323 | ||
|
270 | 324 | |
|
271 | 325 | ; ############# |
|
272 | 326 | ; CELERY CONFIG |
@@ -280,7 +334,10 b' use_celery = false' | |||
|
280 | 334 | #celerybeat-schedule.path = |
|
281 | 335 | |
|
282 | 336 | ; connection url to the message broker (default redis) |
|
283 |
celery.broker_url = redis:// |
|
|
337 | celery.broker_url = redis://redis:6379/8 | |
|
338 | ||
|
339 | ; results backend to get results for (default redis) | |
|
340 | celery.result_backend = redis://redis:6379/8 | |
|
284 | 341 | |
|
285 | 342 | ; rabbitmq example |
|
286 | 343 | #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost |
@@ -289,7 +346,8 b' celery.broker_url = redis://localhost:63' | |||
|
289 | 346 | celery.max_tasks_per_child = 20 |
|
290 | 347 | |
|
291 | 348 | ; tasks will never be sent to the queue, but executed locally instead. |
|
292 |
celery.task_always_eager = |
|
|
349 | celery.task_always_eager = true | |
|
350 | celery.task_store_eager_result = true | |
|
293 | 351 | |
|
294 | 352 | ; ############# |
|
295 | 353 | ; DOGPILE CACHE |
@@ -326,7 +384,7 b' rc_cache.cache_repo_longterm.max_size = ' | |||
|
326 | 384 | rc_cache.cache_general.backend = dogpile.cache.rc.file_namespace |
|
327 | 385 | rc_cache.cache_general.expiration_time = 43200 |
|
328 | 386 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set |
|
329 | rc_cache.cache_general.arguments.filename = %(here)s/cache-backend/cache_general_db | |
|
387 | rc_cache.cache_general.arguments.filename = %(here)s/rc-tests/cache-backend/cache_general_db | |
|
330 | 388 | |
|
331 | 389 | ; alternative `cache_general` redis backend with distributed lock |
|
332 | 390 | #rc_cache.cache_general.backend = dogpile.cache.rc.redis |
@@ -353,7 +411,7 b' rc_cache.cache_general.arguments.filenam' | |||
|
353 | 411 | rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace |
|
354 | 412 | rc_cache.cache_perms.expiration_time = 0 |
|
355 | 413 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set |
|
356 | rc_cache.cache_perms.arguments.filename = %(here)s/cache-backend/cache_perms_db | |
|
414 | rc_cache.cache_perms.arguments.filename = %(here)s/rc-tests/cache-backend/cache_perms_db | |
|
357 | 415 | |
|
358 | 416 | ; alternative `cache_perms` redis backend with distributed lock |
|
359 | 417 | #rc_cache.cache_perms.backend = dogpile.cache.rc.redis |
@@ -380,7 +438,7 b' rc_cache.cache_perms.arguments.filename ' | |||
|
380 | 438 | rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace |
|
381 | 439 | rc_cache.cache_repo.expiration_time = 2592000 |
|
382 | 440 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set |
|
383 | rc_cache.cache_repo.arguments.filename = %(here)s/cache-backend/cache_repo_db | |
|
441 | rc_cache.cache_repo.arguments.filename = %(here)s/rc-tests/cache-backend/cache_repo_db | |
|
384 | 442 | |
|
385 | 443 | ; alternative `cache_repo` redis backend with distributed lock |
|
386 | 444 | #rc_cache.cache_repo.backend = dogpile.cache.rc.redis |
@@ -404,14 +462,14 b' rc_cache.cache_repo.arguments.filename =' | |||
|
404 | 462 | ; ############## |
|
405 | 463 | |
|
406 | 464 | ; beaker.session.type is type of storage options for the logged users sessions. Current allowed |
|
407 |
; types are file, ext:redis, ext:database, ext:memcached |
|
|
408 |
; Fastest ones are |
|
|
465 | ; types are file, ext:redis, ext:database, ext:memcached | |
|
466 | ; Fastest ones are ext:redis and ext:database, DO NOT use memory type for session | |
|
409 | 467 | beaker.session.type = file |
|
410 | 468 | beaker.session.data_dir = %(here)s/rc-tests/data/sessions |
|
411 | 469 | |
|
412 | 470 | ; Redis based sessions |
|
413 | 471 | #beaker.session.type = ext:redis |
|
414 |
#beaker.session.url = redis:// |
|
|
472 | #beaker.session.url = redis://redis:6379/2 | |
|
415 | 473 | |
|
416 | 474 | ; DB based session, fast, and allows easy management over logged in users |
|
417 | 475 | #beaker.session.type = ext:database |
@@ -423,7 +481,7 b' beaker.session.data_dir = %(here)s/rc-te' | |||
|
423 | 481 | |
|
424 | 482 | beaker.session.key = rhodecode |
|
425 | 483 | beaker.session.secret = test-rc-uytcxaz |
|
426 | beaker.session.lock_dir = %(here)s/data/sessions/lock | |
|
484 | beaker.session.lock_dir = %(here)s/rc-tests/data/sessions/lock | |
|
427 | 485 | |
|
428 | 486 | ; Secure encrypted cookie. Requires AES and AES python libraries |
|
429 | 487 | ; you must disable beaker.session.secret to use this |
@@ -441,9 +499,6 b' beaker.session.httponly = true' | |||
|
441 | 499 | ; Set https secure cookie |
|
442 | 500 | beaker.session.secure = false |
|
443 | 501 | |
|
444 | ## auto save the session to not to use .save() | |
|
445 | beaker.session.auto = false | |
|
446 | ||
|
447 | 502 | ; default cookie expiration time in seconds, set to `true` to set expire |
|
448 | 503 | ; at browser close |
|
449 | 504 | #beaker.session.cookie_expires = 3600 |
@@ -458,7 +513,7 b' beaker.session.auto = false' | |||
|
458 | 513 | ; WHOOSH Backend, doesn't require additional services to run |
|
459 | 514 | ; it works good with few dozen repos |
|
460 | 515 | search.module = rhodecode.lib.index.whoosh |
|
461 | search.location = %(here)s/data/index | |
|
516 | search.location = %(here)s/rc-tests/data/index | |
|
462 | 517 | |
|
463 | 518 | ; #################### |
|
464 | 519 | ; CHANNELSTREAM CONFIG |
@@ -470,15 +525,15 b' search.location = %(here)s/data/index' | |||
|
470 | 525 | channelstream.enabled = false |
|
471 | 526 | |
|
472 | 527 | ; server address for channelstream server on the backend |
|
473 |
channelstream.server = |
|
|
528 | channelstream.server = channelstream:9800 | |
|
474 | 529 | |
|
475 | 530 | ; location of the channelstream server from outside world |
|
476 | 531 | ; use ws:// for http or wss:// for https. This address needs to be handled |
|
477 | 532 | ; by external HTTP server such as Nginx or Apache |
|
478 | 533 | ; see Nginx/Apache configuration examples in our docs |
|
479 | 534 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream |
|
480 |
channelstream.secret = |
|
|
481 | channelstream.history.location = %(here)s/channelstream_history | |
|
535 | channelstream.secret = ENV_GENERATED | |
|
536 | channelstream.history.location = %(here)s/rc-tests/channelstream_history | |
|
482 | 537 | |
|
483 | 538 | ; Internal application path that Javascript uses to connect into. |
|
484 | 539 | ; If you use proxy-prefix the prefix should be added before /_channelstream |
@@ -495,7 +550,7 b' channelstream.proxy_path = /_channelstre' | |||
|
495 | 550 | ; pymysql is an alternative driver for MySQL, use in case of problems with default one |
|
496 | 551 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode |
|
497 | 552 | |
|
498 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode_test.db?timeout=30 | |
|
553 | sqlalchemy.db1.url = sqlite:///%(here)s/rc-tests/rhodecode_test.db?timeout=30 | |
|
499 | 554 | |
|
500 | 555 | ; see sqlalchemy docs for other advanced settings |
|
501 | 556 | ; print the sql statements to output |
@@ -537,6 +592,7 b' vcs.scm_app_implementation = http' | |||
|
537 | 592 | |
|
538 | 593 | ; Push/Pull operations hooks protocol, available options are: |
|
539 | 594 | ; `http` - use http-rpc backend (default) |
|
595 | ; `celery` - use celery based hooks | |
|
540 | 596 | vcs.hooks.protocol = http |
|
541 | 597 | |
|
542 | 598 | ; Host on which this instance is listening for hooks. vcsserver will call this host to pull/push hooks so it should be |
@@ -556,11 +612,6 b' vcs.backends = hg, git, svn' | |||
|
556 | 612 | ; Wait this number of seconds before killing connection to the vcsserver |
|
557 | 613 | vcs.connection_timeout = 3600 |
|
558 | 614 | |
|
559 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
560 | ; Set a numeric version for your current SVN e.g 1.8, or 1.12 | |
|
561 | ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
562 | #vcs.svn.compatible_version = 1.8 | |
|
563 | ||
|
564 | 615 | ; Cache flag to cache vcsserver remote calls locally |
|
565 | 616 | ; It uses cache_region `cache_repo` |
|
566 | 617 | vcs.methods.cache = false |
@@ -570,6 +621,17 b' vcs.methods.cache = false' | |||
|
570 | 621 | ; Maps RhodeCode repo groups into SVN paths for Apache |
|
571 | 622 | ; #################################################### |
|
572 | 623 | |
|
624 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
625 | ; Set a numeric version for your current SVN e.g 1.8, or 1.12 | |
|
626 | ; Legacy available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
627 | #vcs.svn.compatible_version = 1.8 | |
|
628 | ||
|
629 | ; Enable SVN proxy of requests over HTTP | |
|
630 | vcs.svn.proxy.enabled = true | |
|
631 | ||
|
632 | ; host to connect to running SVN subsystem | |
|
633 | vcs.svn.proxy.host = http://svn:8090 | |
|
634 | ||
|
573 | 635 | ; Enable or disable the config file generation. |
|
574 | 636 | svn.proxy.generate_config = false |
|
575 | 637 | |
@@ -577,7 +639,7 b' svn.proxy.generate_config = false' | |||
|
577 | 639 | svn.proxy.list_parent_path = true |
|
578 | 640 | |
|
579 | 641 | ; Set location and file name of generated config file. |
|
580 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
|
642 | svn.proxy.config_file_path = %(here)s/rc-tests/mod_dav_svn.conf | |
|
581 | 643 | |
|
582 | 644 | ; alternative mod_dav config template. This needs to be a valid mako template |
|
583 | 645 | ; Example template can be found in the source code: |
@@ -613,25 +675,26 b' ssh.generate_authorized_keyfile = true' | |||
|
613 | 675 | ; Path to the authorized_keys file where the generate entries are placed. |
|
614 | 676 | ; It is possible to have multiple key files specified in `sshd_config` e.g. |
|
615 | 677 | ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode |
|
616 | ssh.authorized_keys_file_path = %(here)s/rc/authorized_keys_rhodecode | |
|
678 | ssh.authorized_keys_file_path = %(here)s/rc-tests/authorized_keys_rhodecode | |
|
617 | 679 | |
|
618 | 680 | ; Command to execute the SSH wrapper. The binary is available in the |
|
619 | 681 | ; RhodeCode installation directory. |
|
620 | ; e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |
|
621 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |
|
682 | ; legacy: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper | |
|
683 | ; new rewrite: /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper-v2 | |
|
684 | ssh.wrapper_cmd = /usr/local/bin/rhodecode_bin/bin/rc-ssh-wrapper | |
|
622 | 685 | |
|
623 | 686 | ; Allow shell when executing the ssh-wrapper command |
|
624 | 687 | ssh.wrapper_cmd_allow_shell = false |
|
625 | 688 | |
|
626 | 689 | ; Enables logging, and detailed output send back to the client during SSH |
|
627 | 690 | ; operations. Useful for debugging, shouldn't be used in production. |
|
628 |
ssh.enable_debug_logging = |
|
|
691 | ssh.enable_debug_logging = true | |
|
629 | 692 | |
|
630 | 693 | ; Paths to binary executable, by default they are the names, but we can |
|
631 | 694 | ; override them if we want to use a custom one |
|
632 |
ssh.executable.hg = |
|
|
633 |
ssh.executable.git = |
|
|
634 |
ssh.executable.svn = |
|
|
695 | ssh.executable.hg = /usr/local/bin/rhodecode_bin/vcs_bin/hg | |
|
696 | ssh.executable.git = /usr/local/bin/rhodecode_bin/vcs_bin/git | |
|
697 | ssh.executable.svn = /usr/local/bin/rhodecode_bin/vcs_bin/svnserve | |
|
635 | 698 | |
|
636 | 699 | ; Enables SSH key generator web interface. Disabling this still allows users |
|
637 | 700 | ; to add their own keys. |
@@ -106,6 +106,8 b' def get_url_defs():' | |||
|
106 | 106 | + "/gists/{gist_id}/rev/{revision}/{format}/{f_path}", |
|
107 | 107 | "login": ADMIN_PREFIX + "/login", |
|
108 | 108 | "logout": ADMIN_PREFIX + "/logout", |
|
109 | "setup_2fa": ADMIN_PREFIX + "/setup_2fa", | |
|
110 | "check_2fa": ADMIN_PREFIX + "/check_2fa", | |
|
109 | 111 | "register": ADMIN_PREFIX + "/register", |
|
110 | 112 | "reset_password": ADMIN_PREFIX + "/password_reset", |
|
111 | 113 | "reset_password_confirmation": ADMIN_PREFIX + "/password_reset_confirmation", |
@@ -250,6 +252,7 b' def get_url_defs():' | |||
|
250 | 252 | "pullrequest_show_all_data": "/{repo_name}/pull-request-data", |
|
251 | 253 | "bookmarks_home": "/{repo_name}/bookmarks", |
|
252 | 254 | "branches_home": "/{repo_name}/branches", |
|
255 | "branch_remove": "/{repo_name}/branches/{branch_name}/remove", | |
|
253 | 256 | "tags_home": "/{repo_name}/tags", |
|
254 | 257 | "repo_changelog": "/{repo_name}/changelog", |
|
255 | 258 | "repo_commits": "/{repo_name}/commits", |
@@ -143,13 +143,14 b' class RcVCSServer(ServerBase):' | |||
|
143 | 143 | log_file_name = 'rc-vcsserver.log' |
|
144 | 144 | status_url_tmpl = 'http://{host}:{port}/status' |
|
145 | 145 | |
|
146 |
def __init__(self, config_file, log_file=None, workers=' |
|
|
146 | def __init__(self, config_file, log_file=None, workers='3'): | |
|
147 | 147 | super(RcVCSServer, self).__init__(config_file, log_file) |
|
148 | 148 | self._args = [ |
|
149 | 149 | 'gunicorn', |
|
150 | 150 | '--bind', self.bind_addr, |
|
151 |
'--worker-class', ' |
|
|
152 |
'-- |
|
|
151 | '--worker-class', 'sync', | |
|
152 | '--threads', '1', | |
|
153 | '--backlog', '8', | |
|
153 | 154 | '--timeout', '300', |
|
154 | 155 | '--workers', workers, |
|
155 | 156 | '--paste', self.config_file] |
@@ -180,13 +181,14 b' class RcWebServer(ServerBase):' | |||
|
180 | 181 | log_file_name = 'rc-web.log' |
|
181 | 182 | status_url_tmpl = 'http://{host}:{port}/_admin/ops/ping' |
|
182 | 183 | |
|
183 |
def __init__(self, config_file, log_file=None, workers=' |
|
|
184 | def __init__(self, config_file, log_file=None, workers='2'): | |
|
184 | 185 | super(RcWebServer, self).__init__(config_file, log_file) |
|
185 | 186 | self._args = [ |
|
186 | 187 | 'gunicorn', |
|
187 | 188 | '--bind', self.bind_addr, |
|
188 |
'--worker-class', 'g |
|
|
189 |
'-- |
|
|
189 | '--worker-class', 'gthread', | |
|
190 | '--threads', '4', | |
|
191 | '--backlog', '8', | |
|
190 | 192 | '--timeout', '300', |
|
191 | 193 | '--workers', workers, |
|
192 | 194 | '--paste', self.config_file] |
@@ -219,3 +221,11 b' class RcWebServer(ServerBase):' | |||
|
219 | 221 | params.update(**kwargs) |
|
220 | 222 | _url = f"http://{params['user']}:{params['passwd']}@{params['host']}/{params['cloned_repo']}" |
|
221 | 223 | return _url |
|
224 | ||
|
225 | def repo_clone_credentials(self, **kwargs): | |
|
226 | params = { | |
|
227 | 'user': TEST_USER_ADMIN_LOGIN, | |
|
228 | 'passwd': TEST_USER_ADMIN_PASS, | |
|
229 | } | |
|
230 | params.update(**kwargs) | |
|
231 | return params['user'], params['passwd'] |
@@ -28,7 +28,7 b' import mock' | |||
|
28 | 28 | import pytest |
|
29 | 29 | |
|
30 | 30 | import rhodecode |
|
31 |
from rhodecode.lib. |
|
|
31 | from rhodecode.lib.archive_cache import get_archival_config | |
|
32 | 32 | from rhodecode.lib.str_utils import ascii_bytes |
|
33 | 33 | from rhodecode.lib.vcs.backends import base |
|
34 | 34 | from rhodecode.lib.vcs.exceptions import ImproperArchiveTypeError, VCSError |
@@ -969,7 +969,7 b' class TestGitCommit(object):' | |||
|
969 | 969 | branches = self.repo.branches |
|
970 | 970 | |
|
971 | 971 | assert 'unicode' in branches |
|
972 |
assert |
|
|
972 | assert 'uniçö∂e' in branches | |
|
973 | 973 | |
|
974 | 974 | def test_unicode_tag_refs(self): |
|
975 | 975 | unicode_tags = { |
@@ -983,7 +983,7 b' class TestGitCommit(object):' | |||
|
983 | 983 | tags = self.repo.tags |
|
984 | 984 | |
|
985 | 985 | assert 'unicode' in tags |
|
986 |
assert |
|
|
986 | assert 'uniçö∂e' in tags | |
|
987 | 987 | |
|
988 | 988 | def test_commit_message_is_unicode(self): |
|
989 | 989 | for commit in self.repo: |
@@ -145,14 +145,14 b' def test_unicode_refs(vcsbackend, filena' | |||
|
145 | 145 | with mock.patch(("rhodecode.lib.vcs.backends.svn.repository" |
|
146 | 146 | ".SubversionRepository._patterns_from_section"), |
|
147 | 147 | return_value=['branches/*']): |
|
148 |
assert |
|
|
148 | assert f'branches/{branch}' in repo.branches | |
|
149 | 149 | |
|
150 | 150 | |
|
151 | 151 | def test_compatible_version(monkeypatch, vcsbackend): |
|
152 | 152 | monkeypatch.setattr(settings, 'SVN_COMPATIBLE_VERSION', 'pre-1.8-compatible') |
|
153 | 153 | path = vcsbackend.new_repo_path() |
|
154 | 154 | SubversionRepository(path, create=True) |
|
155 |
with open('{}/db/format' |
|
|
155 | with open(f'{path}/db/format') as f: | |
|
156 | 156 | first_line = f.readline().strip() |
|
157 | 157 | assert first_line == '4' |
|
158 | 158 |
@@ -26,20 +26,21 b' Base for test suite for making push/pull' | |||
|
26 | 26 | to redirect things to stderr instead of stdout. |
|
27 | 27 | """ |
|
28 | 28 | |
|
29 | from os.path import join as jn | |
|
30 | from subprocess import Popen, PIPE | |
|
29 | ||
|
31 | 30 | import logging |
|
32 | 31 | import os |
|
33 | 32 | import tempfile |
|
33 | import subprocess | |
|
34 | 34 | |
|
35 | 35 | from rhodecode.lib.str_utils import safe_str |
|
36 | from rhodecode.tests import GIT_REPO, HG_REPO | |
|
36 | from rhodecode.tests import GIT_REPO, HG_REPO, SVN_REPO | |
|
37 | 37 | |
|
38 | 38 | DEBUG = True |
|
39 | 39 | RC_LOG = os.path.join(tempfile.gettempdir(), 'rc.log') |
|
40 | 40 | REPO_GROUP = 'a_repo_group' |
|
41 |
HG_REPO_WITH_GROUP = ' |
|
|
42 |
GIT_REPO_WITH_GROUP = ' |
|
|
41 | HG_REPO_WITH_GROUP = f'{REPO_GROUP}/{HG_REPO}' | |
|
42 | GIT_REPO_WITH_GROUP = f'{REPO_GROUP}/{GIT_REPO}' | |
|
43 | SVN_REPO_WITH_GROUP = f'{REPO_GROUP}/{SVN_REPO}' | |
|
43 | 44 | |
|
44 | 45 | log = logging.getLogger(__name__) |
|
45 | 46 | |
@@ -65,7 +66,8 b' class Command(object):' | |||
|
65 | 66 | if key.startswith('COV_CORE_'): |
|
66 | 67 | del env[key] |
|
67 | 68 | |
|
68 | self.process = Popen(command, shell=True, stdout=PIPE, stderr=PIPE, | |
|
69 | self.process = subprocess.Popen( | |
|
70 | command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, | |
|
69 | 71 |
|
|
70 | 72 | stdout, stderr = self.process.communicate() |
|
71 | 73 | |
@@ -85,12 +87,14 b' def _add_files(vcs, dest, clone_url=None' | |||
|
85 | 87 | full_name = 'Marcin Kuźminski' |
|
86 | 88 | email = 'me@email.com' |
|
87 | 89 | git_ident = f"git config user.name {full_name} && git config user.email {email}" |
|
88 | cwd = path = jn(dest) | |
|
90 | cwd = path = os.path.join(dest) | |
|
89 | 91 | |
|
90 | 92 | tags = tags or [] |
|
91 |
|
|
|
92 | Command(cwd).execute('touch %s' % added_file) | |
|
93 | Command(cwd).execute('%s add %s' % (vcs, added_file)) | |
|
93 | name_sequence = next(tempfile._RandomNameSequence()) | |
|
94 | added_file = os.path.join(path, f'{name_sequence}_setup.py') | |
|
95 | ||
|
96 | Command(cwd).execute(f'touch {added_file}') | |
|
97 | Command(cwd).execute(f'{vcs} add {added_file}') | |
|
94 | 98 | author_str = 'Marcin Kuźminski <me@email.com>' |
|
95 | 99 | |
|
96 | 100 | for i in range(kwargs.get('files_no', 3)): |
@@ -128,7 +132,7 b' def _add_files_and_push(vcs, dest, clone' | |||
|
128 | 132 | vcs is git or hg and defines what VCS we want to make those files for |
|
129 | 133 | """ |
|
130 | 134 | git_ident = "git config user.name Marcin Kuźminski && git config user.email me@email.com" |
|
131 | cwd = jn(dest) | |
|
135 | cwd = os.path.join(dest) | |
|
132 | 136 | |
|
133 | 137 | # commit some stuff into this repo |
|
134 | 138 | _add_files(vcs, dest, clone_url, tags, target_branch, new_branch, **kwargs) |
@@ -147,12 +151,21 b' def _add_files_and_push(vcs, dest, clone' | |||
|
147 | 151 | if new_branch: |
|
148 | 152 | maybe_new_branch = '--new-branch' |
|
149 | 153 | stdout, stderr = Command(cwd).execute( |
|
150 |
'hg push --traceback --verbose {} -r {} {}' |
|
|
154 | f'hg push --traceback --verbose {maybe_new_branch} -r {target_branch} {clone_url}' | |
|
151 | 155 | ) |
|
152 | 156 | elif vcs == 'git': |
|
153 | 157 | stdout, stderr = Command(cwd).execute( |
|
154 | """{} && | |
|
155 | git push --verbose --tags {} {}""".format(git_ident, clone_url, target_branch) | |
|
158 | f'{git_ident} && git push --verbose --tags {clone_url} {target_branch}' | |
|
159 | ) | |
|
160 | elif vcs == 'svn': | |
|
161 | username = kwargs.pop('username', '') | |
|
162 | password = kwargs.pop('password', '') | |
|
163 | auth = '' | |
|
164 | if username and password: | |
|
165 | auth = f'--username {username} --password {password}' | |
|
166 | ||
|
167 | stdout, stderr = Command(cwd).execute( | |
|
168 | f'svn commit --no-auth-cache --non-interactive {auth} -m "pushing to {target_branch}"' | |
|
156 | 169 | ) |
|
157 | 170 | |
|
158 | 171 | return stdout, stderr |
@@ -179,6 +192,13 b' def _check_proper_hg_push(stdout, stderr' | |||
|
179 | 192 | assert 'abort:' not in stderr |
|
180 | 193 | |
|
181 | 194 | |
|
195 | def _check_proper_svn_push(stdout, stderr): | |
|
196 | assert 'pushing to' in stdout | |
|
197 | assert 'searching for changes' in stdout | |
|
198 | ||
|
199 | assert 'abort:' not in stderr | |
|
200 | ||
|
201 | ||
|
182 | 202 | def _check_proper_clone(stdout, stderr, vcs): |
|
183 | 203 | if vcs == 'hg': |
|
184 | 204 | assert 'requesting all changes' in stdout |
@@ -193,3 +213,8 b' def _check_proper_clone(stdout, stderr, ' | |||
|
193 | 213 | assert 'Cloning into' in stderr |
|
194 | 214 | assert 'abort:' not in stderr |
|
195 | 215 | assert 'fatal:' not in stderr |
|
216 | ||
|
217 | if vcs == 'svn': | |
|
218 | assert 'dupa' in stdout | |
|
219 | ||
|
220 |
@@ -42,7 +42,7 b' from rhodecode.model.db import Repositor' | |||
|
42 | 42 | from rhodecode.model.meta import Session |
|
43 | 43 | from rhodecode.integrations.types.webhook import WebhookIntegrationType |
|
44 | 44 | |
|
45 | from rhodecode.tests import GIT_REPO, HG_REPO | |
|
45 | from rhodecode.tests import GIT_REPO, HG_REPO, SVN_REPO | |
|
46 | 46 | from rhodecode.tests.conftest import HTTPBIN_DOMAIN, HTTPBIN_POST |
|
47 | 47 | from rhodecode.tests.fixture import Fixture |
|
48 | 48 | from rhodecode.tests.server_utils import RcWebServer |
@@ -51,13 +51,15 b' from rhodecode.tests.server_utils import' | |||
|
51 | 51 | REPO_GROUP = 'a_repo_group' |
|
52 | 52 | HG_REPO_WITH_GROUP = f'{REPO_GROUP}/{HG_REPO}' |
|
53 | 53 | GIT_REPO_WITH_GROUP = f'{REPO_GROUP}/{GIT_REPO}' |
|
54 | SVN_REPO_WITH_GROUP = f'{REPO_GROUP}/{SVN_REPO}' | |
|
54 | 55 | |
|
55 | 56 | log = logging.getLogger(__name__) |
|
56 | 57 | |
|
57 | 58 | |
|
58 | 59 | def check_httpbin_connection(): |
|
60 | log.debug('Checking if HTTPBIN_DOMAIN: %s is available', HTTPBIN_DOMAIN) | |
|
59 | 61 | try: |
|
60 | response = requests.get(HTTPBIN_DOMAIN) | |
|
62 | response = requests.get(HTTPBIN_DOMAIN, timeout=5) | |
|
61 | 63 | return response.status_code == 200 |
|
62 | 64 | except Exception as e: |
|
63 | 65 | print(e) |
@@ -102,11 +104,15 b' def repos(request, db_connection):' | |||
|
102 | 104 | fixture.create_fork(GIT_REPO, GIT_REPO, |
|
103 | 105 | repo_name_full=GIT_REPO_WITH_GROUP, |
|
104 | 106 | repo_group=repo_group_id) |
|
107 | fixture.create_fork(SVN_REPO, SVN_REPO, | |
|
108 | repo_name_full=SVN_REPO_WITH_GROUP, | |
|
109 | repo_group=repo_group_id) | |
|
105 | 110 | |
|
106 | 111 | @request.addfinalizer |
|
107 | 112 | def cleanup(): |
|
108 | 113 | fixture.destroy_repo(HG_REPO_WITH_GROUP) |
|
109 | 114 | fixture.destroy_repo(GIT_REPO_WITH_GROUP) |
|
115 | fixture.destroy_repo(SVN_REPO_WITH_GROUP) | |
|
110 | 116 | fixture.destroy_repo_group(repo_group_id) |
|
111 | 117 | |
|
112 | 118 | |
@@ -139,11 +145,11 b' def rc_web_server(' | |||
|
139 | 145 | """ |
|
140 | 146 | Run the web server as a subprocess. with its own instance of vcsserver |
|
141 | 147 | """ |
|
142 | rcweb_port = available_port_factory() | |
|
143 |
log.info('Using rcweb ops test port |
|
|
148 | rcweb_port: int = available_port_factory() | |
|
149 | log.info('Using rcweb ops test port %s', rcweb_port) | |
|
144 | 150 | |
|
145 | vcsserver_port = available_port_factory() | |
|
146 |
log.info('Using vcsserver ops test port |
|
|
151 | vcsserver_port: int = available_port_factory() | |
|
152 | log.info('Using vcsserver ops test port %s', vcsserver_port) | |
|
147 | 153 | |
|
148 | 154 | vcs_log = os.path.join(tempfile.gettempdir(), 'rc_op_vcs.log') |
|
149 | 155 | vcsserver_factory( |
@@ -303,5 +309,3 b' def branch_permission_setter(request):' | |||
|
303 | 309 | Session().commit() |
|
304 | 310 | |
|
305 | 311 | return _branch_permissions_setter |
|
306 | ||
|
307 |
@@ -32,7 +32,7 b' from rhodecode.lib.vcs.backends.git.repo' | |||
|
32 | 32 | from rhodecode.lib.vcs.nodes import FileNode |
|
33 | 33 | from rhodecode.tests import GIT_REPO |
|
34 | 34 | from rhodecode.tests.vcs_operations import Command |
|
35 | from .test_vcs_operations import _check_proper_clone, _check_proper_git_push | |
|
35 | from .test_vcs_operations_git import _check_proper_clone, _check_proper_git_push | |
|
36 | 36 | |
|
37 | 37 | |
|
38 | 38 | def test_git_clone_with_small_push_buffer(backend_git, rc_web_server, tmpdir): |
@@ -28,47 +28,23 b' Test suite for making push/pull operatio' | |||
|
28 | 28 | |
|
29 | 29 | |
|
30 | 30 | import time |
|
31 | import logging | |
|
32 | ||
|
33 | 31 | import pytest |
|
34 | 32 | |
|
35 |
from rhodecode. |
|
|
36 | from rhodecode.model.auth_token import AuthTokenModel | |
|
37 | from rhodecode.model.db import Repository, UserIpMap, CacheKey | |
|
33 | from rhodecode.model.db import Repository, UserIpMap | |
|
38 | 34 | from rhodecode.model.meta import Session |
|
39 | 35 | from rhodecode.model.repo import RepoModel |
|
40 | 36 | from rhodecode.model.user import UserModel |
|
41 |
from rhodecode.tests import (GIT_REPO, |
|
|
42 | from rhodecode.tests.utils import assert_message_in_log | |
|
37 | from rhodecode.tests import (GIT_REPO, TEST_USER_ADMIN_LOGIN) | |
|
38 | ||
|
43 | 39 | |
|
44 | 40 | from rhodecode.tests.vcs_operations import ( |
|
45 | 41 | Command, _check_proper_clone, _check_proper_git_push, |
|
46 |
_add_files_and_push, |
|
|
42 | _add_files_and_push, GIT_REPO_WITH_GROUP) | |
|
47 | 43 | |
|
48 | 44 | |
|
49 | 45 | @pytest.mark.usefixtures("disable_locking", "disable_anonymous_user") |
|
50 | 46 | class TestVCSOperations(object): |
|
51 | 47 | |
|
52 | def test_clone_hg_repo_by_admin(self, rc_web_server, tmpdir): | |
|
53 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
54 | stdout, stderr = Command('/tmp').execute( | |
|
55 | 'hg clone', clone_url, tmpdir.strpath) | |
|
56 | _check_proper_clone(stdout, stderr, 'hg') | |
|
57 | ||
|
58 | def test_clone_hg_repo_by_admin_pull_protocol(self, rc_web_server, tmpdir): | |
|
59 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
60 | stdout, stderr = Command('/tmp').execute( | |
|
61 | 'hg clone --pull', clone_url, tmpdir.strpath) | |
|
62 | _check_proper_clone(stdout, stderr, 'hg') | |
|
63 | ||
|
64 | def test_clone_hg_repo_by_admin_pull_stream_protocol(self, rc_web_server, tmpdir): | |
|
65 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
66 | stdout, stderr = Command('/tmp').execute( | |
|
67 | 'hg clone --pull --stream', clone_url, tmpdir.strpath) | |
|
68 | assert 'files to transfer,' in stdout | |
|
69 | assert 'transferred 1.' in stdout | |
|
70 | assert '114 files updated,' in stdout | |
|
71 | ||
|
72 | 48 | def test_clone_git_repo_by_admin(self, rc_web_server, tmpdir): |
|
73 | 49 | clone_url = rc_web_server.repo_clone_url(GIT_REPO) |
|
74 | 50 | cmd = Command('/tmp') |
@@ -83,13 +59,6 b' class TestVCSOperations(object):' | |||
|
83 | 59 | _check_proper_clone(stdout, stderr, 'git') |
|
84 | 60 | cmd.assert_returncode_success() |
|
85 | 61 | |
|
86 | def test_clone_hg_repo_by_id_by_admin(self, rc_web_server, tmpdir): | |
|
87 | repo_id = Repository.get_by_repo_name(HG_REPO).repo_id | |
|
88 | clone_url = rc_web_server.repo_clone_url('_%s' % repo_id) | |
|
89 | stdout, stderr = Command('/tmp').execute( | |
|
90 | 'hg clone', clone_url, tmpdir.strpath) | |
|
91 | _check_proper_clone(stdout, stderr, 'hg') | |
|
92 | ||
|
93 | 62 | def test_clone_git_repo_by_id_by_admin(self, rc_web_server, tmpdir): |
|
94 | 63 | repo_id = Repository.get_by_repo_name(GIT_REPO).repo_id |
|
95 | 64 | clone_url = rc_web_server.repo_clone_url('_%s' % repo_id) |
@@ -98,12 +67,6 b' class TestVCSOperations(object):' | |||
|
98 | 67 | _check_proper_clone(stdout, stderr, 'git') |
|
99 | 68 | cmd.assert_returncode_success() |
|
100 | 69 | |
|
101 | def test_clone_hg_repo_with_group_by_admin(self, rc_web_server, tmpdir): | |
|
102 | clone_url = rc_web_server.repo_clone_url(HG_REPO_WITH_GROUP) | |
|
103 | stdout, stderr = Command('/tmp').execute( | |
|
104 | 'hg clone', clone_url, tmpdir.strpath) | |
|
105 | _check_proper_clone(stdout, stderr, 'hg') | |
|
106 | ||
|
107 | 70 | def test_clone_git_repo_with_group_by_admin(self, rc_web_server, tmpdir): |
|
108 | 71 | clone_url = rc_web_server.repo_clone_url(GIT_REPO_WITH_GROUP) |
|
109 | 72 | cmd = Command('/tmp') |
@@ -121,11 +84,6 b' class TestVCSOperations(object):' | |||
|
121 | 84 | assert 'Cloning into' in stderr |
|
122 | 85 | cmd.assert_returncode_success() |
|
123 | 86 | |
|
124 | def test_clone_wrong_credentials_hg(self, rc_web_server, tmpdir): | |
|
125 | clone_url = rc_web_server.repo_clone_url(HG_REPO, passwd='bad!') | |
|
126 | stdout, stderr = Command('/tmp').execute( | |
|
127 | 'hg clone', clone_url, tmpdir.strpath) | |
|
128 | assert 'abort: authorization failed' in stderr | |
|
129 | 87 | |
|
130 | 88 | def test_clone_wrong_credentials_git(self, rc_web_server, tmpdir): |
|
131 | 89 | clone_url = rc_web_server.repo_clone_url(GIT_REPO, passwd='bad!') |
@@ -139,12 +97,6 b' class TestVCSOperations(object):' | |||
|
139 | 97 | 'hg clone', clone_url, tmpdir.strpath) |
|
140 | 98 | assert 'HTTP Error 404: Not Found' in stderr |
|
141 | 99 | |
|
142 | def test_clone_hg_repo_as_git(self, rc_web_server, tmpdir): | |
|
143 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
144 | stdout, stderr = Command('/tmp').execute( | |
|
145 | 'git clone', clone_url, tmpdir.strpath) | |
|
146 | assert 'not found' in stderr | |
|
147 | ||
|
148 | 100 | def test_clone_non_existing_path_hg(self, rc_web_server, tmpdir): |
|
149 | 101 | clone_url = rc_web_server.repo_clone_url('trololo') |
|
150 | 102 | stdout, stderr = Command('/tmp').execute( |
@@ -156,25 +108,11 b' class TestVCSOperations(object):' | |||
|
156 | 108 | stdout, stderr = Command('/tmp').execute('git clone', clone_url) |
|
157 | 109 | assert 'not found' in stderr |
|
158 | 110 | |
|
159 | def test_clone_hg_with_slashes(self, rc_web_server, tmpdir): | |
|
160 | clone_url = rc_web_server.repo_clone_url('//' + HG_REPO) | |
|
161 | stdout, stderr = Command('/tmp').execute('hg clone', clone_url, tmpdir.strpath) | |
|
162 | assert 'HTTP Error 404: Not Found' in stderr | |
|
163 | ||
|
164 | 111 | def test_clone_git_with_slashes(self, rc_web_server, tmpdir): |
|
165 | 112 | clone_url = rc_web_server.repo_clone_url('//' + GIT_REPO) |
|
166 | 113 | stdout, stderr = Command('/tmp').execute('git clone', clone_url) |
|
167 | 114 | assert 'not found' in stderr |
|
168 | 115 | |
|
169 | def test_clone_existing_path_hg_not_in_database( | |
|
170 | self, rc_web_server, tmpdir, fs_repo_only): | |
|
171 | ||
|
172 | db_name = fs_repo_only('not-in-db-hg', repo_type='hg') | |
|
173 | clone_url = rc_web_server.repo_clone_url(db_name) | |
|
174 | stdout, stderr = Command('/tmp').execute( | |
|
175 | 'hg clone', clone_url, tmpdir.strpath) | |
|
176 | assert 'HTTP Error 404: Not Found' in stderr | |
|
177 | ||
|
178 | 116 | def test_clone_existing_path_git_not_in_database( |
|
179 | 117 | self, rc_web_server, tmpdir, fs_repo_only): |
|
180 | 118 | db_name = fs_repo_only('not-in-db-git', repo_type='git') |
@@ -183,14 +121,6 b' class TestVCSOperations(object):' | |||
|
183 | 121 | 'git clone', clone_url, tmpdir.strpath) |
|
184 | 122 | assert 'not found' in stderr |
|
185 | 123 | |
|
186 | def test_clone_existing_path_hg_not_in_database_different_scm( | |
|
187 | self, rc_web_server, tmpdir, fs_repo_only): | |
|
188 | db_name = fs_repo_only('not-in-db-git', repo_type='git') | |
|
189 | clone_url = rc_web_server.repo_clone_url(db_name) | |
|
190 | stdout, stderr = Command('/tmp').execute( | |
|
191 | 'hg clone', clone_url, tmpdir.strpath) | |
|
192 | assert 'HTTP Error 404: Not Found' in stderr | |
|
193 | ||
|
194 | 124 | def test_clone_existing_path_git_not_in_database_different_scm( |
|
195 | 125 | self, rc_web_server, tmpdir, fs_repo_only): |
|
196 | 126 | db_name = fs_repo_only('not-in-db-hg', repo_type='hg') |
@@ -199,17 +129,6 b' class TestVCSOperations(object):' | |||
|
199 | 129 | 'git clone', clone_url, tmpdir.strpath) |
|
200 | 130 | assert 'not found' in stderr |
|
201 | 131 | |
|
202 | def test_clone_non_existing_store_path_hg(self, rc_web_server, tmpdir, user_util): | |
|
203 | repo = user_util.create_repo() | |
|
204 | clone_url = rc_web_server.repo_clone_url(repo.repo_name) | |
|
205 | ||
|
206 | # Damage repo by removing it's folder | |
|
207 | RepoModel()._delete_filesystem_repo(repo) | |
|
208 | ||
|
209 | stdout, stderr = Command('/tmp').execute( | |
|
210 | 'hg clone', clone_url, tmpdir.strpath) | |
|
211 | assert 'HTTP Error 404: Not Found' in stderr | |
|
212 | ||
|
213 | 132 | def test_clone_non_existing_store_path_git(self, rc_web_server, tmpdir, user_util): |
|
214 | 133 | repo = user_util.create_repo(repo_type='git') |
|
215 | 134 | clone_url = rc_web_server.repo_clone_url(repo.repo_name) |
@@ -221,17 +140,6 b' class TestVCSOperations(object):' | |||
|
221 | 140 | 'git clone', clone_url, tmpdir.strpath) |
|
222 | 141 | assert 'not found' in stderr |
|
223 | 142 | |
|
224 | def test_push_new_file_hg(self, rc_web_server, tmpdir): | |
|
225 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
226 | stdout, stderr = Command('/tmp').execute( | |
|
227 | 'hg clone', clone_url, tmpdir.strpath) | |
|
228 | ||
|
229 | stdout, stderr = _add_files_and_push( | |
|
230 | 'hg', tmpdir.strpath, clone_url=clone_url) | |
|
231 | ||
|
232 | assert 'pushing to' in stdout | |
|
233 | assert 'size summary' in stdout | |
|
234 | ||
|
235 | 143 | def test_push_new_file_git(self, rc_web_server, tmpdir): |
|
236 | 144 | clone_url = rc_web_server.repo_clone_url(GIT_REPO) |
|
237 | 145 | stdout, stderr = Command('/tmp').execute( |
@@ -243,58 +151,6 b' class TestVCSOperations(object):' | |||
|
243 | 151 | |
|
244 | 152 | _check_proper_git_push(stdout, stderr) |
|
245 | 153 | |
|
246 | def test_push_invalidates_cache(self, rc_web_server, tmpdir): | |
|
247 | hg_repo = Repository.get_by_repo_name(HG_REPO) | |
|
248 | ||
|
249 | # init cache objects | |
|
250 | CacheKey.delete_all_cache() | |
|
251 | ||
|
252 | repo_namespace_key = CacheKey.REPO_INVALIDATION_NAMESPACE.format(repo_id=hg_repo.repo_id) | |
|
253 | ||
|
254 | inv_context_manager = rc_cache.InvalidationContext(key=repo_namespace_key) | |
|
255 | ||
|
256 | with inv_context_manager as invalidation_context: | |
|
257 | # __enter__ will create and register cache objects | |
|
258 | pass | |
|
259 | ||
|
260 | cache_keys = hg_repo.cache_keys | |
|
261 | assert cache_keys != [] | |
|
262 | old_ids = [x.cache_state_uid for x in cache_keys] | |
|
263 | ||
|
264 | # clone to init cache | |
|
265 | clone_url = rc_web_server.repo_clone_url(hg_repo.repo_name) | |
|
266 | stdout, stderr = Command('/tmp').execute( | |
|
267 | 'hg clone', clone_url, tmpdir.strpath) | |
|
268 | ||
|
269 | cache_keys = hg_repo.cache_keys | |
|
270 | assert cache_keys != [] | |
|
271 | for key in cache_keys: | |
|
272 | assert key.cache_active is True | |
|
273 | ||
|
274 | # PUSH that should trigger invalidation cache | |
|
275 | stdout, stderr = _add_files_and_push( | |
|
276 | 'hg', tmpdir.strpath, clone_url=clone_url, files_no=1) | |
|
277 | ||
|
278 | # flush... | |
|
279 | Session().commit() | |
|
280 | hg_repo = Repository.get_by_repo_name(HG_REPO) | |
|
281 | cache_keys = hg_repo.cache_keys | |
|
282 | assert cache_keys != [] | |
|
283 | new_ids = [x.cache_state_uid for x in cache_keys] | |
|
284 | assert new_ids != old_ids | |
|
285 | ||
|
286 | def test_push_wrong_credentials_hg(self, rc_web_server, tmpdir): | |
|
287 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
288 | stdout, stderr = Command('/tmp').execute( | |
|
289 | 'hg clone', clone_url, tmpdir.strpath) | |
|
290 | ||
|
291 | push_url = rc_web_server.repo_clone_url( | |
|
292 | HG_REPO, user='bad', passwd='name') | |
|
293 | stdout, stderr = _add_files_and_push( | |
|
294 | 'hg', tmpdir.strpath, clone_url=push_url) | |
|
295 | ||
|
296 | assert 'abort: authorization failed' in stderr | |
|
297 | ||
|
298 | 154 | def test_push_wrong_credentials_git(self, rc_web_server, tmpdir): |
|
299 | 155 | clone_url = rc_web_server.repo_clone_url(GIT_REPO) |
|
300 | 156 | stdout, stderr = Command('/tmp').execute( |
@@ -307,17 +163,6 b' class TestVCSOperations(object):' | |||
|
307 | 163 | |
|
308 | 164 | assert 'fatal: Authentication failed' in stderr |
|
309 | 165 | |
|
310 | def test_push_back_to_wrong_url_hg(self, rc_web_server, tmpdir): | |
|
311 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
312 | stdout, stderr = Command('/tmp').execute( | |
|
313 | 'hg clone', clone_url, tmpdir.strpath) | |
|
314 | ||
|
315 | stdout, stderr = _add_files_and_push( | |
|
316 | 'hg', tmpdir.strpath, | |
|
317 | clone_url=rc_web_server.repo_clone_url('not-existing')) | |
|
318 | ||
|
319 | assert 'HTTP Error 404: Not Found' in stderr | |
|
320 | ||
|
321 | 166 | def test_push_back_to_wrong_url_git(self, rc_web_server, tmpdir): |
|
322 | 167 | clone_url = rc_web_server.repo_clone_url(GIT_REPO) |
|
323 | 168 | stdout, stderr = Command('/tmp').execute( |
@@ -329,28 +174,6 b' class TestVCSOperations(object):' | |||
|
329 | 174 | |
|
330 | 175 | assert 'not found' in stderr |
|
331 | 176 | |
|
332 | def test_ip_restriction_hg(self, rc_web_server, tmpdir): | |
|
333 | user_model = UserModel() | |
|
334 | try: | |
|
335 | user_model.add_extra_ip(TEST_USER_ADMIN_LOGIN, '10.10.10.10/32') | |
|
336 | Session().commit() | |
|
337 | time.sleep(2) | |
|
338 | clone_url = rc_web_server.repo_clone_url(HG_REPO) | |
|
339 | stdout, stderr = Command('/tmp').execute( | |
|
340 | 'hg clone', clone_url, tmpdir.strpath) | |
|
341 | assert 'abort: HTTP Error 403: Forbidden' in stderr | |
|
342 | finally: | |
|
343 | # release IP restrictions | |
|
344 | for ip in UserIpMap.getAll(): | |
|
345 | UserIpMap.delete(ip.ip_id) | |
|
346 | Session().commit() | |
|
347 | ||
|
348 | time.sleep(2) | |
|
349 | ||
|
350 | stdout, stderr = Command('/tmp').execute( | |
|
351 | 'hg clone', clone_url, tmpdir.strpath) | |
|
352 | _check_proper_clone(stdout, stderr, 'hg') | |
|
353 | ||
|
354 | 177 | def test_ip_restriction_git(self, rc_web_server, tmpdir): |
|
355 | 178 | user_model = UserModel() |
|
356 | 179 | try: |
@@ -42,6 +42,7 b' connection_available = pytest.mark.skipi' | |||
|
42 | 42 | "enable_webhook_push_integration") |
|
43 | 43 | class TestVCSOperationsOnCustomIniConfig(object): |
|
44 | 44 | |
|
45 | @connection_available | |
|
45 | 46 | def test_push_tag_with_commit_hg(self, rc_web_server, tmpdir): |
|
46 | 47 | clone_url = rc_web_server.repo_clone_url(HG_REPO) |
|
47 | 48 | stdout, stderr = Command('/tmp').execute( |
@@ -56,6 +57,7 b' class TestVCSOperationsOnCustomIniConfig' | |||
|
56 | 57 | assert 'ERROR' not in rc_log |
|
57 | 58 | assert "{'name': 'v1.0.0'," in rc_log |
|
58 | 59 | |
|
60 | @connection_available | |
|
59 | 61 | def test_push_tag_with_commit_git( |
|
60 | 62 | self, rc_web_server, tmpdir): |
|
61 | 63 | clone_url = rc_web_server.repo_clone_url(GIT_REPO) |
@@ -71,6 +73,7 b' class TestVCSOperationsOnCustomIniConfig' | |||
|
71 | 73 | assert 'ERROR' not in rc_log |
|
72 | 74 | assert "{'name': 'v1.0.0'," in rc_log |
|
73 | 75 | |
|
76 | @connection_available | |
|
74 | 77 | def test_push_tag_with_no_commit_git( |
|
75 | 78 | self, rc_web_server, tmpdir): |
|
76 | 79 | clone_url = rc_web_server.repo_clone_url(GIT_REPO) |
@@ -7,7 +7,7 b'' | |||
|
7 | 7 | [server:main] |
|
8 | 8 | ; COMMON HOST/IP CONFIG |
|
9 | 9 | host = 127.0.0.1 |
|
10 |
port = |
|
|
10 | port = 10010 | |
|
11 | 11 | |
|
12 | 12 | |
|
13 | 13 | ; ########################### |
@@ -22,6 +22,17 b' use = egg:gunicorn#main' | |||
|
22 | 22 | [app:main] |
|
23 | 23 | ; The %(here)s variable will be replaced with the absolute path of parent directory |
|
24 | 24 | ; of this file |
|
25 | ; Each option in the app:main can be override by an environmental variable | |
|
26 | ; | |
|
27 | ;To override an option: | |
|
28 | ; | |
|
29 | ;RC_<KeyName> | |
|
30 | ;Everything should be uppercase, . and - should be replaced by _. | |
|
31 | ;For example, if you have these configuration settings: | |
|
32 | ;rc_cache.repo_object.backend = foo | |
|
33 | ;can be overridden by | |
|
34 | ;export RC_CACHE_REPO_OBJECT_BACKEND=foo | |
|
35 | ||
|
25 | 36 | use = egg:rhodecode-vcsserver |
|
26 | 37 | |
|
27 | 38 | ; Pyramid default locales, we need this to be set |
@@ -30,10 +41,14 b' pyramid.default_locale_name = en' | |||
|
30 | 41 | ; default locale used by VCS systems |
|
31 | 42 | locale = en_US.UTF-8 |
|
32 | 43 | |
|
33 | ; path to binaries for vcsserver, it should be set by the installer | |
|
34 | ; at installation time, e.g /home/user/vcsserver-1/profile/bin | |
|
35 | ; it can also be a path to nix-build output in case of development | |
|
36 |
core.binary_dir = |
|
|
44 | ; path to binaries (hg,git,svn) for vcsserver, it should be set by the installer | |
|
45 | ; at installation time, e.g /home/user/.rccontrol/vcsserver-1/profile/bin | |
|
46 | ; or /usr/local/bin/rhodecode_bin/vcs_bin | |
|
47 | core.binary_dir = | |
|
48 | ||
|
49 | ; Redis connection settings for svn integrations logic | |
|
50 | ; This connection string needs to be the same on ce and vcsserver | |
|
51 | vcs.svn.redis_conn = redis://redis:6379/0 | |
|
37 | 52 | |
|
38 | 53 | ; Custom exception store path, defaults to TMPDIR |
|
39 | 54 | ; This is used to store exception from RhodeCode in shared directory |
@@ -52,14 +67,14 b' cache_dir = %(here)s/data' | |||
|
52 | 67 | ; *************************************** |
|
53 | 68 | |
|
54 | 69 | ; `repo_object` cache settings for vcs methods for repositories |
|
55 |
rc_cache.repo_object.backend = dogpile.cache.rc. |
|
|
70 | #rc_cache.repo_object.backend = dogpile.cache.rc.file_namespace | |
|
56 | 71 | |
|
57 | 72 | ; cache auto-expires after N seconds |
|
58 | 73 | ; Examples: 86400 (1Day), 604800 (7Days), 1209600 (14Days), 2592000 (30days), 7776000 (90Days) |
|
59 | rc_cache.repo_object.expiration_time = 2592000 | |
|
74 | #rc_cache.repo_object.expiration_time = 2592000 | |
|
60 | 75 | |
|
61 | 76 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set |
|
62 | #rc_cache.repo_object.arguments.filename = /tmp/vcsserver_cache.db | |
|
77 | #rc_cache.repo_object.arguments.filename = /tmp/vcsserver_cache_repo_object.db | |
|
63 | 78 | |
|
64 | 79 | ; *********************************************************** |
|
65 | 80 | ; `repo_object` cache with redis backend |
@@ -83,19 +98,32 b' rc_cache.repo_object.expiration_time = 2' | |||
|
83 | 98 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends |
|
84 | 99 | #rc_cache.repo_object.arguments.distributed_lock = true |
|
85 | 100 | |
|
86 | # legacy cache regions, please don't change | |
|
87 | beaker.cache.regions = repo_object | |
|
88 | beaker.cache.repo_object.type = memorylru | |
|
89 | beaker.cache.repo_object.max_items = 100 | |
|
90 | # cache auto-expires after N seconds | |
|
91 | beaker.cache.repo_object.expire = 300 | |
|
92 | beaker.cache.repo_object.enabled = true | |
|
101 | ; auto-renew lock to prevent stale locks, slower but safer. Use only if problems happen | |
|
102 | #rc_cache.repo_object.arguments.lock_auto_renewal = true | |
|
103 | ||
|
104 | ; Statsd client config, this is used to send metrics to statsd | |
|
105 | ; We recommend setting statsd_exported and scrape them using Promethues | |
|
106 | #statsd.enabled = false | |
|
107 | #statsd.statsd_host = 0.0.0.0 | |
|
108 | #statsd.statsd_port = 8125 | |
|
109 | #statsd.statsd_prefix = | |
|
110 | #statsd.statsd_ipv6 = false | |
|
93 | 111 | |
|
112 | ; configure logging automatically at server startup set to false | |
|
113 | ; to use the below custom logging config. | |
|
114 | ; RC_LOGGING_FORMATTER | |
|
115 | ; RC_LOGGING_LEVEL | |
|
116 | ; env variables can control the settings for logging in case of autoconfigure | |
|
94 | 117 | |
|
118 | #logging.autoconfigure = true | |
|
119 | ||
|
120 | ; specify your own custom logging config file to configure logging | |
|
121 | #logging.logging_conf_file = /path/to/custom_logging.ini | |
|
95 | 122 | |
|
96 | 123 | ; ##################### |
|
97 | 124 | ; LOGGING CONFIGURATION |
|
98 | 125 | ; ##################### |
|
126 | ||
|
99 | 127 | [loggers] |
|
100 | 128 | keys = root, vcsserver |
|
101 | 129 | |
@@ -103,7 +131,7 b' keys = root, vcsserver' | |||
|
103 | 131 | keys = console |
|
104 | 132 | |
|
105 | 133 | [formatters] |
|
106 | keys = generic | |
|
134 | keys = generic, json | |
|
107 | 135 | |
|
108 | 136 | ; ####### |
|
109 | 137 | ; LOGGERS |
@@ -113,12 +141,11 b' level = NOTSET' | |||
|
113 | 141 | handlers = console |
|
114 | 142 | |
|
115 | 143 | [logger_vcsserver] |
|
116 |
level = |
|
|
144 | level = INFO | |
|
117 | 145 | handlers = |
|
118 | 146 | qualname = vcsserver |
|
119 | 147 | propagate = 1 |
|
120 | 148 | |
|
121 | ||
|
122 | 149 | ; ######## |
|
123 | 150 | ; HANDLERS |
|
124 | 151 | ; ######## |
@@ -127,6 +154,8 b' propagate = 1' | |||
|
127 | 154 | class = StreamHandler |
|
128 | 155 | args = (sys.stderr, ) |
|
129 | 156 | level = DEBUG |
|
157 | ; To enable JSON formatted logs replace 'generic' with 'json' | |
|
158 | ; This allows sending properly formatted logs to grafana loki or elasticsearch | |
|
130 | 159 | formatter = generic |
|
131 | 160 | |
|
132 | 161 | ; ########## |
@@ -136,3 +165,7 b' formatter = generic' | |||
|
136 | 165 | [formatter_generic] |
|
137 | 166 | format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s |
|
138 | 167 | datefmt = %Y-%m-%d %H:%M:%S |
|
168 | ||
|
169 | [formatter_json] | |
|
170 | format = %(timestamp)s %(levelname)s %(name)s %(message)s %(req_id)s | |
|
171 | class = vcsserver.lib._vendor.jsonlogger.JsonFormatter |
@@ -176,18 +176,11 b' setup(' | |||
|
176 | 176 | ('public/**', 'ignore', None), |
|
177 | 177 | ] |
|
178 | 178 | }, |
|
179 | paster_plugins=['PasteScript'], | |
|
179 | ||
|
180 | 180 | entry_points={ |
|
181 | 181 | 'paste.app_factory': [ |
|
182 | 182 | 'main=rhodecode.config.middleware:make_pyramid_app', |
|
183 | 183 | ], |
|
184 | 'paste.global_paster_command': [ | |
|
185 | 'ishell=rhodecode.lib.paster_commands.ishell:Command', | |
|
186 | 'upgrade-db=rhodecode.lib.paster_commands.upgrade_db:UpgradeDb', | |
|
187 | ||
|
188 | 'setup-rhodecode=rhodecode.lib.paster_commands.deprecated.setup_rhodecode:Command', | |
|
189 | 'celeryd=rhodecode.lib.paster_commands.deprecated.celeryd:Command', | |
|
190 | ], | |
|
191 | 184 | 'pyramid.pshell_runner': [ |
|
192 | 185 | 'ipython = rhodecode.lib.pyramid_shell:ipython_shell_runner', |
|
193 | 186 | ], |
@@ -196,7 +189,8 b' setup(' | |||
|
196 | 189 | 'rc-upgrade-db=rhodecode.lib.rc_commands.upgrade_db:main', |
|
197 | 190 | 'rc-ishell=rhodecode.lib.rc_commands.ishell:main', |
|
198 | 191 | 'rc-add-artifact=rhodecode.lib.rc_commands.add_artifact:main', |
|
199 | 'rc-ssh-wrapper=rhodecode.apps.ssh_support.lib.ssh_wrapper:main', | |
|
192 | 'rc-ssh-wrapper=rhodecode.apps.ssh_support.lib.ssh_wrapper_v1:main', | |
|
193 | 'rc-ssh-wrapper-v2=rhodecode.apps.ssh_support.lib.ssh_wrapper_v2:main', | |
|
200 | 194 | ], |
|
201 | 195 | 'beaker.backends': [ |
|
202 | 196 | 'memorylru_base=rhodecode.lib.memory_lru_dict:MemoryLRUNamespaceManagerBase', |
@@ -1,451 +0,0 b'' | |||
|
1 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
2 | # | |
|
3 | # This program is free software: you can redistribute it and/or modify | |
|
4 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
5 | # (only), as published by the Free Software Foundation. | |
|
6 | # | |
|
7 | # This program is distributed in the hope that it will be useful, | |
|
8 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
9 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
10 | # GNU General Public License for more details. | |
|
11 | # | |
|
12 | # You should have received a copy of the GNU Affero General Public License | |
|
13 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
14 | # | |
|
15 | # This program is dual-licensed. If you wish to learn more about the | |
|
16 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
17 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
18 | ||
|
19 | import os | |
|
20 | import time | |
|
21 | import logging | |
|
22 | import tempfile | |
|
23 | import traceback | |
|
24 | import threading | |
|
25 | import socket | |
|
26 | import msgpack | |
|
27 | import gevent | |
|
28 | ||
|
29 | from http.server import BaseHTTPRequestHandler | |
|
30 | from socketserver import TCPServer | |
|
31 | ||
|
32 | import rhodecode | |
|
33 | from rhodecode.lib.exceptions import HTTPLockedRC, HTTPBranchProtected | |
|
34 | from rhodecode.model import meta | |
|
35 | from rhodecode.lib import hooks_base | |
|
36 | from rhodecode.lib.utils2 import AttributeDict | |
|
37 | from rhodecode.lib.pyramid_utils import get_config | |
|
38 | from rhodecode.lib.ext_json import json | |
|
39 | from rhodecode.lib import rc_cache | |
|
40 | ||
|
41 | log = logging.getLogger(__name__) | |
|
42 | ||
|
43 | ||
|
44 | class HooksHttpHandler(BaseHTTPRequestHandler): | |
|
45 | ||
|
46 | JSON_HOOKS_PROTO = 'json.v1' | |
|
47 | MSGPACK_HOOKS_PROTO = 'msgpack.v1' | |
|
48 | # starting with RhodeCode 5.0.0 MsgPack is the default, prior it used json | |
|
49 | DEFAULT_HOOKS_PROTO = MSGPACK_HOOKS_PROTO | |
|
50 | ||
|
51 | @classmethod | |
|
52 | def serialize_data(cls, data, proto=DEFAULT_HOOKS_PROTO): | |
|
53 | if proto == cls.MSGPACK_HOOKS_PROTO: | |
|
54 | return msgpack.packb(data) | |
|
55 | return json.dumps(data) | |
|
56 | ||
|
57 | @classmethod | |
|
58 | def deserialize_data(cls, data, proto=DEFAULT_HOOKS_PROTO): | |
|
59 | if proto == cls.MSGPACK_HOOKS_PROTO: | |
|
60 | return msgpack.unpackb(data) | |
|
61 | return json.loads(data) | |
|
62 | ||
|
63 | def do_POST(self): | |
|
64 | hooks_proto, method, extras = self._read_request() | |
|
65 | log.debug('Handling HooksHttpHandler %s with %s proto', method, hooks_proto) | |
|
66 | ||
|
67 | txn_id = getattr(self.server, 'txn_id', None) | |
|
68 | if txn_id: | |
|
69 | log.debug('Computing TXN_ID based on `%s`:`%s`', | |
|
70 | extras['repository'], extras['txn_id']) | |
|
71 | computed_txn_id = rc_cache.utils.compute_key_from_params( | |
|
72 | extras['repository'], extras['txn_id']) | |
|
73 | if txn_id != computed_txn_id: | |
|
74 | raise Exception( | |
|
75 | 'TXN ID fail: expected {} got {} instead'.format( | |
|
76 | txn_id, computed_txn_id)) | |
|
77 | ||
|
78 | request = getattr(self.server, 'request', None) | |
|
79 | try: | |
|
80 | hooks = Hooks(request=request, log_prefix='HOOKS: {} '.format(self.server.server_address)) | |
|
81 | result = self._call_hook_method(hooks, method, extras) | |
|
82 | ||
|
83 | except Exception as e: | |
|
84 | exc_tb = traceback.format_exc() | |
|
85 | result = { | |
|
86 | 'exception': e.__class__.__name__, | |
|
87 | 'exception_traceback': exc_tb, | |
|
88 | 'exception_args': e.args | |
|
89 | } | |
|
90 | self._write_response(hooks_proto, result) | |
|
91 | ||
|
92 | def _read_request(self): | |
|
93 | length = int(self.headers['Content-Length']) | |
|
94 | # respect sent headers, fallback to OLD proto for compatability | |
|
95 | hooks_proto = self.headers.get('rc-hooks-protocol') or self.JSON_HOOKS_PROTO | |
|
96 | if hooks_proto == self.MSGPACK_HOOKS_PROTO: | |
|
97 | # support for new vcsserver msgpack based protocol hooks | |
|
98 | body = self.rfile.read(length) | |
|
99 | data = self.deserialize_data(body) | |
|
100 | else: | |
|
101 | body = self.rfile.read(length) | |
|
102 | data = self.deserialize_data(body) | |
|
103 | ||
|
104 | return hooks_proto, data['method'], data['extras'] | |
|
105 | ||
|
106 | def _write_response(self, hooks_proto, result): | |
|
107 | self.send_response(200) | |
|
108 | if hooks_proto == self.MSGPACK_HOOKS_PROTO: | |
|
109 | self.send_header("Content-type", "application/msgpack") | |
|
110 | self.end_headers() | |
|
111 | data = self.serialize_data(result) | |
|
112 | self.wfile.write(data) | |
|
113 | else: | |
|
114 | self.send_header("Content-type", "text/json") | |
|
115 | self.end_headers() | |
|
116 | data = self.serialize_data(result) | |
|
117 | self.wfile.write(data) | |
|
118 | ||
|
119 | def _call_hook_method(self, hooks, method, extras): | |
|
120 | try: | |
|
121 | result = getattr(hooks, method)(extras) | |
|
122 | finally: | |
|
123 | meta.Session.remove() | |
|
124 | return result | |
|
125 | ||
|
126 | def log_message(self, format, *args): | |
|
127 | """ | |
|
128 | This is an overridden method of BaseHTTPRequestHandler which logs using | |
|
129 | logging library instead of writing directly to stderr. | |
|
130 | """ | |
|
131 | ||
|
132 | message = format % args | |
|
133 | ||
|
134 | log.debug( | |
|
135 | "HOOKS: client=%s - - [%s] %s", self.client_address, | |
|
136 | self.log_date_time_string(), message) | |
|
137 | ||
|
138 | ||
|
139 | class BaseHooksCallbackDaemon: | |
|
140 | """ | |
|
141 | Basic context manager for actions that don't require some extra | |
|
142 | """ | |
|
143 | def __init__(self): | |
|
144 | self.hooks_module = Hooks.__module__ | |
|
145 | ||
|
146 | def __enter__(self): | |
|
147 | log.debug('Running `%s` callback daemon', self.__class__.__name__) | |
|
148 | return self | |
|
149 | ||
|
150 | def __exit__(self, exc_type, exc_val, exc_tb): | |
|
151 | log.debug('Exiting `%s` callback daemon', self.__class__.__name__) | |
|
152 | ||
|
153 | ||
|
154 | class CeleryHooksCallbackDaemon(BaseHooksCallbackDaemon): | |
|
155 | """ | |
|
156 | Context manger for achieving a compatibility with celery backend | |
|
157 | """ | |
|
158 | ||
|
159 | def __init__(self, config): | |
|
160 | self.task_queue = config.get('app:main', 'celery.broker_url') | |
|
161 | self.task_backend = config.get('app:main', 'celery.result_backend') | |
|
162 | ||
|
163 | ||
|
164 | class ThreadedHookCallbackDaemon(object): | |
|
165 | ||
|
166 | _callback_thread = None | |
|
167 | _daemon = None | |
|
168 | _done = False | |
|
169 | use_gevent = False | |
|
170 | ||
|
171 | def __init__(self, txn_id=None, host=None, port=None): | |
|
172 | self._prepare(txn_id=txn_id, host=host, port=port) | |
|
173 | if self.use_gevent: | |
|
174 | self._run_func = self._run_gevent | |
|
175 | self._stop_func = self._stop_gevent | |
|
176 | else: | |
|
177 | self._run_func = self._run | |
|
178 | self._stop_func = self._stop | |
|
179 | ||
|
180 | def __enter__(self): | |
|
181 | log.debug('Running `%s` callback daemon', self.__class__.__name__) | |
|
182 | self._run_func() | |
|
183 | return self | |
|
184 | ||
|
185 | def __exit__(self, exc_type, exc_val, exc_tb): | |
|
186 | log.debug('Exiting `%s` callback daemon', self.__class__.__name__) | |
|
187 | self._stop_func() | |
|
188 | ||
|
189 | def _prepare(self, txn_id=None, host=None, port=None): | |
|
190 | raise NotImplementedError() | |
|
191 | ||
|
192 | def _run(self): | |
|
193 | raise NotImplementedError() | |
|
194 | ||
|
195 | def _stop(self): | |
|
196 | raise NotImplementedError() | |
|
197 | ||
|
198 | def _run_gevent(self): | |
|
199 | raise NotImplementedError() | |
|
200 | ||
|
201 | def _stop_gevent(self): | |
|
202 | raise NotImplementedError() | |
|
203 | ||
|
204 | ||
|
205 | class HttpHooksCallbackDaemon(ThreadedHookCallbackDaemon): | |
|
206 | """ | |
|
207 | Context manager which will run a callback daemon in a background thread. | |
|
208 | """ | |
|
209 | ||
|
210 | hooks_uri = None | |
|
211 | ||
|
212 | # From Python docs: Polling reduces our responsiveness to a shutdown | |
|
213 | # request and wastes cpu at all other times. | |
|
214 | POLL_INTERVAL = 0.01 | |
|
215 | ||
|
216 | use_gevent = False | |
|
217 | ||
|
218 | @property | |
|
219 | def _hook_prefix(self): | |
|
220 | return 'HOOKS: {} '.format(self.hooks_uri) | |
|
221 | ||
|
222 | def get_hostname(self): | |
|
223 | return socket.gethostname() or '127.0.0.1' | |
|
224 | ||
|
225 | def get_available_port(self, min_port=20000, max_port=65535): | |
|
226 | from rhodecode.lib.utils2 import get_available_port as _get_port | |
|
227 | return _get_port(min_port, max_port) | |
|
228 | ||
|
229 | def _prepare(self, txn_id=None, host=None, port=None): | |
|
230 | from pyramid.threadlocal import get_current_request | |
|
231 | ||
|
232 | if not host or host == "*": | |
|
233 | host = self.get_hostname() | |
|
234 | if not port: | |
|
235 | port = self.get_available_port() | |
|
236 | ||
|
237 | server_address = (host, port) | |
|
238 | self.hooks_uri = '{}:{}'.format(host, port) | |
|
239 | self.txn_id = txn_id | |
|
240 | self._done = False | |
|
241 | ||
|
242 | log.debug( | |
|
243 | "%s Preparing HTTP callback daemon registering hook object: %s", | |
|
244 | self._hook_prefix, HooksHttpHandler) | |
|
245 | ||
|
246 | self._daemon = TCPServer(server_address, HooksHttpHandler) | |
|
247 | # inject transaction_id for later verification | |
|
248 | self._daemon.txn_id = self.txn_id | |
|
249 | ||
|
250 | # pass the WEB app request into daemon | |
|
251 | self._daemon.request = get_current_request() | |
|
252 | ||
|
253 | def _run(self): | |
|
254 | log.debug("Running thread-based loop of callback daemon in background") | |
|
255 | callback_thread = threading.Thread( | |
|
256 | target=self._daemon.serve_forever, | |
|
257 | kwargs={'poll_interval': self.POLL_INTERVAL}) | |
|
258 | callback_thread.daemon = True | |
|
259 | callback_thread.start() | |
|
260 | self._callback_thread = callback_thread | |
|
261 | ||
|
262 | def _run_gevent(self): | |
|
263 | log.debug("Running gevent-based loop of callback daemon in background") | |
|
264 | # create a new greenlet for the daemon's serve_forever method | |
|
265 | callback_greenlet = gevent.spawn( | |
|
266 | self._daemon.serve_forever, | |
|
267 | poll_interval=self.POLL_INTERVAL) | |
|
268 | ||
|
269 | # store reference to greenlet | |
|
270 | self._callback_greenlet = callback_greenlet | |
|
271 | ||
|
272 | # switch to this greenlet | |
|
273 | gevent.sleep(0.01) | |
|
274 | ||
|
275 | def _stop(self): | |
|
276 | log.debug("Waiting for background thread to finish.") | |
|
277 | self._daemon.shutdown() | |
|
278 | self._callback_thread.join() | |
|
279 | self._daemon = None | |
|
280 | self._callback_thread = None | |
|
281 | if self.txn_id: | |
|
282 | txn_id_file = get_txn_id_data_path(self.txn_id) | |
|
283 | log.debug('Cleaning up TXN ID %s', txn_id_file) | |
|
284 | if os.path.isfile(txn_id_file): | |
|
285 | os.remove(txn_id_file) | |
|
286 | ||
|
287 | log.debug("Background thread done.") | |
|
288 | ||
|
289 | def _stop_gevent(self): | |
|
290 | log.debug("Waiting for background greenlet to finish.") | |
|
291 | ||
|
292 | # if greenlet exists and is running | |
|
293 | if self._callback_greenlet and not self._callback_greenlet.dead: | |
|
294 | # shutdown daemon if it exists | |
|
295 | if self._daemon: | |
|
296 | self._daemon.shutdown() | |
|
297 | ||
|
298 | # kill the greenlet | |
|
299 | self._callback_greenlet.kill() | |
|
300 | ||
|
301 | self._daemon = None | |
|
302 | self._callback_greenlet = None | |
|
303 | ||
|
304 | if self.txn_id: | |
|
305 | txn_id_file = get_txn_id_data_path(self.txn_id) | |
|
306 | log.debug('Cleaning up TXN ID %s', txn_id_file) | |
|
307 | if os.path.isfile(txn_id_file): | |
|
308 | os.remove(txn_id_file) | |
|
309 | ||
|
310 | log.debug("Background greenlet done.") | |
|
311 | ||
|
312 | ||
|
313 | def get_txn_id_data_path(txn_id): | |
|
314 | import rhodecode | |
|
315 | ||
|
316 | root = rhodecode.CONFIG.get('cache_dir') or tempfile.gettempdir() | |
|
317 | final_dir = os.path.join(root, 'svn_txn_id') | |
|
318 | ||
|
319 | if not os.path.isdir(final_dir): | |
|
320 | os.makedirs(final_dir) | |
|
321 | return os.path.join(final_dir, 'rc_txn_id_{}'.format(txn_id)) | |
|
322 | ||
|
323 | ||
|
324 | def store_txn_id_data(txn_id, data_dict): | |
|
325 | if not txn_id: | |
|
326 | log.warning('Cannot store txn_id because it is empty') | |
|
327 | return | |
|
328 | ||
|
329 | path = get_txn_id_data_path(txn_id) | |
|
330 | try: | |
|
331 | with open(path, 'wb') as f: | |
|
332 | f.write(json.dumps(data_dict)) | |
|
333 | except Exception: | |
|
334 | log.exception('Failed to write txn_id metadata') | |
|
335 | ||
|
336 | ||
|
337 | def get_txn_id_from_store(txn_id): | |
|
338 | """ | |
|
339 | Reads txn_id from store and if present returns the data for callback manager | |
|
340 | """ | |
|
341 | path = get_txn_id_data_path(txn_id) | |
|
342 | try: | |
|
343 | with open(path, 'rb') as f: | |
|
344 | return json.loads(f.read()) | |
|
345 | except Exception: | |
|
346 | return {} | |
|
347 | ||
|
348 | ||
|
349 | def prepare_callback_daemon(extras, protocol, host, txn_id=None): | |
|
350 | txn_details = get_txn_id_from_store(txn_id) | |
|
351 | port = txn_details.get('port', 0) | |
|
352 | match protocol: | |
|
353 | case 'http': | |
|
354 | callback_daemon = HttpHooksCallbackDaemon( | |
|
355 | txn_id=txn_id, host=host, port=port) | |
|
356 | case 'celery': | |
|
357 | callback_daemon = CeleryHooksCallbackDaemon(get_config(extras['config'])) | |
|
358 | case 'local': | |
|
359 | callback_daemon = BaseHooksCallbackDaemon() | |
|
360 | case _: | |
|
361 | log.error('Unsupported callback daemon protocol "%s"', protocol) | |
|
362 | raise Exception('Unsupported callback daemon protocol.') | |
|
363 | ||
|
364 | extras['hooks_uri'] = getattr(callback_daemon, 'hooks_uri', '') | |
|
365 | extras['task_queue'] = getattr(callback_daemon, 'task_queue', '') | |
|
366 | extras['task_backend'] = getattr(callback_daemon, 'task_backend', '') | |
|
367 | extras['hooks_protocol'] = protocol | |
|
368 | extras['time'] = time.time() | |
|
369 | ||
|
370 | # register txn_id | |
|
371 | extras['txn_id'] = txn_id | |
|
372 | log.debug('Prepared a callback daemon: %s', | |
|
373 | callback_daemon.__class__.__name__) | |
|
374 | return callback_daemon, extras | |
|
375 | ||
|
376 | ||
|
377 | class Hooks(object): | |
|
378 | """ | |
|
379 | Exposes the hooks for remote call backs | |
|
380 | """ | |
|
381 | def __init__(self, request=None, log_prefix=''): | |
|
382 | self.log_prefix = log_prefix | |
|
383 | self.request = request | |
|
384 | ||
|
385 | def repo_size(self, extras): | |
|
386 | log.debug("%sCalled repo_size of %s object", self.log_prefix, self) | |
|
387 | return self._call_hook(hooks_base.repo_size, extras) | |
|
388 | ||
|
389 | def pre_pull(self, extras): | |
|
390 | log.debug("%sCalled pre_pull of %s object", self.log_prefix, self) | |
|
391 | return self._call_hook(hooks_base.pre_pull, extras) | |
|
392 | ||
|
393 | def post_pull(self, extras): | |
|
394 | log.debug("%sCalled post_pull of %s object", self.log_prefix, self) | |
|
395 | return self._call_hook(hooks_base.post_pull, extras) | |
|
396 | ||
|
397 | def pre_push(self, extras): | |
|
398 | log.debug("%sCalled pre_push of %s object", self.log_prefix, self) | |
|
399 | return self._call_hook(hooks_base.pre_push, extras) | |
|
400 | ||
|
401 | def post_push(self, extras): | |
|
402 | log.debug("%sCalled post_push of %s object", self.log_prefix, self) | |
|
403 | return self._call_hook(hooks_base.post_push, extras) | |
|
404 | ||
|
405 | def _call_hook(self, hook, extras): | |
|
406 | extras = AttributeDict(extras) | |
|
407 | server_url = extras['server_url'] | |
|
408 | ||
|
409 | extras.request = self.request | |
|
410 | ||
|
411 | try: | |
|
412 | result = hook(extras) | |
|
413 | if result is None: | |
|
414 | raise Exception( | |
|
415 | 'Failed to obtain hook result from func: {}'.format(hook)) | |
|
416 | except HTTPBranchProtected as handled_error: | |
|
417 | # Those special cases doesn't need error reporting. It's a case of | |
|
418 | # locked repo or protected branch | |
|
419 | result = AttributeDict({ | |
|
420 | 'status': handled_error.code, | |
|
421 | 'output': handled_error.explanation | |
|
422 | }) | |
|
423 | except (HTTPLockedRC, Exception) as error: | |
|
424 | # locked needs different handling since we need to also | |
|
425 | # handle PULL operations | |
|
426 | exc_tb = '' | |
|
427 | if not isinstance(error, HTTPLockedRC): | |
|
428 | exc_tb = traceback.format_exc() | |
|
429 | log.exception('%sException when handling hook %s', self.log_prefix, hook) | |
|
430 | error_args = error.args | |
|
431 | return { | |
|
432 | 'status': 128, | |
|
433 | 'output': '', | |
|
434 | 'exception': type(error).__name__, | |
|
435 | 'exception_traceback': exc_tb, | |
|
436 | 'exception_args': error_args, | |
|
437 | } | |
|
438 | finally: | |
|
439 | meta.Session.remove() | |
|
440 | ||
|
441 | log.debug('%sGot hook call response %s', self.log_prefix, result) | |
|
442 | return { | |
|
443 | 'status': result.status, | |
|
444 | 'output': result.output, | |
|
445 | } | |
|
446 | ||
|
447 | def __enter__(self): | |
|
448 | return self | |
|
449 | ||
|
450 | def __exit__(self, exc_type, exc_val, exc_tb): | |
|
451 | pass |
@@ -1,89 +0,0 b'' | |||
|
1 | ||
|
2 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
3 | # | |
|
4 | # This program is free software: you can redistribute it and/or modify | |
|
5 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
6 | # (only), as published by the Free Software Foundation. | |
|
7 | # | |
|
8 | # This program is distributed in the hope that it will be useful, | |
|
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
11 | # GNU General Public License for more details. | |
|
12 | # | |
|
13 | # You should have received a copy of the GNU Affero General Public License | |
|
14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
15 | # | |
|
16 | # This program is dual-licensed. If you wish to learn more about the | |
|
17 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
18 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
19 | import os | |
|
20 | import logging | |
|
21 | ||
|
22 | from paste.script.command import Command, BadCommand | |
|
23 | ||
|
24 | ||
|
25 | class BasePasterCommand(Command): | |
|
26 | """ | |
|
27 | Abstract Base Class for paster commands. | |
|
28 | ||
|
29 | The celery commands are somewhat aggressive about loading | |
|
30 | celery.conf, and since our module sets the `CELERY_LOADER` | |
|
31 | environment variable to our loader, we have to bootstrap a bit and | |
|
32 | make sure we've had a chance to load the pylons config off of the | |
|
33 | command line, otherwise everything fails. | |
|
34 | """ | |
|
35 | min_args = 1 | |
|
36 | min_args_error = "Please provide a paster config file as an argument." | |
|
37 | takes_config_file = 1 | |
|
38 | requires_config_file = True | |
|
39 | ||
|
40 | def notify_msg(self, msg, log=False): | |
|
41 | """Make a notification to user, additionally if logger is passed | |
|
42 | it logs this action using given logger | |
|
43 | ||
|
44 | :param msg: message that will be printed to user | |
|
45 | :param log: logging instance, to use to additionally log this message | |
|
46 | ||
|
47 | """ | |
|
48 | if log and isinstance(log, logging): | |
|
49 | log(msg) | |
|
50 | ||
|
51 | def run(self, args): | |
|
52 | """ | |
|
53 | Overrides Command.run | |
|
54 | ||
|
55 | Checks for a config file argument and loads it. | |
|
56 | """ | |
|
57 | if len(args) < self.min_args: | |
|
58 | raise BadCommand( | |
|
59 | self.min_args_error % {'min_args': self.min_args, | |
|
60 | 'actual_args': len(args)}) | |
|
61 | ||
|
62 | # Decrement because we're going to lob off the first argument. | |
|
63 | # @@ This is hacky | |
|
64 | self.min_args -= 1 | |
|
65 | self.bootstrap_config(args[0]) | |
|
66 | self.update_parser() | |
|
67 | return super(BasePasterCommand, self).run(args[1:]) | |
|
68 | ||
|
69 | def update_parser(self): | |
|
70 | """ | |
|
71 | Abstract method. Allows for the class' parser to be updated | |
|
72 | before the superclass' `run` method is called. Necessary to | |
|
73 | allow options/arguments to be passed through to the underlying | |
|
74 | celery command. | |
|
75 | """ | |
|
76 | raise NotImplementedError("Abstract Method.") | |
|
77 | ||
|
78 | def bootstrap_config(self, conf): | |
|
79 | """ | |
|
80 | Loads the pylons configuration. | |
|
81 | """ | |
|
82 | self.path_to_ini_file = os.path.realpath(conf) | |
|
83 | ||
|
84 | def _init_session(self): | |
|
85 | """ | |
|
86 | Inits SqlAlchemy Session | |
|
87 | """ | |
|
88 | logging.config.fileConfig(self.path_to_ini_file) | |
|
89 |
|
1 | NO CONTENT: file was removed |
@@ -1,44 +0,0 b'' | |||
|
1 | ||
|
2 | ||
|
3 | # Copyright (C) 2013-2023 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | from rhodecode.lib.paster_commands import BasePasterCommand | |
|
23 | ||
|
24 | ||
|
25 | class Command(BasePasterCommand): | |
|
26 | """ | |
|
27 | Start the celery worker | |
|
28 | ||
|
29 | Starts the celery worker that uses a paste.deploy configuration | |
|
30 | file. | |
|
31 | """ | |
|
32 | usage = 'CONFIG_FILE' | |
|
33 | summary = __doc__.splitlines()[0] | |
|
34 | description = "".join(__doc__.splitlines()[2:]) | |
|
35 | ||
|
36 | parser = BasePasterCommand.standard_parser(quiet=True) | |
|
37 | ||
|
38 | def update_parser(self): | |
|
39 | pass | |
|
40 | ||
|
41 | def command(self): | |
|
42 | cmd = 'celery worker --task-events --beat --app rhodecode.lib.celerylib.loader --loglevel DEBUG --ini=%s' % self.path_to_ini_file | |
|
43 | raise Exception('This Command is deprecated please run: %s' % cmd) | |
|
44 |
@@ -1,42 +0,0 b'' | |||
|
1 | ||
|
2 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
3 | # | |
|
4 | # This program is free software: you can redistribute it and/or modify | |
|
5 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
6 | # (only), as published by the Free Software Foundation. | |
|
7 | # | |
|
8 | # This program is distributed in the hope that it will be useful, | |
|
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
11 | # GNU General Public License for more details. | |
|
12 | # | |
|
13 | # You should have received a copy of the GNU Affero General Public License | |
|
14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
15 | # | |
|
16 | # This program is dual-licensed. If you wish to learn more about the | |
|
17 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
18 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
19 | ||
|
20 | ||
|
21 | from rhodecode.lib.paster_commands import BasePasterCommand | |
|
22 | ||
|
23 | ||
|
24 | class Command(BasePasterCommand): | |
|
25 | """ | |
|
26 | Start the celery worker | |
|
27 | ||
|
28 | Starts the celery worker that uses a paste.deploy configuration | |
|
29 | file. | |
|
30 | """ | |
|
31 | usage = 'CONFIG_FILE [celeryd options...]' | |
|
32 | summary = __doc__.splitlines()[0] | |
|
33 | description = "".join(__doc__.splitlines()[2:]) | |
|
34 | ||
|
35 | parser = BasePasterCommand.standard_parser(quiet=True) | |
|
36 | ||
|
37 | def update_parser(self): | |
|
38 | pass | |
|
39 | ||
|
40 | def command(self): | |
|
41 | cmd = 'rc-setup-app %s' % self.path_to_ini_file | |
|
42 | raise Exception('This Command is deprecated please run: %s' % cmd) |
@@ -1,80 +0,0 b'' | |||
|
1 | ||
|
2 | ||
|
3 | # Copyright (C) 2013-2023 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | """ | |
|
22 | interactive shell paster command for RhodeCode | |
|
23 | """ | |
|
24 | ||
|
25 | import os | |
|
26 | import sys | |
|
27 | import logging | |
|
28 | ||
|
29 | from rhodecode.lib.paster_commands import BasePasterCommand | |
|
30 | ||
|
31 | # fix rhodecode import | |
|
32 | from os.path import dirname as dn | |
|
33 | rc_path = dn(dn(dn(os.path.realpath(__file__)))) | |
|
34 | sys.path.append(rc_path) | |
|
35 | ||
|
36 | log = logging.getLogger(__name__) | |
|
37 | ||
|
38 | welcome_banner = """Welcome to RhodeCode iShell. | |
|
39 | Type `exit` to exit the shell. | |
|
40 | iShell is interactive shell to interact directly with the | |
|
41 | internal RhodeCode APIs. You can rescue your lost password, | |
|
42 | or reset some user/system settings. | |
|
43 | """ | |
|
44 | ||
|
45 | ||
|
46 | class Command(BasePasterCommand): | |
|
47 | ||
|
48 | max_args = 1 | |
|
49 | min_args = 1 | |
|
50 | ||
|
51 | usage = "CONFIG_FILE" | |
|
52 | group_name = "RhodeCode" | |
|
53 | takes_config_file = -1 | |
|
54 | parser = BasePasterCommand.standard_parser(verbose=True) | |
|
55 | summary = "Interactive shell" | |
|
56 | ||
|
57 | def command(self): | |
|
58 | #get SqlAlchemy session | |
|
59 | self._init_session() | |
|
60 | ||
|
61 | # imports, used in ipython shell | |
|
62 | import os | |
|
63 | import sys | |
|
64 | import time | |
|
65 | import shutil | |
|
66 | import datetime | |
|
67 | from rhodecode.model.db import * | |
|
68 | ||
|
69 | try: | |
|
70 | from IPython import embed | |
|
71 | from traitlets.config import Config | |
|
72 | cfg = Config() | |
|
73 | cfg.InteractiveShellEmbed.confirm_exit = False | |
|
74 | embed(config=cfg, banner1=welcome_banner) | |
|
75 | except ImportError: | |
|
76 | print('ipython installation required for ishell') | |
|
77 | sys.exit(-1) | |
|
78 | ||
|
79 | def update_parser(self): | |
|
80 | pass |
@@ -1,63 +0,0 b'' | |||
|
1 | ||
|
2 | # Copyright (C) 2010-2023 RhodeCode GmbH | |
|
3 | # | |
|
4 | # This program is free software: you can redistribute it and/or modify | |
|
5 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
6 | # (only), as published by the Free Software Foundation. | |
|
7 | # | |
|
8 | # This program is distributed in the hope that it will be useful, | |
|
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
11 | # GNU General Public License for more details. | |
|
12 | # | |
|
13 | # You should have received a copy of the GNU Affero General Public License | |
|
14 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
15 | # | |
|
16 | # This program is dual-licensed. If you wish to learn more about the | |
|
17 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
18 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
19 | ||
|
20 | import logging | |
|
21 | ||
|
22 | from rhodecode.lib.paster_commands import BasePasterCommand, Command | |
|
23 | ||
|
24 | log = logging.getLogger(__name__) | |
|
25 | ||
|
26 | ||
|
27 | class UpgradeDb(BasePasterCommand): | |
|
28 | """ | |
|
29 | Command used for paster to upgrade our database to newer version | |
|
30 | """ | |
|
31 | ||
|
32 | max_args = 1 | |
|
33 | min_args = 1 | |
|
34 | ||
|
35 | usage = "CONFIG_FILE" | |
|
36 | summary = "Upgrades current db to newer version" | |
|
37 | group_name = "RhodeCode" | |
|
38 | ||
|
39 | parser = Command.standard_parser(verbose=True) | |
|
40 | ||
|
41 | def command(self): | |
|
42 | from rhodecode.lib.rc_commands import upgrade_db | |
|
43 | upgrade_db.command( | |
|
44 | self.path_to_ini_file, self.options.__dict__.get('force_ask'), None) | |
|
45 | ||
|
46 | def update_parser(self): | |
|
47 | self.parser.add_option('--sql', | |
|
48 | action='store_true', | |
|
49 | dest='just_sql', | |
|
50 | help="Prints upgrade sql for further investigation", | |
|
51 | default=False) | |
|
52 | ||
|
53 | self.parser.add_option('--force-yes', | |
|
54 | action='store_true', | |
|
55 | dest='force_ask', | |
|
56 | default=None, | |
|
57 | help='Force yes to every question') | |
|
58 | self.parser.add_option('--force-no', | |
|
59 | action='store_false', | |
|
60 | dest='force_ask', | |
|
61 | default=None, | |
|
62 | help='Force no to every question') | |
|
63 |
General Comments 0
You need to be logged in to leave comments.
Login now