##// END OF EJS Templates
release: Merge default into stable for release preparation
milka -
r4671:d6153155 merge stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,76 b''
1 |RCE| 4.25.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2021-04-02
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - SSH: allow clone by ID via SSH operations.
14 - Artifacts: added an admin panel to manage artifacts.
15 - Redmine: added option to add note to a ticket without changing its status in Redmine integration.
16
17
18 General
19 ^^^^^^^
20
21 - Git: change lookups logic. Prioritize reference names over numerical ids.
22 Numerical ids are supported as a fallback if ref matching is unsuccessful.
23 - Permissions: changed fork permission help text to reflect the actual state on how it works.
24 - Permissions: flush permissions on owner changes for repo and repo groups. This
25 would fix problems when owner of repository changes then the new owner lacked permissions
26 until cache expired.
27 - Artifacts: added API function to remove artifacts.
28 - Archives: use a special name for non-hashed archives to fix caching issues.
29 - Packaging: fixed few packages requirements for a proper builds.
30 - Packaging: fix rhodecode-tools for docker builds.
31 - Packaging: fixed some problem after latest setuptools-scm release.
32 - Packaging: added setuptools-scm to packages for build.
33 - Packaging: fix jira package for reproducible builds.
34 - Packaging: fix zipp package patches.
35
36
37 Security
38 ^^^^^^^^
39
40 - Comments: forbid removal of comments by anyone except the owners.
41 Previously admins of a repository could remove them if they would construct a special url with data.
42 - Pull requests: fixed some xss problems when a deleted file with special characters were commented on.
43
44
45 Performance
46 ^^^^^^^^^^^
47
48 - License: skip channelstream connect on license checks logic to reduce calls handling times.
49 - Core: optimize some calls to skip license/scm detection on them. Each license check is expensive
50 and we don't need them on each call.
51
52
53 Fixes
54 ^^^^^
55
56 - Branch-permissions: fixed ce view. Fixes #5656
57 - Feed: fix errors on feed access of empty repositories.
58 - Archives: if implicit ref name was used (e.g master) to obtain archive, we now
59 redirect to explicit commit sha so we can have the proper caching for references names.
60 - rcextensions: fixed pre-files extractor return code support.
61 - Svn: fix subprocess problems on some of the calls for file checking.
62 - Pull requests: fixed multiple repetitions of referenced tickets in pull requests summary sidebar.
63 - Maintenance: fixed bad routes def
64 - clone-uri: fixed the problems with key mismatch that caused errors on summary page.
65 - Largefiles: added fix for downloading largefiles which had no extension in file name.
66 - Compare: fix referenced commits bug.
67 - Git: fix for unicode branches
68
69
70 Upgrade notes
71 ^^^^^^^^^^^^^
72
73 - Scheduled release 4.25.0.
74
75
76
@@ -0,0 +1,13 b''
1 diff -rup channelstream-0.6.14-orig/setup.py channelstream-0.6.14/setup.py
2
3 --- channelstream-0.6.14/setup-orig.py 2021-03-11 12:34:45.000000000 +0100
4 +++ channelstream-0.6.14/setup.py 2021-03-11 12:34:56.000000000 +0100
5 @@ -52,7 +52,7 @@ setup(
6 include_package_data=True,
7 install_requires=requires,
8 python_requires=">=2.7",
9 - setup_requires=["pytest-runner"],
10 + setup_requires=["pytest-runner==5.1.0"],
11 extras_require={
12 "dev": ["coverage", "pytest", "pyramid", "tox", "mock", "webtest"],
13 "lint": ["black"],
@@ -0,0 +1,10 b''
1 diff -rup configparser-4.0.2-orig/pyproject.toml configparser-4.0.2/pyproject.toml
2 --- configparser-4.0.2-orig/pyproject.toml 2021-03-22 21:28:11.000000000 +0100
3 +++ configparser-4.0.2/pyproject.toml 2021-03-22 21:28:11.000000000 +0100
4 @@ -1,5 +1,5 @@
5 [build-system]
6 -requires = ["setuptools>=40.7", "wheel", "setuptools_scm>=1.15"]
7 +requires = ["setuptools<=42.0", "wheel", "setuptools_scm<6.0.0"]
8 build-backend = "setuptools.build_meta"
9
10 [tool.black]
@@ -0,0 +1,7 b''
1 diff -rup importlib-metadata-1.6.0-orig/yproject.toml importlib-metadata-1.6.0/pyproject.toml
2 --- importlib-metadata-1.6.0-orig/yproject.toml 2021-03-22 22:10:33.000000000 +0100
3 +++ importlib-metadata-1.6.0/pyproject.toml 2021-03-22 22:11:09.000000000 +0100
4 @@ -1,3 +1,3 @@
5 [build-system]
6 -requires = ["setuptools>=30.3", "wheel", "setuptools_scm"]
7 +requires = ["setuptools<42.0", "wheel", "setuptools_scm<6.0.0"]
@@ -0,0 +1,12 b''
1 diff -rup pyramid-apispec-0.3.2-orig/setup.py pyramid-apispec-0.3.2/setup.py
2 --- pyramid-apispec-0.3.2-orig/setup.py 2021-03-11 11:19:26.000000000 +0100
3 +++ pyramid-apispec-0.3.2/setup.py 2021-03-11 11:19:51.000000000 +0100
4 @@ -44,7 +44,7 @@ setup(
5 packages=find_packages(exclude=["contrib", "docs", "tests"]),
6 package_data={"pyramid_apispec": ["static/*.*"], "": ["LICENSE"]},
7 install_requires=["apispec[yaml]==1.0.0"],
8 - setup_requires=["pytest-runner"],
9 + setup_requires=["pytest-runner==5.1"],
10 extras_require={
11 "dev": ["coverage", "pytest", "pyramid", "tox", "webtest"],
12 "demo": ["marshmallow==2.15.3", "pyramid", "apispec", "webtest"], No newline at end of file
@@ -0,0 +1,12 b''
1 diff -rup rhodecode-tools-1.4.0-orig/setup.py rhodecode-tools-1.4.0/setup.py
2 --- rhodecode-tools-1.4.0/setup-orig.py 2021-03-11 12:34:45.000000000 +0100
3 +++ rhodecode-tools-1.4.0/setup.py 2021-03-11 12:34:56.000000000 +0100
4 @@ -69,7 +69,7 @@ def _get_requirements(req_filename, excl
5
6
7 # requirements extract
8 -setup_requirements = ['pytest-runner']
9 +setup_requirements = ['pytest-runner==5.1.0']
10 install_requirements = _get_requirements(
11 'requirements.txt', exclude=['setuptools'])
12 test_requirements = _get_requirements('requirements_test.txt') No newline at end of file
@@ -0,0 +1,10 b''
1 diff -rup zip-1.2.0-orig/pyproject.toml zip-1.2.0/pyproject.toml
2 --- zip-1.2.0-orig/pyproject.toml 2021-03-23 10:55:37.000000000 +0100
3 +++ zip-1.2.0/pyproject.toml 2021-03-23 10:56:05.000000000 +0100
4 @@ -1,5 +1,5 @@
5 [build-system]
6 -requires = ["setuptools>=34.4", "wheel", "setuptools_scm>=1.15"]
7 +requires = ["setuptools<42.0", "wheel", "setuptools_scm<6.0.0"]
8 build-backend = "setuptools.build_meta"
9
10 [tool.black]
@@ -0,0 +1,74 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import mock
22 import pytest
23
24 from rhodecode.lib.utils2 import str2bool
25 from rhodecode.lib.vcs.exceptions import RepositoryRequirementError
26 from rhodecode.model.db import Repository, UserRepoToPerm, Permission, User
27 from rhodecode.model.meta import Session
28 from rhodecode.tests import (
29 TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_LOGIN, assert_session_flash)
30 from rhodecode.tests.fixture import Fixture
31
32 fixture = Fixture()
33
34
35 def route_path(name, params=None, **kwargs):
36 import urllib
37
38 base_url = {
39 'edit_repo_maintenance': '/{repo_name}/settings/maintenance',
40 'edit_repo_maintenance_execute': '/{repo_name}/settings/maintenance/execute',
41
42 }[name].format(**kwargs)
43
44 if params:
45 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
46 return base_url
47
48
49 def _get_permission_for_user(user, repo):
50 perm = UserRepoToPerm.query()\
51 .filter(UserRepoToPerm.repository ==
52 Repository.get_by_repo_name(repo))\
53 .filter(UserRepoToPerm.user == User.get_by_username(user))\
54 .all()
55 return perm
56
57
58 @pytest.mark.usefixtures('autologin_user', 'app')
59 class TestAdminRepoMaintenance(object):
60 @pytest.mark.parametrize('urlname', [
61 'edit_repo_maintenance',
62 ])
63 def test_show_page(self, urlname, app, backend):
64 app.get(route_path(urlname, repo_name=backend.repo_name), status=200)
65
66 def test_execute_maintenance_for_repo_hg(self, app, backend_hg, autologin_user, xhr_header):
67 repo_name = backend_hg.repo_name
68
69 response = app.get(
70 route_path('edit_repo_maintenance_execute',
71 repo_name=repo_name,),
72 extra_environ=xhr_header)
73
74 assert "HG Verify repo" in ''.join(response.json)
@@ -1,6 +1,5 b''
1 1 [bumpversion]
2 current_version = 4.24.1
2 current_version = 4.25.0
3 3 message = release: Bump version {current_version} to {new_version}
4 4
5 5 [bumpversion:file:rhodecode/VERSION]
6
@@ -1,33 +1,28 b''
1 1 [DEFAULT]
2 2 done = false
3 3
4 4 [task:bump_version]
5 5 done = true
6 6
7 7 [task:rc_tools_pinned]
8 done = true
9 8
10 9 [task:fixes_on_stable]
11 done = true
12 10
13 11 [task:pip2nix_generated]
14 done = true
15 12
16 13 [task:changelog_updated]
17 done = true
18 14
19 15 [task:generate_api_docs]
20 done = true
16
17 [task:updated_translation]
21 18
22 19 [release]
23 state = prepared
24 version = 4.24.1
25
26 [task:updated_translation]
20 state = in_progress
21 version = 4.25.0
27 22
28 23 [task:generate_js_routes]
29 24
30 25 [task:updated_trial_license]
31 26
32 27 [task:generate_oss_licenses]
33 28
@@ -1,184 +1,202 b''
1 1 .. _store-methods-ref:
2 2
3 3 store methods
4 4 =============
5 5
6 6 file_store_add (EE only)
7 7 ------------------------
8 8
9 9 .. py:function:: file_store_add(apiuser, filename, content, description=<Optional:''>)
10 10
11 11 Upload API for the file_store
12 12
13 13 Example usage from CLI::
14 14 rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg"}"
15 15
16 16 This command takes the following options:
17 17
18 18 :param apiuser: This is filled automatically from the |authtoken|.
19 19 :type apiuser: AuthUser
20 20 :param filename: name of the file uploaded
21 21 :type filename: str
22 22 :param description: Optional description for added file
23 23 :type description: str
24 24 :param content: base64 encoded content of the uploaded file
25 25 :type content: str
26 26
27 27 Example output:
28 28
29 29 .. code-block:: bash
30 30
31 31 id : <id_given_in_input>
32 32 result: {
33 33 "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
34 34 "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
35 35 "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg"
36 36 }
37 37 error : null
38 38
39 39
40 40 file_store_add_with_acl (EE only)
41 41 ---------------------------------
42 42
43 43 .. py:function:: file_store_add_with_acl(apiuser, filename, content, description=<Optional:''>, scope_user_id=<Optional:None>, scope_repo_id=<Optional:None>, scope_repo_group_id=<Optional:None>)
44 44
45 45 Upload API for the file_store
46 46
47 47 Example usage from CLI::
48 48 rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg", "scope_repo_id":101}"
49 49
50 50 This command takes the following options:
51 51
52 52 :param apiuser: This is filled automatically from the |authtoken|.
53 53 :type apiuser: AuthUser
54 54 :param filename: name of the file uploaded
55 55 :type filename: str
56 56 :param description: Optional description for added file
57 57 :type description: str
58 58 :param content: base64 encoded content of the uploaded file
59 59 :type content: str
60 60
61 61 :param scope_user_id: Optionally bind this file to user.
62 62 This will check ACL in such way only this user can access the file.
63 63 :type scope_user_id: int
64 64 :param scope_repo_id: Optionally bind this file to repository.
65 65 This will check ACL in such way only user with proper access to such
66 66 repository can access the file.
67 67 :type scope_repo_id: int
68 68 :param scope_repo_group_id: Optionally bind this file to repository group.
69 69 This will check ACL in such way only user with proper access to such
70 70 repository group can access the file.
71 71 :type scope_repo_group_id: int
72 72
73 73 Example output:
74 74
75 75 .. code-block:: bash
76 76
77 77 id : <id_given_in_input>
78 78 result: {
79 79 "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
80 80 "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
81 81 "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg"
82 82 }
83 83 error : null
84 84
85 85
86 86 file_store_get_info (EE only)
87 87 -----------------------------
88 88
89 89 .. py:function:: file_store_get_info(apiuser, store_fid)
90 90
91 91 Get artifact data.
92 92
93 93 Example output:
94 94
95 95 .. code-block:: bash
96 96
97 97 id : <id_given_in_input>
98 98 result: {
99 99 "artifact": {
100 100 "access_path_fqn": "https://rhodecode.example.com/_file_store/download/0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
101 101 "created_on": "2019-10-15T16:25:35.491",
102 102 "description": "my upload",
103 103 "downloaded_times": 1,
104 104 "file_uid": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
105 105 "filename": "example.jpg",
106 106 "filename_org": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
107 107 "hidden": false,
108 108 "metadata": [
109 109 {
110 110 "artifact": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
111 111 "key": "yellow",
112 112 "section": "tags",
113 113 "value": "bar"
114 114 }
115 115 ],
116 116 "sha256": "818dff0f44574dfb6814d38e6bf3c60c5943d1d13653398ecddaedf2f6a5b04d",
117 117 "size": 18599,
118 118 "uploaded_by": {
119 119 "email": "admin@rhodecode.com",
120 120 "emails": [
121 121 "admin@rhodecode.com"
122 122 ],
123 123 "firstname": "Admin",
124 124 "lastname": "LastName",
125 125 "user_id": 2,
126 126 "username": "admin"
127 127 }
128 128 }
129 129 }
130 130 error : null
131 131
132 132
133 file_store_delete (EE only)
134 ---------------------------
135
136 .. py:function:: file_store_delete(apiuser, store_fid)
137
138 Delete an artifact based on the secret uuid.
139
140 Example output:
141
142 .. code-block:: bash
143
144 id : <id_given_in_input>
145 result: {
146 "artifact" : {"uid": "some uid", "removed": true}
147 }
148 error : null
149
150
133 151 file_store_add_metadata (EE only)
134 152 ---------------------------------
135 153
136 154 .. py:function:: file_store_add_metadata(apiuser, store_fid, section, key, value, value_type=<Optional:'unicode'>)
137 155
138 156 Add metadata into artifact. The metadata consist of section, key, value. eg.
139 157 section='tags', 'key'='tag_name', value='1'
140 158
141 159 :param apiuser: This is filled automatically from the |authtoken|.
142 160 :type apiuser: AuthUser
143 161
144 162 :param store_fid: file uid, e.g 0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4
145 163 :type store_fid: str
146 164
147 165 :param section: Section name to add metadata
148 166 :type section: str
149 167
150 168 :param key: Key to add as metadata
151 169 :type key: str
152 170
153 171 :param value: Value to add as metadata
154 172 :type value: str
155 173
156 174 :param value_type: Optional type, default is 'unicode' other types are:
157 175 int, list, bool, unicode, str
158 176
159 177 :type value_type: str
160 178
161 179 Example output:
162 180
163 181 .. code-block:: bash
164 182
165 183 id : <id_given_in_input>
166 184 result: {
167 185 "metadata": [
168 186 {
169 187 "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4",
170 188 "key": "secret",
171 189 "section": "tags",
172 190 "value": "1"
173 191 },
174 192 {
175 193 "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4",
176 194 "key": "video",
177 195 "section": "tags",
178 196 "value": "1"
179 197 }
180 198 ]
181 199 }
182 200 error : null
183 201
184 202
@@ -1,153 +1,154 b''
1 1 .. _rhodecode-release-notes-ref:
2 2
3 3 Release Notes
4 4 =============
5 5
6 6 |RCE| 4.x Versions
7 7 ------------------
8 8
9 9 .. toctree::
10 10 :maxdepth: 1
11 11
12 release-notes-4.25.0.rst
12 13 release-notes-4.24.1.rst
13 14 release-notes-4.24.0.rst
14 15 release-notes-4.23.2.rst
15 16 release-notes-4.23.1.rst
16 17 release-notes-4.23.0.rst
17 18 release-notes-4.22.0.rst
18 19 release-notes-4.21.0.rst
19 20 release-notes-4.20.1.rst
20 21 release-notes-4.20.0.rst
21 22 release-notes-4.19.3.rst
22 23 release-notes-4.19.2.rst
23 24 release-notes-4.19.1.rst
24 25 release-notes-4.19.0.rst
25 26 release-notes-4.18.3.rst
26 27 release-notes-4.18.2.rst
27 28 release-notes-4.18.1.rst
28 29 release-notes-4.18.0.rst
29 30 release-notes-4.17.4.rst
30 31 release-notes-4.17.3.rst
31 32 release-notes-4.17.2.rst
32 33 release-notes-4.17.1.rst
33 34 release-notes-4.17.0.rst
34 35 release-notes-4.16.2.rst
35 36 release-notes-4.16.1.rst
36 37 release-notes-4.16.0.rst
37 38 release-notes-4.15.2.rst
38 39 release-notes-4.15.1.rst
39 40 release-notes-4.15.0.rst
40 41 release-notes-4.14.1.rst
41 42 release-notes-4.14.0.rst
42 43 release-notes-4.13.3.rst
43 44 release-notes-4.13.2.rst
44 45 release-notes-4.13.1.rst
45 46 release-notes-4.13.0.rst
46 47 release-notes-4.12.4.rst
47 48 release-notes-4.12.3.rst
48 49 release-notes-4.12.2.rst
49 50 release-notes-4.12.1.rst
50 51 release-notes-4.12.0.rst
51 52 release-notes-4.11.6.rst
52 53 release-notes-4.11.5.rst
53 54 release-notes-4.11.4.rst
54 55 release-notes-4.11.3.rst
55 56 release-notes-4.11.2.rst
56 57 release-notes-4.11.1.rst
57 58 release-notes-4.11.0.rst
58 59 release-notes-4.10.6.rst
59 60 release-notes-4.10.5.rst
60 61 release-notes-4.10.4.rst
61 62 release-notes-4.10.3.rst
62 63 release-notes-4.10.2.rst
63 64 release-notes-4.10.1.rst
64 65 release-notes-4.10.0.rst
65 66 release-notes-4.9.1.rst
66 67 release-notes-4.9.0.rst
67 68 release-notes-4.8.0.rst
68 69 release-notes-4.7.2.rst
69 70 release-notes-4.7.1.rst
70 71 release-notes-4.7.0.rst
71 72 release-notes-4.6.1.rst
72 73 release-notes-4.6.0.rst
73 74 release-notes-4.5.2.rst
74 75 release-notes-4.5.1.rst
75 76 release-notes-4.5.0.rst
76 77 release-notes-4.4.2.rst
77 78 release-notes-4.4.1.rst
78 79 release-notes-4.4.0.rst
79 80 release-notes-4.3.1.rst
80 81 release-notes-4.3.0.rst
81 82 release-notes-4.2.1.rst
82 83 release-notes-4.2.0.rst
83 84 release-notes-4.1.2.rst
84 85 release-notes-4.1.1.rst
85 86 release-notes-4.1.0.rst
86 87 release-notes-4.0.1.rst
87 88 release-notes-4.0.0.rst
88 89
89 90 |RCE| 3.x Versions
90 91 ------------------
91 92
92 93 .. toctree::
93 94 :maxdepth: 1
94 95
95 96 release-notes-3.8.4.rst
96 97 release-notes-3.8.3.rst
97 98 release-notes-3.8.2.rst
98 99 release-notes-3.8.1.rst
99 100 release-notes-3.8.0.rst
100 101 release-notes-3.7.1.rst
101 102 release-notes-3.7.0.rst
102 103 release-notes-3.6.1.rst
103 104 release-notes-3.6.0.rst
104 105 release-notes-3.5.2.rst
105 106 release-notes-3.5.1.rst
106 107 release-notes-3.5.0.rst
107 108 release-notes-3.4.1.rst
108 109 release-notes-3.4.0.rst
109 110 release-notes-3.3.4.rst
110 111 release-notes-3.3.3.rst
111 112 release-notes-3.3.2.rst
112 113 release-notes-3.3.1.rst
113 114 release-notes-3.3.0.rst
114 115 release-notes-3.2.3.rst
115 116 release-notes-3.2.2.rst
116 117 release-notes-3.2.1.rst
117 118 release-notes-3.2.0.rst
118 119 release-notes-3.1.1.rst
119 120 release-notes-3.1.0.rst
120 121 release-notes-3.0.2.rst
121 122 release-notes-3.0.1.rst
122 123 release-notes-3.0.0.rst
123 124
124 125 |RCE| 2.x Versions
125 126 ------------------
126 127
127 128 .. toctree::
128 129 :maxdepth: 1
129 130
130 131 release-notes-2.2.8.rst
131 132 release-notes-2.2.7.rst
132 133 release-notes-2.2.6.rst
133 134 release-notes-2.2.5.rst
134 135 release-notes-2.2.4.rst
135 136 release-notes-2.2.3.rst
136 137 release-notes-2.2.2.rst
137 138 release-notes-2.2.1.rst
138 139 release-notes-2.2.0.rst
139 140 release-notes-2.1.0.rst
140 141 release-notes-2.0.2.rst
141 142 release-notes-2.0.1.rst
142 143 release-notes-2.0.0.rst
143 144
144 145 |RCE| 1.x Versions
145 146 ------------------
146 147
147 148 .. toctree::
148 149 :maxdepth: 1
149 150
150 151 release-notes-1.7.2.rst
151 152 release-notes-1.7.1.rst
152 153 release-notes-1.7.0.rst
153 154 release-notes-1.6.0.rst
@@ -1,12 +1,12 b''
1 1 diff -rup pytest-4.6.5-orig/setup.py pytest-4.6.5/setup.py
2 2 --- pytest-4.6.5-orig/setup.py 2018-04-10 10:23:04.000000000 +0200
3 3 +++ pytest-4.6.5/setup.py 2018-04-10 10:23:34.000000000 +0200
4 4 @@ -24,7 +24,7 @@ INSTALL_REQUIRES = [
5 5 def main():
6 6 setup(
7 7 use_scm_version={"write_to": "src/_pytest/_version.py"},
8 8 - setup_requires=["setuptools-scm", "setuptools>=40.0"],
9 + setup_requires=["setuptools-scm", "setuptools<=42.0"],
9 + setup_requires=["setuptools-scm<6.0.0", "setuptools<=42.0"],
10 10 package_dir={"": "src"},
11 11 # fmt: off
12 12 extras_require={ No newline at end of file
@@ -1,287 +1,353 b''
1 1 # Overrides for the generated python-packages.nix
2 2 #
3 3 # This function is intended to be used as an extension to the generated file
4 4 # python-packages.nix. The main objective is to add needed dependencies of C
5 5 # libraries and tweak the build instructions where needed.
6 6
7 7 { pkgs
8 8 , basePythonPackages
9 9 }:
10 10
11 11 let
12 12 sed = "sed -i";
13 13
14 14 localLicenses = {
15 15 repoze = {
16 16 fullName = "Repoze License";
17 17 url = http://www.repoze.org/LICENSE.txt;
18 18 };
19 19 };
20 20
21 21 in
22 22
23 23 self: super: {
24 24
25 25 "appenlight-client" = super."appenlight-client".override (attrs: {
26 26 meta = {
27 27 license = [ pkgs.lib.licenses.bsdOriginal ];
28 28 };
29 29 });
30 30
31 31 "beaker" = super."beaker".override (attrs: {
32 32 patches = [
33 33 ./patches/beaker/patch-beaker-lock-func-debug.diff
34 34 ./patches/beaker/patch-beaker-metadata-reuse.diff
35 35 ./patches/beaker/patch-beaker-improved-redis.diff
36 36 ./patches/beaker/patch-beaker-improved-redis-2.diff
37 37 ];
38 38 });
39 39
40 40 "cffi" = super."cffi".override (attrs: {
41 41 buildInputs = [
42 42 pkgs.libffi
43 43 ];
44 44 });
45 45
46 46 "cryptography" = super."cryptography".override (attrs: {
47 47 buildInputs = [
48 48 pkgs.openssl
49 49 ];
50 50 });
51 51
52 52 "gevent" = super."gevent".override (attrs: {
53 53 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
54 54 # NOTE: (marcink) odd requirements from gevent aren't set properly,
55 55 # thus we need to inject psutil manually
56 56 self."psutil"
57 57 ];
58 58 });
59 59
60 60 "future" = super."future".override (attrs: {
61 61 meta = {
62 62 license = [ pkgs.lib.licenses.mit ];
63 63 };
64 64 });
65 65
66 66 "testpath" = super."testpath".override (attrs: {
67 67 meta = {
68 68 license = [ pkgs.lib.licenses.mit ];
69 69 };
70 70 });
71 71
72 72 "gnureadline" = super."gnureadline".override (attrs: {
73 73 buildInputs = [
74 74 pkgs.ncurses
75 75 ];
76 76 patchPhase = ''
77 77 substituteInPlace setup.py --replace "/bin/bash" "${pkgs.bash}/bin/bash"
78 78 '';
79 79 });
80 80
81 81 "gunicorn" = super."gunicorn".override (attrs: {
82 82 propagatedBuildInputs = [
83 83 # johbo: futures is needed as long as we are on Python 2, otherwise
84 84 # gunicorn explodes if used with multiple threads per worker.
85 85 self."futures"
86 86 ];
87 87 });
88 88
89 89 "nbconvert" = super."nbconvert".override (attrs: {
90 90 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
91 91 # marcink: plug in jupyter-client for notebook rendering
92 92 self."jupyter-client"
93 93 ];
94 94 });
95 95
96 96 "ipython" = super."ipython".override (attrs: {
97 97 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
98 98 self."gnureadline"
99 99 ];
100 100 });
101 101
102 102 "lxml" = super."lxml".override (attrs: {
103 103 buildInputs = [
104 104 pkgs.libxml2
105 105 pkgs.libxslt
106 106 ];
107 107 propagatedBuildInputs = [
108 108 # Needed, so that "setup.py bdist_wheel" does work
109 109 self."wheel"
110 110 ];
111 111 });
112 112
113 113 "mysql-python" = super."mysql-python".override (attrs: {
114 114 buildInputs = [
115 115 pkgs.openssl
116 116 ];
117 117 propagatedBuildInputs = [
118 118 pkgs.libmysql
119 119 pkgs.zlib
120 120 ];
121 121 });
122 122
123 123 "psycopg2" = super."psycopg2".override (attrs: {
124 124 propagatedBuildInputs = [
125 125 pkgs.postgresql
126 126 ];
127 127 meta = {
128 128 license = pkgs.lib.licenses.lgpl3Plus;
129 129 };
130 130 });
131 131
132 132 "pycurl" = super."pycurl".override (attrs: {
133 133 propagatedBuildInputs = [
134 134 pkgs.curl
135 135 pkgs.openssl
136 136 ];
137 137
138 138 preConfigure = ''
139 139 substituteInPlace setup.py --replace '--static-libs' '--libs'
140 140 export PYCURL_SSL_LIBRARY=openssl
141 141 '';
142 142
143 143 meta = {
144 144 license = pkgs.lib.licenses.mit;
145 145 };
146 146 });
147 147
148 148 "pyramid" = super."pyramid".override (attrs: {
149 149 meta = {
150 150 license = localLicenses.repoze;
151 151 };
152 152 });
153 153
154 154 "pyramid-debugtoolbar" = super."pyramid-debugtoolbar".override (attrs: {
155 155 meta = {
156 156 license = [ pkgs.lib.licenses.bsdOriginal localLicenses.repoze ];
157 157 };
158 158 });
159 159
160 160 "pysqlite" = super."pysqlite".override (attrs: {
161 161 propagatedBuildInputs = [
162 162 pkgs.sqlite
163 163 ];
164 164 meta = {
165 165 license = [ pkgs.lib.licenses.zlib pkgs.lib.licenses.libpng ];
166 166 };
167 167 });
168 168
169 169 "python-ldap" = super."python-ldap".override (attrs: {
170 170 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
171 171 pkgs.openldap
172 172 pkgs.cyrus_sasl
173 173 pkgs.openssl
174 174 ];
175 175 });
176 176
177 177 "python-pam" = super."python-pam".override (attrs: {
178 178 propagatedBuildInputs = [
179 179 pkgs.pam
180 180 ];
181 181
182 182 # TODO: johbo: Check if this can be avoided, or transform into
183 183 # a real patch
184 184 patchPhase = ''
185 185 substituteInPlace pam.py \
186 186 --replace 'find_library("pam")' '"${pkgs.pam}/lib/libpam.so.0"'
187 187 '';
188 188
189 189 });
190 190
191 191 "python-saml" = super."python-saml".override (attrs: {
192 192 buildInputs = [
193 193 pkgs.libxml2
194 194 pkgs.libxslt
195 195 ];
196 196 });
197 197
198 198 "dm.xmlsec.binding" = super."dm.xmlsec.binding".override (attrs: {
199 199 buildInputs = [
200 200 pkgs.libxml2
201 201 pkgs.libxslt
202 202 pkgs.xmlsec
203 203 pkgs.libtool
204 204 ];
205 205 });
206 206
207 207 "pyzmq" = super."pyzmq".override (attrs: {
208 208 buildInputs = [
209 209 pkgs.czmq
210 210 ];
211 211 });
212 212
213 213 "urlobject" = super."urlobject".override (attrs: {
214 214 meta = {
215 215 license = {
216 216 spdxId = "Unlicense";
217 217 fullName = "The Unlicense";
218 218 url = http://unlicense.org/;
219 219 };
220 220 };
221 221 });
222 222
223 223 "docutils" = super."docutils".override (attrs: {
224 224 meta = {
225 225 license = pkgs.lib.licenses.bsd2;
226 226 };
227 227 });
228 228
229 229 "colander" = super."colander".override (attrs: {
230 230 meta = {
231 231 license = localLicenses.repoze;
232 232 };
233 233 });
234 234
235 235 "pyramid-beaker" = super."pyramid-beaker".override (attrs: {
236 236 meta = {
237 237 license = localLicenses.repoze;
238 238 };
239 239 });
240 240
241 241 "pyramid-mako" = super."pyramid-mako".override (attrs: {
242 242 meta = {
243 243 license = localLicenses.repoze;
244 244 };
245 245 });
246 246
247 247 "repoze.lru" = super."repoze.lru".override (attrs: {
248 248 meta = {
249 249 license = localLicenses.repoze;
250 250 };
251 251 });
252 252
253 253 "python-editor" = super."python-editor".override (attrs: {
254 254 meta = {
255 255 license = pkgs.lib.licenses.asl20;
256 256 };
257 257 });
258 258
259 259 "translationstring" = super."translationstring".override (attrs: {
260 260 meta = {
261 261 license = localLicenses.repoze;
262 262 };
263 263 });
264 264
265 265 "venusian" = super."venusian".override (attrs: {
266 266 meta = {
267 267 license = localLicenses.repoze;
268 268 };
269 269 });
270 270
271 271 "supervisor" = super."supervisor".override (attrs: {
272 272 patches = [
273 273 ./patches/supervisor/patch-rlimits-old-kernel.diff
274 274 ];
275 275 });
276 276
277 277 "pytest" = super."pytest".override (attrs: {
278 278 patches = [
279 279 ./patches/pytest/setuptools.patch
280 280 ];
281 281 });
282 282
283 "pytest-runner" = super."pytest-runner".override (attrs: {
284 propagatedBuildInputs = [
285 self."setuptools-scm"
286 ];
287 });
288
289 "py" = super."py".override (attrs: {
290 propagatedBuildInputs = [
291 self."setuptools-scm"
292 ];
293 });
294
295 "python-dateutil" = super."python-dateutil".override (attrs: {
296 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
297 self."setuptools-scm"
298 ];
299 });
300
301 "configparser" = super."configparser".override (attrs: {
302 patches = [
303 ./patches/configparser/pyproject.patch
304 ];
305 propagatedBuildInputs = [
306 self."setuptools-scm"
307 ];
308 });
309
310 "importlib-metadata" = super."importlib-metadata".override (attrs: {
311
312 patches = [
313 ./patches/importlib_metadata/pyproject.patch
314 ];
315
316 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
317 self."setuptools-scm"
318 ];
319
320 });
321
322 "zipp" = super."zipp".override (attrs: {
323 patches = [
324 ./patches/zipp/pyproject.patch
325 ];
326 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
327 self."setuptools-scm"
328 ];
329 });
330
331 "pyramid-apispec" = super."pyramid-apispec".override (attrs: {
332 patches = [
333 ./patches/pyramid_apispec/setuptools.patch
334 ];
335 });
336
337 "channelstream" = super."channelstream".override (attrs: {
338 patches = [
339 ./patches/channelstream/setuptools.patch
340 ];
341 });
342
343 "rhodecode-tools" = super."rhodecode-tools".override (attrs: {
344 patches = [
345 ./patches/rhodecode_tools/setuptools.patch
346 ];
347 });
348
283 349 # Avoid that base packages screw up the build process
284 350 inherit (basePythonPackages)
285 351 setuptools;
286 352
287 353 }
@@ -1,2509 +1,2520 b''
1 1 # Generated by pip2nix 0.8.0.dev1
2 2 # See https://github.com/johbo/pip2nix
3 3
4 4 { pkgs, fetchurl, fetchgit, fetchhg }:
5 5
6 6 self: super: {
7 7 "alembic" = super.buildPythonPackage {
8 8 name = "alembic-1.4.2";
9 9 doCheck = false;
10 10 propagatedBuildInputs = [
11 11 self."sqlalchemy"
12 12 self."mako"
13 13 self."python-editor"
14 14 self."python-dateutil"
15 15 ];
16 16 src = fetchurl {
17 17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
18 18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
19 19 };
20 20 meta = {
21 21 license = [ pkgs.lib.licenses.mit ];
22 22 };
23 23 };
24 24 "amqp" = super.buildPythonPackage {
25 25 name = "amqp-2.5.2";
26 26 doCheck = false;
27 27 propagatedBuildInputs = [
28 28 self."vine"
29 29 ];
30 30 src = fetchurl {
31 31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
32 32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
33 33 };
34 34 meta = {
35 35 license = [ pkgs.lib.licenses.bsdOriginal ];
36 36 };
37 37 };
38 38 "apispec" = super.buildPythonPackage {
39 39 name = "apispec-1.0.0";
40 40 doCheck = false;
41 41 propagatedBuildInputs = [
42 42 self."PyYAML"
43 43 ];
44 44 src = fetchurl {
45 45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
46 46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
47 47 };
48 48 meta = {
49 49 license = [ pkgs.lib.licenses.mit ];
50 50 };
51 51 };
52 52 "appenlight-client" = super.buildPythonPackage {
53 53 name = "appenlight-client-0.6.26";
54 54 doCheck = false;
55 55 propagatedBuildInputs = [
56 56 self."webob"
57 57 self."requests"
58 58 self."six"
59 59 ];
60 60 src = fetchurl {
61 61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
62 62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
63 63 };
64 64 meta = {
65 65 license = [ pkgs.lib.licenses.bsdOriginal ];
66 66 };
67 67 };
68 68 "asn1crypto" = super.buildPythonPackage {
69 69 name = "asn1crypto-0.24.0";
70 70 doCheck = false;
71 71 src = fetchurl {
72 72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
73 73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
74 74 };
75 75 meta = {
76 76 license = [ pkgs.lib.licenses.mit ];
77 77 };
78 78 };
79 79 "atomicwrites" = super.buildPythonPackage {
80 80 name = "atomicwrites-1.3.0";
81 81 doCheck = false;
82 82 src = fetchurl {
83 83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
84 84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
85 85 };
86 86 meta = {
87 87 license = [ pkgs.lib.licenses.mit ];
88 88 };
89 89 };
90 90 "attrs" = super.buildPythonPackage {
91 91 name = "attrs-19.3.0";
92 92 doCheck = false;
93 93 src = fetchurl {
94 94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
95 95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
96 96 };
97 97 meta = {
98 98 license = [ pkgs.lib.licenses.mit ];
99 99 };
100 100 };
101 101 "babel" = super.buildPythonPackage {
102 102 name = "babel-1.3";
103 103 doCheck = false;
104 104 propagatedBuildInputs = [
105 105 self."pytz"
106 106 ];
107 107 src = fetchurl {
108 108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
109 109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
110 110 };
111 111 meta = {
112 112 license = [ pkgs.lib.licenses.bsdOriginal ];
113 113 };
114 114 };
115 115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
116 116 name = "backports.shutil-get-terminal-size-1.0.0";
117 117 doCheck = false;
118 118 src = fetchurl {
119 119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
120 120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
121 121 };
122 122 meta = {
123 123 license = [ pkgs.lib.licenses.mit ];
124 124 };
125 125 };
126 126 "beaker" = super.buildPythonPackage {
127 127 name = "beaker-1.9.1";
128 128 doCheck = false;
129 129 propagatedBuildInputs = [
130 130 self."funcsigs"
131 131 ];
132 132 src = fetchurl {
133 133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
134 134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
135 135 };
136 136 meta = {
137 137 license = [ pkgs.lib.licenses.bsdOriginal ];
138 138 };
139 139 };
140 140 "beautifulsoup4" = super.buildPythonPackage {
141 141 name = "beautifulsoup4-4.6.3";
142 142 doCheck = false;
143 143 src = fetchurl {
144 144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
145 145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
146 146 };
147 147 meta = {
148 148 license = [ pkgs.lib.licenses.mit ];
149 149 };
150 150 };
151 151 "billiard" = super.buildPythonPackage {
152 152 name = "billiard-3.6.1.0";
153 153 doCheck = false;
154 154 src = fetchurl {
155 155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
156 156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
157 157 };
158 158 meta = {
159 159 license = [ pkgs.lib.licenses.bsdOriginal ];
160 160 };
161 161 };
162 162 "bleach" = super.buildPythonPackage {
163 163 name = "bleach-3.1.3";
164 164 doCheck = false;
165 165 propagatedBuildInputs = [
166 166 self."six"
167 167 self."webencodings"
168 168 ];
169 169 src = fetchurl {
170 170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
171 171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
172 172 };
173 173 meta = {
174 174 license = [ pkgs.lib.licenses.asl20 ];
175 175 };
176 176 };
177 177 "bumpversion" = super.buildPythonPackage {
178 178 name = "bumpversion-0.5.3";
179 179 doCheck = false;
180 180 src = fetchurl {
181 181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
182 182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
183 183 };
184 184 meta = {
185 185 license = [ pkgs.lib.licenses.mit ];
186 186 };
187 187 };
188 188 "cachetools" = super.buildPythonPackage {
189 189 name = "cachetools-3.1.1";
190 190 doCheck = false;
191 191 src = fetchurl {
192 192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
193 193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
194 194 };
195 195 meta = {
196 196 license = [ pkgs.lib.licenses.mit ];
197 197 };
198 198 };
199 199 "celery" = super.buildPythonPackage {
200 200 name = "celery-4.3.0";
201 201 doCheck = false;
202 202 propagatedBuildInputs = [
203 203 self."pytz"
204 204 self."billiard"
205 205 self."kombu"
206 206 self."vine"
207 207 ];
208 208 src = fetchurl {
209 209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
210 210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
211 211 };
212 212 meta = {
213 213 license = [ pkgs.lib.licenses.bsdOriginal ];
214 214 };
215 215 };
216 216 "certifi" = super.buildPythonPackage {
217 217 name = "certifi-2020.4.5.1";
218 218 doCheck = false;
219 219 src = fetchurl {
220 220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
221 221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
222 222 };
223 223 meta = {
224 224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
225 225 };
226 226 };
227 227 "cffi" = super.buildPythonPackage {
228 228 name = "cffi-1.12.3";
229 229 doCheck = false;
230 230 propagatedBuildInputs = [
231 231 self."pycparser"
232 232 ];
233 233 src = fetchurl {
234 234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
235 235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
236 236 };
237 237 meta = {
238 238 license = [ pkgs.lib.licenses.mit ];
239 239 };
240 240 };
241 241 "chameleon" = super.buildPythonPackage {
242 242 name = "chameleon-2.24";
243 243 doCheck = false;
244 244 src = fetchurl {
245 245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
246 246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
247 247 };
248 248 meta = {
249 249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
250 250 };
251 251 };
252 252 "channelstream" = super.buildPythonPackage {
253 253 name = "channelstream-0.6.14";
254 254 doCheck = false;
255 255 propagatedBuildInputs = [
256 256 self."gevent"
257 257 self."ws4py"
258 258 self."marshmallow"
259 259 self."python-dateutil"
260 260 self."pyramid"
261 261 self."pyramid-jinja2"
262 262 self."pyramid-apispec"
263 263 self."itsdangerous"
264 264 self."requests"
265 265 self."six"
266 266 ];
267 267 src = fetchurl {
268 268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
269 269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
270 270 };
271 271 meta = {
272 272 license = [ pkgs.lib.licenses.bsdOriginal ];
273 273 };
274 274 };
275 275 "chardet" = super.buildPythonPackage {
276 276 name = "chardet-3.0.4";
277 277 doCheck = false;
278 278 src = fetchurl {
279 279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
280 280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
281 281 };
282 282 meta = {
283 283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
284 284 };
285 285 };
286 286 "click" = super.buildPythonPackage {
287 287 name = "click-7.0";
288 288 doCheck = false;
289 289 src = fetchurl {
290 290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
291 291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
292 292 };
293 293 meta = {
294 294 license = [ pkgs.lib.licenses.bsdOriginal ];
295 295 };
296 296 };
297 297 "colander" = super.buildPythonPackage {
298 298 name = "colander-1.7.0";
299 299 doCheck = false;
300 300 propagatedBuildInputs = [
301 301 self."translationstring"
302 302 self."iso8601"
303 303 self."enum34"
304 304 ];
305 305 src = fetchurl {
306 306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
307 307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
308 308 };
309 309 meta = {
310 310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
311 311 };
312 312 };
313 313 "configobj" = super.buildPythonPackage {
314 314 name = "configobj-5.0.6";
315 315 doCheck = false;
316 316 propagatedBuildInputs = [
317 317 self."six"
318 318 ];
319 319 src = fetchurl {
320 320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
321 321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
322 322 };
323 323 meta = {
324 324 license = [ pkgs.lib.licenses.bsdOriginal ];
325 325 };
326 326 };
327 327 "configparser" = super.buildPythonPackage {
328 328 name = "configparser-4.0.2";
329 329 doCheck = false;
330 330 src = fetchurl {
331 331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
332 332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
333 333 };
334 334 meta = {
335 335 license = [ pkgs.lib.licenses.mit ];
336 336 };
337 337 };
338 338 "contextlib2" = super.buildPythonPackage {
339 339 name = "contextlib2-0.6.0.post1";
340 340 doCheck = false;
341 341 src = fetchurl {
342 342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
343 343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
344 344 };
345 345 meta = {
346 346 license = [ pkgs.lib.licenses.psfl ];
347 347 };
348 348 };
349 349 "cov-core" = super.buildPythonPackage {
350 350 name = "cov-core-1.15.0";
351 351 doCheck = false;
352 352 propagatedBuildInputs = [
353 353 self."coverage"
354 354 ];
355 355 src = fetchurl {
356 356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
357 357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
358 358 };
359 359 meta = {
360 360 license = [ pkgs.lib.licenses.mit ];
361 361 };
362 362 };
363 363 "coverage" = super.buildPythonPackage {
364 364 name = "coverage-4.5.4";
365 365 doCheck = false;
366 366 src = fetchurl {
367 367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
368 368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
369 369 };
370 370 meta = {
371 371 license = [ pkgs.lib.licenses.asl20 ];
372 372 };
373 373 };
374 374 "cryptography" = super.buildPythonPackage {
375 375 name = "cryptography-2.6.1";
376 376 doCheck = false;
377 377 propagatedBuildInputs = [
378 378 self."asn1crypto"
379 379 self."six"
380 380 self."cffi"
381 381 self."enum34"
382 382 self."ipaddress"
383 383 ];
384 384 src = fetchurl {
385 385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
386 386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
387 387 };
388 388 meta = {
389 389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
390 390 };
391 391 };
392 392 "cssselect" = super.buildPythonPackage {
393 393 name = "cssselect-1.0.3";
394 394 doCheck = false;
395 395 src = fetchurl {
396 396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
397 397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
398 398 };
399 399 meta = {
400 400 license = [ pkgs.lib.licenses.bsdOriginal ];
401 401 };
402 402 };
403 403 "cssutils" = super.buildPythonPackage {
404 404 name = "cssutils-1.0.2";
405 405 doCheck = false;
406 406 src = fetchurl {
407 407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
408 408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
409 409 };
410 410 meta = {
411 411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
412 412 };
413 413 };
414 414 "decorator" = super.buildPythonPackage {
415 415 name = "decorator-4.1.2";
416 416 doCheck = false;
417 417 src = fetchurl {
418 418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
419 419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
420 420 };
421 421 meta = {
422 422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
423 423 };
424 424 };
425 425 "deform" = super.buildPythonPackage {
426 426 name = "deform-2.0.8";
427 427 doCheck = false;
428 428 propagatedBuildInputs = [
429 429 self."chameleon"
430 430 self."colander"
431 431 self."iso8601"
432 432 self."peppercorn"
433 433 self."translationstring"
434 434 self."zope.deprecation"
435 435 ];
436 436 src = fetchurl {
437 437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
438 438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
439 439 };
440 440 meta = {
441 441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
442 442 };
443 443 };
444 444 "defusedxml" = super.buildPythonPackage {
445 445 name = "defusedxml-0.6.0";
446 446 doCheck = false;
447 447 src = fetchurl {
448 448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
449 449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
450 450 };
451 451 meta = {
452 452 license = [ pkgs.lib.licenses.psfl ];
453 453 };
454 454 };
455 455 "dm.xmlsec.binding" = super.buildPythonPackage {
456 456 name = "dm.xmlsec.binding-1.3.7";
457 457 doCheck = false;
458 458 propagatedBuildInputs = [
459 459 self."setuptools"
460 460 self."lxml"
461 461 ];
462 462 src = fetchurl {
463 463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
464 464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
465 465 };
466 466 meta = {
467 467 license = [ pkgs.lib.licenses.bsdOriginal ];
468 468 };
469 469 };
470 470 "docutils" = super.buildPythonPackage {
471 471 name = "docutils-0.16";
472 472 doCheck = false;
473 473 src = fetchurl {
474 474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
475 475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
476 476 };
477 477 meta = {
478 478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
479 479 };
480 480 };
481 481 "dogpile.cache" = super.buildPythonPackage {
482 482 name = "dogpile.cache-0.9.0";
483 483 doCheck = false;
484 484 propagatedBuildInputs = [
485 485 self."decorator"
486 486 ];
487 487 src = fetchurl {
488 488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
489 489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
490 490 };
491 491 meta = {
492 492 license = [ pkgs.lib.licenses.bsdOriginal ];
493 493 };
494 494 };
495 495 "dogpile.core" = super.buildPythonPackage {
496 496 name = "dogpile.core-0.4.1";
497 497 doCheck = false;
498 498 src = fetchurl {
499 499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
500 500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
501 501 };
502 502 meta = {
503 503 license = [ pkgs.lib.licenses.bsdOriginal ];
504 504 };
505 505 };
506 506 "ecdsa" = super.buildPythonPackage {
507 507 name = "ecdsa-0.13.2";
508 508 doCheck = false;
509 509 src = fetchurl {
510 510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
511 511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
512 512 };
513 513 meta = {
514 514 license = [ pkgs.lib.licenses.mit ];
515 515 };
516 516 };
517 517 "elasticsearch" = super.buildPythonPackage {
518 518 name = "elasticsearch-6.3.1";
519 519 doCheck = false;
520 520 propagatedBuildInputs = [
521 521 self."urllib3"
522 522 ];
523 523 src = fetchurl {
524 524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
525 525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
526 526 };
527 527 meta = {
528 528 license = [ pkgs.lib.licenses.asl20 ];
529 529 };
530 530 };
531 531 "elasticsearch-dsl" = super.buildPythonPackage {
532 532 name = "elasticsearch-dsl-6.3.1";
533 533 doCheck = false;
534 534 propagatedBuildInputs = [
535 535 self."six"
536 536 self."python-dateutil"
537 537 self."elasticsearch"
538 538 self."ipaddress"
539 539 ];
540 540 src = fetchurl {
541 541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
542 542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
543 543 };
544 544 meta = {
545 545 license = [ pkgs.lib.licenses.asl20 ];
546 546 };
547 547 };
548 548 "elasticsearch1" = super.buildPythonPackage {
549 549 name = "elasticsearch1-1.10.0";
550 550 doCheck = false;
551 551 propagatedBuildInputs = [
552 552 self."urllib3"
553 553 ];
554 554 src = fetchurl {
555 555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
556 556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
557 557 };
558 558 meta = {
559 559 license = [ pkgs.lib.licenses.asl20 ];
560 560 };
561 561 };
562 562 "elasticsearch1-dsl" = super.buildPythonPackage {
563 563 name = "elasticsearch1-dsl-0.0.12";
564 564 doCheck = false;
565 565 propagatedBuildInputs = [
566 566 self."six"
567 567 self."python-dateutil"
568 568 self."elasticsearch1"
569 569 ];
570 570 src = fetchurl {
571 571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
572 572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
573 573 };
574 574 meta = {
575 575 license = [ pkgs.lib.licenses.asl20 ];
576 576 };
577 577 };
578 578 "elasticsearch2" = super.buildPythonPackage {
579 579 name = "elasticsearch2-2.5.1";
580 580 doCheck = false;
581 581 propagatedBuildInputs = [
582 582 self."urllib3"
583 583 ];
584 584 src = fetchurl {
585 585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
586 586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
587 587 };
588 588 meta = {
589 589 license = [ pkgs.lib.licenses.asl20 ];
590 590 };
591 591 };
592 592 "entrypoints" = super.buildPythonPackage {
593 593 name = "entrypoints-0.2.2";
594 594 doCheck = false;
595 595 propagatedBuildInputs = [
596 596 self."configparser"
597 597 ];
598 598 src = fetchurl {
599 599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
600 600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
601 601 };
602 602 meta = {
603 603 license = [ pkgs.lib.licenses.mit ];
604 604 };
605 605 };
606 606 "enum34" = super.buildPythonPackage {
607 607 name = "enum34-1.1.10";
608 608 doCheck = false;
609 609 src = fetchurl {
610 610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
611 611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
612 612 };
613 613 meta = {
614 614 license = [ pkgs.lib.licenses.bsdOriginal ];
615 615 };
616 616 };
617 617 "formencode" = super.buildPythonPackage {
618 618 name = "formencode-1.2.4";
619 619 doCheck = false;
620 620 src = fetchurl {
621 621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
622 622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
623 623 };
624 624 meta = {
625 625 license = [ pkgs.lib.licenses.psfl ];
626 626 };
627 627 };
628 628 "funcsigs" = super.buildPythonPackage {
629 629 name = "funcsigs-1.0.2";
630 630 doCheck = false;
631 631 src = fetchurl {
632 632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
633 633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
634 634 };
635 635 meta = {
636 636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
637 637 };
638 638 };
639 639 "functools32" = super.buildPythonPackage {
640 640 name = "functools32-3.2.3.post2";
641 641 doCheck = false;
642 642 src = fetchurl {
643 643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
644 644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
645 645 };
646 646 meta = {
647 647 license = [ pkgs.lib.licenses.psfl ];
648 648 };
649 649 };
650 650 "future" = super.buildPythonPackage {
651 651 name = "future-0.14.3";
652 652 doCheck = false;
653 653 src = fetchurl {
654 654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
655 655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
656 656 };
657 657 meta = {
658 658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
659 659 };
660 660 };
661 661 "futures" = super.buildPythonPackage {
662 662 name = "futures-3.0.2";
663 663 doCheck = false;
664 664 src = fetchurl {
665 665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
666 666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
667 667 };
668 668 meta = {
669 669 license = [ pkgs.lib.licenses.bsdOriginal ];
670 670 };
671 671 };
672 672 "gevent" = super.buildPythonPackage {
673 673 name = "gevent-1.5.0";
674 674 doCheck = false;
675 675 propagatedBuildInputs = [
676 676 self."greenlet"
677 677 ];
678 678 src = fetchurl {
679 679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
680 680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
681 681 };
682 682 meta = {
683 683 license = [ pkgs.lib.licenses.mit ];
684 684 };
685 685 };
686 686 "gnureadline" = super.buildPythonPackage {
687 687 name = "gnureadline-6.3.8";
688 688 doCheck = false;
689 689 src = fetchurl {
690 690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
691 691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
692 692 };
693 693 meta = {
694 694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
695 695 };
696 696 };
697 697 "gprof2dot" = super.buildPythonPackage {
698 698 name = "gprof2dot-2017.9.19";
699 699 doCheck = false;
700 700 src = fetchurl {
701 701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
702 702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
703 703 };
704 704 meta = {
705 705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
706 706 };
707 707 };
708 708 "greenlet" = super.buildPythonPackage {
709 709 name = "greenlet-0.4.15";
710 710 doCheck = false;
711 711 src = fetchurl {
712 712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
713 713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
714 714 };
715 715 meta = {
716 716 license = [ pkgs.lib.licenses.mit ];
717 717 };
718 718 };
719 719 "gunicorn" = super.buildPythonPackage {
720 720 name = "gunicorn-19.9.0";
721 721 doCheck = false;
722 722 src = fetchurl {
723 723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
724 724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
725 725 };
726 726 meta = {
727 727 license = [ pkgs.lib.licenses.mit ];
728 728 };
729 729 };
730 730 "hupper" = super.buildPythonPackage {
731 731 name = "hupper-1.10.2";
732 732 doCheck = false;
733 733 src = fetchurl {
734 734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
735 735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
736 736 };
737 737 meta = {
738 738 license = [ pkgs.lib.licenses.mit ];
739 739 };
740 740 };
741 741 "idna" = super.buildPythonPackage {
742 742 name = "idna-2.8";
743 743 doCheck = false;
744 744 src = fetchurl {
745 745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
746 746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
747 747 };
748 748 meta = {
749 749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
750 750 };
751 751 };
752 752 "importlib-metadata" = super.buildPythonPackage {
753 753 name = "importlib-metadata-1.6.0";
754 754 doCheck = false;
755 755 propagatedBuildInputs = [
756 756 self."zipp"
757 757 self."pathlib2"
758 758 self."contextlib2"
759 759 self."configparser"
760 760 ];
761 761 src = fetchurl {
762 762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
763 763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
764 764 };
765 765 meta = {
766 766 license = [ pkgs.lib.licenses.asl20 ];
767 767 };
768 768 };
769 769 "infrae.cache" = super.buildPythonPackage {
770 770 name = "infrae.cache-1.0.1";
771 771 doCheck = false;
772 772 propagatedBuildInputs = [
773 773 self."beaker"
774 774 self."repoze.lru"
775 775 ];
776 776 src = fetchurl {
777 777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
778 778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
779 779 };
780 780 meta = {
781 781 license = [ pkgs.lib.licenses.zpl21 ];
782 782 };
783 783 };
784 784 "invoke" = super.buildPythonPackage {
785 785 name = "invoke-0.13.0";
786 786 doCheck = false;
787 787 src = fetchurl {
788 788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
789 789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
790 790 };
791 791 meta = {
792 792 license = [ pkgs.lib.licenses.bsdOriginal ];
793 793 };
794 794 };
795 795 "ipaddress" = super.buildPythonPackage {
796 796 name = "ipaddress-1.0.23";
797 797 doCheck = false;
798 798 src = fetchurl {
799 799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
800 800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
801 801 };
802 802 meta = {
803 803 license = [ pkgs.lib.licenses.psfl ];
804 804 };
805 805 };
806 806 "ipdb" = super.buildPythonPackage {
807 807 name = "ipdb-0.13.2";
808 808 doCheck = false;
809 809 propagatedBuildInputs = [
810 810 self."setuptools"
811 811 self."ipython"
812 812 ];
813 813 src = fetchurl {
814 814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
815 815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
816 816 };
817 817 meta = {
818 818 license = [ pkgs.lib.licenses.bsdOriginal ];
819 819 };
820 820 };
821 821 "ipython" = super.buildPythonPackage {
822 822 name = "ipython-5.1.0";
823 823 doCheck = false;
824 824 propagatedBuildInputs = [
825 825 self."setuptools"
826 826 self."decorator"
827 827 self."pickleshare"
828 828 self."simplegeneric"
829 829 self."traitlets"
830 830 self."prompt-toolkit"
831 831 self."pygments"
832 832 self."pexpect"
833 833 self."backports.shutil-get-terminal-size"
834 834 self."pathlib2"
835 835 self."pexpect"
836 836 ];
837 837 src = fetchurl {
838 838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
839 839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
840 840 };
841 841 meta = {
842 842 license = [ pkgs.lib.licenses.bsdOriginal ];
843 843 };
844 844 };
845 845 "ipython-genutils" = super.buildPythonPackage {
846 846 name = "ipython-genutils-0.2.0";
847 847 doCheck = false;
848 848 src = fetchurl {
849 849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
850 850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
851 851 };
852 852 meta = {
853 853 license = [ pkgs.lib.licenses.bsdOriginal ];
854 854 };
855 855 };
856 856 "iso8601" = super.buildPythonPackage {
857 857 name = "iso8601-0.1.12";
858 858 doCheck = false;
859 859 src = fetchurl {
860 860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
861 861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
862 862 };
863 863 meta = {
864 864 license = [ pkgs.lib.licenses.mit ];
865 865 };
866 866 };
867 867 "isodate" = super.buildPythonPackage {
868 868 name = "isodate-0.6.0";
869 869 doCheck = false;
870 870 propagatedBuildInputs = [
871 871 self."six"
872 872 ];
873 873 src = fetchurl {
874 874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
875 875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
876 876 };
877 877 meta = {
878 878 license = [ pkgs.lib.licenses.bsdOriginal ];
879 879 };
880 880 };
881 881 "itsdangerous" = super.buildPythonPackage {
882 882 name = "itsdangerous-1.1.0";
883 883 doCheck = false;
884 884 src = fetchurl {
885 885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
886 886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
887 887 };
888 888 meta = {
889 889 license = [ pkgs.lib.licenses.bsdOriginal ];
890 890 };
891 891 };
892 892 "jinja2" = super.buildPythonPackage {
893 893 name = "jinja2-2.9.6";
894 894 doCheck = false;
895 895 propagatedBuildInputs = [
896 896 self."markupsafe"
897 897 ];
898 898 src = fetchurl {
899 899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
900 900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
901 901 };
902 902 meta = {
903 903 license = [ pkgs.lib.licenses.bsdOriginal ];
904 904 };
905 905 };
906 906 "jsonschema" = super.buildPythonPackage {
907 907 name = "jsonschema-2.6.0";
908 908 doCheck = false;
909 909 propagatedBuildInputs = [
910 910 self."functools32"
911 911 ];
912 912 src = fetchurl {
913 913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
914 914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
915 915 };
916 916 meta = {
917 917 license = [ pkgs.lib.licenses.mit ];
918 918 };
919 919 };
920 920 "jupyter-client" = super.buildPythonPackage {
921 921 name = "jupyter-client-5.0.0";
922 922 doCheck = false;
923 923 propagatedBuildInputs = [
924 924 self."traitlets"
925 925 self."jupyter-core"
926 926 self."pyzmq"
927 927 self."python-dateutil"
928 928 ];
929 929 src = fetchurl {
930 930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
931 931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
932 932 };
933 933 meta = {
934 934 license = [ pkgs.lib.licenses.bsdOriginal ];
935 935 };
936 936 };
937 937 "jupyter-core" = super.buildPythonPackage {
938 938 name = "jupyter-core-4.5.0";
939 939 doCheck = false;
940 940 propagatedBuildInputs = [
941 941 self."traitlets"
942 942 ];
943 943 src = fetchurl {
944 944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
945 945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
946 946 };
947 947 meta = {
948 948 license = [ pkgs.lib.licenses.bsdOriginal ];
949 949 };
950 950 };
951 951 "kombu" = super.buildPythonPackage {
952 952 name = "kombu-4.6.6";
953 953 doCheck = false;
954 954 propagatedBuildInputs = [
955 955 self."amqp"
956 956 self."importlib-metadata"
957 957 ];
958 958 src = fetchurl {
959 959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
960 960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
961 961 };
962 962 meta = {
963 963 license = [ pkgs.lib.licenses.bsdOriginal ];
964 964 };
965 965 };
966 966 "lxml" = super.buildPythonPackage {
967 967 name = "lxml-4.2.5";
968 968 doCheck = false;
969 969 src = fetchurl {
970 970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
971 971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
972 972 };
973 973 meta = {
974 974 license = [ pkgs.lib.licenses.bsdOriginal ];
975 975 };
976 976 };
977 977 "mako" = super.buildPythonPackage {
978 978 name = "mako-1.1.0";
979 979 doCheck = false;
980 980 propagatedBuildInputs = [
981 981 self."markupsafe"
982 982 ];
983 983 src = fetchurl {
984 984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
985 985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
986 986 };
987 987 meta = {
988 988 license = [ pkgs.lib.licenses.mit ];
989 989 };
990 990 };
991 991 "markdown" = super.buildPythonPackage {
992 992 name = "markdown-2.6.11";
993 993 doCheck = false;
994 994 src = fetchurl {
995 995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
996 996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
997 997 };
998 998 meta = {
999 999 license = [ pkgs.lib.licenses.bsdOriginal ];
1000 1000 };
1001 1001 };
1002 1002 "markupsafe" = super.buildPythonPackage {
1003 1003 name = "markupsafe-1.1.1";
1004 1004 doCheck = false;
1005 1005 src = fetchurl {
1006 1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1007 1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1008 1008 };
1009 1009 meta = {
1010 1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1011 1011 };
1012 1012 };
1013 1013 "marshmallow" = super.buildPythonPackage {
1014 1014 name = "marshmallow-2.18.0";
1015 1015 doCheck = false;
1016 1016 src = fetchurl {
1017 1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1018 1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1019 1019 };
1020 1020 meta = {
1021 1021 license = [ pkgs.lib.licenses.mit ];
1022 1022 };
1023 1023 };
1024 1024 "mistune" = super.buildPythonPackage {
1025 1025 name = "mistune-0.8.4";
1026 1026 doCheck = false;
1027 1027 src = fetchurl {
1028 1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1029 1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1030 1030 };
1031 1031 meta = {
1032 1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1033 1033 };
1034 1034 };
1035 1035 "mock" = super.buildPythonPackage {
1036 1036 name = "mock-3.0.5";
1037 1037 doCheck = false;
1038 1038 propagatedBuildInputs = [
1039 1039 self."six"
1040 1040 self."funcsigs"
1041 1041 ];
1042 1042 src = fetchurl {
1043 1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1044 1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1045 1045 };
1046 1046 meta = {
1047 1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1048 1048 };
1049 1049 };
1050 1050 "more-itertools" = super.buildPythonPackage {
1051 1051 name = "more-itertools-5.0.0";
1052 1052 doCheck = false;
1053 1053 propagatedBuildInputs = [
1054 1054 self."six"
1055 1055 ];
1056 1056 src = fetchurl {
1057 1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1058 1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1059 1059 };
1060 1060 meta = {
1061 1061 license = [ pkgs.lib.licenses.mit ];
1062 1062 };
1063 1063 };
1064 1064 "msgpack-python" = super.buildPythonPackage {
1065 1065 name = "msgpack-python-0.5.6";
1066 1066 doCheck = false;
1067 1067 src = fetchurl {
1068 1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1069 1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1070 1070 };
1071 1071 meta = {
1072 1072 license = [ pkgs.lib.licenses.asl20 ];
1073 1073 };
1074 1074 };
1075 1075 "mysql-python" = super.buildPythonPackage {
1076 1076 name = "mysql-python-1.2.5";
1077 1077 doCheck = false;
1078 1078 src = fetchurl {
1079 1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1080 1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1081 1081 };
1082 1082 meta = {
1083 1083 license = [ pkgs.lib.licenses.gpl1 ];
1084 1084 };
1085 1085 };
1086 1086 "nbconvert" = super.buildPythonPackage {
1087 1087 name = "nbconvert-5.3.1";
1088 1088 doCheck = false;
1089 1089 propagatedBuildInputs = [
1090 1090 self."mistune"
1091 1091 self."jinja2"
1092 1092 self."pygments"
1093 1093 self."traitlets"
1094 1094 self."jupyter-core"
1095 1095 self."nbformat"
1096 1096 self."entrypoints"
1097 1097 self."bleach"
1098 1098 self."pandocfilters"
1099 1099 self."testpath"
1100 1100 ];
1101 1101 src = fetchurl {
1102 1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1103 1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1104 1104 };
1105 1105 meta = {
1106 1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1107 1107 };
1108 1108 };
1109 1109 "nbformat" = super.buildPythonPackage {
1110 1110 name = "nbformat-4.4.0";
1111 1111 doCheck = false;
1112 1112 propagatedBuildInputs = [
1113 1113 self."ipython-genutils"
1114 1114 self."traitlets"
1115 1115 self."jsonschema"
1116 1116 self."jupyter-core"
1117 1117 ];
1118 1118 src = fetchurl {
1119 1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1120 1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1121 1121 };
1122 1122 meta = {
1123 1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1124 1124 };
1125 1125 };
1126 1126 "packaging" = super.buildPythonPackage {
1127 1127 name = "packaging-20.3";
1128 1128 doCheck = false;
1129 1129 propagatedBuildInputs = [
1130 1130 self."pyparsing"
1131 1131 self."six"
1132 1132 ];
1133 1133 src = fetchurl {
1134 1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1135 1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1136 1136 };
1137 1137 meta = {
1138 1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1139 1139 };
1140 1140 };
1141 1141 "pandocfilters" = super.buildPythonPackage {
1142 1142 name = "pandocfilters-1.4.2";
1143 1143 doCheck = false;
1144 1144 src = fetchurl {
1145 1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1146 1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1147 1147 };
1148 1148 meta = {
1149 1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1150 1150 };
1151 1151 };
1152 1152 "paste" = super.buildPythonPackage {
1153 1153 name = "paste-3.4.0";
1154 1154 doCheck = false;
1155 1155 propagatedBuildInputs = [
1156 1156 self."six"
1157 1157 ];
1158 1158 src = fetchurl {
1159 1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1160 1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1161 1161 };
1162 1162 meta = {
1163 1163 license = [ pkgs.lib.licenses.mit ];
1164 1164 };
1165 1165 };
1166 1166 "pastedeploy" = super.buildPythonPackage {
1167 1167 name = "pastedeploy-2.1.0";
1168 1168 doCheck = false;
1169 1169 src = fetchurl {
1170 1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1171 1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1172 1172 };
1173 1173 meta = {
1174 1174 license = [ pkgs.lib.licenses.mit ];
1175 1175 };
1176 1176 };
1177 1177 "pastescript" = super.buildPythonPackage {
1178 1178 name = "pastescript-3.2.0";
1179 1179 doCheck = false;
1180 1180 propagatedBuildInputs = [
1181 1181 self."paste"
1182 1182 self."pastedeploy"
1183 1183 self."six"
1184 1184 ];
1185 1185 src = fetchurl {
1186 1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1187 1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1188 1188 };
1189 1189 meta = {
1190 1190 license = [ pkgs.lib.licenses.mit ];
1191 1191 };
1192 1192 };
1193 1193 "pathlib2" = super.buildPythonPackage {
1194 1194 name = "pathlib2-2.3.5";
1195 1195 doCheck = false;
1196 1196 propagatedBuildInputs = [
1197 1197 self."six"
1198 1198 self."scandir"
1199 1199 ];
1200 1200 src = fetchurl {
1201 1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1202 1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1203 1203 };
1204 1204 meta = {
1205 1205 license = [ pkgs.lib.licenses.mit ];
1206 1206 };
1207 1207 };
1208 1208 "peppercorn" = super.buildPythonPackage {
1209 1209 name = "peppercorn-0.6";
1210 1210 doCheck = false;
1211 1211 src = fetchurl {
1212 1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1213 1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1214 1214 };
1215 1215 meta = {
1216 1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1217 1217 };
1218 1218 };
1219 1219 "pexpect" = super.buildPythonPackage {
1220 1220 name = "pexpect-4.8.0";
1221 1221 doCheck = false;
1222 1222 propagatedBuildInputs = [
1223 1223 self."ptyprocess"
1224 1224 ];
1225 1225 src = fetchurl {
1226 1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1227 1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1228 1228 };
1229 1229 meta = {
1230 1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1231 1231 };
1232 1232 };
1233 1233 "pickleshare" = super.buildPythonPackage {
1234 1234 name = "pickleshare-0.7.5";
1235 1235 doCheck = false;
1236 1236 propagatedBuildInputs = [
1237 1237 self."pathlib2"
1238 1238 ];
1239 1239 src = fetchurl {
1240 1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1241 1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1242 1242 };
1243 1243 meta = {
1244 1244 license = [ pkgs.lib.licenses.mit ];
1245 1245 };
1246 1246 };
1247 1247 "plaster" = super.buildPythonPackage {
1248 1248 name = "plaster-1.0";
1249 1249 doCheck = false;
1250 1250 propagatedBuildInputs = [
1251 1251 self."setuptools"
1252 1252 ];
1253 1253 src = fetchurl {
1254 1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1255 1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1256 1256 };
1257 1257 meta = {
1258 1258 license = [ pkgs.lib.licenses.mit ];
1259 1259 };
1260 1260 };
1261 1261 "plaster-pastedeploy" = super.buildPythonPackage {
1262 1262 name = "plaster-pastedeploy-0.7";
1263 1263 doCheck = false;
1264 1264 propagatedBuildInputs = [
1265 1265 self."pastedeploy"
1266 1266 self."plaster"
1267 1267 ];
1268 1268 src = fetchurl {
1269 1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1270 1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1271 1271 };
1272 1272 meta = {
1273 1273 license = [ pkgs.lib.licenses.mit ];
1274 1274 };
1275 1275 };
1276 1276 "pluggy" = super.buildPythonPackage {
1277 1277 name = "pluggy-0.13.1";
1278 1278 doCheck = false;
1279 1279 propagatedBuildInputs = [
1280 1280 self."importlib-metadata"
1281 1281 ];
1282 1282 src = fetchurl {
1283 1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1284 1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1285 1285 };
1286 1286 meta = {
1287 1287 license = [ pkgs.lib.licenses.mit ];
1288 1288 };
1289 1289 };
1290 1290 "premailer" = super.buildPythonPackage {
1291 1291 name = "premailer-3.6.1";
1292 1292 doCheck = false;
1293 1293 propagatedBuildInputs = [
1294 1294 self."lxml"
1295 1295 self."cssselect"
1296 1296 self."cssutils"
1297 1297 self."requests"
1298 1298 self."cachetools"
1299 1299 ];
1300 1300 src = fetchurl {
1301 1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1302 1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1303 1303 };
1304 1304 meta = {
1305 1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1306 1306 };
1307 1307 };
1308 1308 "prompt-toolkit" = super.buildPythonPackage {
1309 1309 name = "prompt-toolkit-1.0.18";
1310 1310 doCheck = false;
1311 1311 propagatedBuildInputs = [
1312 1312 self."six"
1313 1313 self."wcwidth"
1314 1314 ];
1315 1315 src = fetchurl {
1316 1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1317 1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1318 1318 };
1319 1319 meta = {
1320 1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1321 1321 };
1322 1322 };
1323 1323 "psutil" = super.buildPythonPackage {
1324 1324 name = "psutil-5.7.0";
1325 1325 doCheck = false;
1326 1326 src = fetchurl {
1327 1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1328 1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1329 1329 };
1330 1330 meta = {
1331 1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1332 1332 };
1333 1333 };
1334 1334 "psycopg2" = super.buildPythonPackage {
1335 1335 name = "psycopg2-2.8.4";
1336 1336 doCheck = false;
1337 1337 src = fetchurl {
1338 1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1339 1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1340 1340 };
1341 1341 meta = {
1342 1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1343 1343 };
1344 1344 };
1345 1345 "ptyprocess" = super.buildPythonPackage {
1346 1346 name = "ptyprocess-0.6.0";
1347 1347 doCheck = false;
1348 1348 src = fetchurl {
1349 1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1350 1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1351 1351 };
1352 1352 meta = {
1353 1353 license = [ ];
1354 1354 };
1355 1355 };
1356 1356 "py" = super.buildPythonPackage {
1357 1357 name = "py-1.8.0";
1358 1358 doCheck = false;
1359 1359 src = fetchurl {
1360 1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1361 1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1362 1362 };
1363 1363 meta = {
1364 1364 license = [ pkgs.lib.licenses.mit ];
1365 1365 };
1366 1366 };
1367 1367 "py-bcrypt" = super.buildPythonPackage {
1368 1368 name = "py-bcrypt-0.4";
1369 1369 doCheck = false;
1370 1370 src = fetchurl {
1371 1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1372 1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1373 1373 };
1374 1374 meta = {
1375 1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1376 1376 };
1377 1377 };
1378 1378 "py-gfm" = super.buildPythonPackage {
1379 1379 name = "py-gfm-0.1.4";
1380 1380 doCheck = false;
1381 1381 propagatedBuildInputs = [
1382 1382 self."setuptools"
1383 1383 self."markdown"
1384 1384 ];
1385 1385 src = fetchurl {
1386 1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1387 1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1388 1388 };
1389 1389 meta = {
1390 1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1391 1391 };
1392 1392 };
1393 1393 "pyasn1" = super.buildPythonPackage {
1394 1394 name = "pyasn1-0.4.8";
1395 1395 doCheck = false;
1396 1396 src = fetchurl {
1397 1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1398 1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1399 1399 };
1400 1400 meta = {
1401 1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1402 1402 };
1403 1403 };
1404 1404 "pyasn1-modules" = super.buildPythonPackage {
1405 1405 name = "pyasn1-modules-0.2.6";
1406 1406 doCheck = false;
1407 1407 propagatedBuildInputs = [
1408 1408 self."pyasn1"
1409 1409 ];
1410 1410 src = fetchurl {
1411 1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1412 1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1413 1413 };
1414 1414 meta = {
1415 1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1416 1416 };
1417 1417 };
1418 1418 "pycparser" = super.buildPythonPackage {
1419 1419 name = "pycparser-2.20";
1420 1420 doCheck = false;
1421 1421 src = fetchurl {
1422 1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1423 1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1424 1424 };
1425 1425 meta = {
1426 1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1427 1427 };
1428 1428 };
1429 1429 "pycrypto" = super.buildPythonPackage {
1430 1430 name = "pycrypto-2.6.1";
1431 1431 doCheck = false;
1432 1432 src = fetchurl {
1433 1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1434 1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1435 1435 };
1436 1436 meta = {
1437 1437 license = [ pkgs.lib.licenses.publicDomain ];
1438 1438 };
1439 1439 };
1440 1440 "pycurl" = super.buildPythonPackage {
1441 1441 name = "pycurl-7.43.0.3";
1442 1442 doCheck = false;
1443 1443 src = fetchurl {
1444 1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1445 1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1446 1446 };
1447 1447 meta = {
1448 1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1449 1449 };
1450 1450 };
1451 1451 "pygments" = super.buildPythonPackage {
1452 1452 name = "pygments-2.4.2";
1453 1453 doCheck = false;
1454 1454 src = fetchurl {
1455 1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1456 1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1457 1457 };
1458 1458 meta = {
1459 1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1460 1460 };
1461 1461 };
1462 1462 "pymysql" = super.buildPythonPackage {
1463 1463 name = "pymysql-0.8.1";
1464 1464 doCheck = false;
1465 1465 src = fetchurl {
1466 1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1467 1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1468 1468 };
1469 1469 meta = {
1470 1470 license = [ pkgs.lib.licenses.mit ];
1471 1471 };
1472 1472 };
1473 1473 "pyotp" = super.buildPythonPackage {
1474 1474 name = "pyotp-2.3.0";
1475 1475 doCheck = false;
1476 1476 src = fetchurl {
1477 1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1478 1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1479 1479 };
1480 1480 meta = {
1481 1481 license = [ pkgs.lib.licenses.mit ];
1482 1482 };
1483 1483 };
1484 1484 "pyparsing" = super.buildPythonPackage {
1485 1485 name = "pyparsing-2.4.7";
1486 1486 doCheck = false;
1487 1487 src = fetchurl {
1488 1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1489 1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1490 1490 };
1491 1491 meta = {
1492 1492 license = [ pkgs.lib.licenses.mit ];
1493 1493 };
1494 1494 };
1495 1495 "pyramid" = super.buildPythonPackage {
1496 1496 name = "pyramid-1.10.4";
1497 1497 doCheck = false;
1498 1498 propagatedBuildInputs = [
1499 1499 self."hupper"
1500 1500 self."plaster"
1501 1501 self."plaster-pastedeploy"
1502 1502 self."setuptools"
1503 1503 self."translationstring"
1504 1504 self."venusian"
1505 1505 self."webob"
1506 1506 self."zope.deprecation"
1507 1507 self."zope.interface"
1508 1508 self."repoze.lru"
1509 1509 ];
1510 1510 src = fetchurl {
1511 1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1512 1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1513 1513 };
1514 1514 meta = {
1515 1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1516 1516 };
1517 1517 };
1518 1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1519 1519 name = "pyramid-debugtoolbar-4.6.1";
1520 1520 doCheck = false;
1521 1521 propagatedBuildInputs = [
1522 1522 self."pyramid"
1523 1523 self."pyramid-mako"
1524 1524 self."repoze.lru"
1525 1525 self."pygments"
1526 1526 self."ipaddress"
1527 1527 ];
1528 1528 src = fetchurl {
1529 1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1530 1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1531 1531 };
1532 1532 meta = {
1533 1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1534 1534 };
1535 1535 };
1536 1536 "pyramid-jinja2" = super.buildPythonPackage {
1537 1537 name = "pyramid-jinja2-2.7";
1538 1538 doCheck = false;
1539 1539 propagatedBuildInputs = [
1540 1540 self."pyramid"
1541 1541 self."zope.deprecation"
1542 1542 self."jinja2"
1543 1543 self."markupsafe"
1544 1544 ];
1545 1545 src = fetchurl {
1546 1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1547 1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1548 1548 };
1549 1549 meta = {
1550 1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1551 1551 };
1552 1552 };
1553 1553 "pyramid-apispec" = super.buildPythonPackage {
1554 1554 name = "pyramid-apispec-0.3.2";
1555 1555 doCheck = false;
1556 1556 propagatedBuildInputs = [
1557 1557 self."apispec"
1558 1558 ];
1559 1559 src = fetchurl {
1560 1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1561 1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1562 1562 };
1563 1563 meta = {
1564 1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1565 1565 };
1566 1566 };
1567 1567 "pyramid-mailer" = super.buildPythonPackage {
1568 1568 name = "pyramid-mailer-0.15.1";
1569 1569 doCheck = false;
1570 1570 propagatedBuildInputs = [
1571 1571 self."pyramid"
1572 1572 self."repoze.sendmail"
1573 1573 self."transaction"
1574 1574 ];
1575 1575 src = fetchurl {
1576 1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1577 1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1578 1578 };
1579 1579 meta = {
1580 1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1581 1581 };
1582 1582 };
1583 1583 "pyramid-mako" = super.buildPythonPackage {
1584 1584 name = "pyramid-mako-1.1.0";
1585 1585 doCheck = false;
1586 1586 propagatedBuildInputs = [
1587 1587 self."pyramid"
1588 1588 self."mako"
1589 1589 ];
1590 1590 src = fetchurl {
1591 1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1592 1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1593 1593 };
1594 1594 meta = {
1595 1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1596 1596 };
1597 1597 };
1598 1598 "pysqlite" = super.buildPythonPackage {
1599 1599 name = "pysqlite-2.8.3";
1600 1600 doCheck = false;
1601 1601 src = fetchurl {
1602 1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1603 1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1604 1604 };
1605 1605 meta = {
1606 1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1607 1607 };
1608 1608 };
1609 1609 "pytest" = super.buildPythonPackage {
1610 1610 name = "pytest-4.6.5";
1611 1611 doCheck = false;
1612 1612 propagatedBuildInputs = [
1613 1613 self."py"
1614 1614 self."six"
1615 1615 self."packaging"
1616 1616 self."attrs"
1617 1617 self."atomicwrites"
1618 1618 self."pluggy"
1619 1619 self."importlib-metadata"
1620 1620 self."wcwidth"
1621 1621 self."funcsigs"
1622 1622 self."pathlib2"
1623 1623 self."more-itertools"
1624 1624 ];
1625 1625 src = fetchurl {
1626 1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1627 1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1628 1628 };
1629 1629 meta = {
1630 1630 license = [ pkgs.lib.licenses.mit ];
1631 1631 };
1632 1632 };
1633 1633 "pytest-cov" = super.buildPythonPackage {
1634 1634 name = "pytest-cov-2.7.1";
1635 1635 doCheck = false;
1636 1636 propagatedBuildInputs = [
1637 1637 self."pytest"
1638 1638 self."coverage"
1639 1639 ];
1640 1640 src = fetchurl {
1641 1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1642 1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1643 1643 };
1644 1644 meta = {
1645 1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1646 1646 };
1647 1647 };
1648 1648 "pytest-profiling" = super.buildPythonPackage {
1649 1649 name = "pytest-profiling-1.7.0";
1650 1650 doCheck = false;
1651 1651 propagatedBuildInputs = [
1652 1652 self."six"
1653 1653 self."pytest"
1654 1654 self."gprof2dot"
1655 1655 ];
1656 1656 src = fetchurl {
1657 1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1658 1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1659 1659 };
1660 1660 meta = {
1661 1661 license = [ pkgs.lib.licenses.mit ];
1662 1662 };
1663 1663 };
1664 1664 "pytest-runner" = super.buildPythonPackage {
1665 1665 name = "pytest-runner-5.1";
1666 1666 doCheck = false;
1667 1667 src = fetchurl {
1668 1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1669 1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1670 1670 };
1671 1671 meta = {
1672 1672 license = [ pkgs.lib.licenses.mit ];
1673 1673 };
1674 1674 };
1675 1675 "pytest-sugar" = super.buildPythonPackage {
1676 1676 name = "pytest-sugar-0.9.2";
1677 1677 doCheck = false;
1678 1678 propagatedBuildInputs = [
1679 1679 self."pytest"
1680 1680 self."termcolor"
1681 1681 self."packaging"
1682 1682 ];
1683 1683 src = fetchurl {
1684 1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1685 1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1686 1686 };
1687 1687 meta = {
1688 1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1689 1689 };
1690 1690 };
1691 1691 "pytest-timeout" = super.buildPythonPackage {
1692 1692 name = "pytest-timeout-1.3.3";
1693 1693 doCheck = false;
1694 1694 propagatedBuildInputs = [
1695 1695 self."pytest"
1696 1696 ];
1697 1697 src = fetchurl {
1698 1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1699 1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1700 1700 };
1701 1701 meta = {
1702 1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1703 1703 };
1704 1704 };
1705 1705 "python-dateutil" = super.buildPythonPackage {
1706 1706 name = "python-dateutil-2.8.1";
1707 1707 doCheck = false;
1708 1708 propagatedBuildInputs = [
1709 1709 self."six"
1710 1710 ];
1711 1711 src = fetchurl {
1712 1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1713 1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1714 1714 };
1715 1715 meta = {
1716 1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1717 1717 };
1718 1718 };
1719 1719 "python-editor" = super.buildPythonPackage {
1720 1720 name = "python-editor-1.0.4";
1721 1721 doCheck = false;
1722 1722 src = fetchurl {
1723 1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1724 1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1725 1725 };
1726 1726 meta = {
1727 1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1728 1728 };
1729 1729 };
1730 1730 "python-ldap" = super.buildPythonPackage {
1731 1731 name = "python-ldap-3.2.0";
1732 1732 doCheck = false;
1733 1733 propagatedBuildInputs = [
1734 1734 self."pyasn1"
1735 1735 self."pyasn1-modules"
1736 1736 ];
1737 1737 src = fetchurl {
1738 1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1739 1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1740 1740 };
1741 1741 meta = {
1742 1742 license = [ pkgs.lib.licenses.psfl ];
1743 1743 };
1744 1744 };
1745 1745 "python-memcached" = super.buildPythonPackage {
1746 1746 name = "python-memcached-1.59";
1747 1747 doCheck = false;
1748 1748 propagatedBuildInputs = [
1749 1749 self."six"
1750 1750 ];
1751 1751 src = fetchurl {
1752 1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1753 1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1754 1754 };
1755 1755 meta = {
1756 1756 license = [ pkgs.lib.licenses.psfl ];
1757 1757 };
1758 1758 };
1759 1759 "python-pam" = super.buildPythonPackage {
1760 1760 name = "python-pam-1.8.4";
1761 1761 doCheck = false;
1762 1762 src = fetchurl {
1763 1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1764 1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1765 1765 };
1766 1766 meta = {
1767 1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1768 1768 };
1769 1769 };
1770 1770 "python-saml" = super.buildPythonPackage {
1771 1771 name = "python-saml-2.4.2";
1772 1772 doCheck = false;
1773 1773 propagatedBuildInputs = [
1774 1774 self."dm.xmlsec.binding"
1775 1775 self."isodate"
1776 1776 self."defusedxml"
1777 1777 ];
1778 1778 src = fetchurl {
1779 1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1780 1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1781 1781 };
1782 1782 meta = {
1783 1783 license = [ pkgs.lib.licenses.mit ];
1784 1784 };
1785 1785 };
1786 1786 "pytz" = super.buildPythonPackage {
1787 1787 name = "pytz-2019.3";
1788 1788 doCheck = false;
1789 1789 src = fetchurl {
1790 1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1791 1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1792 1792 };
1793 1793 meta = {
1794 1794 license = [ pkgs.lib.licenses.mit ];
1795 1795 };
1796 1796 };
1797 1797 "pyzmq" = super.buildPythonPackage {
1798 1798 name = "pyzmq-14.6.0";
1799 1799 doCheck = false;
1800 1800 src = fetchurl {
1801 1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1802 1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1803 1803 };
1804 1804 meta = {
1805 1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1806 1806 };
1807 1807 };
1808 1808 "PyYAML" = super.buildPythonPackage {
1809 1809 name = "PyYAML-5.3.1";
1810 1810 doCheck = false;
1811 1811 src = fetchurl {
1812 1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1813 1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1814 1814 };
1815 1815 meta = {
1816 1816 license = [ pkgs.lib.licenses.mit ];
1817 1817 };
1818 1818 };
1819 1819 "regex" = super.buildPythonPackage {
1820 1820 name = "regex-2020.9.27";
1821 1821 doCheck = false;
1822 1822 src = fetchurl {
1823 1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1824 1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1825 1825 };
1826 1826 meta = {
1827 1827 license = [ pkgs.lib.licenses.psfl ];
1828 1828 };
1829 1829 };
1830 1830 "redis" = super.buildPythonPackage {
1831 1831 name = "redis-3.5.3";
1832 1832 doCheck = false;
1833 1833 src = fetchurl {
1834 1834 url = "https://files.pythonhosted.org/packages/b3/17/1e567ff78c83854e16b98694411fe6e08c3426af866ad11397cddceb80d3/redis-3.5.3.tar.gz";
1835 1835 sha256 = "0e7e0cfca8660dea8b7d5cd8c4f6c5e29e11f31158c0b0ae91a397f00e5a05a2";
1836 1836 };
1837 1837 meta = {
1838 1838 license = [ pkgs.lib.licenses.mit ];
1839 1839 };
1840 1840 };
1841 1841 "repoze.lru" = super.buildPythonPackage {
1842 1842 name = "repoze.lru-0.7";
1843 1843 doCheck = false;
1844 1844 src = fetchurl {
1845 1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1846 1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1847 1847 };
1848 1848 meta = {
1849 1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1850 1850 };
1851 1851 };
1852 1852 "repoze.sendmail" = super.buildPythonPackage {
1853 1853 name = "repoze.sendmail-4.4.1";
1854 1854 doCheck = false;
1855 1855 propagatedBuildInputs = [
1856 1856 self."setuptools"
1857 1857 self."zope.interface"
1858 1858 self."transaction"
1859 1859 ];
1860 1860 src = fetchurl {
1861 1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1862 1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1863 1863 };
1864 1864 meta = {
1865 1865 license = [ pkgs.lib.licenses.zpl21 ];
1866 1866 };
1867 1867 };
1868 1868 "requests" = super.buildPythonPackage {
1869 1869 name = "requests-2.22.0";
1870 1870 doCheck = false;
1871 1871 propagatedBuildInputs = [
1872 1872 self."chardet"
1873 1873 self."idna"
1874 1874 self."urllib3"
1875 1875 self."certifi"
1876 1876 ];
1877 1877 src = fetchurl {
1878 1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1879 1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1880 1880 };
1881 1881 meta = {
1882 1882 license = [ pkgs.lib.licenses.asl20 ];
1883 1883 };
1884 1884 };
1885 1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1886 name = "rhodecode-enterprise-ce-4.24.1";
1886 name = "rhodecode-enterprise-ce-4.25.0";
1887 1887 buildInputs = [
1888 1888 self."pytest"
1889 1889 self."py"
1890 1890 self."pytest-cov"
1891 1891 self."pytest-sugar"
1892 1892 self."pytest-runner"
1893 1893 self."pytest-profiling"
1894 1894 self."pytest-timeout"
1895 1895 self."gprof2dot"
1896 1896 self."mock"
1897 1897 self."cov-core"
1898 1898 self."coverage"
1899 1899 self."webtest"
1900 1900 self."beautifulsoup4"
1901 1901 self."configobj"
1902 1902 ];
1903 1903 doCheck = true;
1904 1904 propagatedBuildInputs = [
1905 1905 self."amqp"
1906 1906 self."babel"
1907 1907 self."beaker"
1908 1908 self."bleach"
1909 1909 self."celery"
1910 1910 self."channelstream"
1911 1911 self."click"
1912 1912 self."colander"
1913 1913 self."configobj"
1914 1914 self."cssselect"
1915 1915 self."cryptography"
1916 1916 self."decorator"
1917 1917 self."deform"
1918 1918 self."docutils"
1919 1919 self."dogpile.cache"
1920 1920 self."dogpile.core"
1921 1921 self."formencode"
1922 1922 self."future"
1923 1923 self."futures"
1924 1924 self."infrae.cache"
1925 1925 self."iso8601"
1926 1926 self."itsdangerous"
1927 1927 self."kombu"
1928 1928 self."lxml"
1929 1929 self."mako"
1930 1930 self."markdown"
1931 1931 self."markupsafe"
1932 1932 self."msgpack-python"
1933 1933 self."pyotp"
1934 1934 self."packaging"
1935 1935 self."pathlib2"
1936 1936 self."paste"
1937 1937 self."pastedeploy"
1938 1938 self."pastescript"
1939 1939 self."peppercorn"
1940 1940 self."premailer"
1941 1941 self."psutil"
1942 1942 self."py-bcrypt"
1943 1943 self."pycurl"
1944 1944 self."pycrypto"
1945 1945 self."pygments"
1946 1946 self."pyparsing"
1947 1947 self."pyramid-debugtoolbar"
1948 1948 self."pyramid-mako"
1949 1949 self."pyramid"
1950 1950 self."pyramid-mailer"
1951 1951 self."python-dateutil"
1952 1952 self."python-ldap"
1953 1953 self."python-memcached"
1954 1954 self."python-pam"
1955 1955 self."python-saml"
1956 1956 self."pytz"
1957 1957 self."tzlocal"
1958 1958 self."pyzmq"
1959 1959 self."py-gfm"
1960 1960 self."regex"
1961 1961 self."redis"
1962 1962 self."repoze.lru"
1963 1963 self."requests"
1964 1964 self."routes"
1965 1965 self."simplejson"
1966 1966 self."six"
1967 1967 self."sqlalchemy"
1968 1968 self."sshpubkeys"
1969 1969 self."subprocess32"
1970 1970 self."supervisor"
1971 1971 self."translationstring"
1972 1972 self."urllib3"
1973 1973 self."urlobject"
1974 1974 self."venusian"
1975 1975 self."weberror"
1976 1976 self."webhelpers2"
1977 1977 self."webob"
1978 1978 self."whoosh"
1979 1979 self."wsgiref"
1980 1980 self."zope.cachedescriptors"
1981 1981 self."zope.deprecation"
1982 1982 self."zope.event"
1983 1983 self."zope.interface"
1984 1984 self."mysql-python"
1985 1985 self."pymysql"
1986 1986 self."pysqlite"
1987 1987 self."psycopg2"
1988 1988 self."nbconvert"
1989 1989 self."nbformat"
1990 1990 self."jupyter-client"
1991 1991 self."jupyter-core"
1992 1992 self."alembic"
1993 1993 self."invoke"
1994 1994 self."bumpversion"
1995 1995 self."gevent"
1996 1996 self."greenlet"
1997 1997 self."gunicorn"
1998 1998 self."waitress"
1999 1999 self."ipdb"
2000 2000 self."ipython"
2001 2001 self."rhodecode-tools"
2002 2002 self."appenlight-client"
2003 2003 self."pytest"
2004 2004 self."py"
2005 2005 self."pytest-cov"
2006 2006 self."pytest-sugar"
2007 2007 self."pytest-runner"
2008 2008 self."pytest-profiling"
2009 2009 self."pytest-timeout"
2010 2010 self."gprof2dot"
2011 2011 self."mock"
2012 2012 self."cov-core"
2013 2013 self."coverage"
2014 2014 self."webtest"
2015 2015 self."beautifulsoup4"
2016 2016 ];
2017 2017 src = ./.;
2018 2018 meta = {
2019 2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2020 2020 };
2021 2021 };
2022 2022 "rhodecode-tools" = super.buildPythonPackage {
2023 2023 name = "rhodecode-tools-1.4.0";
2024 2024 doCheck = false;
2025 2025 propagatedBuildInputs = [
2026 2026 self."click"
2027 2027 self."future"
2028 2028 self."six"
2029 2029 self."mako"
2030 2030 self."markupsafe"
2031 2031 self."requests"
2032 2032 self."urllib3"
2033 2033 self."whoosh"
2034 2034 self."elasticsearch"
2035 2035 self."elasticsearch-dsl"
2036 2036 self."elasticsearch2"
2037 2037 self."elasticsearch1-dsl"
2038 2038 ];
2039 2039 src = fetchurl {
2040 2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2041 2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2042 2042 };
2043 2043 meta = {
2044 2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2045 2045 };
2046 2046 };
2047 2047 "routes" = super.buildPythonPackage {
2048 2048 name = "routes-2.4.1";
2049 2049 doCheck = false;
2050 2050 propagatedBuildInputs = [
2051 2051 self."six"
2052 2052 self."repoze.lru"
2053 2053 ];
2054 2054 src = fetchurl {
2055 2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2056 2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2057 2057 };
2058 2058 meta = {
2059 2059 license = [ pkgs.lib.licenses.mit ];
2060 2060 };
2061 2061 };
2062 2062 "scandir" = super.buildPythonPackage {
2063 2063 name = "scandir-1.10.0";
2064 2064 doCheck = false;
2065 2065 src = fetchurl {
2066 2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2067 2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2068 2068 };
2069 2069 meta = {
2070 2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2071 2071 };
2072 2072 };
2073 2073 "setproctitle" = super.buildPythonPackage {
2074 2074 name = "setproctitle-1.1.10";
2075 2075 doCheck = false;
2076 2076 src = fetchurl {
2077 2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2078 2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2079 2079 };
2080 2080 meta = {
2081 2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2082 2082 };
2083 2083 };
2084 2084 "setuptools" = super.buildPythonPackage {
2085 2085 name = "setuptools-44.1.0";
2086 2086 doCheck = false;
2087 2087 src = fetchurl {
2088 2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2089 2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2090 2090 };
2091 2091 meta = {
2092 2092 license = [ pkgs.lib.licenses.mit ];
2093 2093 };
2094 2094 };
2095 "setuptools-scm" = super.buildPythonPackage {
2096 name = "setuptools-scm-3.5.0";
2097 doCheck = false;
2098 src = fetchurl {
2099 url = "https://files.pythonhosted.org/packages/b2/f7/60a645aae001a2e06cf4b8db2fba9d9f36b8fd378f10647e3e218b61b74b/setuptools_scm-3.5.0.tar.gz";
2100 sha256 = "5bdf21a05792903cafe7ae0c9501182ab52497614fa6b1750d9dbae7b60c1a87";
2101 };
2102 meta = {
2103 license = [ pkgs.lib.licenses.psfl ];
2104 };
2105 };
2095 2106 "simplegeneric" = super.buildPythonPackage {
2096 2107 name = "simplegeneric-0.8.1";
2097 2108 doCheck = false;
2098 2109 src = fetchurl {
2099 2110 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2100 2111 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2101 2112 };
2102 2113 meta = {
2103 2114 license = [ pkgs.lib.licenses.zpl21 ];
2104 2115 };
2105 2116 };
2106 2117 "simplejson" = super.buildPythonPackage {
2107 2118 name = "simplejson-3.16.0";
2108 2119 doCheck = false;
2109 2120 src = fetchurl {
2110 2121 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2111 2122 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2112 2123 };
2113 2124 meta = {
2114 2125 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2115 2126 };
2116 2127 };
2117 2128 "six" = super.buildPythonPackage {
2118 2129 name = "six-1.11.0";
2119 2130 doCheck = false;
2120 2131 src = fetchurl {
2121 2132 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2122 2133 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2123 2134 };
2124 2135 meta = {
2125 2136 license = [ pkgs.lib.licenses.mit ];
2126 2137 };
2127 2138 };
2128 2139 "sqlalchemy" = super.buildPythonPackage {
2129 2140 name = "sqlalchemy-1.3.15";
2130 2141 doCheck = false;
2131 2142 src = fetchurl {
2132 2143 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2133 2144 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2134 2145 };
2135 2146 meta = {
2136 2147 license = [ pkgs.lib.licenses.mit ];
2137 2148 };
2138 2149 };
2139 2150 "sshpubkeys" = super.buildPythonPackage {
2140 2151 name = "sshpubkeys-3.1.0";
2141 2152 doCheck = false;
2142 2153 propagatedBuildInputs = [
2143 2154 self."cryptography"
2144 2155 self."ecdsa"
2145 2156 ];
2146 2157 src = fetchurl {
2147 2158 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2148 2159 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2149 2160 };
2150 2161 meta = {
2151 2162 license = [ pkgs.lib.licenses.bsdOriginal ];
2152 2163 };
2153 2164 };
2154 2165 "subprocess32" = super.buildPythonPackage {
2155 2166 name = "subprocess32-3.5.4";
2156 2167 doCheck = false;
2157 2168 src = fetchurl {
2158 2169 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2159 2170 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2160 2171 };
2161 2172 meta = {
2162 2173 license = [ pkgs.lib.licenses.psfl ];
2163 2174 };
2164 2175 };
2165 2176 "supervisor" = super.buildPythonPackage {
2166 2177 name = "supervisor-4.1.0";
2167 2178 doCheck = false;
2168 2179 src = fetchurl {
2169 2180 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2170 2181 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2171 2182 };
2172 2183 meta = {
2173 2184 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2174 2185 };
2175 2186 };
2176 2187 "tempita" = super.buildPythonPackage {
2177 2188 name = "tempita-0.5.2";
2178 2189 doCheck = false;
2179 2190 src = fetchurl {
2180 2191 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2181 2192 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2182 2193 };
2183 2194 meta = {
2184 2195 license = [ pkgs.lib.licenses.mit ];
2185 2196 };
2186 2197 };
2187 2198 "termcolor" = super.buildPythonPackage {
2188 2199 name = "termcolor-1.1.0";
2189 2200 doCheck = false;
2190 2201 src = fetchurl {
2191 2202 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2192 2203 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2193 2204 };
2194 2205 meta = {
2195 2206 license = [ pkgs.lib.licenses.mit ];
2196 2207 };
2197 2208 };
2198 2209 "testpath" = super.buildPythonPackage {
2199 2210 name = "testpath-0.4.4";
2200 2211 doCheck = false;
2201 2212 src = fetchurl {
2202 2213 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2203 2214 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2204 2215 };
2205 2216 meta = {
2206 2217 license = [ ];
2207 2218 };
2208 2219 };
2209 2220 "traitlets" = super.buildPythonPackage {
2210 2221 name = "traitlets-4.3.3";
2211 2222 doCheck = false;
2212 2223 propagatedBuildInputs = [
2213 2224 self."ipython-genutils"
2214 2225 self."six"
2215 2226 self."decorator"
2216 2227 self."enum34"
2217 2228 ];
2218 2229 src = fetchurl {
2219 2230 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2220 2231 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2221 2232 };
2222 2233 meta = {
2223 2234 license = [ pkgs.lib.licenses.bsdOriginal ];
2224 2235 };
2225 2236 };
2226 2237 "transaction" = super.buildPythonPackage {
2227 2238 name = "transaction-2.4.0";
2228 2239 doCheck = false;
2229 2240 propagatedBuildInputs = [
2230 2241 self."zope.interface"
2231 2242 ];
2232 2243 src = fetchurl {
2233 2244 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2234 2245 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2235 2246 };
2236 2247 meta = {
2237 2248 license = [ pkgs.lib.licenses.zpl21 ];
2238 2249 };
2239 2250 };
2240 2251 "translationstring" = super.buildPythonPackage {
2241 2252 name = "translationstring-1.3";
2242 2253 doCheck = false;
2243 2254 src = fetchurl {
2244 2255 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2245 2256 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2246 2257 };
2247 2258 meta = {
2248 2259 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2249 2260 };
2250 2261 };
2251 2262 "tzlocal" = super.buildPythonPackage {
2252 2263 name = "tzlocal-1.5.1";
2253 2264 doCheck = false;
2254 2265 propagatedBuildInputs = [
2255 2266 self."pytz"
2256 2267 ];
2257 2268 src = fetchurl {
2258 2269 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2259 2270 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2260 2271 };
2261 2272 meta = {
2262 2273 license = [ pkgs.lib.licenses.mit ];
2263 2274 };
2264 2275 };
2265 2276 "urllib3" = super.buildPythonPackage {
2266 2277 name = "urllib3-1.25.2";
2267 2278 doCheck = false;
2268 2279 src = fetchurl {
2269 2280 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2270 2281 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2271 2282 };
2272 2283 meta = {
2273 2284 license = [ pkgs.lib.licenses.mit ];
2274 2285 };
2275 2286 };
2276 2287 "urlobject" = super.buildPythonPackage {
2277 2288 name = "urlobject-2.4.3";
2278 2289 doCheck = false;
2279 2290 src = fetchurl {
2280 2291 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2281 2292 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2282 2293 };
2283 2294 meta = {
2284 2295 license = [ pkgs.lib.licenses.publicDomain ];
2285 2296 };
2286 2297 };
2287 2298 "venusian" = super.buildPythonPackage {
2288 2299 name = "venusian-1.2.0";
2289 2300 doCheck = false;
2290 2301 src = fetchurl {
2291 2302 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2292 2303 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2293 2304 };
2294 2305 meta = {
2295 2306 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2296 2307 };
2297 2308 };
2298 2309 "vine" = super.buildPythonPackage {
2299 2310 name = "vine-1.3.0";
2300 2311 doCheck = false;
2301 2312 src = fetchurl {
2302 2313 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2303 2314 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2304 2315 };
2305 2316 meta = {
2306 2317 license = [ pkgs.lib.licenses.bsdOriginal ];
2307 2318 };
2308 2319 };
2309 2320 "waitress" = super.buildPythonPackage {
2310 2321 name = "waitress-1.3.1";
2311 2322 doCheck = false;
2312 2323 src = fetchurl {
2313 2324 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2314 2325 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2315 2326 };
2316 2327 meta = {
2317 2328 license = [ pkgs.lib.licenses.zpl21 ];
2318 2329 };
2319 2330 };
2320 2331 "wcwidth" = super.buildPythonPackage {
2321 2332 name = "wcwidth-0.1.9";
2322 2333 doCheck = false;
2323 2334 src = fetchurl {
2324 2335 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2325 2336 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2326 2337 };
2327 2338 meta = {
2328 2339 license = [ pkgs.lib.licenses.mit ];
2329 2340 };
2330 2341 };
2331 2342 "webencodings" = super.buildPythonPackage {
2332 2343 name = "webencodings-0.5.1";
2333 2344 doCheck = false;
2334 2345 src = fetchurl {
2335 2346 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2336 2347 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2337 2348 };
2338 2349 meta = {
2339 2350 license = [ pkgs.lib.licenses.bsdOriginal ];
2340 2351 };
2341 2352 };
2342 2353 "weberror" = super.buildPythonPackage {
2343 2354 name = "weberror-0.13.1";
2344 2355 doCheck = false;
2345 2356 propagatedBuildInputs = [
2346 2357 self."webob"
2347 2358 self."tempita"
2348 2359 self."pygments"
2349 2360 self."paste"
2350 2361 ];
2351 2362 src = fetchurl {
2352 2363 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2353 2364 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2354 2365 };
2355 2366 meta = {
2356 2367 license = [ pkgs.lib.licenses.mit ];
2357 2368 };
2358 2369 };
2359 2370 "webhelpers2" = super.buildPythonPackage {
2360 2371 name = "webhelpers2-2.0";
2361 2372 doCheck = false;
2362 2373 propagatedBuildInputs = [
2363 2374 self."markupsafe"
2364 2375 self."six"
2365 2376 ];
2366 2377 src = fetchurl {
2367 2378 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2368 2379 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2369 2380 };
2370 2381 meta = {
2371 2382 license = [ pkgs.lib.licenses.mit ];
2372 2383 };
2373 2384 };
2374 2385 "webob" = super.buildPythonPackage {
2375 2386 name = "webob-1.8.5";
2376 2387 doCheck = false;
2377 2388 src = fetchurl {
2378 2389 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2379 2390 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2380 2391 };
2381 2392 meta = {
2382 2393 license = [ pkgs.lib.licenses.mit ];
2383 2394 };
2384 2395 };
2385 2396 "webtest" = super.buildPythonPackage {
2386 2397 name = "webtest-2.0.34";
2387 2398 doCheck = false;
2388 2399 propagatedBuildInputs = [
2389 2400 self."six"
2390 2401 self."webob"
2391 2402 self."waitress"
2392 2403 self."beautifulsoup4"
2393 2404 ];
2394 2405 src = fetchurl {
2395 2406 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2396 2407 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2397 2408 };
2398 2409 meta = {
2399 2410 license = [ pkgs.lib.licenses.mit ];
2400 2411 };
2401 2412 };
2402 2413 "whoosh" = super.buildPythonPackage {
2403 2414 name = "whoosh-2.7.4";
2404 2415 doCheck = false;
2405 2416 src = fetchurl {
2406 2417 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2407 2418 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2408 2419 };
2409 2420 meta = {
2410 2421 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2411 2422 };
2412 2423 };
2413 2424 "ws4py" = super.buildPythonPackage {
2414 2425 name = "ws4py-0.5.1";
2415 2426 doCheck = false;
2416 2427 src = fetchurl {
2417 2428 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2418 2429 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2419 2430 };
2420 2431 meta = {
2421 2432 license = [ pkgs.lib.licenses.bsdOriginal ];
2422 2433 };
2423 2434 };
2424 2435 "wsgiref" = super.buildPythonPackage {
2425 2436 name = "wsgiref-0.1.2";
2426 2437 doCheck = false;
2427 2438 src = fetchurl {
2428 2439 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2429 2440 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2430 2441 };
2431 2442 meta = {
2432 2443 license = [ { fullName = "PSF or ZPL"; } ];
2433 2444 };
2434 2445 };
2435 2446 "zipp" = super.buildPythonPackage {
2436 2447 name = "zipp-1.2.0";
2437 2448 doCheck = false;
2438 2449 propagatedBuildInputs = [
2439 2450 self."contextlib2"
2440 2451 ];
2441 2452 src = fetchurl {
2442 2453 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2443 2454 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2444 2455 };
2445 2456 meta = {
2446 2457 license = [ pkgs.lib.licenses.mit ];
2447 2458 };
2448 2459 };
2449 2460 "zope.cachedescriptors" = super.buildPythonPackage {
2450 2461 name = "zope.cachedescriptors-4.3.1";
2451 2462 doCheck = false;
2452 2463 propagatedBuildInputs = [
2453 2464 self."setuptools"
2454 2465 ];
2455 2466 src = fetchurl {
2456 2467 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2457 2468 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2458 2469 };
2459 2470 meta = {
2460 2471 license = [ pkgs.lib.licenses.zpl21 ];
2461 2472 };
2462 2473 };
2463 2474 "zope.deprecation" = super.buildPythonPackage {
2464 2475 name = "zope.deprecation-4.4.0";
2465 2476 doCheck = false;
2466 2477 propagatedBuildInputs = [
2467 2478 self."setuptools"
2468 2479 ];
2469 2480 src = fetchurl {
2470 2481 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2471 2482 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2472 2483 };
2473 2484 meta = {
2474 2485 license = [ pkgs.lib.licenses.zpl21 ];
2475 2486 };
2476 2487 };
2477 2488 "zope.event" = super.buildPythonPackage {
2478 2489 name = "zope.event-4.4";
2479 2490 doCheck = false;
2480 2491 propagatedBuildInputs = [
2481 2492 self."setuptools"
2482 2493 ];
2483 2494 src = fetchurl {
2484 2495 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2485 2496 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2486 2497 };
2487 2498 meta = {
2488 2499 license = [ pkgs.lib.licenses.zpl21 ];
2489 2500 };
2490 2501 };
2491 2502 "zope.interface" = super.buildPythonPackage {
2492 2503 name = "zope.interface-4.6.0";
2493 2504 doCheck = false;
2494 2505 propagatedBuildInputs = [
2495 2506 self."setuptools"
2496 2507 ];
2497 2508 src = fetchurl {
2498 2509 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2499 2510 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2500 2511 };
2501 2512 meta = {
2502 2513 license = [ pkgs.lib.licenses.zpl21 ];
2503 2514 };
2504 2515 };
2505 2516
2506 2517 ### Test requirements
2507 2518
2508 2519
2509 2520 }
@@ -1,1 +1,1 b''
1 4.24.1 No newline at end of file
1 4.25.0 No newline at end of file
@@ -1,2524 +1,2524 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import time
23 23
24 24 import rhodecode
25 25 from rhodecode.api import (
26 26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
27 27 from rhodecode.api.utils import (
28 28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
29 29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
30 30 get_perm_or_error, parse_args, get_origin, build_commit_data,
31 31 validate_set_owner_permissions)
32 32 from rhodecode.lib import audit_logger, rc_cache, channelstream
33 33 from rhodecode.lib import repo_maintenance
34 34 from rhodecode.lib.auth import (
35 35 HasPermissionAnyApi, HasUserGroupPermissionAnyApi,
36 36 HasRepoPermissionAnyApi)
37 37 from rhodecode.lib.celerylib.utils import get_task_id
38 38 from rhodecode.lib.utils2 import (
39 39 str2bool, time_to_datetime, safe_str, safe_int, safe_unicode)
40 40 from rhodecode.lib.ext_json import json
41 41 from rhodecode.lib.exceptions import (
42 42 StatusChangeOnClosedPullRequestError, CommentVersionMismatch)
43 43 from rhodecode.lib.vcs import RepositoryError
44 44 from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError
45 45 from rhodecode.model.changeset_status import ChangesetStatusModel
46 46 from rhodecode.model.comment import CommentsModel
47 47 from rhodecode.model.db import (
48 48 Session, ChangesetStatus, RepositoryField, Repository, RepoGroup,
49 49 ChangesetComment)
50 50 from rhodecode.model.permission import PermissionModel
51 51 from rhodecode.model.pull_request import PullRequestModel
52 52 from rhodecode.model.repo import RepoModel
53 53 from rhodecode.model.scm import ScmModel, RepoList
54 54 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
55 55 from rhodecode.model import validation_schema
56 56 from rhodecode.model.validation_schema.schemas import repo_schema
57 57
58 58 log = logging.getLogger(__name__)
59 59
60 60
61 61 @jsonrpc_method()
62 62 def get_repo(request, apiuser, repoid, cache=Optional(True)):
63 63 """
64 64 Gets an existing repository by its name or repository_id.
65 65
66 66 The members section so the output returns users groups or users
67 67 associated with that repository.
68 68
69 69 This command can only be run using an |authtoken| with admin rights,
70 70 or users with at least read rights to the |repo|.
71 71
72 72 :param apiuser: This is filled automatically from the |authtoken|.
73 73 :type apiuser: AuthUser
74 74 :param repoid: The repository name or repository id.
75 75 :type repoid: str or int
76 76 :param cache: use the cached value for last changeset
77 77 :type: cache: Optional(bool)
78 78
79 79 Example output:
80 80
81 81 .. code-block:: bash
82 82
83 83 {
84 84 "error": null,
85 85 "id": <repo_id>,
86 86 "result": {
87 87 "clone_uri": null,
88 88 "created_on": "timestamp",
89 89 "description": "repo description",
90 90 "enable_downloads": false,
91 91 "enable_locking": false,
92 92 "enable_statistics": false,
93 93 "followers": [
94 94 {
95 95 "active": true,
96 96 "admin": false,
97 97 "api_key": "****************************************",
98 98 "api_keys": [
99 99 "****************************************"
100 100 ],
101 101 "email": "user@example.com",
102 102 "emails": [
103 103 "user@example.com"
104 104 ],
105 105 "extern_name": "rhodecode",
106 106 "extern_type": "rhodecode",
107 107 "firstname": "username",
108 108 "ip_addresses": [],
109 109 "language": null,
110 110 "last_login": "2015-09-16T17:16:35.854",
111 111 "lastname": "surname",
112 112 "user_id": <user_id>,
113 113 "username": "name"
114 114 }
115 115 ],
116 116 "fork_of": "parent-repo",
117 117 "landing_rev": [
118 118 "rev",
119 119 "tip"
120 120 ],
121 121 "last_changeset": {
122 122 "author": "User <user@example.com>",
123 123 "branch": "default",
124 124 "date": "timestamp",
125 125 "message": "last commit message",
126 126 "parents": [
127 127 {
128 128 "raw_id": "commit-id"
129 129 }
130 130 ],
131 131 "raw_id": "commit-id",
132 132 "revision": <revision number>,
133 133 "short_id": "short id"
134 134 },
135 135 "lock_reason": null,
136 136 "locked_by": null,
137 137 "locked_date": null,
138 138 "owner": "owner-name",
139 139 "permissions": [
140 140 {
141 141 "name": "super-admin-name",
142 142 "origin": "super-admin",
143 143 "permission": "repository.admin",
144 144 "type": "user"
145 145 },
146 146 {
147 147 "name": "owner-name",
148 148 "origin": "owner",
149 149 "permission": "repository.admin",
150 150 "type": "user"
151 151 },
152 152 {
153 153 "name": "user-group-name",
154 154 "origin": "permission",
155 155 "permission": "repository.write",
156 156 "type": "user_group"
157 157 }
158 158 ],
159 159 "private": true,
160 160 "repo_id": 676,
161 161 "repo_name": "user-group/repo-name",
162 162 "repo_type": "hg"
163 163 }
164 164 }
165 165 """
166 166
167 167 repo = get_repo_or_error(repoid)
168 168 cache = Optional.extract(cache)
169 169
170 170 include_secrets = False
171 171 if has_superadmin_permission(apiuser):
172 172 include_secrets = True
173 173 else:
174 174 # check if we have at least read permission for this repo !
175 175 _perms = (
176 176 'repository.admin', 'repository.write', 'repository.read',)
177 177 validate_repo_permissions(apiuser, repoid, repo, _perms)
178 178
179 179 permissions = []
180 180 for _user in repo.permissions():
181 181 user_data = {
182 182 'name': _user.username,
183 183 'permission': _user.permission,
184 184 'origin': get_origin(_user),
185 185 'type': "user",
186 186 }
187 187 permissions.append(user_data)
188 188
189 189 for _user_group in repo.permission_user_groups():
190 190 user_group_data = {
191 191 'name': _user_group.users_group_name,
192 192 'permission': _user_group.permission,
193 193 'origin': get_origin(_user_group),
194 194 'type': "user_group",
195 195 }
196 196 permissions.append(user_group_data)
197 197
198 198 following_users = [
199 199 user.user.get_api_data(include_secrets=include_secrets)
200 200 for user in repo.followers]
201 201
202 202 if not cache:
203 203 repo.update_commit_cache()
204 204 data = repo.get_api_data(include_secrets=include_secrets)
205 205 data['permissions'] = permissions
206 206 data['followers'] = following_users
207 207 return data
208 208
209 209
210 210 @jsonrpc_method()
211 211 def get_repos(request, apiuser, root=Optional(None), traverse=Optional(True)):
212 212 """
213 213 Lists all existing repositories.
214 214
215 215 This command can only be run using an |authtoken| with admin rights,
216 216 or users with at least read rights to |repos|.
217 217
218 218 :param apiuser: This is filled automatically from the |authtoken|.
219 219 :type apiuser: AuthUser
220 220 :param root: specify root repository group to fetch repositories.
221 221 filters the returned repositories to be members of given root group.
222 222 :type root: Optional(None)
223 223 :param traverse: traverse given root into subrepositories. With this flag
224 224 set to False, it will only return top-level repositories from `root`.
225 225 if root is empty it will return just top-level repositories.
226 226 :type traverse: Optional(True)
227 227
228 228
229 229 Example output:
230 230
231 231 .. code-block:: bash
232 232
233 233 id : <id_given_in_input>
234 234 result: [
235 235 {
236 236 "repo_id" : "<repo_id>",
237 237 "repo_name" : "<reponame>"
238 238 "repo_type" : "<repo_type>",
239 239 "clone_uri" : "<clone_uri>",
240 240 "private": : "<bool>",
241 241 "created_on" : "<datetimecreated>",
242 242 "description" : "<description>",
243 243 "landing_rev": "<landing_rev>",
244 244 "owner": "<repo_owner>",
245 245 "fork_of": "<name_of_fork_parent>",
246 246 "enable_downloads": "<bool>",
247 247 "enable_locking": "<bool>",
248 248 "enable_statistics": "<bool>",
249 249 },
250 250 ...
251 251 ]
252 252 error: null
253 253 """
254 254
255 255 include_secrets = has_superadmin_permission(apiuser)
256 256 _perms = ('repository.read', 'repository.write', 'repository.admin',)
257 257 extras = {'user': apiuser}
258 258
259 259 root = Optional.extract(root)
260 260 traverse = Optional.extract(traverse, binary=True)
261 261
262 262 if root:
263 263 # verify parent existance, if it's empty return an error
264 264 parent = RepoGroup.get_by_group_name(root)
265 265 if not parent:
266 266 raise JSONRPCError(
267 267 'Root repository group `{}` does not exist'.format(root))
268 268
269 269 if traverse:
270 270 repos = RepoModel().get_repos_for_root(root=root, traverse=traverse)
271 271 else:
272 272 repos = RepoModel().get_repos_for_root(root=parent)
273 273 else:
274 274 if traverse:
275 275 repos = RepoModel().get_all()
276 276 else:
277 277 # return just top-level
278 278 repos = RepoModel().get_repos_for_root(root=None)
279 279
280 280 repo_list = RepoList(repos, perm_set=_perms, extra_kwargs=extras)
281 281 return [repo.get_api_data(include_secrets=include_secrets)
282 282 for repo in repo_list]
283 283
284 284
285 285 @jsonrpc_method()
286 286 def get_repo_changeset(request, apiuser, repoid, revision,
287 287 details=Optional('basic')):
288 288 """
289 289 Returns information about a changeset.
290 290
291 291 Additionally parameters define the amount of details returned by
292 292 this function.
293 293
294 294 This command can only be run using an |authtoken| with admin rights,
295 295 or users with at least read rights to the |repo|.
296 296
297 297 :param apiuser: This is filled automatically from the |authtoken|.
298 298 :type apiuser: AuthUser
299 299 :param repoid: The repository name or repository id
300 300 :type repoid: str or int
301 301 :param revision: revision for which listing should be done
302 302 :type revision: str
303 303 :param details: details can be 'basic|extended|full' full gives diff
304 304 info details like the diff itself, and number of changed files etc.
305 305 :type details: Optional(str)
306 306
307 307 """
308 308 repo = get_repo_or_error(repoid)
309 309 if not has_superadmin_permission(apiuser):
310 310 _perms = ('repository.admin', 'repository.write', 'repository.read',)
311 311 validate_repo_permissions(apiuser, repoid, repo, _perms)
312 312
313 313 changes_details = Optional.extract(details)
314 314 _changes_details_types = ['basic', 'extended', 'full']
315 315 if changes_details not in _changes_details_types:
316 316 raise JSONRPCError(
317 317 'ret_type must be one of %s' % (
318 318 ','.join(_changes_details_types)))
319 319
320 320 vcs_repo = repo.scm_instance()
321 321 pre_load = ['author', 'branch', 'date', 'message', 'parents',
322 322 'status', '_commit', '_file_paths']
323 323
324 324 try:
325 325 commit = repo.get_commit(commit_id=revision, pre_load=pre_load)
326 326 except TypeError as e:
327 327 raise JSONRPCError(safe_str(e))
328 328 _cs_json = commit.__json__()
329 329 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
330 330 if changes_details == 'full':
331 331 _cs_json['refs'] = commit._get_refs()
332 332 return _cs_json
333 333
334 334
335 335 @jsonrpc_method()
336 336 def get_repo_changesets(request, apiuser, repoid, start_rev, limit,
337 337 details=Optional('basic')):
338 338 """
339 339 Returns a set of commits limited by the number starting
340 340 from the `start_rev` option.
341 341
342 342 Additional parameters define the amount of details returned by this
343 343 function.
344 344
345 345 This command can only be run using an |authtoken| with admin rights,
346 346 or users with at least read rights to |repos|.
347 347
348 348 :param apiuser: This is filled automatically from the |authtoken|.
349 349 :type apiuser: AuthUser
350 350 :param repoid: The repository name or repository ID.
351 351 :type repoid: str or int
352 352 :param start_rev: The starting revision from where to get changesets.
353 353 :type start_rev: str
354 354 :param limit: Limit the number of commits to this amount
355 355 :type limit: str or int
356 356 :param details: Set the level of detail returned. Valid option are:
357 357 ``basic``, ``extended`` and ``full``.
358 358 :type details: Optional(str)
359 359
360 360 .. note::
361 361
362 362 Setting the parameter `details` to the value ``full`` is extensive
363 363 and returns details like the diff itself, and the number
364 364 of changed files.
365 365
366 366 """
367 367 repo = get_repo_or_error(repoid)
368 368 if not has_superadmin_permission(apiuser):
369 369 _perms = ('repository.admin', 'repository.write', 'repository.read',)
370 370 validate_repo_permissions(apiuser, repoid, repo, _perms)
371 371
372 372 changes_details = Optional.extract(details)
373 373 _changes_details_types = ['basic', 'extended', 'full']
374 374 if changes_details not in _changes_details_types:
375 375 raise JSONRPCError(
376 376 'ret_type must be one of %s' % (
377 377 ','.join(_changes_details_types)))
378 378
379 379 limit = int(limit)
380 380 pre_load = ['author', 'branch', 'date', 'message', 'parents',
381 381 'status', '_commit', '_file_paths']
382 382
383 383 vcs_repo = repo.scm_instance()
384 384 # SVN needs a special case to distinguish its index and commit id
385 385 if vcs_repo and vcs_repo.alias == 'svn' and (start_rev == '0'):
386 386 start_rev = vcs_repo.commit_ids[0]
387 387
388 388 try:
389 389 commits = vcs_repo.get_commits(
390 390 start_id=start_rev, pre_load=pre_load, translate_tags=False)
391 391 except TypeError as e:
392 392 raise JSONRPCError(safe_str(e))
393 393 except Exception:
394 394 log.exception('Fetching of commits failed')
395 395 raise JSONRPCError('Error occurred during commit fetching')
396 396
397 397 ret = []
398 398 for cnt, commit in enumerate(commits):
399 399 if cnt >= limit != -1:
400 400 break
401 401 _cs_json = commit.__json__()
402 402 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
403 403 if changes_details == 'full':
404 404 _cs_json['refs'] = {
405 405 'branches': [commit.branch],
406 406 'bookmarks': getattr(commit, 'bookmarks', []),
407 407 'tags': commit.tags
408 408 }
409 409 ret.append(_cs_json)
410 410 return ret
411 411
412 412
413 413 @jsonrpc_method()
414 414 def get_repo_nodes(request, apiuser, repoid, revision, root_path,
415 415 ret_type=Optional('all'), details=Optional('basic'),
416 416 max_file_bytes=Optional(None)):
417 417 """
418 418 Returns a list of nodes and children in a flat list for a given
419 419 path at given revision.
420 420
421 421 It's possible to specify ret_type to show only `files` or `dirs`.
422 422
423 423 This command can only be run using an |authtoken| with admin rights,
424 424 or users with at least read rights to |repos|.
425 425
426 426 :param apiuser: This is filled automatically from the |authtoken|.
427 427 :type apiuser: AuthUser
428 428 :param repoid: The repository name or repository ID.
429 429 :type repoid: str or int
430 430 :param revision: The revision for which listing should be done.
431 431 :type revision: str
432 432 :param root_path: The path from which to start displaying.
433 433 :type root_path: str
434 434 :param ret_type: Set the return type. Valid options are
435 435 ``all`` (default), ``files`` and ``dirs``.
436 436 :type ret_type: Optional(str)
437 437 :param details: Returns extended information about nodes, such as
438 438 md5, binary, and or content.
439 439 The valid options are ``basic`` and ``full``.
440 440 :type details: Optional(str)
441 441 :param max_file_bytes: Only return file content under this file size bytes
442 442 :type details: Optional(int)
443 443
444 444 Example output:
445 445
446 446 .. code-block:: bash
447 447
448 448 id : <id_given_in_input>
449 449 result: [
450 450 {
451 451 "binary": false,
452 452 "content": "File line",
453 453 "extension": "md",
454 454 "lines": 2,
455 455 "md5": "059fa5d29b19c0657e384749480f6422",
456 456 "mimetype": "text/x-minidsrc",
457 457 "name": "file.md",
458 458 "size": 580,
459 459 "type": "file"
460 460 },
461 461 ...
462 462 ]
463 463 error: null
464 464 """
465 465
466 466 repo = get_repo_or_error(repoid)
467 467 if not has_superadmin_permission(apiuser):
468 468 _perms = ('repository.admin', 'repository.write', 'repository.read',)
469 469 validate_repo_permissions(apiuser, repoid, repo, _perms)
470 470
471 471 ret_type = Optional.extract(ret_type)
472 472 details = Optional.extract(details)
473 473 _extended_types = ['basic', 'full']
474 474 if details not in _extended_types:
475 475 raise JSONRPCError('ret_type must be one of %s' % (','.join(_extended_types)))
476 476 extended_info = False
477 477 content = False
478 478 if details == 'basic':
479 479 extended_info = True
480 480
481 481 if details == 'full':
482 482 extended_info = content = True
483 483
484 484 _map = {}
485 485 try:
486 486 # check if repo is not empty by any chance, skip quicker if it is.
487 487 _scm = repo.scm_instance()
488 488 if _scm.is_empty():
489 489 return []
490 490
491 491 _d, _f = ScmModel().get_nodes(
492 492 repo, revision, root_path, flat=False,
493 493 extended_info=extended_info, content=content,
494 494 max_file_bytes=max_file_bytes)
495 495 _map = {
496 496 'all': _d + _f,
497 497 'files': _f,
498 498 'dirs': _d,
499 499 }
500 500 return _map[ret_type]
501 501 except KeyError:
502 502 raise JSONRPCError(
503 503 'ret_type must be one of %s' % (','.join(sorted(_map.keys()))))
504 504 except Exception:
505 505 log.exception("Exception occurred while trying to get repo nodes")
506 506 raise JSONRPCError(
507 507 'failed to get repo: `%s` nodes' % repo.repo_name
508 508 )
509 509
510 510
511 511 @jsonrpc_method()
512 512 def get_repo_file(request, apiuser, repoid, commit_id, file_path,
513 513 max_file_bytes=Optional(None), details=Optional('basic'),
514 514 cache=Optional(True)):
515 515 """
516 516 Returns a single file from repository at given revision.
517 517
518 518 This command can only be run using an |authtoken| with admin rights,
519 519 or users with at least read rights to |repos|.
520 520
521 521 :param apiuser: This is filled automatically from the |authtoken|.
522 522 :type apiuser: AuthUser
523 523 :param repoid: The repository name or repository ID.
524 524 :type repoid: str or int
525 525 :param commit_id: The revision for which listing should be done.
526 526 :type commit_id: str
527 527 :param file_path: The path from which to start displaying.
528 528 :type file_path: str
529 529 :param details: Returns different set of information about nodes.
530 530 The valid options are ``minimal`` ``basic`` and ``full``.
531 531 :type details: Optional(str)
532 532 :param max_file_bytes: Only return file content under this file size bytes
533 533 :type max_file_bytes: Optional(int)
534 534 :param cache: Use internal caches for fetching files. If disabled fetching
535 535 files is slower but more memory efficient
536 536 :type cache: Optional(bool)
537 537
538 538 Example output:
539 539
540 540 .. code-block:: bash
541 541
542 542 id : <id_given_in_input>
543 543 result: {
544 544 "binary": false,
545 545 "extension": "py",
546 546 "lines": 35,
547 547 "content": "....",
548 548 "md5": "76318336366b0f17ee249e11b0c99c41",
549 549 "mimetype": "text/x-python",
550 550 "name": "python.py",
551 551 "size": 817,
552 552 "type": "file",
553 553 }
554 554 error: null
555 555 """
556 556
557 557 repo = get_repo_or_error(repoid)
558 558 if not has_superadmin_permission(apiuser):
559 559 _perms = ('repository.admin', 'repository.write', 'repository.read',)
560 560 validate_repo_permissions(apiuser, repoid, repo, _perms)
561 561
562 562 cache = Optional.extract(cache, binary=True)
563 563 details = Optional.extract(details)
564 564 _extended_types = ['minimal', 'minimal+search', 'basic', 'full']
565 565 if details not in _extended_types:
566 566 raise JSONRPCError(
567 567 'ret_type must be one of %s, got %s' % (','.join(_extended_types)), details)
568 568 extended_info = False
569 569 content = False
570 570
571 571 if details == 'minimal':
572 572 extended_info = False
573 573
574 574 elif details == 'basic':
575 575 extended_info = True
576 576
577 577 elif details == 'full':
578 578 extended_info = content = True
579 579
580 580 file_path = safe_unicode(file_path)
581 581 try:
582 582 # check if repo is not empty by any chance, skip quicker if it is.
583 583 _scm = repo.scm_instance()
584 584 if _scm.is_empty():
585 585 return None
586 586
587 587 node = ScmModel().get_node(
588 588 repo, commit_id, file_path, extended_info=extended_info,
589 589 content=content, max_file_bytes=max_file_bytes, cache=cache)
590 590 except NodeDoesNotExistError:
591 591 raise JSONRPCError(u'There is no file in repo: `{}` at path `{}` for commit: `{}`'.format(
592 592 repo.repo_name, file_path, commit_id))
593 593 except Exception:
594 594 log.exception(u"Exception occurred while trying to get repo %s file",
595 595 repo.repo_name)
596 596 raise JSONRPCError(u'failed to get repo: `{}` file at path {}'.format(
597 597 repo.repo_name, file_path))
598 598
599 599 return node
600 600
601 601
602 602 @jsonrpc_method()
603 603 def get_repo_fts_tree(request, apiuser, repoid, commit_id, root_path):
604 604 """
605 605 Returns a list of tree nodes for path at given revision. This api is built
606 606 strictly for usage in full text search building, and shouldn't be consumed
607 607
608 608 This command can only be run using an |authtoken| with admin rights,
609 609 or users with at least read rights to |repos|.
610 610
611 611 """
612 612
613 613 repo = get_repo_or_error(repoid)
614 614 if not has_superadmin_permission(apiuser):
615 615 _perms = ('repository.admin', 'repository.write', 'repository.read',)
616 616 validate_repo_permissions(apiuser, repoid, repo, _perms)
617 617
618 618 repo_id = repo.repo_id
619 619 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
620 620 cache_on = cache_seconds > 0
621 621
622 622 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
623 623 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
624 624
625 625 def compute_fts_tree(cache_ver, repo_id, commit_id, root_path):
626 626 return ScmModel().get_fts_data(repo_id, commit_id, root_path)
627 627
628 628 try:
629 629 # check if repo is not empty by any chance, skip quicker if it is.
630 630 _scm = repo.scm_instance()
631 631 if _scm.is_empty():
632 632 return []
633 633 except RepositoryError:
634 634 log.exception("Exception occurred while trying to get repo nodes")
635 635 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
636 636
637 637 try:
638 638 # we need to resolve commit_id to a FULL sha for cache to work correctly.
639 639 # sending 'master' is a pointer that needs to be translated to current commit.
640 640 commit_id = _scm.get_commit(commit_id=commit_id).raw_id
641 641 log.debug(
642 642 'Computing FTS REPO TREE for repo_id %s commit_id `%s` '
643 643 'with caching: %s[TTL: %ss]' % (
644 644 repo_id, commit_id, cache_on, cache_seconds or 0))
645 645
646 646 tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path)
647 647 return tree_files
648 648
649 649 except Exception:
650 650 log.exception("Exception occurred while trying to get repo nodes")
651 651 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
652 652
653 653
654 654 @jsonrpc_method()
655 655 def get_repo_refs(request, apiuser, repoid):
656 656 """
657 657 Returns a dictionary of current references. It returns
658 658 bookmarks, branches, closed_branches, and tags for given repository
659 659
660 660 It's possible to specify ret_type to show only `files` or `dirs`.
661 661
662 662 This command can only be run using an |authtoken| with admin rights,
663 663 or users with at least read rights to |repos|.
664 664
665 665 :param apiuser: This is filled automatically from the |authtoken|.
666 666 :type apiuser: AuthUser
667 667 :param repoid: The repository name or repository ID.
668 668 :type repoid: str or int
669 669
670 670 Example output:
671 671
672 672 .. code-block:: bash
673 673
674 674 id : <id_given_in_input>
675 675 "result": {
676 676 "bookmarks": {
677 677 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
678 678 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
679 679 },
680 680 "branches": {
681 681 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
682 682 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
683 683 },
684 684 "branches_closed": {},
685 685 "tags": {
686 686 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
687 687 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
688 688 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
689 689 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
690 690 }
691 691 }
692 692 error: null
693 693 """
694 694
695 695 repo = get_repo_or_error(repoid)
696 696 if not has_superadmin_permission(apiuser):
697 697 _perms = ('repository.admin', 'repository.write', 'repository.read',)
698 698 validate_repo_permissions(apiuser, repoid, repo, _perms)
699 699
700 700 try:
701 701 # check if repo is not empty by any chance, skip quicker if it is.
702 702 vcs_instance = repo.scm_instance()
703 703 refs = vcs_instance.refs()
704 704 return refs
705 705 except Exception:
706 706 log.exception("Exception occurred while trying to get repo refs")
707 707 raise JSONRPCError(
708 708 'failed to get repo: `%s` references' % repo.repo_name
709 709 )
710 710
711 711
712 712 @jsonrpc_method()
713 713 def create_repo(
714 714 request, apiuser, repo_name, repo_type,
715 715 owner=Optional(OAttr('apiuser')),
716 716 description=Optional(''),
717 717 private=Optional(False),
718 718 clone_uri=Optional(None),
719 719 push_uri=Optional(None),
720 720 landing_rev=Optional(None),
721 721 enable_statistics=Optional(False),
722 722 enable_locking=Optional(False),
723 723 enable_downloads=Optional(False),
724 724 copy_permissions=Optional(False)):
725 725 """
726 726 Creates a repository.
727 727
728 728 * If the repository name contains "/", repository will be created inside
729 729 a repository group or nested repository groups
730 730
731 731 For example "foo/bar/repo1" will create |repo| called "repo1" inside
732 732 group "foo/bar". You have to have permissions to access and write to
733 733 the last repository group ("bar" in this example)
734 734
735 735 This command can only be run using an |authtoken| with at least
736 736 permissions to create repositories, or write permissions to
737 737 parent repository groups.
738 738
739 739 :param apiuser: This is filled automatically from the |authtoken|.
740 740 :type apiuser: AuthUser
741 741 :param repo_name: Set the repository name.
742 742 :type repo_name: str
743 743 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
744 744 :type repo_type: str
745 745 :param owner: user_id or username
746 746 :type owner: Optional(str)
747 747 :param description: Set the repository description.
748 748 :type description: Optional(str)
749 749 :param private: set repository as private
750 750 :type private: bool
751 751 :param clone_uri: set clone_uri
752 752 :type clone_uri: str
753 753 :param push_uri: set push_uri
754 754 :type push_uri: str
755 755 :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd
756 756 :type landing_rev: str
757 757 :param enable_locking:
758 758 :type enable_locking: bool
759 759 :param enable_downloads:
760 760 :type enable_downloads: bool
761 761 :param enable_statistics:
762 762 :type enable_statistics: bool
763 763 :param copy_permissions: Copy permission from group in which the
764 764 repository is being created.
765 765 :type copy_permissions: bool
766 766
767 767
768 768 Example output:
769 769
770 770 .. code-block:: bash
771 771
772 772 id : <id_given_in_input>
773 773 result: {
774 774 "msg": "Created new repository `<reponame>`",
775 775 "success": true,
776 776 "task": "<celery task id or None if done sync>"
777 777 }
778 778 error: null
779 779
780 780
781 781 Example error output:
782 782
783 783 .. code-block:: bash
784 784
785 785 id : <id_given_in_input>
786 786 result : null
787 787 error : {
788 788 'failed to create repository `<repo_name>`'
789 789 }
790 790
791 791 """
792 792
793 793 owner = validate_set_owner_permissions(apiuser, owner)
794 794
795 795 description = Optional.extract(description)
796 796 copy_permissions = Optional.extract(copy_permissions)
797 797 clone_uri = Optional.extract(clone_uri)
798 798 push_uri = Optional.extract(push_uri)
799 799
800 800 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
801 801 if isinstance(private, Optional):
802 802 private = defs.get('repo_private') or Optional.extract(private)
803 803 if isinstance(repo_type, Optional):
804 804 repo_type = defs.get('repo_type')
805 805 if isinstance(enable_statistics, Optional):
806 806 enable_statistics = defs.get('repo_enable_statistics')
807 807 if isinstance(enable_locking, Optional):
808 808 enable_locking = defs.get('repo_enable_locking')
809 809 if isinstance(enable_downloads, Optional):
810 810 enable_downloads = defs.get('repo_enable_downloads')
811 811
812 812 landing_ref, _label = ScmModel.backend_landing_ref(repo_type)
813 813 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
814 814 ref_choices = list(set(ref_choices + [landing_ref]))
815 815
816 816 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
817 817
818 818 schema = repo_schema.RepoSchema().bind(
819 819 repo_type_options=rhodecode.BACKENDS.keys(),
820 820 repo_ref_options=ref_choices,
821 821 repo_type=repo_type,
822 822 # user caller
823 823 user=apiuser)
824 824
825 825 try:
826 826 schema_data = schema.deserialize(dict(
827 827 repo_name=repo_name,
828 828 repo_type=repo_type,
829 829 repo_owner=owner.username,
830 830 repo_description=description,
831 831 repo_landing_commit_ref=landing_commit_ref,
832 832 repo_clone_uri=clone_uri,
833 833 repo_push_uri=push_uri,
834 834 repo_private=private,
835 835 repo_copy_permissions=copy_permissions,
836 836 repo_enable_statistics=enable_statistics,
837 837 repo_enable_downloads=enable_downloads,
838 838 repo_enable_locking=enable_locking))
839 839 except validation_schema.Invalid as err:
840 840 raise JSONRPCValidationError(colander_exc=err)
841 841
842 842 try:
843 843 data = {
844 844 'owner': owner,
845 845 'repo_name': schema_data['repo_group']['repo_name_without_group'],
846 846 'repo_name_full': schema_data['repo_name'],
847 847 'repo_group': schema_data['repo_group']['repo_group_id'],
848 848 'repo_type': schema_data['repo_type'],
849 849 'repo_description': schema_data['repo_description'],
850 850 'repo_private': schema_data['repo_private'],
851 851 'clone_uri': schema_data['repo_clone_uri'],
852 852 'push_uri': schema_data['repo_push_uri'],
853 853 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
854 854 'enable_statistics': schema_data['repo_enable_statistics'],
855 855 'enable_locking': schema_data['repo_enable_locking'],
856 856 'enable_downloads': schema_data['repo_enable_downloads'],
857 857 'repo_copy_permissions': schema_data['repo_copy_permissions'],
858 858 }
859 859
860 860 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
861 861 task_id = get_task_id(task)
862 862 # no commit, it's done in RepoModel, or async via celery
863 863 return {
864 864 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
865 865 'success': True, # cannot return the repo data here since fork
866 866 # can be done async
867 867 'task': task_id
868 868 }
869 869 except Exception:
870 870 log.exception(
871 871 u"Exception while trying to create the repository %s",
872 872 schema_data['repo_name'])
873 873 raise JSONRPCError(
874 874 'failed to create repository `%s`' % (schema_data['repo_name'],))
875 875
876 876
877 877 @jsonrpc_method()
878 878 def add_field_to_repo(request, apiuser, repoid, key, label=Optional(''),
879 879 description=Optional('')):
880 880 """
881 881 Adds an extra field to a repository.
882 882
883 883 This command can only be run using an |authtoken| with at least
884 884 write permissions to the |repo|.
885 885
886 886 :param apiuser: This is filled automatically from the |authtoken|.
887 887 :type apiuser: AuthUser
888 888 :param repoid: Set the repository name or repository id.
889 889 :type repoid: str or int
890 890 :param key: Create a unique field key for this repository.
891 891 :type key: str
892 892 :param label:
893 893 :type label: Optional(str)
894 894 :param description:
895 895 :type description: Optional(str)
896 896 """
897 897 repo = get_repo_or_error(repoid)
898 898 if not has_superadmin_permission(apiuser):
899 899 _perms = ('repository.admin',)
900 900 validate_repo_permissions(apiuser, repoid, repo, _perms)
901 901
902 902 label = Optional.extract(label) or key
903 903 description = Optional.extract(description)
904 904
905 905 field = RepositoryField.get_by_key_name(key, repo)
906 906 if field:
907 907 raise JSONRPCError('Field with key '
908 908 '`%s` exists for repo `%s`' % (key, repoid))
909 909
910 910 try:
911 911 RepoModel().add_repo_field(repo, key, field_label=label,
912 912 field_desc=description)
913 913 Session().commit()
914 914 return {
915 915 'msg': "Added new repository field `%s`" % (key,),
916 916 'success': True,
917 917 }
918 918 except Exception:
919 919 log.exception("Exception occurred while trying to add field to repo")
920 920 raise JSONRPCError(
921 921 'failed to create new field for repository `%s`' % (repoid,))
922 922
923 923
924 924 @jsonrpc_method()
925 925 def remove_field_from_repo(request, apiuser, repoid, key):
926 926 """
927 927 Removes an extra field from a repository.
928 928
929 929 This command can only be run using an |authtoken| with at least
930 930 write permissions to the |repo|.
931 931
932 932 :param apiuser: This is filled automatically from the |authtoken|.
933 933 :type apiuser: AuthUser
934 934 :param repoid: Set the repository name or repository ID.
935 935 :type repoid: str or int
936 936 :param key: Set the unique field key for this repository.
937 937 :type key: str
938 938 """
939 939
940 940 repo = get_repo_or_error(repoid)
941 941 if not has_superadmin_permission(apiuser):
942 942 _perms = ('repository.admin',)
943 943 validate_repo_permissions(apiuser, repoid, repo, _perms)
944 944
945 945 field = RepositoryField.get_by_key_name(key, repo)
946 946 if not field:
947 947 raise JSONRPCError('Field with key `%s` does not '
948 948 'exists for repo `%s`' % (key, repoid))
949 949
950 950 try:
951 951 RepoModel().delete_repo_field(repo, field_key=key)
952 952 Session().commit()
953 953 return {
954 954 'msg': "Deleted repository field `%s`" % (key,),
955 955 'success': True,
956 956 }
957 957 except Exception:
958 958 log.exception(
959 959 "Exception occurred while trying to delete field from repo")
960 960 raise JSONRPCError(
961 961 'failed to delete field for repository `%s`' % (repoid,))
962 962
963 963
964 964 @jsonrpc_method()
965 965 def update_repo(
966 966 request, apiuser, repoid, repo_name=Optional(None),
967 967 owner=Optional(OAttr('apiuser')), description=Optional(''),
968 968 private=Optional(False),
969 969 clone_uri=Optional(None), push_uri=Optional(None),
970 970 landing_rev=Optional(None), fork_of=Optional(None),
971 971 enable_statistics=Optional(False),
972 972 enable_locking=Optional(False),
973 973 enable_downloads=Optional(False), fields=Optional('')):
974 974 """
975 975 Updates a repository with the given information.
976 976
977 977 This command can only be run using an |authtoken| with at least
978 978 admin permissions to the |repo|.
979 979
980 980 * If the repository name contains "/", repository will be updated
981 981 accordingly with a repository group or nested repository groups
982 982
983 983 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
984 984 called "repo-test" and place it inside group "foo/bar".
985 985 You have to have permissions to access and write to the last repository
986 986 group ("bar" in this example)
987 987
988 988 :param apiuser: This is filled automatically from the |authtoken|.
989 989 :type apiuser: AuthUser
990 990 :param repoid: repository name or repository ID.
991 991 :type repoid: str or int
992 992 :param repo_name: Update the |repo| name, including the
993 993 repository group it's in.
994 994 :type repo_name: str
995 995 :param owner: Set the |repo| owner.
996 996 :type owner: str
997 997 :param fork_of: Set the |repo| as fork of another |repo|.
998 998 :type fork_of: str
999 999 :param description: Update the |repo| description.
1000 1000 :type description: str
1001 1001 :param private: Set the |repo| as private. (True | False)
1002 1002 :type private: bool
1003 1003 :param clone_uri: Update the |repo| clone URI.
1004 1004 :type clone_uri: str
1005 1005 :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd
1006 1006 :type landing_rev: str
1007 1007 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1008 1008 :type enable_statistics: bool
1009 1009 :param enable_locking: Enable |repo| locking.
1010 1010 :type enable_locking: bool
1011 1011 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1012 1012 :type enable_downloads: bool
1013 1013 :param fields: Add extra fields to the |repo|. Use the following
1014 1014 example format: ``field_key=field_val,field_key2=fieldval2``.
1015 1015 Escape ', ' with \,
1016 1016 :type fields: str
1017 1017 """
1018 1018
1019 1019 repo = get_repo_or_error(repoid)
1020 1020
1021 1021 include_secrets = False
1022 1022 if not has_superadmin_permission(apiuser):
1023 1023 _perms = ('repository.admin',)
1024 1024 validate_repo_permissions(apiuser, repoid, repo, _perms)
1025 1025 else:
1026 1026 include_secrets = True
1027 1027
1028 1028 updates = dict(
1029 1029 repo_name=repo_name
1030 1030 if not isinstance(repo_name, Optional) else repo.repo_name,
1031 1031
1032 1032 fork_id=fork_of
1033 1033 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
1034 1034
1035 1035 user=owner
1036 1036 if not isinstance(owner, Optional) else repo.user.username,
1037 1037
1038 1038 repo_description=description
1039 1039 if not isinstance(description, Optional) else repo.description,
1040 1040
1041 1041 repo_private=private
1042 1042 if not isinstance(private, Optional) else repo.private,
1043 1043
1044 1044 clone_uri=clone_uri
1045 1045 if not isinstance(clone_uri, Optional) else repo.clone_uri,
1046 1046
1047 1047 push_uri=push_uri
1048 1048 if not isinstance(push_uri, Optional) else repo.push_uri,
1049 1049
1050 1050 repo_landing_rev=landing_rev
1051 1051 if not isinstance(landing_rev, Optional) else repo._landing_revision,
1052 1052
1053 1053 repo_enable_statistics=enable_statistics
1054 1054 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
1055 1055
1056 1056 repo_enable_locking=enable_locking
1057 1057 if not isinstance(enable_locking, Optional) else repo.enable_locking,
1058 1058
1059 1059 repo_enable_downloads=enable_downloads
1060 1060 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
1061 1061
1062 1062 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1063 1063 ref_choices, _labels = ScmModel().get_repo_landing_revs(
1064 1064 request.translate, repo=repo)
1065 1065 ref_choices = list(set(ref_choices + [landing_ref]))
1066 1066
1067 1067 old_values = repo.get_api_data()
1068 1068 repo_type = repo.repo_type
1069 1069 schema = repo_schema.RepoSchema().bind(
1070 1070 repo_type_options=rhodecode.BACKENDS.keys(),
1071 1071 repo_ref_options=ref_choices,
1072 1072 repo_type=repo_type,
1073 1073 # user caller
1074 1074 user=apiuser,
1075 1075 old_values=old_values)
1076 1076 try:
1077 1077 schema_data = schema.deserialize(dict(
1078 1078 # we save old value, users cannot change type
1079 1079 repo_type=repo_type,
1080 1080
1081 1081 repo_name=updates['repo_name'],
1082 1082 repo_owner=updates['user'],
1083 1083 repo_description=updates['repo_description'],
1084 1084 repo_clone_uri=updates['clone_uri'],
1085 1085 repo_push_uri=updates['push_uri'],
1086 1086 repo_fork_of=updates['fork_id'],
1087 1087 repo_private=updates['repo_private'],
1088 1088 repo_landing_commit_ref=updates['repo_landing_rev'],
1089 1089 repo_enable_statistics=updates['repo_enable_statistics'],
1090 1090 repo_enable_downloads=updates['repo_enable_downloads'],
1091 1091 repo_enable_locking=updates['repo_enable_locking']))
1092 1092 except validation_schema.Invalid as err:
1093 1093 raise JSONRPCValidationError(colander_exc=err)
1094 1094
1095 1095 # save validated data back into the updates dict
1096 1096 validated_updates = dict(
1097 1097 repo_name=schema_data['repo_group']['repo_name_without_group'],
1098 1098 repo_group=schema_data['repo_group']['repo_group_id'],
1099 1099
1100 1100 user=schema_data['repo_owner'],
1101 1101 repo_description=schema_data['repo_description'],
1102 1102 repo_private=schema_data['repo_private'],
1103 1103 clone_uri=schema_data['repo_clone_uri'],
1104 1104 push_uri=schema_data['repo_push_uri'],
1105 1105 repo_landing_rev=schema_data['repo_landing_commit_ref'],
1106 1106 repo_enable_statistics=schema_data['repo_enable_statistics'],
1107 1107 repo_enable_locking=schema_data['repo_enable_locking'],
1108 1108 repo_enable_downloads=schema_data['repo_enable_downloads'],
1109 1109 )
1110 1110
1111 1111 if schema_data['repo_fork_of']:
1112 1112 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
1113 1113 validated_updates['fork_id'] = fork_repo.repo_id
1114 1114
1115 1115 # extra fields
1116 1116 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
1117 1117 if fields:
1118 1118 validated_updates.update(fields)
1119 1119
1120 1120 try:
1121 1121 RepoModel().update(repo, **validated_updates)
1122 1122 audit_logger.store_api(
1123 1123 'repo.edit', action_data={'old_data': old_values},
1124 1124 user=apiuser, repo=repo)
1125 1125 Session().commit()
1126 1126 return {
1127 1127 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
1128 1128 'repository': repo.get_api_data(include_secrets=include_secrets)
1129 1129 }
1130 1130 except Exception:
1131 1131 log.exception(
1132 1132 u"Exception while trying to update the repository %s",
1133 1133 repoid)
1134 1134 raise JSONRPCError('failed to update repo `%s`' % repoid)
1135 1135
1136 1136
1137 1137 @jsonrpc_method()
1138 1138 def fork_repo(request, apiuser, repoid, fork_name,
1139 1139 owner=Optional(OAttr('apiuser')),
1140 1140 description=Optional(''),
1141 1141 private=Optional(False),
1142 1142 clone_uri=Optional(None),
1143 1143 landing_rev=Optional(None),
1144 1144 copy_permissions=Optional(False)):
1145 1145 """
1146 1146 Creates a fork of the specified |repo|.
1147 1147
1148 1148 * If the fork_name contains "/", fork will be created inside
1149 1149 a repository group or nested repository groups
1150 1150
1151 1151 For example "foo/bar/fork-repo" will create fork called "fork-repo"
1152 1152 inside group "foo/bar". You have to have permissions to access and
1153 1153 write to the last repository group ("bar" in this example)
1154 1154
1155 1155 This command can only be run using an |authtoken| with minimum
1156 1156 read permissions of the forked repo, create fork permissions for an user.
1157 1157
1158 1158 :param apiuser: This is filled automatically from the |authtoken|.
1159 1159 :type apiuser: AuthUser
1160 1160 :param repoid: Set repository name or repository ID.
1161 1161 :type repoid: str or int
1162 1162 :param fork_name: Set the fork name, including it's repository group membership.
1163 1163 :type fork_name: str
1164 1164 :param owner: Set the fork owner.
1165 1165 :type owner: str
1166 1166 :param description: Set the fork description.
1167 1167 :type description: str
1168 1168 :param copy_permissions: Copy permissions from parent |repo|. The
1169 1169 default is False.
1170 1170 :type copy_permissions: bool
1171 1171 :param private: Make the fork private. The default is False.
1172 1172 :type private: bool
1173 1173 :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd
1174 1174
1175 1175 Example output:
1176 1176
1177 1177 .. code-block:: bash
1178 1178
1179 1179 id : <id_for_response>
1180 1180 api_key : "<api_key>"
1181 1181 args: {
1182 1182 "repoid" : "<reponame or repo_id>",
1183 1183 "fork_name": "<forkname>",
1184 1184 "owner": "<username or user_id = Optional(=apiuser)>",
1185 1185 "description": "<description>",
1186 1186 "copy_permissions": "<bool>",
1187 1187 "private": "<bool>",
1188 1188 "landing_rev": "<landing_rev>"
1189 1189 }
1190 1190
1191 1191 Example error output:
1192 1192
1193 1193 .. code-block:: bash
1194 1194
1195 1195 id : <id_given_in_input>
1196 1196 result: {
1197 1197 "msg": "Created fork of `<reponame>` as `<forkname>`",
1198 1198 "success": true,
1199 1199 "task": "<celery task id or None if done sync>"
1200 1200 }
1201 1201 error: null
1202 1202
1203 1203 """
1204 1204
1205 1205 repo = get_repo_or_error(repoid)
1206 1206 repo_name = repo.repo_name
1207 1207
1208 1208 if not has_superadmin_permission(apiuser):
1209 1209 # check if we have at least read permission for
1210 1210 # this repo that we fork !
1211 1211 _perms = ('repository.admin', 'repository.write', 'repository.read')
1212 1212 validate_repo_permissions(apiuser, repoid, repo, _perms)
1213 1213
1214 1214 # check if the regular user has at least fork permissions as well
1215 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
1215 if not HasPermissionAnyApi(PermissionModel.FORKING_ENABLED)(user=apiuser):
1216 1216 raise JSONRPCForbidden()
1217 1217
1218 1218 # check if user can set owner parameter
1219 1219 owner = validate_set_owner_permissions(apiuser, owner)
1220 1220
1221 1221 description = Optional.extract(description)
1222 1222 copy_permissions = Optional.extract(copy_permissions)
1223 1223 clone_uri = Optional.extract(clone_uri)
1224 1224
1225 1225 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1226 1226 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
1227 1227 ref_choices = list(set(ref_choices + [landing_ref]))
1228 1228 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
1229 1229
1230 1230 private = Optional.extract(private)
1231 1231
1232 1232 schema = repo_schema.RepoSchema().bind(
1233 1233 repo_type_options=rhodecode.BACKENDS.keys(),
1234 1234 repo_ref_options=ref_choices,
1235 1235 repo_type=repo.repo_type,
1236 1236 # user caller
1237 1237 user=apiuser)
1238 1238
1239 1239 try:
1240 1240 schema_data = schema.deserialize(dict(
1241 1241 repo_name=fork_name,
1242 1242 repo_type=repo.repo_type,
1243 1243 repo_owner=owner.username,
1244 1244 repo_description=description,
1245 1245 repo_landing_commit_ref=landing_commit_ref,
1246 1246 repo_clone_uri=clone_uri,
1247 1247 repo_private=private,
1248 1248 repo_copy_permissions=copy_permissions))
1249 1249 except validation_schema.Invalid as err:
1250 1250 raise JSONRPCValidationError(colander_exc=err)
1251 1251
1252 1252 try:
1253 1253 data = {
1254 1254 'fork_parent_id': repo.repo_id,
1255 1255
1256 1256 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1257 1257 'repo_name_full': schema_data['repo_name'],
1258 1258 'repo_group': schema_data['repo_group']['repo_group_id'],
1259 1259 'repo_type': schema_data['repo_type'],
1260 1260 'description': schema_data['repo_description'],
1261 1261 'private': schema_data['repo_private'],
1262 1262 'copy_permissions': schema_data['repo_copy_permissions'],
1263 1263 'landing_rev': schema_data['repo_landing_commit_ref'],
1264 1264 }
1265 1265
1266 1266 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1267 1267 # no commit, it's done in RepoModel, or async via celery
1268 1268 task_id = get_task_id(task)
1269 1269
1270 1270 return {
1271 1271 'msg': 'Created fork of `%s` as `%s`' % (
1272 1272 repo.repo_name, schema_data['repo_name']),
1273 1273 'success': True, # cannot return the repo data here since fork
1274 1274 # can be done async
1275 1275 'task': task_id
1276 1276 }
1277 1277 except Exception:
1278 1278 log.exception(
1279 1279 u"Exception while trying to create fork %s",
1280 1280 schema_data['repo_name'])
1281 1281 raise JSONRPCError(
1282 1282 'failed to fork repository `%s` as `%s`' % (
1283 1283 repo_name, schema_data['repo_name']))
1284 1284
1285 1285
1286 1286 @jsonrpc_method()
1287 1287 def delete_repo(request, apiuser, repoid, forks=Optional('')):
1288 1288 """
1289 1289 Deletes a repository.
1290 1290
1291 1291 * When the `forks` parameter is set it's possible to detach or delete
1292 1292 forks of deleted repository.
1293 1293
1294 1294 This command can only be run using an |authtoken| with admin
1295 1295 permissions on the |repo|.
1296 1296
1297 1297 :param apiuser: This is filled automatically from the |authtoken|.
1298 1298 :type apiuser: AuthUser
1299 1299 :param repoid: Set the repository name or repository ID.
1300 1300 :type repoid: str or int
1301 1301 :param forks: Set to `detach` or `delete` forks from the |repo|.
1302 1302 :type forks: Optional(str)
1303 1303
1304 1304 Example error output:
1305 1305
1306 1306 .. code-block:: bash
1307 1307
1308 1308 id : <id_given_in_input>
1309 1309 result: {
1310 1310 "msg": "Deleted repository `<reponame>`",
1311 1311 "success": true
1312 1312 }
1313 1313 error: null
1314 1314 """
1315 1315
1316 1316 repo = get_repo_or_error(repoid)
1317 1317 repo_name = repo.repo_name
1318 1318 if not has_superadmin_permission(apiuser):
1319 1319 _perms = ('repository.admin',)
1320 1320 validate_repo_permissions(apiuser, repoid, repo, _perms)
1321 1321
1322 1322 try:
1323 1323 handle_forks = Optional.extract(forks)
1324 1324 _forks_msg = ''
1325 1325 _forks = [f for f in repo.forks]
1326 1326 if handle_forks == 'detach':
1327 1327 _forks_msg = ' ' + 'Detached %s forks' % len(_forks)
1328 1328 elif handle_forks == 'delete':
1329 1329 _forks_msg = ' ' + 'Deleted %s forks' % len(_forks)
1330 1330 elif _forks:
1331 1331 raise JSONRPCError(
1332 1332 'Cannot delete `%s` it still contains attached forks' %
1333 1333 (repo.repo_name,)
1334 1334 )
1335 1335 old_data = repo.get_api_data()
1336 1336 RepoModel().delete(repo, forks=forks)
1337 1337
1338 1338 repo = audit_logger.RepoWrap(repo_id=None,
1339 1339 repo_name=repo.repo_name)
1340 1340
1341 1341 audit_logger.store_api(
1342 1342 'repo.delete', action_data={'old_data': old_data},
1343 1343 user=apiuser, repo=repo)
1344 1344
1345 1345 ScmModel().mark_for_invalidation(repo_name, delete=True)
1346 1346 Session().commit()
1347 1347 return {
1348 1348 'msg': 'Deleted repository `%s`%s' % (repo_name, _forks_msg),
1349 1349 'success': True
1350 1350 }
1351 1351 except Exception:
1352 1352 log.exception("Exception occurred while trying to delete repo")
1353 1353 raise JSONRPCError(
1354 1354 'failed to delete repository `%s`' % (repo_name,)
1355 1355 )
1356 1356
1357 1357
1358 1358 #TODO: marcink, change name ?
1359 1359 @jsonrpc_method()
1360 1360 def invalidate_cache(request, apiuser, repoid, delete_keys=Optional(False)):
1361 1361 """
1362 1362 Invalidates the cache for the specified repository.
1363 1363
1364 1364 This command can only be run using an |authtoken| with admin rights to
1365 1365 the specified repository.
1366 1366
1367 1367 This command takes the following options:
1368 1368
1369 1369 :param apiuser: This is filled automatically from |authtoken|.
1370 1370 :type apiuser: AuthUser
1371 1371 :param repoid: Sets the repository name or repository ID.
1372 1372 :type repoid: str or int
1373 1373 :param delete_keys: This deletes the invalidated keys instead of
1374 1374 just flagging them.
1375 1375 :type delete_keys: Optional(``True`` | ``False``)
1376 1376
1377 1377 Example output:
1378 1378
1379 1379 .. code-block:: bash
1380 1380
1381 1381 id : <id_given_in_input>
1382 1382 result : {
1383 1383 'msg': Cache for repository `<repository name>` was invalidated,
1384 1384 'repository': <repository name>
1385 1385 }
1386 1386 error : null
1387 1387
1388 1388 Example error output:
1389 1389
1390 1390 .. code-block:: bash
1391 1391
1392 1392 id : <id_given_in_input>
1393 1393 result : null
1394 1394 error : {
1395 1395 'Error occurred during cache invalidation action'
1396 1396 }
1397 1397
1398 1398 """
1399 1399
1400 1400 repo = get_repo_or_error(repoid)
1401 1401 if not has_superadmin_permission(apiuser):
1402 1402 _perms = ('repository.admin', 'repository.write',)
1403 1403 validate_repo_permissions(apiuser, repoid, repo, _perms)
1404 1404
1405 1405 delete = Optional.extract(delete_keys)
1406 1406 try:
1407 1407 ScmModel().mark_for_invalidation(repo.repo_name, delete=delete)
1408 1408 return {
1409 1409 'msg': 'Cache for repository `%s` was invalidated' % (repoid,),
1410 1410 'repository': repo.repo_name
1411 1411 }
1412 1412 except Exception:
1413 1413 log.exception(
1414 1414 "Exception occurred while trying to invalidate repo cache")
1415 1415 raise JSONRPCError(
1416 1416 'Error occurred during cache invalidation action'
1417 1417 )
1418 1418
1419 1419
1420 1420 #TODO: marcink, change name ?
1421 1421 @jsonrpc_method()
1422 1422 def lock(request, apiuser, repoid, locked=Optional(None),
1423 1423 userid=Optional(OAttr('apiuser'))):
1424 1424 """
1425 1425 Sets the lock state of the specified |repo| by the given user.
1426 1426 From more information, see :ref:`repo-locking`.
1427 1427
1428 1428 * If the ``userid`` option is not set, the repository is locked to the
1429 1429 user who called the method.
1430 1430 * If the ``locked`` parameter is not set, the current lock state of the
1431 1431 repository is displayed.
1432 1432
1433 1433 This command can only be run using an |authtoken| with admin rights to
1434 1434 the specified repository.
1435 1435
1436 1436 This command takes the following options:
1437 1437
1438 1438 :param apiuser: This is filled automatically from the |authtoken|.
1439 1439 :type apiuser: AuthUser
1440 1440 :param repoid: Sets the repository name or repository ID.
1441 1441 :type repoid: str or int
1442 1442 :param locked: Sets the lock state.
1443 1443 :type locked: Optional(``True`` | ``False``)
1444 1444 :param userid: Set the repository lock to this user.
1445 1445 :type userid: Optional(str or int)
1446 1446
1447 1447 Example error output:
1448 1448
1449 1449 .. code-block:: bash
1450 1450
1451 1451 id : <id_given_in_input>
1452 1452 result : {
1453 1453 'repo': '<reponame>',
1454 1454 'locked': <bool: lock state>,
1455 1455 'locked_since': <int: lock timestamp>,
1456 1456 'locked_by': <username of person who made the lock>,
1457 1457 'lock_reason': <str: reason for locking>,
1458 1458 'lock_state_changed': <bool: True if lock state has been changed in this request>,
1459 1459 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
1460 1460 or
1461 1461 'msg': 'Repo `<repository name>` not locked.'
1462 1462 or
1463 1463 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
1464 1464 }
1465 1465 error : null
1466 1466
1467 1467 Example error output:
1468 1468
1469 1469 .. code-block:: bash
1470 1470
1471 1471 id : <id_given_in_input>
1472 1472 result : null
1473 1473 error : {
1474 1474 'Error occurred locking repository `<reponame>`'
1475 1475 }
1476 1476 """
1477 1477
1478 1478 repo = get_repo_or_error(repoid)
1479 1479 if not has_superadmin_permission(apiuser):
1480 1480 # check if we have at least write permission for this repo !
1481 1481 _perms = ('repository.admin', 'repository.write',)
1482 1482 validate_repo_permissions(apiuser, repoid, repo, _perms)
1483 1483
1484 1484 # make sure normal user does not pass someone else userid,
1485 1485 # he is not allowed to do that
1486 1486 if not isinstance(userid, Optional) and userid != apiuser.user_id:
1487 1487 raise JSONRPCError('userid is not the same as your user')
1488 1488
1489 1489 if isinstance(userid, Optional):
1490 1490 userid = apiuser.user_id
1491 1491
1492 1492 user = get_user_or_error(userid)
1493 1493
1494 1494 if isinstance(locked, Optional):
1495 1495 lockobj = repo.locked
1496 1496
1497 1497 if lockobj[0] is None:
1498 1498 _d = {
1499 1499 'repo': repo.repo_name,
1500 1500 'locked': False,
1501 1501 'locked_since': None,
1502 1502 'locked_by': None,
1503 1503 'lock_reason': None,
1504 1504 'lock_state_changed': False,
1505 1505 'msg': 'Repo `%s` not locked.' % repo.repo_name
1506 1506 }
1507 1507 return _d
1508 1508 else:
1509 1509 _user_id, _time, _reason = lockobj
1510 1510 lock_user = get_user_or_error(userid)
1511 1511 _d = {
1512 1512 'repo': repo.repo_name,
1513 1513 'locked': True,
1514 1514 'locked_since': _time,
1515 1515 'locked_by': lock_user.username,
1516 1516 'lock_reason': _reason,
1517 1517 'lock_state_changed': False,
1518 1518 'msg': ('Repo `%s` locked by `%s` on `%s`.'
1519 1519 % (repo.repo_name, lock_user.username,
1520 1520 json.dumps(time_to_datetime(_time))))
1521 1521 }
1522 1522 return _d
1523 1523
1524 1524 # force locked state through a flag
1525 1525 else:
1526 1526 locked = str2bool(locked)
1527 1527 lock_reason = Repository.LOCK_API
1528 1528 try:
1529 1529 if locked:
1530 1530 lock_time = time.time()
1531 1531 Repository.lock(repo, user.user_id, lock_time, lock_reason)
1532 1532 else:
1533 1533 lock_time = None
1534 1534 Repository.unlock(repo)
1535 1535 _d = {
1536 1536 'repo': repo.repo_name,
1537 1537 'locked': locked,
1538 1538 'locked_since': lock_time,
1539 1539 'locked_by': user.username,
1540 1540 'lock_reason': lock_reason,
1541 1541 'lock_state_changed': True,
1542 1542 'msg': ('User `%s` set lock state for repo `%s` to `%s`'
1543 1543 % (user.username, repo.repo_name, locked))
1544 1544 }
1545 1545 return _d
1546 1546 except Exception:
1547 1547 log.exception(
1548 1548 "Exception occurred while trying to lock repository")
1549 1549 raise JSONRPCError(
1550 1550 'Error occurred locking repository `%s`' % repo.repo_name
1551 1551 )
1552 1552
1553 1553
1554 1554 @jsonrpc_method()
1555 1555 def comment_commit(
1556 1556 request, apiuser, repoid, commit_id, message, status=Optional(None),
1557 1557 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
1558 1558 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
1559 1559 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
1560 1560 """
1561 1561 Set a commit comment, and optionally change the status of the commit.
1562 1562
1563 1563 :param apiuser: This is filled automatically from the |authtoken|.
1564 1564 :type apiuser: AuthUser
1565 1565 :param repoid: Set the repository name or repository ID.
1566 1566 :type repoid: str or int
1567 1567 :param commit_id: Specify the commit_id for which to set a comment.
1568 1568 :type commit_id: str
1569 1569 :param message: The comment text.
1570 1570 :type message: str
1571 1571 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
1572 1572 'approved', 'rejected', 'under_review'
1573 1573 :type status: str
1574 1574 :param comment_type: Comment type, one of: 'note', 'todo'
1575 1575 :type comment_type: Optional(str), default: 'note'
1576 1576 :param resolves_comment_id: id of comment which this one will resolve
1577 1577 :type resolves_comment_id: Optional(int)
1578 1578 :param extra_recipients: list of user ids or usernames to add
1579 1579 notifications for this comment. Acts like a CC for notification
1580 1580 :type extra_recipients: Optional(list)
1581 1581 :param userid: Set the user name of the comment creator.
1582 1582 :type userid: Optional(str or int)
1583 1583 :param send_email: Define if this comment should also send email notification
1584 1584 :type send_email: Optional(bool)
1585 1585
1586 1586 Example error output:
1587 1587
1588 1588 .. code-block:: bash
1589 1589
1590 1590 {
1591 1591 "id" : <id_given_in_input>,
1592 1592 "result" : {
1593 1593 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
1594 1594 "status_change": null or <status>,
1595 1595 "success": true
1596 1596 },
1597 1597 "error" : null
1598 1598 }
1599 1599
1600 1600 """
1601 1601 _ = request.translate
1602 1602
1603 1603 repo = get_repo_or_error(repoid)
1604 1604 if not has_superadmin_permission(apiuser):
1605 1605 _perms = ('repository.read', 'repository.write', 'repository.admin')
1606 1606 validate_repo_permissions(apiuser, repoid, repo, _perms)
1607 1607 db_repo_name = repo.repo_name
1608 1608
1609 1609 try:
1610 1610 commit = repo.scm_instance().get_commit(commit_id=commit_id)
1611 1611 commit_id = commit.raw_id
1612 1612 except Exception as e:
1613 1613 log.exception('Failed to fetch commit')
1614 1614 raise JSONRPCError(safe_str(e))
1615 1615
1616 1616 if isinstance(userid, Optional):
1617 1617 userid = apiuser.user_id
1618 1618
1619 1619 user = get_user_or_error(userid)
1620 1620 status = Optional.extract(status)
1621 1621 comment_type = Optional.extract(comment_type)
1622 1622 resolves_comment_id = Optional.extract(resolves_comment_id)
1623 1623 extra_recipients = Optional.extract(extra_recipients)
1624 1624 send_email = Optional.extract(send_email, binary=True)
1625 1625
1626 1626 allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES]
1627 1627 if status and status not in allowed_statuses:
1628 1628 raise JSONRPCError('Bad status, must be on '
1629 1629 'of %s got %s' % (allowed_statuses, status,))
1630 1630
1631 1631 if resolves_comment_id:
1632 1632 comment = ChangesetComment.get(resolves_comment_id)
1633 1633 if not comment:
1634 1634 raise JSONRPCError(
1635 1635 'Invalid resolves_comment_id `%s` for this commit.'
1636 1636 % resolves_comment_id)
1637 1637 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
1638 1638 raise JSONRPCError(
1639 1639 'Comment `%s` is wrong type for setting status to resolved.'
1640 1640 % resolves_comment_id)
1641 1641
1642 1642 try:
1643 1643 rc_config = SettingsModel().get_all_settings()
1644 1644 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
1645 1645 status_change_label = ChangesetStatus.get_status_lbl(status)
1646 1646 comment = CommentsModel().create(
1647 1647 message, repo, user, commit_id=commit_id,
1648 1648 status_change=status_change_label,
1649 1649 status_change_type=status,
1650 1650 renderer=renderer,
1651 1651 comment_type=comment_type,
1652 1652 resolves_comment_id=resolves_comment_id,
1653 1653 auth_user=apiuser,
1654 1654 extra_recipients=extra_recipients,
1655 1655 send_email=send_email
1656 1656 )
1657 1657 is_inline = comment.is_inline
1658 1658
1659 1659 if status:
1660 1660 # also do a status change
1661 1661 try:
1662 1662 ChangesetStatusModel().set_status(
1663 1663 repo, status, user, comment, revision=commit_id,
1664 1664 dont_allow_on_closed_pull_request=True
1665 1665 )
1666 1666 except StatusChangeOnClosedPullRequestError:
1667 1667 log.exception(
1668 1668 "Exception occurred while trying to change repo commit status")
1669 1669 msg = ('Changing status on a commit associated with '
1670 1670 'a closed pull request is not allowed')
1671 1671 raise JSONRPCError(msg)
1672 1672
1673 1673 CommentsModel().trigger_commit_comment_hook(
1674 1674 repo, apiuser, 'create',
1675 1675 data={'comment': comment, 'commit': commit})
1676 1676
1677 1677 Session().commit()
1678 1678
1679 1679 comment_broadcast_channel = channelstream.comment_channel(
1680 1680 db_repo_name, commit_obj=commit)
1681 1681
1682 1682 comment_data = {'comment': comment, 'comment_id': comment.comment_id}
1683 1683 comment_type = 'inline' if is_inline else 'general'
1684 1684 channelstream.comment_channelstream_push(
1685 1685 request, comment_broadcast_channel, apiuser,
1686 1686 _('posted a new {} comment').format(comment_type),
1687 1687 comment_data=comment_data)
1688 1688
1689 1689 return {
1690 1690 'msg': (
1691 1691 'Commented on commit `%s` for repository `%s`' % (
1692 1692 comment.revision, repo.repo_name)),
1693 1693 'status_change': status,
1694 1694 'success': True,
1695 1695 }
1696 1696 except JSONRPCError:
1697 1697 # catch any inside errors, and re-raise them to prevent from
1698 1698 # below global catch to silence them
1699 1699 raise
1700 1700 except Exception:
1701 1701 log.exception("Exception occurred while trying to comment on commit")
1702 1702 raise JSONRPCError(
1703 1703 'failed to set comment on repository `%s`' % (repo.repo_name,)
1704 1704 )
1705 1705
1706 1706
1707 1707 @jsonrpc_method()
1708 1708 def get_repo_comments(request, apiuser, repoid,
1709 1709 commit_id=Optional(None), comment_type=Optional(None),
1710 1710 userid=Optional(None)):
1711 1711 """
1712 1712 Get all comments for a repository
1713 1713
1714 1714 :param apiuser: This is filled automatically from the |authtoken|.
1715 1715 :type apiuser: AuthUser
1716 1716 :param repoid: Set the repository name or repository ID.
1717 1717 :type repoid: str or int
1718 1718 :param commit_id: Optionally filter the comments by the commit_id
1719 1719 :type commit_id: Optional(str), default: None
1720 1720 :param comment_type: Optionally filter the comments by the comment_type
1721 1721 one of: 'note', 'todo'
1722 1722 :type comment_type: Optional(str), default: None
1723 1723 :param userid: Optionally filter the comments by the author of comment
1724 1724 :type userid: Optional(str or int), Default: None
1725 1725
1726 1726 Example error output:
1727 1727
1728 1728 .. code-block:: bash
1729 1729
1730 1730 {
1731 1731 "id" : <id_given_in_input>,
1732 1732 "result" : [
1733 1733 {
1734 1734 "comment_author": <USER_DETAILS>,
1735 1735 "comment_created_on": "2017-02-01T14:38:16.309",
1736 1736 "comment_f_path": "file.txt",
1737 1737 "comment_id": 282,
1738 1738 "comment_lineno": "n1",
1739 1739 "comment_resolved_by": null,
1740 1740 "comment_status": [],
1741 1741 "comment_text": "This file needs a header",
1742 1742 "comment_type": "todo",
1743 1743 "comment_last_version: 0
1744 1744 }
1745 1745 ],
1746 1746 "error" : null
1747 1747 }
1748 1748
1749 1749 """
1750 1750 repo = get_repo_or_error(repoid)
1751 1751 if not has_superadmin_permission(apiuser):
1752 1752 _perms = ('repository.read', 'repository.write', 'repository.admin')
1753 1753 validate_repo_permissions(apiuser, repoid, repo, _perms)
1754 1754
1755 1755 commit_id = Optional.extract(commit_id)
1756 1756
1757 1757 userid = Optional.extract(userid)
1758 1758 if userid:
1759 1759 user = get_user_or_error(userid)
1760 1760 else:
1761 1761 user = None
1762 1762
1763 1763 comment_type = Optional.extract(comment_type)
1764 1764 if comment_type and comment_type not in ChangesetComment.COMMENT_TYPES:
1765 1765 raise JSONRPCError(
1766 1766 'comment_type must be one of `{}` got {}'.format(
1767 1767 ChangesetComment.COMMENT_TYPES, comment_type)
1768 1768 )
1769 1769
1770 1770 comments = CommentsModel().get_repository_comments(
1771 1771 repo=repo, comment_type=comment_type, user=user, commit_id=commit_id)
1772 1772 return comments
1773 1773
1774 1774
1775 1775 @jsonrpc_method()
1776 1776 def get_comment(request, apiuser, comment_id):
1777 1777 """
1778 1778 Get single comment from repository or pull_request
1779 1779
1780 1780 :param apiuser: This is filled automatically from the |authtoken|.
1781 1781 :type apiuser: AuthUser
1782 1782 :param comment_id: comment id found in the URL of comment
1783 1783 :type comment_id: str or int
1784 1784
1785 1785 Example error output:
1786 1786
1787 1787 .. code-block:: bash
1788 1788
1789 1789 {
1790 1790 "id" : <id_given_in_input>,
1791 1791 "result" : {
1792 1792 "comment_author": <USER_DETAILS>,
1793 1793 "comment_created_on": "2017-02-01T14:38:16.309",
1794 1794 "comment_f_path": "file.txt",
1795 1795 "comment_id": 282,
1796 1796 "comment_lineno": "n1",
1797 1797 "comment_resolved_by": null,
1798 1798 "comment_status": [],
1799 1799 "comment_text": "This file needs a header",
1800 1800 "comment_type": "todo",
1801 1801 "comment_last_version: 0
1802 1802 },
1803 1803 "error" : null
1804 1804 }
1805 1805
1806 1806 """
1807 1807
1808 1808 comment = ChangesetComment.get(comment_id)
1809 1809 if not comment:
1810 1810 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1811 1811
1812 1812 perms = ('repository.read', 'repository.write', 'repository.admin')
1813 1813 has_comment_perm = HasRepoPermissionAnyApi(*perms)\
1814 1814 (user=apiuser, repo_name=comment.repo.repo_name)
1815 1815
1816 1816 if not has_comment_perm:
1817 1817 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1818 1818
1819 1819 return comment
1820 1820
1821 1821
1822 1822 @jsonrpc_method()
1823 1823 def edit_comment(request, apiuser, message, comment_id, version,
1824 1824 userid=Optional(OAttr('apiuser'))):
1825 1825 """
1826 1826 Edit comment on the pull request or commit,
1827 1827 specified by the `comment_id` and version. Initially version should be 0
1828 1828
1829 1829 :param apiuser: This is filled automatically from the |authtoken|.
1830 1830 :type apiuser: AuthUser
1831 1831 :param comment_id: Specify the comment_id for editing
1832 1832 :type comment_id: int
1833 1833 :param version: version of the comment that will be created, starts from 0
1834 1834 :type version: int
1835 1835 :param message: The text content of the comment.
1836 1836 :type message: str
1837 1837 :param userid: Comment on the pull request as this user
1838 1838 :type userid: Optional(str or int)
1839 1839
1840 1840 Example output:
1841 1841
1842 1842 .. code-block:: bash
1843 1843
1844 1844 id : <id_given_in_input>
1845 1845 result : {
1846 1846 "comment": "<comment data>",
1847 1847 "version": "<Integer>",
1848 1848 },
1849 1849 error : null
1850 1850 """
1851 1851
1852 1852 auth_user = apiuser
1853 1853 comment = ChangesetComment.get(comment_id)
1854 1854 if not comment:
1855 1855 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1856 1856
1857 1857 is_super_admin = has_superadmin_permission(apiuser)
1858 1858 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1859 1859 (user=apiuser, repo_name=comment.repo.repo_name)
1860 1860
1861 1861 if not isinstance(userid, Optional):
1862 1862 if is_super_admin or is_repo_admin:
1863 1863 apiuser = get_user_or_error(userid)
1864 1864 auth_user = apiuser.AuthUser()
1865 1865 else:
1866 1866 raise JSONRPCError('userid is not the same as your user')
1867 1867
1868 1868 comment_author = comment.author.user_id == auth_user.user_id
1869 1869 if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1870 1870 raise JSONRPCError("you don't have access to edit this comment")
1871 1871
1872 1872 try:
1873 1873 comment_history = CommentsModel().edit(
1874 1874 comment_id=comment_id,
1875 1875 text=message,
1876 1876 auth_user=auth_user,
1877 1877 version=version,
1878 1878 )
1879 1879 Session().commit()
1880 1880 except CommentVersionMismatch:
1881 1881 raise JSONRPCError(
1882 1882 'comment ({}) version ({}) mismatch'.format(comment_id, version)
1883 1883 )
1884 1884 if not comment_history and not message:
1885 1885 raise JSONRPCError(
1886 1886 "comment ({}) can't be changed with empty string".format(comment_id)
1887 1887 )
1888 1888
1889 1889 if comment.pull_request:
1890 1890 pull_request = comment.pull_request
1891 1891 PullRequestModel().trigger_pull_request_hook(
1892 1892 pull_request, apiuser, 'comment_edit',
1893 1893 data={'comment': comment})
1894 1894 else:
1895 1895 db_repo = comment.repo
1896 1896 commit_id = comment.revision
1897 1897 commit = db_repo.get_commit(commit_id)
1898 1898 CommentsModel().trigger_commit_comment_hook(
1899 1899 db_repo, apiuser, 'edit',
1900 1900 data={'comment': comment, 'commit': commit})
1901 1901
1902 1902 data = {
1903 1903 'comment': comment,
1904 1904 'version': comment_history.version if comment_history else None,
1905 1905 }
1906 1906 return data
1907 1907
1908 1908
1909 1909 # TODO(marcink): write this with all required logic for deleting a comments in PR or commits
1910 1910 # @jsonrpc_method()
1911 1911 # def delete_comment(request, apiuser, comment_id):
1912 1912 # auth_user = apiuser
1913 1913 #
1914 1914 # comment = ChangesetComment.get(comment_id)
1915 1915 # if not comment:
1916 1916 # raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1917 1917 #
1918 1918 # is_super_admin = has_superadmin_permission(apiuser)
1919 1919 # is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1920 1920 # (user=apiuser, repo_name=comment.repo.repo_name)
1921 1921 #
1922 1922 # comment_author = comment.author.user_id == auth_user.user_id
1923 1923 # if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1924 1924 # raise JSONRPCError("you don't have access to edit this comment")
1925 1925
1926 1926 @jsonrpc_method()
1927 1927 def grant_user_permission(request, apiuser, repoid, userid, perm):
1928 1928 """
1929 1929 Grant permissions for the specified user on the given repository,
1930 1930 or update existing permissions if found.
1931 1931
1932 1932 This command can only be run using an |authtoken| with admin
1933 1933 permissions on the |repo|.
1934 1934
1935 1935 :param apiuser: This is filled automatically from the |authtoken|.
1936 1936 :type apiuser: AuthUser
1937 1937 :param repoid: Set the repository name or repository ID.
1938 1938 :type repoid: str or int
1939 1939 :param userid: Set the user name.
1940 1940 :type userid: str
1941 1941 :param perm: Set the user permissions, using the following format
1942 1942 ``(repository.(none|read|write|admin))``
1943 1943 :type perm: str
1944 1944
1945 1945 Example output:
1946 1946
1947 1947 .. code-block:: bash
1948 1948
1949 1949 id : <id_given_in_input>
1950 1950 result: {
1951 1951 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
1952 1952 "success": true
1953 1953 }
1954 1954 error: null
1955 1955 """
1956 1956
1957 1957 repo = get_repo_or_error(repoid)
1958 1958 user = get_user_or_error(userid)
1959 1959 perm = get_perm_or_error(perm)
1960 1960 if not has_superadmin_permission(apiuser):
1961 1961 _perms = ('repository.admin',)
1962 1962 validate_repo_permissions(apiuser, repoid, repo, _perms)
1963 1963
1964 1964 perm_additions = [[user.user_id, perm.permission_name, "user"]]
1965 1965 try:
1966 1966 changes = RepoModel().update_permissions(
1967 1967 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
1968 1968
1969 1969 action_data = {
1970 1970 'added': changes['added'],
1971 1971 'updated': changes['updated'],
1972 1972 'deleted': changes['deleted'],
1973 1973 }
1974 1974 audit_logger.store_api(
1975 1975 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
1976 1976 Session().commit()
1977 1977 PermissionModel().flush_user_permission_caches(changes)
1978 1978
1979 1979 return {
1980 1980 'msg': 'Granted perm: `%s` for user: `%s` in repo: `%s`' % (
1981 1981 perm.permission_name, user.username, repo.repo_name
1982 1982 ),
1983 1983 'success': True
1984 1984 }
1985 1985 except Exception:
1986 1986 log.exception("Exception occurred while trying edit permissions for repo")
1987 1987 raise JSONRPCError(
1988 1988 'failed to edit permission for user: `%s` in repo: `%s`' % (
1989 1989 userid, repoid
1990 1990 )
1991 1991 )
1992 1992
1993 1993
1994 1994 @jsonrpc_method()
1995 1995 def revoke_user_permission(request, apiuser, repoid, userid):
1996 1996 """
1997 1997 Revoke permission for a user on the specified repository.
1998 1998
1999 1999 This command can only be run using an |authtoken| with admin
2000 2000 permissions on the |repo|.
2001 2001
2002 2002 :param apiuser: This is filled automatically from the |authtoken|.
2003 2003 :type apiuser: AuthUser
2004 2004 :param repoid: Set the repository name or repository ID.
2005 2005 :type repoid: str or int
2006 2006 :param userid: Set the user name of revoked user.
2007 2007 :type userid: str or int
2008 2008
2009 2009 Example error output:
2010 2010
2011 2011 .. code-block:: bash
2012 2012
2013 2013 id : <id_given_in_input>
2014 2014 result: {
2015 2015 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
2016 2016 "success": true
2017 2017 }
2018 2018 error: null
2019 2019 """
2020 2020
2021 2021 repo = get_repo_or_error(repoid)
2022 2022 user = get_user_or_error(userid)
2023 2023 if not has_superadmin_permission(apiuser):
2024 2024 _perms = ('repository.admin',)
2025 2025 validate_repo_permissions(apiuser, repoid, repo, _perms)
2026 2026
2027 2027 perm_deletions = [[user.user_id, None, "user"]]
2028 2028 try:
2029 2029 changes = RepoModel().update_permissions(
2030 2030 repo=repo, perm_deletions=perm_deletions, cur_user=user)
2031 2031
2032 2032 action_data = {
2033 2033 'added': changes['added'],
2034 2034 'updated': changes['updated'],
2035 2035 'deleted': changes['deleted'],
2036 2036 }
2037 2037 audit_logger.store_api(
2038 2038 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2039 2039 Session().commit()
2040 2040 PermissionModel().flush_user_permission_caches(changes)
2041 2041
2042 2042 return {
2043 2043 'msg': 'Revoked perm for user: `%s` in repo: `%s`' % (
2044 2044 user.username, repo.repo_name
2045 2045 ),
2046 2046 'success': True
2047 2047 }
2048 2048 except Exception:
2049 2049 log.exception("Exception occurred while trying revoke permissions to repo")
2050 2050 raise JSONRPCError(
2051 2051 'failed to edit permission for user: `%s` in repo: `%s`' % (
2052 2052 userid, repoid
2053 2053 )
2054 2054 )
2055 2055
2056 2056
2057 2057 @jsonrpc_method()
2058 2058 def grant_user_group_permission(request, apiuser, repoid, usergroupid, perm):
2059 2059 """
2060 2060 Grant permission for a user group on the specified repository,
2061 2061 or update existing permissions.
2062 2062
2063 2063 This command can only be run using an |authtoken| with admin
2064 2064 permissions on the |repo|.
2065 2065
2066 2066 :param apiuser: This is filled automatically from the |authtoken|.
2067 2067 :type apiuser: AuthUser
2068 2068 :param repoid: Set the repository name or repository ID.
2069 2069 :type repoid: str or int
2070 2070 :param usergroupid: Specify the ID of the user group.
2071 2071 :type usergroupid: str or int
2072 2072 :param perm: Set the user group permissions using the following
2073 2073 format: (repository.(none|read|write|admin))
2074 2074 :type perm: str
2075 2075
2076 2076 Example output:
2077 2077
2078 2078 .. code-block:: bash
2079 2079
2080 2080 id : <id_given_in_input>
2081 2081 result : {
2082 2082 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
2083 2083 "success": true
2084 2084
2085 2085 }
2086 2086 error : null
2087 2087
2088 2088 Example error output:
2089 2089
2090 2090 .. code-block:: bash
2091 2091
2092 2092 id : <id_given_in_input>
2093 2093 result : null
2094 2094 error : {
2095 2095 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
2096 2096 }
2097 2097
2098 2098 """
2099 2099
2100 2100 repo = get_repo_or_error(repoid)
2101 2101 perm = get_perm_or_error(perm)
2102 2102 if not has_superadmin_permission(apiuser):
2103 2103 _perms = ('repository.admin',)
2104 2104 validate_repo_permissions(apiuser, repoid, repo, _perms)
2105 2105
2106 2106 user_group = get_user_group_or_error(usergroupid)
2107 2107 if not has_superadmin_permission(apiuser):
2108 2108 # check if we have at least read permission for this user group !
2109 2109 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2110 2110 if not HasUserGroupPermissionAnyApi(*_perms)(
2111 2111 user=apiuser, user_group_name=user_group.users_group_name):
2112 2112 raise JSONRPCError(
2113 2113 'user group `%s` does not exist' % (usergroupid,))
2114 2114
2115 2115 perm_additions = [[user_group.users_group_id, perm.permission_name, "user_group"]]
2116 2116 try:
2117 2117 changes = RepoModel().update_permissions(
2118 2118 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
2119 2119 action_data = {
2120 2120 'added': changes['added'],
2121 2121 'updated': changes['updated'],
2122 2122 'deleted': changes['deleted'],
2123 2123 }
2124 2124 audit_logger.store_api(
2125 2125 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2126 2126 Session().commit()
2127 2127 PermissionModel().flush_user_permission_caches(changes)
2128 2128
2129 2129 return {
2130 2130 'msg': 'Granted perm: `%s` for user group: `%s` in '
2131 2131 'repo: `%s`' % (
2132 2132 perm.permission_name, user_group.users_group_name,
2133 2133 repo.repo_name
2134 2134 ),
2135 2135 'success': True
2136 2136 }
2137 2137 except Exception:
2138 2138 log.exception(
2139 2139 "Exception occurred while trying change permission on repo")
2140 2140 raise JSONRPCError(
2141 2141 'failed to edit permission for user group: `%s` in '
2142 2142 'repo: `%s`' % (
2143 2143 usergroupid, repo.repo_name
2144 2144 )
2145 2145 )
2146 2146
2147 2147
2148 2148 @jsonrpc_method()
2149 2149 def revoke_user_group_permission(request, apiuser, repoid, usergroupid):
2150 2150 """
2151 2151 Revoke the permissions of a user group on a given repository.
2152 2152
2153 2153 This command can only be run using an |authtoken| with admin
2154 2154 permissions on the |repo|.
2155 2155
2156 2156 :param apiuser: This is filled automatically from the |authtoken|.
2157 2157 :type apiuser: AuthUser
2158 2158 :param repoid: Set the repository name or repository ID.
2159 2159 :type repoid: str or int
2160 2160 :param usergroupid: Specify the user group ID.
2161 2161 :type usergroupid: str or int
2162 2162
2163 2163 Example output:
2164 2164
2165 2165 .. code-block:: bash
2166 2166
2167 2167 id : <id_given_in_input>
2168 2168 result: {
2169 2169 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
2170 2170 "success": true
2171 2171 }
2172 2172 error: null
2173 2173 """
2174 2174
2175 2175 repo = get_repo_or_error(repoid)
2176 2176 if not has_superadmin_permission(apiuser):
2177 2177 _perms = ('repository.admin',)
2178 2178 validate_repo_permissions(apiuser, repoid, repo, _perms)
2179 2179
2180 2180 user_group = get_user_group_or_error(usergroupid)
2181 2181 if not has_superadmin_permission(apiuser):
2182 2182 # check if we have at least read permission for this user group !
2183 2183 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2184 2184 if not HasUserGroupPermissionAnyApi(*_perms)(
2185 2185 user=apiuser, user_group_name=user_group.users_group_name):
2186 2186 raise JSONRPCError(
2187 2187 'user group `%s` does not exist' % (usergroupid,))
2188 2188
2189 2189 perm_deletions = [[user_group.users_group_id, None, "user_group"]]
2190 2190 try:
2191 2191 changes = RepoModel().update_permissions(
2192 2192 repo=repo, perm_deletions=perm_deletions, cur_user=apiuser)
2193 2193 action_data = {
2194 2194 'added': changes['added'],
2195 2195 'updated': changes['updated'],
2196 2196 'deleted': changes['deleted'],
2197 2197 }
2198 2198 audit_logger.store_api(
2199 2199 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2200 2200 Session().commit()
2201 2201 PermissionModel().flush_user_permission_caches(changes)
2202 2202
2203 2203 return {
2204 2204 'msg': 'Revoked perm for user group: `%s` in repo: `%s`' % (
2205 2205 user_group.users_group_name, repo.repo_name
2206 2206 ),
2207 2207 'success': True
2208 2208 }
2209 2209 except Exception:
2210 2210 log.exception("Exception occurred while trying revoke "
2211 2211 "user group permission on repo")
2212 2212 raise JSONRPCError(
2213 2213 'failed to edit permission for user group: `%s` in '
2214 2214 'repo: `%s`' % (
2215 2215 user_group.users_group_name, repo.repo_name
2216 2216 )
2217 2217 )
2218 2218
2219 2219
2220 2220 @jsonrpc_method()
2221 2221 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
2222 2222 """
2223 2223 Triggers a pull on the given repository from a remote location. You
2224 2224 can use this to keep remote repositories up-to-date.
2225 2225
2226 2226 This command can only be run using an |authtoken| with admin
2227 2227 rights to the specified repository. For more information,
2228 2228 see :ref:`config-token-ref`.
2229 2229
2230 2230 This command takes the following options:
2231 2231
2232 2232 :param apiuser: This is filled automatically from the |authtoken|.
2233 2233 :type apiuser: AuthUser
2234 2234 :param repoid: The repository name or repository ID.
2235 2235 :type repoid: str or int
2236 2236 :param remote_uri: Optional remote URI to pass in for pull
2237 2237 :type remote_uri: str
2238 2238
2239 2239 Example output:
2240 2240
2241 2241 .. code-block:: bash
2242 2242
2243 2243 id : <id_given_in_input>
2244 2244 result : {
2245 2245 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
2246 2246 "repository": "<repository name>"
2247 2247 }
2248 2248 error : null
2249 2249
2250 2250 Example error output:
2251 2251
2252 2252 .. code-block:: bash
2253 2253
2254 2254 id : <id_given_in_input>
2255 2255 result : null
2256 2256 error : {
2257 2257 "Unable to push changes from `<remote_url>`"
2258 2258 }
2259 2259
2260 2260 """
2261 2261
2262 2262 repo = get_repo_or_error(repoid)
2263 2263 remote_uri = Optional.extract(remote_uri)
2264 2264 remote_uri_display = remote_uri or repo.clone_uri_hidden
2265 2265 if not has_superadmin_permission(apiuser):
2266 2266 _perms = ('repository.admin',)
2267 2267 validate_repo_permissions(apiuser, repoid, repo, _perms)
2268 2268
2269 2269 try:
2270 2270 ScmModel().pull_changes(
2271 2271 repo.repo_name, apiuser.username, remote_uri=remote_uri)
2272 2272 return {
2273 2273 'msg': 'Pulled from url `%s` on repo `%s`' % (
2274 2274 remote_uri_display, repo.repo_name),
2275 2275 'repository': repo.repo_name
2276 2276 }
2277 2277 except Exception:
2278 2278 log.exception("Exception occurred while trying to "
2279 2279 "pull changes from remote location")
2280 2280 raise JSONRPCError(
2281 2281 'Unable to pull changes from `%s`' % remote_uri_display
2282 2282 )
2283 2283
2284 2284
2285 2285 @jsonrpc_method()
2286 2286 def strip(request, apiuser, repoid, revision, branch):
2287 2287 """
2288 2288 Strips the given revision from the specified repository.
2289 2289
2290 2290 * This will remove the revision and all of its decendants.
2291 2291
2292 2292 This command can only be run using an |authtoken| with admin rights to
2293 2293 the specified repository.
2294 2294
2295 2295 This command takes the following options:
2296 2296
2297 2297 :param apiuser: This is filled automatically from the |authtoken|.
2298 2298 :type apiuser: AuthUser
2299 2299 :param repoid: The repository name or repository ID.
2300 2300 :type repoid: str or int
2301 2301 :param revision: The revision you wish to strip.
2302 2302 :type revision: str
2303 2303 :param branch: The branch from which to strip the revision.
2304 2304 :type branch: str
2305 2305
2306 2306 Example output:
2307 2307
2308 2308 .. code-block:: bash
2309 2309
2310 2310 id : <id_given_in_input>
2311 2311 result : {
2312 2312 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
2313 2313 "repository": "<repository name>"
2314 2314 }
2315 2315 error : null
2316 2316
2317 2317 Example error output:
2318 2318
2319 2319 .. code-block:: bash
2320 2320
2321 2321 id : <id_given_in_input>
2322 2322 result : null
2323 2323 error : {
2324 2324 "Unable to strip commit <commit_hash> from repo `<repository name>`"
2325 2325 }
2326 2326
2327 2327 """
2328 2328
2329 2329 repo = get_repo_or_error(repoid)
2330 2330 if not has_superadmin_permission(apiuser):
2331 2331 _perms = ('repository.admin',)
2332 2332 validate_repo_permissions(apiuser, repoid, repo, _perms)
2333 2333
2334 2334 try:
2335 2335 ScmModel().strip(repo, revision, branch)
2336 2336 audit_logger.store_api(
2337 2337 'repo.commit.strip', action_data={'commit_id': revision},
2338 2338 repo=repo,
2339 2339 user=apiuser, commit=True)
2340 2340
2341 2341 return {
2342 2342 'msg': 'Stripped commit %s from repo `%s`' % (
2343 2343 revision, repo.repo_name),
2344 2344 'repository': repo.repo_name
2345 2345 }
2346 2346 except Exception:
2347 2347 log.exception("Exception while trying to strip")
2348 2348 raise JSONRPCError(
2349 2349 'Unable to strip commit %s from repo `%s`' % (
2350 2350 revision, repo.repo_name)
2351 2351 )
2352 2352
2353 2353
2354 2354 @jsonrpc_method()
2355 2355 def get_repo_settings(request, apiuser, repoid, key=Optional(None)):
2356 2356 """
2357 2357 Returns all settings for a repository. If key is given it only returns the
2358 2358 setting identified by the key or null.
2359 2359
2360 2360 :param apiuser: This is filled automatically from the |authtoken|.
2361 2361 :type apiuser: AuthUser
2362 2362 :param repoid: The repository name or repository id.
2363 2363 :type repoid: str or int
2364 2364 :param key: Key of the setting to return.
2365 2365 :type: key: Optional(str)
2366 2366
2367 2367 Example output:
2368 2368
2369 2369 .. code-block:: bash
2370 2370
2371 2371 {
2372 2372 "error": null,
2373 2373 "id": 237,
2374 2374 "result": {
2375 2375 "extensions_largefiles": true,
2376 2376 "extensions_evolve": true,
2377 2377 "hooks_changegroup_push_logger": true,
2378 2378 "hooks_changegroup_repo_size": false,
2379 2379 "hooks_outgoing_pull_logger": true,
2380 2380 "phases_publish": "True",
2381 2381 "rhodecode_hg_use_rebase_for_merging": true,
2382 2382 "rhodecode_pr_merge_enabled": true,
2383 2383 "rhodecode_use_outdated_comments": true
2384 2384 }
2385 2385 }
2386 2386 """
2387 2387
2388 2388 # Restrict access to this api method to super-admins, and repo admins only.
2389 2389 repo = get_repo_or_error(repoid)
2390 2390 if not has_superadmin_permission(apiuser):
2391 2391 _perms = ('repository.admin',)
2392 2392 validate_repo_permissions(apiuser, repoid, repo, _perms)
2393 2393
2394 2394 try:
2395 2395 settings_model = VcsSettingsModel(repo=repo)
2396 2396 settings = settings_model.get_global_settings()
2397 2397 settings.update(settings_model.get_repo_settings())
2398 2398
2399 2399 # If only a single setting is requested fetch it from all settings.
2400 2400 key = Optional.extract(key)
2401 2401 if key is not None:
2402 2402 settings = settings.get(key, None)
2403 2403 except Exception:
2404 2404 msg = 'Failed to fetch settings for repository `{}`'.format(repoid)
2405 2405 log.exception(msg)
2406 2406 raise JSONRPCError(msg)
2407 2407
2408 2408 return settings
2409 2409
2410 2410
2411 2411 @jsonrpc_method()
2412 2412 def set_repo_settings(request, apiuser, repoid, settings):
2413 2413 """
2414 2414 Update repository settings. Returns true on success.
2415 2415
2416 2416 :param apiuser: This is filled automatically from the |authtoken|.
2417 2417 :type apiuser: AuthUser
2418 2418 :param repoid: The repository name or repository id.
2419 2419 :type repoid: str or int
2420 2420 :param settings: The new settings for the repository.
2421 2421 :type: settings: dict
2422 2422
2423 2423 Example output:
2424 2424
2425 2425 .. code-block:: bash
2426 2426
2427 2427 {
2428 2428 "error": null,
2429 2429 "id": 237,
2430 2430 "result": true
2431 2431 }
2432 2432 """
2433 2433 # Restrict access to this api method to super-admins, and repo admins only.
2434 2434 repo = get_repo_or_error(repoid)
2435 2435 if not has_superadmin_permission(apiuser):
2436 2436 _perms = ('repository.admin',)
2437 2437 validate_repo_permissions(apiuser, repoid, repo, _perms)
2438 2438
2439 2439 if type(settings) is not dict:
2440 2440 raise JSONRPCError('Settings have to be a JSON Object.')
2441 2441
2442 2442 try:
2443 2443 settings_model = VcsSettingsModel(repo=repoid)
2444 2444
2445 2445 # Merge global, repo and incoming settings.
2446 2446 new_settings = settings_model.get_global_settings()
2447 2447 new_settings.update(settings_model.get_repo_settings())
2448 2448 new_settings.update(settings)
2449 2449
2450 2450 # Update the settings.
2451 2451 inherit_global_settings = new_settings.get(
2452 2452 'inherit_global_settings', False)
2453 2453 settings_model.create_or_update_repo_settings(
2454 2454 new_settings, inherit_global_settings=inherit_global_settings)
2455 2455 Session().commit()
2456 2456 except Exception:
2457 2457 msg = 'Failed to update settings for repository `{}`'.format(repoid)
2458 2458 log.exception(msg)
2459 2459 raise JSONRPCError(msg)
2460 2460
2461 2461 # Indicate success.
2462 2462 return True
2463 2463
2464 2464
2465 2465 @jsonrpc_method()
2466 2466 def maintenance(request, apiuser, repoid):
2467 2467 """
2468 2468 Triggers a maintenance on the given repository.
2469 2469
2470 2470 This command can only be run using an |authtoken| with admin
2471 2471 rights to the specified repository. For more information,
2472 2472 see :ref:`config-token-ref`.
2473 2473
2474 2474 This command takes the following options:
2475 2475
2476 2476 :param apiuser: This is filled automatically from the |authtoken|.
2477 2477 :type apiuser: AuthUser
2478 2478 :param repoid: The repository name or repository ID.
2479 2479 :type repoid: str or int
2480 2480
2481 2481 Example output:
2482 2482
2483 2483 .. code-block:: bash
2484 2484
2485 2485 id : <id_given_in_input>
2486 2486 result : {
2487 2487 "msg": "executed maintenance command",
2488 2488 "executed_actions": [
2489 2489 <action_message>, <action_message2>...
2490 2490 ],
2491 2491 "repository": "<repository name>"
2492 2492 }
2493 2493 error : null
2494 2494
2495 2495 Example error output:
2496 2496
2497 2497 .. code-block:: bash
2498 2498
2499 2499 id : <id_given_in_input>
2500 2500 result : null
2501 2501 error : {
2502 2502 "Unable to execute maintenance on `<reponame>`"
2503 2503 }
2504 2504
2505 2505 """
2506 2506
2507 2507 repo = get_repo_or_error(repoid)
2508 2508 if not has_superadmin_permission(apiuser):
2509 2509 _perms = ('repository.admin',)
2510 2510 validate_repo_permissions(apiuser, repoid, repo, _perms)
2511 2511
2512 2512 try:
2513 2513 maintenance = repo_maintenance.RepoMaintenance()
2514 2514 executed_actions = maintenance.execute(repo)
2515 2515
2516 2516 return {
2517 2517 'msg': 'executed maintenance command',
2518 2518 'executed_actions': executed_actions,
2519 2519 'repository': repo.repo_name
2520 2520 }
2521 2521 except Exception:
2522 2522 log.exception("Exception occurred while trying to run maintenance")
2523 2523 raise JSONRPCError(
2524 2524 'Unable to execute maintenance on `%s`' % repo.repo_name)
@@ -1,269 +1,270 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22 import time
23 23 import errno
24 24 import hashlib
25 25
26 26 from rhodecode.lib.ext_json import json
27 27 from rhodecode.apps.file_store import utils
28 28 from rhodecode.apps.file_store.extensions import resolve_extensions
29 29 from rhodecode.apps.file_store.exceptions import (
30 30 FileNotAllowedException, FileOverSizeException)
31 31
32 32 METADATA_VER = 'v1'
33 33
34 34
35 35 def safe_make_dirs(dir_path):
36 36 if not os.path.exists(dir_path):
37 37 try:
38 38 os.makedirs(dir_path)
39 39 except OSError as e:
40 40 if e.errno != errno.EEXIST:
41 41 raise
42 42 return
43 43
44 44
45 45 class LocalFileStorage(object):
46 46
47 47 @classmethod
48 48 def apply_counter(cls, counter, filename):
49 49 name_counted = '%d-%s' % (counter, filename)
50 50 return name_counted
51 51
52 52 @classmethod
53 53 def resolve_name(cls, name, directory):
54 54 """
55 55 Resolves a unique name and the correct path. If a filename
56 56 for that path already exists then a numeric prefix with values > 0 will be
57 57 added, for example test.jpg -> 1-test.jpg etc. initially file would have 0 prefix.
58 58
59 59 :param name: base name of file
60 60 :param directory: absolute directory path
61 61 """
62 62
63 63 counter = 0
64 64 while True:
65 65 name_counted = cls.apply_counter(counter, name)
66 66
67 67 # sub_store prefix to optimize disk usage, e.g some_path/ab/final_file
68 68 sub_store = cls._sub_store_from_filename(name_counted)
69 69 sub_store_path = os.path.join(directory, sub_store)
70 70 safe_make_dirs(sub_store_path)
71 71
72 72 path = os.path.join(sub_store_path, name_counted)
73 73 if not os.path.exists(path):
74 74 return name_counted, path
75 75 counter += 1
76 76
77 77 @classmethod
78 78 def _sub_store_from_filename(cls, filename):
79 79 return filename[:2]
80 80
81 81 @classmethod
82 82 def calculate_path_hash(cls, file_path):
83 83 """
84 84 Efficient calculation of file_path sha256 sum
85 85
86 86 :param file_path:
87 87 :return: sha256sum
88 88 """
89 89 digest = hashlib.sha256()
90 90 with open(file_path, 'rb') as f:
91 91 for chunk in iter(lambda: f.read(1024 * 100), b""):
92 92 digest.update(chunk)
93 93
94 94 return digest.hexdigest()
95 95
96 96 def __init__(self, base_path, extension_groups=None):
97 97
98 98 """
99 99 Local file storage
100 100
101 101 :param base_path: the absolute base path where uploads are stored
102 102 :param extension_groups: extensions string
103 103 """
104 104
105 105 extension_groups = extension_groups or ['any']
106 106 self.base_path = base_path
107 107 self.extensions = resolve_extensions([], groups=extension_groups)
108 108
109 109 def __repr__(self):
110 110 return '{}@{}'.format(self.__class__, self.base_path)
111 111
112 112 def store_path(self, filename):
113 113 """
114 114 Returns absolute file path of the filename, joined to the
115 115 base_path.
116 116
117 117 :param filename: base name of file
118 118 """
119 119 prefix_dir = ''
120 120 if '/' in filename:
121 121 prefix_dir, filename = filename.split('/')
122 122 sub_store = self._sub_store_from_filename(filename)
123 123 else:
124 124 sub_store = self._sub_store_from_filename(filename)
125 125 return os.path.join(self.base_path, prefix_dir, sub_store, filename)
126 126
127 127 def delete(self, filename):
128 128 """
129 129 Deletes the filename. Filename is resolved with the
130 130 absolute path based on base_path. If file does not exist,
131 131 returns **False**, otherwise **True**
132 132
133 133 :param filename: base name of file
134 134 """
135 135 if self.exists(filename):
136 136 os.remove(self.store_path(filename))
137 137 return True
138 138 return False
139 139
140 140 def exists(self, filename):
141 141 """
142 142 Checks if file exists. Resolves filename's absolute
143 143 path based on base_path.
144 144
145 145 :param filename: file_uid name of file, e.g 0-f62b2b2d-9708-4079-a071-ec3f958448d4.svg
146 146 """
147 147 return os.path.exists(self.store_path(filename))
148 148
149 149 def filename_allowed(self, filename, extensions=None):
150 150 """Checks if a filename has an allowed extension
151 151
152 152 :param filename: base name of file
153 153 :param extensions: iterable of extensions (or self.extensions)
154 154 """
155 155 _, ext = os.path.splitext(filename)
156 156 return self.extension_allowed(ext, extensions)
157 157
158 158 def extension_allowed(self, ext, extensions=None):
159 159 """
160 160 Checks if an extension is permitted. Both e.g. ".jpg" and
161 161 "jpg" can be passed in. Extension lookup is case-insensitive.
162 162
163 163 :param ext: extension to check
164 164 :param extensions: iterable of extensions to validate against (or self.extensions)
165 165 """
166 166 def normalize_ext(_ext):
167 167 if _ext.startswith('.'):
168 168 _ext = _ext[1:]
169 169 return _ext.lower()
170 170
171 171 extensions = extensions or self.extensions
172 172 if not extensions:
173 173 return True
174 174
175 175 ext = normalize_ext(ext)
176 176
177 177 return ext in [normalize_ext(x) for x in extensions]
178 178
179 179 def save_file(self, file_obj, filename, directory=None, extensions=None,
180 180 extra_metadata=None, max_filesize=None, randomized_name=True, **kwargs):
181 181 """
182 182 Saves a file object to the uploads location.
183 183 Returns the resolved filename, i.e. the directory +
184 184 the (randomized/incremented) base name.
185 185
186 186 :param file_obj: **cgi.FieldStorage** object (or similar)
187 187 :param filename: original filename
188 188 :param directory: relative path of sub-directory
189 189 :param extensions: iterable of allowed extensions, if not default
190 190 :param max_filesize: maximum size of file that should be allowed
191 191 :param randomized_name: generate random generated UID or fixed based on the filename
192 192 :param extra_metadata: extra JSON metadata to store next to the file with .meta suffix
193 193
194 194 """
195 195
196 196 extensions = extensions or self.extensions
197 197
198 198 if not self.filename_allowed(filename, extensions):
199 199 raise FileNotAllowedException()
200 200
201 201 if directory:
202 202 dest_directory = os.path.join(self.base_path, directory)
203 203 else:
204 204 dest_directory = self.base_path
205 205
206 206 safe_make_dirs(dest_directory)
207 207
208 208 uid_filename = utils.uid_filename(filename, randomized=randomized_name)
209 209
210 210 # resolve also produces special sub-dir for file optimized store
211 211 filename, path = self.resolve_name(uid_filename, dest_directory)
212 212 stored_file_dir = os.path.dirname(path)
213 213
214 214 no_body_seek = kwargs.pop('no_body_seek', False)
215 215 if no_body_seek:
216 216 pass
217 217 else:
218 218 file_obj.seek(0)
219 219
220 220 with open(path, "wb") as dest:
221 221 length = 256 * 1024
222 222 while 1:
223 223 buf = file_obj.read(length)
224 224 if not buf:
225 225 break
226 226 dest.write(buf)
227 227
228 228 metadata = {}
229 229 if extra_metadata:
230 230 metadata = extra_metadata
231 231
232 232 size = os.stat(path).st_size
233 233
234 234 if max_filesize and size > max_filesize:
235 235 # free up the copied file, and raise exc
236 236 os.remove(path)
237 237 raise FileOverSizeException()
238 238
239 239 file_hash = self.calculate_path_hash(path)
240 240
241 241 metadata.update({
242 242 "filename": filename,
243 243 "size": size,
244 244 "time": time.time(),
245 245 "sha256": file_hash,
246 246 "meta_ver": METADATA_VER
247 247 })
248 248
249 249 filename_meta = filename + '.meta'
250 250 with open(os.path.join(stored_file_dir, filename_meta), "wb") as dest_meta:
251 251 dest_meta.write(json.dumps(metadata))
252 252
253 253 if directory:
254 254 filename = os.path.join(directory, filename)
255 255
256 256 return filename, metadata
257 257
258 def get_metadata(self, filename):
258 def get_metadata(self, filename, ignore_missing=False):
259 259 """
260 260 Reads JSON stored metadata for a file
261 261
262 262 :param filename:
263 263 :return:
264 264 """
265 265 filename = self.store_path(filename)
266 266 filename_meta = filename + '.meta'
267
267 if ignore_missing and not os.path.isfile(filename_meta):
268 return {}
268 269 with open(filename_meta, "rb") as source_meta:
269 270 return json.loads(source_meta.read())
@@ -1,1227 +1,1227 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20 from rhodecode.apps._base import add_route_with_slash
21 21
22 22
23 23 def includeme(config):
24 24 from rhodecode.apps.repository.views.repo_artifacts import RepoArtifactsView
25 25 from rhodecode.apps.repository.views.repo_audit_logs import AuditLogsView
26 26 from rhodecode.apps.repository.views.repo_automation import RepoAutomationView
27 27 from rhodecode.apps.repository.views.repo_bookmarks import RepoBookmarksView
28 28 from rhodecode.apps.repository.views.repo_branch_permissions import RepoSettingsBranchPermissionsView
29 29 from rhodecode.apps.repository.views.repo_branches import RepoBranchesView
30 30 from rhodecode.apps.repository.views.repo_caches import RepoCachesView
31 31 from rhodecode.apps.repository.views.repo_changelog import RepoChangelogView
32 32 from rhodecode.apps.repository.views.repo_checks import RepoChecksView
33 33 from rhodecode.apps.repository.views.repo_commits import RepoCommitsView
34 34 from rhodecode.apps.repository.views.repo_compare import RepoCompareView
35 35 from rhodecode.apps.repository.views.repo_feed import RepoFeedView
36 36 from rhodecode.apps.repository.views.repo_files import RepoFilesView
37 37 from rhodecode.apps.repository.views.repo_forks import RepoForksView
38 38 from rhodecode.apps.repository.views.repo_maintainance import RepoMaintenanceView
39 39 from rhodecode.apps.repository.views.repo_permissions import RepoSettingsPermissionsView
40 40 from rhodecode.apps.repository.views.repo_pull_requests import RepoPullRequestsView
41 41 from rhodecode.apps.repository.views.repo_review_rules import RepoReviewRulesView
42 42 from rhodecode.apps.repository.views.repo_settings import RepoSettingsView
43 43 from rhodecode.apps.repository.views.repo_settings_advanced import RepoSettingsAdvancedView
44 44 from rhodecode.apps.repository.views.repo_settings_fields import RepoSettingsFieldsView
45 45 from rhodecode.apps.repository.views.repo_settings_issue_trackers import RepoSettingsIssueTrackersView
46 46 from rhodecode.apps.repository.views.repo_settings_remote import RepoSettingsRemoteView
47 47 from rhodecode.apps.repository.views.repo_settings_vcs import RepoSettingsVcsView
48 48 from rhodecode.apps.repository.views.repo_strip import RepoStripView
49 49 from rhodecode.apps.repository.views.repo_summary import RepoSummaryView
50 50 from rhodecode.apps.repository.views.repo_tags import RepoTagsView
51 51
52 52 # repo creating checks, special cases that aren't repo routes
53 53 config.add_route(
54 54 name='repo_creating',
55 55 pattern='/{repo_name:.*?[^/]}/repo_creating')
56 56 config.add_view(
57 57 RepoChecksView,
58 58 attr='repo_creating',
59 59 route_name='repo_creating', request_method='GET',
60 60 renderer='rhodecode:templates/admin/repos/repo_creating.mako')
61 61
62 62 config.add_route(
63 63 name='repo_creating_check',
64 64 pattern='/{repo_name:.*?[^/]}/repo_creating_check')
65 65 config.add_view(
66 66 RepoChecksView,
67 67 attr='repo_creating_check',
68 68 route_name='repo_creating_check', request_method='GET',
69 69 renderer='json_ext')
70 70
71 71 # Summary
72 72 # NOTE(marcink): one additional route is defined in very bottom, catch
73 73 # all pattern
74 74 config.add_route(
75 75 name='repo_summary_explicit',
76 76 pattern='/{repo_name:.*?[^/]}/summary', repo_route=True)
77 77 config.add_view(
78 78 RepoSummaryView,
79 79 attr='summary',
80 80 route_name='repo_summary_explicit', request_method='GET',
81 81 renderer='rhodecode:templates/summary/summary.mako')
82 82
83 83 config.add_route(
84 84 name='repo_summary_commits',
85 85 pattern='/{repo_name:.*?[^/]}/summary-commits', repo_route=True)
86 86 config.add_view(
87 87 RepoSummaryView,
88 88 attr='summary_commits',
89 89 route_name='repo_summary_commits', request_method='GET',
90 90 renderer='rhodecode:templates/summary/summary_commits.mako')
91 91
92 92 # Commits
93 93 config.add_route(
94 94 name='repo_commit',
95 95 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}', repo_route=True)
96 96 config.add_view(
97 97 RepoCommitsView,
98 98 attr='repo_commit_show',
99 99 route_name='repo_commit', request_method='GET',
100 100 renderer=None)
101 101
102 102 config.add_route(
103 103 name='repo_commit_children',
104 104 pattern='/{repo_name:.*?[^/]}/changeset_children/{commit_id}', repo_route=True)
105 105 config.add_view(
106 106 RepoCommitsView,
107 107 attr='repo_commit_children',
108 108 route_name='repo_commit_children', request_method='GET',
109 109 renderer='json_ext', xhr=True)
110 110
111 111 config.add_route(
112 112 name='repo_commit_parents',
113 113 pattern='/{repo_name:.*?[^/]}/changeset_parents/{commit_id}', repo_route=True)
114 114 config.add_view(
115 115 RepoCommitsView,
116 116 attr='repo_commit_parents',
117 117 route_name='repo_commit_parents', request_method='GET',
118 118 renderer='json_ext')
119 119
120 120 config.add_route(
121 121 name='repo_commit_raw',
122 122 pattern='/{repo_name:.*?[^/]}/changeset-diff/{commit_id}', repo_route=True)
123 123 config.add_view(
124 124 RepoCommitsView,
125 125 attr='repo_commit_raw',
126 126 route_name='repo_commit_raw', request_method='GET',
127 127 renderer=None)
128 128
129 129 config.add_route(
130 130 name='repo_commit_patch',
131 131 pattern='/{repo_name:.*?[^/]}/changeset-patch/{commit_id}', repo_route=True)
132 132 config.add_view(
133 133 RepoCommitsView,
134 134 attr='repo_commit_patch',
135 135 route_name='repo_commit_patch', request_method='GET',
136 136 renderer=None)
137 137
138 138 config.add_route(
139 139 name='repo_commit_download',
140 140 pattern='/{repo_name:.*?[^/]}/changeset-download/{commit_id}', repo_route=True)
141 141 config.add_view(
142 142 RepoCommitsView,
143 143 attr='repo_commit_download',
144 144 route_name='repo_commit_download', request_method='GET',
145 145 renderer=None)
146 146
147 147 config.add_route(
148 148 name='repo_commit_data',
149 149 pattern='/{repo_name:.*?[^/]}/changeset-data/{commit_id}', repo_route=True)
150 150 config.add_view(
151 151 RepoCommitsView,
152 152 attr='repo_commit_data',
153 153 route_name='repo_commit_data', request_method='GET',
154 154 renderer='json_ext', xhr=True)
155 155
156 156 config.add_route(
157 157 name='repo_commit_comment_create',
158 158 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/create', repo_route=True)
159 159 config.add_view(
160 160 RepoCommitsView,
161 161 attr='repo_commit_comment_create',
162 162 route_name='repo_commit_comment_create', request_method='POST',
163 163 renderer='json_ext')
164 164
165 165 config.add_route(
166 166 name='repo_commit_comment_preview',
167 167 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True)
168 168 config.add_view(
169 169 RepoCommitsView,
170 170 attr='repo_commit_comment_preview',
171 171 route_name='repo_commit_comment_preview', request_method='POST',
172 172 renderer='string', xhr=True)
173 173
174 174 config.add_route(
175 175 name='repo_commit_comment_history_view',
176 176 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_history_id}/history_view', repo_route=True)
177 177 config.add_view(
178 178 RepoCommitsView,
179 179 attr='repo_commit_comment_history_view',
180 180 route_name='repo_commit_comment_history_view', request_method='POST',
181 181 renderer='string', xhr=True)
182 182
183 183 config.add_route(
184 184 name='repo_commit_comment_attachment_upload',
185 185 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True)
186 186 config.add_view(
187 187 RepoCommitsView,
188 188 attr='repo_commit_comment_attachment_upload',
189 189 route_name='repo_commit_comment_attachment_upload', request_method='POST',
190 190 renderer='json_ext', xhr=True)
191 191
192 192 config.add_route(
193 193 name='repo_commit_comment_delete',
194 194 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True)
195 195 config.add_view(
196 196 RepoCommitsView,
197 197 attr='repo_commit_comment_delete',
198 198 route_name='repo_commit_comment_delete', request_method='POST',
199 199 renderer='json_ext')
200 200
201 201 config.add_route(
202 202 name='repo_commit_comment_edit',
203 203 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/edit', repo_route=True)
204 204 config.add_view(
205 205 RepoCommitsView,
206 206 attr='repo_commit_comment_edit',
207 207 route_name='repo_commit_comment_edit', request_method='POST',
208 208 renderer='json_ext')
209 209
210 210 # still working url for backward compat.
211 211 config.add_route(
212 212 name='repo_commit_raw_deprecated',
213 213 pattern='/{repo_name:.*?[^/]}/raw-changeset/{commit_id}', repo_route=True)
214 214 config.add_view(
215 215 RepoCommitsView,
216 216 attr='repo_commit_raw',
217 217 route_name='repo_commit_raw_deprecated', request_method='GET',
218 218 renderer=None)
219 219
220 220 # Files
221 221 config.add_route(
222 222 name='repo_archivefile',
223 223 pattern='/{repo_name:.*?[^/]}/archive/{fname:.*}', repo_route=True)
224 224 config.add_view(
225 225 RepoFilesView,
226 226 attr='repo_archivefile',
227 227 route_name='repo_archivefile', request_method='GET',
228 228 renderer=None)
229 229
230 230 config.add_route(
231 231 name='repo_files_diff',
232 232 pattern='/{repo_name:.*?[^/]}/diff/{f_path:.*}', repo_route=True)
233 233 config.add_view(
234 234 RepoFilesView,
235 235 attr='repo_files_diff',
236 236 route_name='repo_files_diff', request_method='GET',
237 237 renderer=None)
238 238
239 239 config.add_route( # legacy route to make old links work
240 240 name='repo_files_diff_2way_redirect',
241 241 pattern='/{repo_name:.*?[^/]}/diff-2way/{f_path:.*}', repo_route=True)
242 242 config.add_view(
243 243 RepoFilesView,
244 244 attr='repo_files_diff_2way_redirect',
245 245 route_name='repo_files_diff_2way_redirect', request_method='GET',
246 246 renderer=None)
247 247
248 248 config.add_route(
249 249 name='repo_files',
250 250 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/{f_path:.*}', repo_route=True)
251 251 config.add_view(
252 252 RepoFilesView,
253 253 attr='repo_files',
254 254 route_name='repo_files', request_method='GET',
255 255 renderer=None)
256 256
257 257 config.add_route(
258 258 name='repo_files:default_path',
259 259 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/', repo_route=True)
260 260 config.add_view(
261 261 RepoFilesView,
262 262 attr='repo_files',
263 263 route_name='repo_files:default_path', request_method='GET',
264 264 renderer=None)
265 265
266 266 config.add_route(
267 267 name='repo_files:default_commit',
268 268 pattern='/{repo_name:.*?[^/]}/files', repo_route=True)
269 269 config.add_view(
270 270 RepoFilesView,
271 271 attr='repo_files',
272 272 route_name='repo_files:default_commit', request_method='GET',
273 273 renderer=None)
274 274
275 275 config.add_route(
276 276 name='repo_files:rendered',
277 277 pattern='/{repo_name:.*?[^/]}/render/{commit_id}/{f_path:.*}', repo_route=True)
278 278 config.add_view(
279 279 RepoFilesView,
280 280 attr='repo_files',
281 281 route_name='repo_files:rendered', request_method='GET',
282 282 renderer=None)
283 283
284 284 config.add_route(
285 285 name='repo_files:annotated',
286 286 pattern='/{repo_name:.*?[^/]}/annotate/{commit_id}/{f_path:.*}', repo_route=True)
287 287 config.add_view(
288 288 RepoFilesView,
289 289 attr='repo_files',
290 290 route_name='repo_files:annotated', request_method='GET',
291 291 renderer=None)
292 292
293 293 config.add_route(
294 294 name='repo_files:annotated_previous',
295 295 pattern='/{repo_name:.*?[^/]}/annotate-previous/{commit_id}/{f_path:.*}', repo_route=True)
296 296 config.add_view(
297 297 RepoFilesView,
298 298 attr='repo_files_annotated_previous',
299 299 route_name='repo_files:annotated_previous', request_method='GET',
300 300 renderer=None)
301 301
302 302 config.add_route(
303 303 name='repo_nodetree_full',
304 304 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/{f_path:.*}', repo_route=True)
305 305 config.add_view(
306 306 RepoFilesView,
307 307 attr='repo_nodetree_full',
308 308 route_name='repo_nodetree_full', request_method='GET',
309 309 renderer=None, xhr=True)
310 310
311 311 config.add_route(
312 312 name='repo_nodetree_full:default_path',
313 313 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/', repo_route=True)
314 314 config.add_view(
315 315 RepoFilesView,
316 316 attr='repo_nodetree_full',
317 317 route_name='repo_nodetree_full:default_path', request_method='GET',
318 318 renderer=None, xhr=True)
319 319
320 320 config.add_route(
321 321 name='repo_files_nodelist',
322 322 pattern='/{repo_name:.*?[^/]}/nodelist/{commit_id}/{f_path:.*}', repo_route=True)
323 323 config.add_view(
324 324 RepoFilesView,
325 325 attr='repo_nodelist',
326 326 route_name='repo_files_nodelist', request_method='GET',
327 327 renderer='json_ext', xhr=True)
328 328
329 329 config.add_route(
330 330 name='repo_file_raw',
331 331 pattern='/{repo_name:.*?[^/]}/raw/{commit_id}/{f_path:.*}', repo_route=True)
332 332 config.add_view(
333 333 RepoFilesView,
334 334 attr='repo_file_raw',
335 335 route_name='repo_file_raw', request_method='GET',
336 336 renderer=None)
337 337
338 338 config.add_route(
339 339 name='repo_file_download',
340 340 pattern='/{repo_name:.*?[^/]}/download/{commit_id}/{f_path:.*}', repo_route=True)
341 341 config.add_view(
342 342 RepoFilesView,
343 343 attr='repo_file_download',
344 344 route_name='repo_file_download', request_method='GET',
345 345 renderer=None)
346 346
347 347 config.add_route( # backward compat to keep old links working
348 348 name='repo_file_download:legacy',
349 349 pattern='/{repo_name:.*?[^/]}/rawfile/{commit_id}/{f_path:.*}',
350 350 repo_route=True)
351 351 config.add_view(
352 352 RepoFilesView,
353 353 attr='repo_file_download',
354 354 route_name='repo_file_download:legacy', request_method='GET',
355 355 renderer=None)
356 356
357 357 config.add_route(
358 358 name='repo_file_history',
359 359 pattern='/{repo_name:.*?[^/]}/history/{commit_id}/{f_path:.*}', repo_route=True)
360 360 config.add_view(
361 361 RepoFilesView,
362 362 attr='repo_file_history',
363 363 route_name='repo_file_history', request_method='GET',
364 364 renderer='json_ext')
365 365
366 366 config.add_route(
367 367 name='repo_file_authors',
368 368 pattern='/{repo_name:.*?[^/]}/authors/{commit_id}/{f_path:.*}', repo_route=True)
369 369 config.add_view(
370 370 RepoFilesView,
371 371 attr='repo_file_authors',
372 372 route_name='repo_file_authors', request_method='GET',
373 373 renderer='rhodecode:templates/files/file_authors_box.mako')
374 374
375 375 config.add_route(
376 376 name='repo_files_check_head',
377 377 pattern='/{repo_name:.*?[^/]}/check_head/{commit_id}/{f_path:.*}',
378 378 repo_route=True)
379 379 config.add_view(
380 380 RepoFilesView,
381 381 attr='repo_files_check_head',
382 382 route_name='repo_files_check_head', request_method='POST',
383 383 renderer='json_ext', xhr=True)
384 384
385 385 config.add_route(
386 386 name='repo_files_remove_file',
387 387 pattern='/{repo_name:.*?[^/]}/remove_file/{commit_id}/{f_path:.*}',
388 388 repo_route=True)
389 389 config.add_view(
390 390 RepoFilesView,
391 391 attr='repo_files_remove_file',
392 392 route_name='repo_files_remove_file', request_method='GET',
393 393 renderer='rhodecode:templates/files/files_delete.mako')
394 394
395 395 config.add_route(
396 396 name='repo_files_delete_file',
397 397 pattern='/{repo_name:.*?[^/]}/delete_file/{commit_id}/{f_path:.*}',
398 398 repo_route=True)
399 399 config.add_view(
400 400 RepoFilesView,
401 401 attr='repo_files_delete_file',
402 402 route_name='repo_files_delete_file', request_method='POST',
403 403 renderer=None)
404 404
405 405 config.add_route(
406 406 name='repo_files_edit_file',
407 407 pattern='/{repo_name:.*?[^/]}/edit_file/{commit_id}/{f_path:.*}',
408 408 repo_route=True)
409 409 config.add_view(
410 410 RepoFilesView,
411 411 attr='repo_files_edit_file',
412 412 route_name='repo_files_edit_file', request_method='GET',
413 413 renderer='rhodecode:templates/files/files_edit.mako')
414 414
415 415 config.add_route(
416 416 name='repo_files_update_file',
417 417 pattern='/{repo_name:.*?[^/]}/update_file/{commit_id}/{f_path:.*}',
418 418 repo_route=True)
419 419 config.add_view(
420 420 RepoFilesView,
421 421 attr='repo_files_update_file',
422 422 route_name='repo_files_update_file', request_method='POST',
423 423 renderer=None)
424 424
425 425 config.add_route(
426 426 name='repo_files_add_file',
427 427 pattern='/{repo_name:.*?[^/]}/add_file/{commit_id}/{f_path:.*}',
428 428 repo_route=True)
429 429 config.add_view(
430 430 RepoFilesView,
431 431 attr='repo_files_add_file',
432 432 route_name='repo_files_add_file', request_method='GET',
433 433 renderer='rhodecode:templates/files/files_add.mako')
434 434
435 435 config.add_route(
436 436 name='repo_files_upload_file',
437 437 pattern='/{repo_name:.*?[^/]}/upload_file/{commit_id}/{f_path:.*}',
438 438 repo_route=True)
439 439 config.add_view(
440 440 RepoFilesView,
441 441 attr='repo_files_add_file',
442 442 route_name='repo_files_upload_file', request_method='GET',
443 443 renderer='rhodecode:templates/files/files_upload.mako')
444 444 config.add_view( # POST creates
445 445 RepoFilesView,
446 446 attr='repo_files_upload_file',
447 447 route_name='repo_files_upload_file', request_method='POST',
448 448 renderer='json_ext')
449 449
450 450 config.add_route(
451 451 name='repo_files_create_file',
452 452 pattern='/{repo_name:.*?[^/]}/create_file/{commit_id}/{f_path:.*}',
453 453 repo_route=True)
454 454 config.add_view( # POST creates
455 455 RepoFilesView,
456 456 attr='repo_files_create_file',
457 457 route_name='repo_files_create_file', request_method='POST',
458 458 renderer=None)
459 459
460 460 # Refs data
461 461 config.add_route(
462 462 name='repo_refs_data',
463 463 pattern='/{repo_name:.*?[^/]}/refs-data', repo_route=True)
464 464 config.add_view(
465 465 RepoSummaryView,
466 466 attr='repo_refs_data',
467 467 route_name='repo_refs_data', request_method='GET',
468 468 renderer='json_ext')
469 469
470 470 config.add_route(
471 471 name='repo_refs_changelog_data',
472 472 pattern='/{repo_name:.*?[^/]}/refs-data-changelog', repo_route=True)
473 473 config.add_view(
474 474 RepoSummaryView,
475 475 attr='repo_refs_changelog_data',
476 476 route_name='repo_refs_changelog_data', request_method='GET',
477 477 renderer='json_ext')
478 478
479 479 config.add_route(
480 480 name='repo_stats',
481 481 pattern='/{repo_name:.*?[^/]}/repo_stats/{commit_id}', repo_route=True)
482 482 config.add_view(
483 483 RepoSummaryView,
484 484 attr='repo_stats',
485 485 route_name='repo_stats', request_method='GET',
486 486 renderer='json_ext')
487 487
488 488 # Commits
489 489 config.add_route(
490 490 name='repo_commits',
491 491 pattern='/{repo_name:.*?[^/]}/commits', repo_route=True)
492 492 config.add_view(
493 493 RepoChangelogView,
494 494 attr='repo_changelog',
495 495 route_name='repo_commits', request_method='GET',
496 496 renderer='rhodecode:templates/commits/changelog.mako')
497 497 # old routes for backward compat
498 498 config.add_view(
499 499 RepoChangelogView,
500 500 attr='repo_changelog',
501 501 route_name='repo_changelog', request_method='GET',
502 502 renderer='rhodecode:templates/commits/changelog.mako')
503 503
504 504 config.add_route(
505 505 name='repo_commits_elements',
506 506 pattern='/{repo_name:.*?[^/]}/commits_elements', repo_route=True)
507 507 config.add_view(
508 508 RepoChangelogView,
509 509 attr='repo_commits_elements',
510 510 route_name='repo_commits_elements', request_method=('GET', 'POST'),
511 511 renderer='rhodecode:templates/commits/changelog_elements.mako',
512 512 xhr=True)
513 513
514 514 config.add_route(
515 515 name='repo_commits_elements_file',
516 516 pattern='/{repo_name:.*?[^/]}/commits_elements/{commit_id}/{f_path:.*}', repo_route=True)
517 517 config.add_view(
518 518 RepoChangelogView,
519 519 attr='repo_commits_elements',
520 520 route_name='repo_commits_elements_file', request_method=('GET', 'POST'),
521 521 renderer='rhodecode:templates/commits/changelog_elements.mako',
522 522 xhr=True)
523 523
524 524 config.add_route(
525 525 name='repo_commits_file',
526 526 pattern='/{repo_name:.*?[^/]}/commits/{commit_id}/{f_path:.*}', repo_route=True)
527 527 config.add_view(
528 528 RepoChangelogView,
529 529 attr='repo_changelog',
530 530 route_name='repo_commits_file', request_method='GET',
531 531 renderer='rhodecode:templates/commits/changelog.mako')
532 532 # old routes for backward compat
533 533 config.add_view(
534 534 RepoChangelogView,
535 535 attr='repo_changelog',
536 536 route_name='repo_changelog_file', request_method='GET',
537 537 renderer='rhodecode:templates/commits/changelog.mako')
538 538
539 539 # Changelog (old deprecated name for commits page)
540 540 config.add_route(
541 541 name='repo_changelog',
542 542 pattern='/{repo_name:.*?[^/]}/changelog', repo_route=True)
543 543 config.add_route(
544 544 name='repo_changelog_file',
545 545 pattern='/{repo_name:.*?[^/]}/changelog/{commit_id}/{f_path:.*}', repo_route=True)
546 546
547 547 # Compare
548 548 config.add_route(
549 549 name='repo_compare_select',
550 550 pattern='/{repo_name:.*?[^/]}/compare', repo_route=True)
551 551 config.add_view(
552 552 RepoCompareView,
553 553 attr='compare_select',
554 554 route_name='repo_compare_select', request_method='GET',
555 555 renderer='rhodecode:templates/compare/compare_diff.mako')
556 556
557 557 config.add_route(
558 558 name='repo_compare',
559 559 pattern='/{repo_name:.*?[^/]}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', repo_route=True)
560 560 config.add_view(
561 561 RepoCompareView,
562 562 attr='compare',
563 563 route_name='repo_compare', request_method='GET',
564 564 renderer=None)
565 565
566 566 # Tags
567 567 config.add_route(
568 568 name='tags_home',
569 569 pattern='/{repo_name:.*?[^/]}/tags', repo_route=True)
570 570 config.add_view(
571 571 RepoTagsView,
572 572 attr='tags',
573 573 route_name='tags_home', request_method='GET',
574 574 renderer='rhodecode:templates/tags/tags.mako')
575 575
576 576 # Branches
577 577 config.add_route(
578 578 name='branches_home',
579 579 pattern='/{repo_name:.*?[^/]}/branches', repo_route=True)
580 580 config.add_view(
581 581 RepoBranchesView,
582 582 attr='branches',
583 583 route_name='branches_home', request_method='GET',
584 584 renderer='rhodecode:templates/branches/branches.mako')
585 585
586 586 # Bookmarks
587 587 config.add_route(
588 588 name='bookmarks_home',
589 589 pattern='/{repo_name:.*?[^/]}/bookmarks', repo_route=True)
590 590 config.add_view(
591 591 RepoBookmarksView,
592 592 attr='bookmarks',
593 593 route_name='bookmarks_home', request_method='GET',
594 594 renderer='rhodecode:templates/bookmarks/bookmarks.mako')
595 595
596 596 # Forks
597 597 config.add_route(
598 598 name='repo_fork_new',
599 599 pattern='/{repo_name:.*?[^/]}/fork', repo_route=True,
600 600 repo_forbid_when_archived=True,
601 601 repo_accepted_types=['hg', 'git'])
602 602 config.add_view(
603 603 RepoForksView,
604 604 attr='repo_fork_new',
605 605 route_name='repo_fork_new', request_method='GET',
606 606 renderer='rhodecode:templates/forks/forks.mako')
607 607
608 608 config.add_route(
609 609 name='repo_fork_create',
610 610 pattern='/{repo_name:.*?[^/]}/fork/create', repo_route=True,
611 611 repo_forbid_when_archived=True,
612 612 repo_accepted_types=['hg', 'git'])
613 613 config.add_view(
614 614 RepoForksView,
615 615 attr='repo_fork_create',
616 616 route_name='repo_fork_create', request_method='POST',
617 617 renderer='rhodecode:templates/forks/fork.mako')
618 618
619 619 config.add_route(
620 620 name='repo_forks_show_all',
621 621 pattern='/{repo_name:.*?[^/]}/forks', repo_route=True,
622 622 repo_accepted_types=['hg', 'git'])
623 623 config.add_view(
624 624 RepoForksView,
625 625 attr='repo_forks_show_all',
626 626 route_name='repo_forks_show_all', request_method='GET',
627 627 renderer='rhodecode:templates/forks/forks.mako')
628 628
629 629 config.add_route(
630 630 name='repo_forks_data',
631 631 pattern='/{repo_name:.*?[^/]}/forks/data', repo_route=True,
632 632 repo_accepted_types=['hg', 'git'])
633 633 config.add_view(
634 634 RepoForksView,
635 635 attr='repo_forks_data',
636 636 route_name='repo_forks_data', request_method='GET',
637 637 renderer='json_ext', xhr=True)
638 638
639 639 # Pull Requests
640 640 config.add_route(
641 641 name='pullrequest_show',
642 642 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}',
643 643 repo_route=True)
644 644 config.add_view(
645 645 RepoPullRequestsView,
646 646 attr='pull_request_show',
647 647 route_name='pullrequest_show', request_method='GET',
648 648 renderer='rhodecode:templates/pullrequests/pullrequest_show.mako')
649 649
650 650 config.add_route(
651 651 name='pullrequest_show_all',
652 652 pattern='/{repo_name:.*?[^/]}/pull-request',
653 653 repo_route=True, repo_accepted_types=['hg', 'git'])
654 654 config.add_view(
655 655 RepoPullRequestsView,
656 656 attr='pull_request_list',
657 657 route_name='pullrequest_show_all', request_method='GET',
658 658 renderer='rhodecode:templates/pullrequests/pullrequests.mako')
659 659
660 660 config.add_route(
661 661 name='pullrequest_show_all_data',
662 662 pattern='/{repo_name:.*?[^/]}/pull-request-data',
663 663 repo_route=True, repo_accepted_types=['hg', 'git'])
664 664 config.add_view(
665 665 RepoPullRequestsView,
666 666 attr='pull_request_list_data',
667 667 route_name='pullrequest_show_all_data', request_method='GET',
668 668 renderer='json_ext', xhr=True)
669 669
670 670 config.add_route(
671 671 name='pullrequest_repo_refs',
672 672 pattern='/{repo_name:.*?[^/]}/pull-request/refs/{target_repo_name:.*?[^/]}',
673 673 repo_route=True)
674 674 config.add_view(
675 675 RepoPullRequestsView,
676 676 attr='pull_request_repo_refs',
677 677 route_name='pullrequest_repo_refs', request_method='GET',
678 678 renderer='json_ext', xhr=True)
679 679
680 680 config.add_route(
681 681 name='pullrequest_repo_targets',
682 682 pattern='/{repo_name:.*?[^/]}/pull-request/repo-targets',
683 683 repo_route=True)
684 684 config.add_view(
685 685 RepoPullRequestsView,
686 686 attr='pullrequest_repo_targets',
687 687 route_name='pullrequest_repo_targets', request_method='GET',
688 688 renderer='json_ext', xhr=True)
689 689
690 690 config.add_route(
691 691 name='pullrequest_new',
692 692 pattern='/{repo_name:.*?[^/]}/pull-request/new',
693 693 repo_route=True, repo_accepted_types=['hg', 'git'],
694 694 repo_forbid_when_archived=True)
695 695 config.add_view(
696 696 RepoPullRequestsView,
697 697 attr='pull_request_new',
698 698 route_name='pullrequest_new', request_method='GET',
699 699 renderer='rhodecode:templates/pullrequests/pullrequest.mako')
700 700
701 701 config.add_route(
702 702 name='pullrequest_create',
703 703 pattern='/{repo_name:.*?[^/]}/pull-request/create',
704 704 repo_route=True, repo_accepted_types=['hg', 'git'],
705 705 repo_forbid_when_archived=True)
706 706 config.add_view(
707 707 RepoPullRequestsView,
708 708 attr='pull_request_create',
709 709 route_name='pullrequest_create', request_method='POST',
710 710 renderer=None)
711 711
712 712 config.add_route(
713 713 name='pullrequest_update',
714 714 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/update',
715 715 repo_route=True, repo_forbid_when_archived=True)
716 716 config.add_view(
717 717 RepoPullRequestsView,
718 718 attr='pull_request_update',
719 719 route_name='pullrequest_update', request_method='POST',
720 720 renderer='json_ext')
721 721
722 722 config.add_route(
723 723 name='pullrequest_merge',
724 724 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/merge',
725 725 repo_route=True, repo_forbid_when_archived=True)
726 726 config.add_view(
727 727 RepoPullRequestsView,
728 728 attr='pull_request_merge',
729 729 route_name='pullrequest_merge', request_method='POST',
730 730 renderer='json_ext')
731 731
732 732 config.add_route(
733 733 name='pullrequest_delete',
734 734 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/delete',
735 735 repo_route=True, repo_forbid_when_archived=True)
736 736 config.add_view(
737 737 RepoPullRequestsView,
738 738 attr='pull_request_delete',
739 739 route_name='pullrequest_delete', request_method='POST',
740 740 renderer='json_ext')
741 741
742 742 config.add_route(
743 743 name='pullrequest_comment_create',
744 744 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment',
745 745 repo_route=True)
746 746 config.add_view(
747 747 RepoPullRequestsView,
748 748 attr='pull_request_comment_create',
749 749 route_name='pullrequest_comment_create', request_method='POST',
750 750 renderer='json_ext')
751 751
752 752 config.add_route(
753 753 name='pullrequest_comment_edit',
754 754 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/edit',
755 755 repo_route=True, repo_accepted_types=['hg', 'git'])
756 756 config.add_view(
757 757 RepoPullRequestsView,
758 758 attr='pull_request_comment_edit',
759 759 route_name='pullrequest_comment_edit', request_method='POST',
760 760 renderer='json_ext')
761 761
762 762 config.add_route(
763 763 name='pullrequest_comment_delete',
764 764 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/delete',
765 765 repo_route=True, repo_accepted_types=['hg', 'git'])
766 766 config.add_view(
767 767 RepoPullRequestsView,
768 768 attr='pull_request_comment_delete',
769 769 route_name='pullrequest_comment_delete', request_method='POST',
770 770 renderer='json_ext')
771 771
772 772 config.add_route(
773 773 name='pullrequest_comments',
774 774 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comments',
775 775 repo_route=True)
776 776 config.add_view(
777 777 RepoPullRequestsView,
778 778 attr='pullrequest_comments',
779 779 route_name='pullrequest_comments', request_method='POST',
780 780 renderer='string_html', xhr=True)
781 781
782 782 config.add_route(
783 783 name='pullrequest_todos',
784 784 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/todos',
785 785 repo_route=True)
786 786 config.add_view(
787 787 RepoPullRequestsView,
788 788 attr='pullrequest_todos',
789 789 route_name='pullrequest_todos', request_method='POST',
790 790 renderer='string_html', xhr=True)
791 791
792 792 config.add_route(
793 793 name='pullrequest_drafts',
794 794 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/drafts',
795 795 repo_route=True)
796 796 config.add_view(
797 797 RepoPullRequestsView,
798 798 attr='pullrequest_drafts',
799 799 route_name='pullrequest_drafts', request_method='POST',
800 800 renderer='string_html', xhr=True)
801 801
802 802 # Artifacts, (EE feature)
803 803 config.add_route(
804 804 name='repo_artifacts_list',
805 805 pattern='/{repo_name:.*?[^/]}/artifacts', repo_route=True)
806 806 config.add_view(
807 807 RepoArtifactsView,
808 808 attr='repo_artifacts',
809 809 route_name='repo_artifacts_list', request_method='GET',
810 810 renderer='rhodecode:templates/artifacts/artifact_list.mako')
811 811
812 812 # Settings
813 813 config.add_route(
814 814 name='edit_repo',
815 815 pattern='/{repo_name:.*?[^/]}/settings', repo_route=True)
816 816 config.add_view(
817 817 RepoSettingsView,
818 818 attr='edit_settings',
819 819 route_name='edit_repo', request_method='GET',
820 820 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
821 821 # update is POST on edit_repo
822 822 config.add_view(
823 823 RepoSettingsView,
824 824 attr='edit_settings_update',
825 825 route_name='edit_repo', request_method='POST',
826 826 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
827 827
828 828 # Settings advanced
829 829 config.add_route(
830 830 name='edit_repo_advanced',
831 831 pattern='/{repo_name:.*?[^/]}/settings/advanced', repo_route=True)
832 832 config.add_view(
833 833 RepoSettingsAdvancedView,
834 834 attr='edit_advanced',
835 835 route_name='edit_repo_advanced', request_method='GET',
836 836 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
837 837
838 838 config.add_route(
839 839 name='edit_repo_advanced_archive',
840 840 pattern='/{repo_name:.*?[^/]}/settings/advanced/archive', repo_route=True)
841 841 config.add_view(
842 842 RepoSettingsAdvancedView,
843 843 attr='edit_advanced_archive',
844 844 route_name='edit_repo_advanced_archive', request_method='POST',
845 845 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
846 846
847 847 config.add_route(
848 848 name='edit_repo_advanced_delete',
849 849 pattern='/{repo_name:.*?[^/]}/settings/advanced/delete', repo_route=True)
850 850 config.add_view(
851 851 RepoSettingsAdvancedView,
852 852 attr='edit_advanced_delete',
853 853 route_name='edit_repo_advanced_delete', request_method='POST',
854 854 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
855 855
856 856 config.add_route(
857 857 name='edit_repo_advanced_locking',
858 858 pattern='/{repo_name:.*?[^/]}/settings/advanced/locking', repo_route=True)
859 859 config.add_view(
860 860 RepoSettingsAdvancedView,
861 861 attr='edit_advanced_toggle_locking',
862 862 route_name='edit_repo_advanced_locking', request_method='POST',
863 863 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
864 864
865 865 config.add_route(
866 866 name='edit_repo_advanced_journal',
867 867 pattern='/{repo_name:.*?[^/]}/settings/advanced/journal', repo_route=True)
868 868 config.add_view(
869 869 RepoSettingsAdvancedView,
870 870 attr='edit_advanced_journal',
871 871 route_name='edit_repo_advanced_journal', request_method='POST',
872 872 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
873 873
874 874 config.add_route(
875 875 name='edit_repo_advanced_fork',
876 876 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
877 877 config.add_view(
878 878 RepoSettingsAdvancedView,
879 879 attr='edit_advanced_fork',
880 880 route_name='edit_repo_advanced_fork', request_method='POST',
881 881 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
882 882
883 883 config.add_route(
884 884 name='edit_repo_advanced_hooks',
885 885 pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True)
886 886 config.add_view(
887 887 RepoSettingsAdvancedView,
888 888 attr='edit_advanced_install_hooks',
889 889 route_name='edit_repo_advanced_hooks', request_method='GET',
890 890 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
891 891
892 892 # Caches
893 893 config.add_route(
894 894 name='edit_repo_caches',
895 895 pattern='/{repo_name:.*?[^/]}/settings/caches', repo_route=True)
896 896 config.add_view(
897 897 RepoCachesView,
898 898 attr='repo_caches',
899 899 route_name='edit_repo_caches', request_method='GET',
900 900 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
901 901 config.add_view(
902 902 RepoCachesView,
903 903 attr='repo_caches_purge',
904 904 route_name='edit_repo_caches', request_method='POST')
905 905
906 906 # Permissions
907 907 config.add_route(
908 908 name='edit_repo_perms',
909 909 pattern='/{repo_name:.*?[^/]}/settings/permissions', repo_route=True)
910 910 config.add_view(
911 911 RepoSettingsPermissionsView,
912 912 attr='edit_permissions',
913 913 route_name='edit_repo_perms', request_method='GET',
914 914 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
915 915 config.add_view(
916 916 RepoSettingsPermissionsView,
917 917 attr='edit_permissions_update',
918 918 route_name='edit_repo_perms', request_method='POST',
919 919 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
920 920
921 921 config.add_route(
922 922 name='edit_repo_perms_set_private',
923 923 pattern='/{repo_name:.*?[^/]}/settings/permissions/set_private', repo_route=True)
924 924 config.add_view(
925 925 RepoSettingsPermissionsView,
926 926 attr='edit_permissions_set_private_repo',
927 927 route_name='edit_repo_perms_set_private', request_method='POST',
928 928 renderer='json_ext')
929 929
930 930 # Permissions Branch (EE feature)
931 931 config.add_route(
932 932 name='edit_repo_perms_branch',
933 933 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions', repo_route=True)
934 934 config.add_view(
935 RepoBranchesView,
936 attr='branches',
935 RepoSettingsBranchPermissionsView,
936 attr='branch_permissions',
937 937 route_name='edit_repo_perms_branch', request_method='GET',
938 938 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
939 939
940 940 config.add_route(
941 941 name='edit_repo_perms_branch_delete',
942 942 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions/{rule_id}/delete',
943 943 repo_route=True)
944 944 ## Only implemented in EE
945 945
946 946 # Maintenance
947 947 config.add_route(
948 948 name='edit_repo_maintenance',
949 949 pattern='/{repo_name:.*?[^/]}/settings/maintenance', repo_route=True)
950 950 config.add_view(
951 951 RepoMaintenanceView,
952 952 attr='repo_maintenance',
953 route_name='edit_repo_maintenance_execute', request_method='GET',
954 renderer='json', xhr=True)
953 route_name='edit_repo_maintenance', request_method='GET',
954 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
955 955
956 956 config.add_route(
957 957 name='edit_repo_maintenance_execute',
958 958 pattern='/{repo_name:.*?[^/]}/settings/maintenance/execute', repo_route=True)
959 959 config.add_view(
960 960 RepoMaintenanceView,
961 961 attr='repo_maintenance_execute',
962 route_name='edit_repo_maintenance', request_method='GET',
963 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
962 route_name='edit_repo_maintenance_execute', request_method='GET',
963 renderer='json', xhr=True)
964 964
965 965 # Fields
966 966 config.add_route(
967 967 name='edit_repo_fields',
968 968 pattern='/{repo_name:.*?[^/]}/settings/fields', repo_route=True)
969 969 config.add_view(
970 970 RepoSettingsFieldsView,
971 971 attr='repo_field_edit',
972 972 route_name='edit_repo_fields', request_method='GET',
973 973 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
974 974
975 975 config.add_route(
976 976 name='edit_repo_fields_create',
977 977 pattern='/{repo_name:.*?[^/]}/settings/fields/create', repo_route=True)
978 978 config.add_view(
979 979 RepoSettingsFieldsView,
980 980 attr='repo_field_create',
981 981 route_name='edit_repo_fields_create', request_method='POST',
982 982 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
983 983
984 984 config.add_route(
985 985 name='edit_repo_fields_delete',
986 986 pattern='/{repo_name:.*?[^/]}/settings/fields/{field_id}/delete', repo_route=True)
987 987 config.add_view(
988 988 RepoSettingsFieldsView,
989 989 attr='repo_field_delete',
990 990 route_name='edit_repo_fields_delete', request_method='POST',
991 991 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
992 992
993 993 # Locking
994 994 config.add_route(
995 995 name='repo_edit_toggle_locking',
996 996 pattern='/{repo_name:.*?[^/]}/settings/toggle_locking', repo_route=True)
997 997 config.add_view(
998 998 RepoSettingsView,
999 999 attr='edit_advanced_toggle_locking',
1000 1000 route_name='repo_edit_toggle_locking', request_method='GET',
1001 1001 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1002 1002
1003 1003 # Remote
1004 1004 config.add_route(
1005 1005 name='edit_repo_remote',
1006 1006 pattern='/{repo_name:.*?[^/]}/settings/remote', repo_route=True)
1007 1007 config.add_view(
1008 1008 RepoSettingsRemoteView,
1009 1009 attr='repo_remote_edit_form',
1010 1010 route_name='edit_repo_remote', request_method='GET',
1011 1011 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1012 1012
1013 1013 config.add_route(
1014 1014 name='edit_repo_remote_pull',
1015 1015 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
1016 1016 config.add_view(
1017 1017 RepoSettingsRemoteView,
1018 1018 attr='repo_remote_pull_changes',
1019 1019 route_name='edit_repo_remote_pull', request_method='POST',
1020 1020 renderer=None)
1021 1021
1022 1022 config.add_route(
1023 1023 name='edit_repo_remote_push',
1024 1024 pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True)
1025 1025
1026 1026 # Statistics
1027 1027 config.add_route(
1028 1028 name='edit_repo_statistics',
1029 1029 pattern='/{repo_name:.*?[^/]}/settings/statistics', repo_route=True)
1030 1030 config.add_view(
1031 1031 RepoSettingsView,
1032 1032 attr='edit_statistics_form',
1033 1033 route_name='edit_repo_statistics', request_method='GET',
1034 1034 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1035 1035
1036 1036 config.add_route(
1037 1037 name='edit_repo_statistics_reset',
1038 1038 pattern='/{repo_name:.*?[^/]}/settings/statistics/update', repo_route=True)
1039 1039 config.add_view(
1040 1040 RepoSettingsView,
1041 1041 attr='repo_statistics_reset',
1042 1042 route_name='edit_repo_statistics_reset', request_method='POST',
1043 1043 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1044 1044
1045 1045 # Issue trackers
1046 1046 config.add_route(
1047 1047 name='edit_repo_issuetracker',
1048 1048 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers', repo_route=True)
1049 1049 config.add_view(
1050 1050 RepoSettingsIssueTrackersView,
1051 1051 attr='repo_issuetracker',
1052 1052 route_name='edit_repo_issuetracker', request_method='GET',
1053 1053 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1054 1054
1055 1055 config.add_route(
1056 1056 name='edit_repo_issuetracker_test',
1057 1057 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/test', repo_route=True)
1058 1058 config.add_view(
1059 1059 RepoSettingsIssueTrackersView,
1060 1060 attr='repo_issuetracker_test',
1061 1061 route_name='edit_repo_issuetracker_test', request_method='POST',
1062 1062 renderer='string', xhr=True)
1063 1063
1064 1064 config.add_route(
1065 1065 name='edit_repo_issuetracker_delete',
1066 1066 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/delete', repo_route=True)
1067 1067 config.add_view(
1068 1068 RepoSettingsIssueTrackersView,
1069 1069 attr='repo_issuetracker_delete',
1070 1070 route_name='edit_repo_issuetracker_delete', request_method='POST',
1071 1071 renderer='json_ext', xhr=True)
1072 1072
1073 1073 config.add_route(
1074 1074 name='edit_repo_issuetracker_update',
1075 1075 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/update', repo_route=True)
1076 1076 config.add_view(
1077 1077 RepoSettingsIssueTrackersView,
1078 1078 attr='repo_issuetracker_update',
1079 1079 route_name='edit_repo_issuetracker_update', request_method='POST',
1080 1080 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1081 1081
1082 1082 # VCS Settings
1083 1083 config.add_route(
1084 1084 name='edit_repo_vcs',
1085 1085 pattern='/{repo_name:.*?[^/]}/settings/vcs', repo_route=True)
1086 1086 config.add_view(
1087 1087 RepoSettingsVcsView,
1088 1088 attr='repo_vcs_settings',
1089 1089 route_name='edit_repo_vcs', request_method='GET',
1090 1090 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1091 1091
1092 1092 config.add_route(
1093 1093 name='edit_repo_vcs_update',
1094 1094 pattern='/{repo_name:.*?[^/]}/settings/vcs/update', repo_route=True)
1095 1095 config.add_view(
1096 1096 RepoSettingsVcsView,
1097 1097 attr='repo_settings_vcs_update',
1098 1098 route_name='edit_repo_vcs_update', request_method='POST',
1099 1099 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1100 1100
1101 1101 # svn pattern
1102 1102 config.add_route(
1103 1103 name='edit_repo_vcs_svn_pattern_delete',
1104 1104 pattern='/{repo_name:.*?[^/]}/settings/vcs/svn_pattern/delete', repo_route=True)
1105 1105 config.add_view(
1106 1106 RepoSettingsVcsView,
1107 1107 attr='repo_settings_delete_svn_pattern',
1108 1108 route_name='edit_repo_vcs_svn_pattern_delete', request_method='POST',
1109 1109 renderer='json_ext', xhr=True)
1110 1110
1111 1111 # Repo Review Rules (EE feature)
1112 1112 config.add_route(
1113 1113 name='repo_reviewers',
1114 1114 pattern='/{repo_name:.*?[^/]}/settings/review/rules', repo_route=True)
1115 1115 config.add_view(
1116 1116 RepoReviewRulesView,
1117 1117 attr='repo_review_rules',
1118 1118 route_name='repo_reviewers', request_method='GET',
1119 1119 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1120 1120
1121 1121 config.add_route(
1122 1122 name='repo_default_reviewers_data',
1123 1123 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
1124 1124 config.add_view(
1125 1125 RepoReviewRulesView,
1126 1126 attr='repo_default_reviewers_data',
1127 1127 route_name='repo_default_reviewers_data', request_method='GET',
1128 1128 renderer='json_ext')
1129 1129
1130 1130 # Repo Automation (EE feature)
1131 1131 config.add_route(
1132 1132 name='repo_automation',
1133 1133 pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True)
1134 1134 config.add_view(
1135 1135 RepoAutomationView,
1136 1136 attr='repo_automation',
1137 1137 route_name='repo_automation', request_method='GET',
1138 1138 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1139 1139
1140 1140 # Strip
1141 1141 config.add_route(
1142 1142 name='edit_repo_strip',
1143 1143 pattern='/{repo_name:.*?[^/]}/settings/strip', repo_route=True)
1144 1144 config.add_view(
1145 1145 RepoStripView,
1146 1146 attr='strip',
1147 1147 route_name='edit_repo_strip', request_method='GET',
1148 1148 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1149 1149
1150 1150 config.add_route(
1151 1151 name='strip_check',
1152 1152 pattern='/{repo_name:.*?[^/]}/settings/strip_check', repo_route=True)
1153 1153 config.add_view(
1154 1154 RepoStripView,
1155 1155 attr='strip_check',
1156 1156 route_name='strip_check', request_method='POST',
1157 1157 renderer='json', xhr=True)
1158 1158
1159 1159 config.add_route(
1160 1160 name='strip_execute',
1161 1161 pattern='/{repo_name:.*?[^/]}/settings/strip_execute', repo_route=True)
1162 1162 config.add_view(
1163 1163 RepoStripView,
1164 1164 attr='strip_execute',
1165 1165 route_name='strip_execute', request_method='POST',
1166 1166 renderer='json', xhr=True)
1167 1167
1168 1168 # Audit logs
1169 1169 config.add_route(
1170 1170 name='edit_repo_audit_logs',
1171 1171 pattern='/{repo_name:.*?[^/]}/settings/audit_logs', repo_route=True)
1172 1172 config.add_view(
1173 1173 AuditLogsView,
1174 1174 attr='repo_audit_logs',
1175 1175 route_name='edit_repo_audit_logs', request_method='GET',
1176 1176 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1177 1177
1178 1178 # ATOM/RSS Feed, shouldn't contain slashes for outlook compatibility
1179 1179 config.add_route(
1180 1180 name='rss_feed_home',
1181 1181 pattern='/{repo_name:.*?[^/]}/feed-rss', repo_route=True)
1182 1182 config.add_view(
1183 1183 RepoFeedView,
1184 1184 attr='rss',
1185 1185 route_name='rss_feed_home', request_method='GET', renderer=None)
1186 1186
1187 1187 config.add_route(
1188 1188 name='rss_feed_home_old',
1189 1189 pattern='/{repo_name:.*?[^/]}/feed/rss', repo_route=True)
1190 1190 config.add_view(
1191 1191 RepoFeedView,
1192 1192 attr='rss',
1193 1193 route_name='rss_feed_home_old', request_method='GET', renderer=None)
1194 1194
1195 1195 config.add_route(
1196 1196 name='atom_feed_home',
1197 1197 pattern='/{repo_name:.*?[^/]}/feed-atom', repo_route=True)
1198 1198 config.add_view(
1199 1199 RepoFeedView,
1200 1200 attr='atom',
1201 1201 route_name='atom_feed_home', request_method='GET', renderer=None)
1202 1202
1203 1203 config.add_route(
1204 1204 name='atom_feed_home_old',
1205 1205 pattern='/{repo_name:.*?[^/]}/feed/atom', repo_route=True)
1206 1206 config.add_view(
1207 1207 RepoFeedView,
1208 1208 attr='atom',
1209 1209 route_name='atom_feed_home_old', request_method='GET', renderer=None)
1210 1210
1211 1211 # NOTE(marcink): needs to be at the end for catch-all
1212 1212 add_route_with_slash(
1213 1213 config,
1214 1214 name='repo_summary',
1215 1215 pattern='/{repo_name:.*?[^/]}', repo_route=True)
1216 1216 config.add_view(
1217 1217 RepoSummaryView,
1218 1218 attr='summary',
1219 1219 route_name='repo_summary', request_method='GET',
1220 1220 renderer='rhodecode:templates/summary/summary.mako')
1221 1221
1222 1222 # TODO(marcink): there's no such route??
1223 1223 config.add_view(
1224 1224 RepoSummaryView,
1225 1225 attr='summary',
1226 1226 route_name='repo_summary_slash', request_method='GET',
1227 1227 renderer='rhodecode:templates/summary/summary.mako') No newline at end of file
@@ -1,1070 +1,1092 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22
23 23 import mock
24 24 import pytest
25 25
26 26 from rhodecode.apps.repository.tests.test_repo_compare import ComparePage
27 27 from rhodecode.apps.repository.views.repo_files import RepoFilesView
28 28 from rhodecode.lib import helpers as h
29 29 from rhodecode.lib.compat import OrderedDict
30 30 from rhodecode.lib.ext_json import json
31 31 from rhodecode.lib.vcs import nodes
32 32
33 33 from rhodecode.lib.vcs.conf import settings
34 34 from rhodecode.tests import assert_session_flash
35 35 from rhodecode.tests.fixture import Fixture
36 36 from rhodecode.model.db import Session
37 37
38 38 fixture = Fixture()
39 39
40 40
41 41 def get_node_history(backend_type):
42 42 return {
43 43 'hg': json.loads(fixture.load_resource('hg_node_history_response.json')),
44 44 'git': json.loads(fixture.load_resource('git_node_history_response.json')),
45 45 'svn': json.loads(fixture.load_resource('svn_node_history_response.json')),
46 46 }[backend_type]
47 47
48 48
49 49 def route_path(name, params=None, **kwargs):
50 50 import urllib
51 51
52 52 base_url = {
53 53 'repo_summary': '/{repo_name}',
54 54 'repo_archivefile': '/{repo_name}/archive/{fname}',
55 55 'repo_files_diff': '/{repo_name}/diff/{f_path}',
56 56 'repo_files_diff_2way_redirect': '/{repo_name}/diff-2way/{f_path}',
57 57 'repo_files': '/{repo_name}/files/{commit_id}/{f_path}',
58 58 'repo_files:default_path': '/{repo_name}/files/{commit_id}/',
59 59 'repo_files:default_commit': '/{repo_name}/files',
60 60 'repo_files:rendered': '/{repo_name}/render/{commit_id}/{f_path}',
61 61 'repo_files:annotated': '/{repo_name}/annotate/{commit_id}/{f_path}',
62 62 'repo_files:annotated_previous': '/{repo_name}/annotate-previous/{commit_id}/{f_path}',
63 63 'repo_files_nodelist': '/{repo_name}/nodelist/{commit_id}/{f_path}',
64 64 'repo_file_raw': '/{repo_name}/raw/{commit_id}/{f_path}',
65 65 'repo_file_download': '/{repo_name}/download/{commit_id}/{f_path}',
66 66 'repo_file_history': '/{repo_name}/history/{commit_id}/{f_path}',
67 67 'repo_file_authors': '/{repo_name}/authors/{commit_id}/{f_path}',
68 68 'repo_files_remove_file': '/{repo_name}/remove_file/{commit_id}/{f_path}',
69 69 'repo_files_delete_file': '/{repo_name}/delete_file/{commit_id}/{f_path}',
70 70 'repo_files_edit_file': '/{repo_name}/edit_file/{commit_id}/{f_path}',
71 71 'repo_files_update_file': '/{repo_name}/update_file/{commit_id}/{f_path}',
72 72 'repo_files_add_file': '/{repo_name}/add_file/{commit_id}/{f_path}',
73 73 'repo_files_create_file': '/{repo_name}/create_file/{commit_id}/{f_path}',
74 74 'repo_nodetree_full': '/{repo_name}/nodetree_full/{commit_id}/{f_path}',
75 75 'repo_nodetree_full:default_path': '/{repo_name}/nodetree_full/{commit_id}/',
76 76 }[name].format(**kwargs)
77 77
78 78 if params:
79 79 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
80 80 return base_url
81 81
82 82
83 83 def assert_files_in_response(response, files, params):
84 84 template = (
85 85 'href="/%(repo_name)s/files/%(commit_id)s/%(name)s"')
86 86 _assert_items_in_response(response, files, template, params)
87 87
88 88
89 89 def assert_dirs_in_response(response, dirs, params):
90 90 template = (
91 91 'href="/%(repo_name)s/files/%(commit_id)s/%(name)s"')
92 92 _assert_items_in_response(response, dirs, template, params)
93 93
94 94
95 95 def _assert_items_in_response(response, items, template, params):
96 96 for item in items:
97 97 item_params = {'name': item}
98 98 item_params.update(params)
99 99 response.mustcontain(template % item_params)
100 100
101 101
102 102 def assert_timeago_in_response(response, items, params):
103 103 for item in items:
104 104 response.mustcontain(h.age_component(params['date']))
105 105
106 106
107 107 @pytest.mark.usefixtures("app")
108 108 class TestFilesViews(object):
109 109
110 110 def test_show_files(self, backend):
111 111 response = self.app.get(
112 112 route_path('repo_files',
113 113 repo_name=backend.repo_name,
114 114 commit_id='tip', f_path='/'))
115 115 commit = backend.repo.get_commit()
116 116
117 117 params = {
118 118 'repo_name': backend.repo_name,
119 119 'commit_id': commit.raw_id,
120 120 'date': commit.date
121 121 }
122 122 assert_dirs_in_response(response, ['docs', 'vcs'], params)
123 123 files = [
124 124 '.gitignore',
125 125 '.hgignore',
126 126 '.hgtags',
127 127 # TODO: missing in Git
128 128 # '.travis.yml',
129 129 'MANIFEST.in',
130 130 'README.rst',
131 131 # TODO: File is missing in svn repository
132 132 # 'run_test_and_report.sh',
133 133 'setup.cfg',
134 134 'setup.py',
135 135 'test_and_report.sh',
136 136 'tox.ini',
137 137 ]
138 138 assert_files_in_response(response, files, params)
139 139 assert_timeago_in_response(response, files, params)
140 140
141 141 def test_show_files_links_submodules_with_absolute_url(self, backend_hg):
142 142 repo = backend_hg['subrepos']
143 143 response = self.app.get(
144 144 route_path('repo_files',
145 145 repo_name=repo.repo_name,
146 146 commit_id='tip', f_path='/'))
147 147 assert_response = response.assert_response()
148 148 assert_response.contains_one_link(
149 149 'absolute-path @ 000000000000', 'http://example.com/absolute-path')
150 150
151 151 def test_show_files_links_submodules_with_absolute_url_subpaths(
152 152 self, backend_hg):
153 153 repo = backend_hg['subrepos']
154 154 response = self.app.get(
155 155 route_path('repo_files',
156 156 repo_name=repo.repo_name,
157 157 commit_id='tip', f_path='/'))
158 158 assert_response = response.assert_response()
159 159 assert_response.contains_one_link(
160 160 'subpaths-path @ 000000000000',
161 161 'http://sub-base.example.com/subpaths-path')
162 162
163 163 @pytest.mark.xfail_backends("svn", reason="Depends on branch support")
164 164 def test_files_menu(self, backend):
165 165 new_branch = "temp_branch_name"
166 166 commits = [
167 167 {'message': 'a'},
168 168 {'message': 'b', 'branch': new_branch}
169 169 ]
170 170 backend.create_repo(commits)
171 171 backend.repo.landing_rev = "branch:%s" % new_branch
172 172 Session().commit()
173 173
174 174 # get response based on tip and not new commit
175 175 response = self.app.get(
176 176 route_path('repo_files',
177 177 repo_name=backend.repo_name,
178 178 commit_id='tip', f_path='/'))
179 179
180 180 # make sure Files menu url is not tip but new commit
181 181 landing_rev = backend.repo.landing_ref_name
182 182 files_url = route_path('repo_files:default_path',
183 183 repo_name=backend.repo_name,
184 184 commit_id=landing_rev, params={'at': landing_rev})
185 185
186 186 assert landing_rev != 'tip'
187 187 response.mustcontain(
188 188 '<li class="active"><a class="menulink" href="%s">' % files_url)
189 189
190 190 def test_show_files_commit(self, backend):
191 191 commit = backend.repo.get_commit(commit_idx=32)
192 192
193 193 response = self.app.get(
194 194 route_path('repo_files',
195 195 repo_name=backend.repo_name,
196 196 commit_id=commit.raw_id, f_path='/'))
197 197
198 198 dirs = ['docs', 'tests']
199 199 files = ['README.rst']
200 200 params = {
201 201 'repo_name': backend.repo_name,
202 202 'commit_id': commit.raw_id,
203 203 }
204 204 assert_dirs_in_response(response, dirs, params)
205 205 assert_files_in_response(response, files, params)
206 206
207 207 def test_show_files_different_branch(self, backend):
208 208 branches = dict(
209 209 hg=(150, ['git']),
210 210 # TODO: Git test repository does not contain other branches
211 211 git=(633, ['master']),
212 212 # TODO: Branch support in Subversion
213 213 svn=(150, [])
214 214 )
215 215 idx, branches = branches[backend.alias]
216 216 commit = backend.repo.get_commit(commit_idx=idx)
217 217 response = self.app.get(
218 218 route_path('repo_files',
219 219 repo_name=backend.repo_name,
220 220 commit_id=commit.raw_id, f_path='/'))
221 221
222 222 assert_response = response.assert_response()
223 223 for branch in branches:
224 224 assert_response.element_contains('.tags .branchtag', branch)
225 225
226 226 def test_show_files_paging(self, backend):
227 227 repo = backend.repo
228 228 indexes = [73, 92, 109, 1, 0]
229 229 idx_map = [(rev, repo.get_commit(commit_idx=rev).raw_id)
230 230 for rev in indexes]
231 231
232 232 for idx in idx_map:
233 233 response = self.app.get(
234 234 route_path('repo_files',
235 235 repo_name=backend.repo_name,
236 236 commit_id=idx[1], f_path='/'))
237 237
238 238 response.mustcontain("""r%s:%s""" % (idx[0], idx[1][:8]))
239 239
240 240 def test_file_source(self, backend):
241 241 commit = backend.repo.get_commit(commit_idx=167)
242 242 response = self.app.get(
243 243 route_path('repo_files',
244 244 repo_name=backend.repo_name,
245 245 commit_id=commit.raw_id, f_path='vcs/nodes.py'))
246 246
247 247 msgbox = """<div class="commit">%s</div>"""
248 248 response.mustcontain(msgbox % (commit.message, ))
249 249
250 250 assert_response = response.assert_response()
251 251 if commit.branch:
252 252 assert_response.element_contains(
253 253 '.tags.tags-main .branchtag', commit.branch)
254 254 if commit.tags:
255 255 for tag in commit.tags:
256 256 assert_response.element_contains('.tags.tags-main .tagtag', tag)
257 257
258 258 def test_file_source_annotated(self, backend):
259 259 response = self.app.get(
260 260 route_path('repo_files:annotated',
261 261 repo_name=backend.repo_name,
262 262 commit_id='tip', f_path='vcs/nodes.py'))
263 263 expected_commits = {
264 264 'hg': 'r356',
265 265 'git': 'r345',
266 266 'svn': 'r208',
267 267 }
268 268 response.mustcontain(expected_commits[backend.alias])
269 269
270 270 def test_file_source_authors(self, backend):
271 271 response = self.app.get(
272 272 route_path('repo_file_authors',
273 273 repo_name=backend.repo_name,
274 274 commit_id='tip', f_path='vcs/nodes.py'))
275 275 expected_authors = {
276 276 'hg': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
277 277 'git': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
278 278 'svn': ('marcin', 'lukasz'),
279 279 }
280 280
281 281 for author in expected_authors[backend.alias]:
282 282 response.mustcontain(author)
283 283
284 284 def test_file_source_authors_with_annotation(self, backend):
285 285 response = self.app.get(
286 286 route_path('repo_file_authors',
287 287 repo_name=backend.repo_name,
288 288 commit_id='tip', f_path='vcs/nodes.py',
289 289 params=dict(annotate=1)))
290 290 expected_authors = {
291 291 'hg': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
292 292 'git': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
293 293 'svn': ('marcin', 'lukasz'),
294 294 }
295 295
296 296 for author in expected_authors[backend.alias]:
297 297 response.mustcontain(author)
298 298
299 299 def test_file_source_history(self, backend, xhr_header):
300 300 response = self.app.get(
301 301 route_path('repo_file_history',
302 302 repo_name=backend.repo_name,
303 303 commit_id='tip', f_path='vcs/nodes.py'),
304 304 extra_environ=xhr_header)
305 305 assert get_node_history(backend.alias) == json.loads(response.body)
306 306
307 307 def test_file_source_history_svn(self, backend_svn, xhr_header):
308 308 simple_repo = backend_svn['svn-simple-layout']
309 309 response = self.app.get(
310 310 route_path('repo_file_history',
311 311 repo_name=simple_repo.repo_name,
312 312 commit_id='tip', f_path='trunk/example.py'),
313 313 extra_environ=xhr_header)
314 314
315 315 expected_data = json.loads(
316 316 fixture.load_resource('svn_node_history_branches.json'))
317 317
318 318 assert expected_data == response.json
319 319
320 320 def test_file_source_history_with_annotation(self, backend, xhr_header):
321 321 response = self.app.get(
322 322 route_path('repo_file_history',
323 323 repo_name=backend.repo_name,
324 324 commit_id='tip', f_path='vcs/nodes.py',
325 325 params=dict(annotate=1)),
326 326
327 327 extra_environ=xhr_header)
328 328 assert get_node_history(backend.alias) == json.loads(response.body)
329 329
330 330 def test_tree_search_top_level(self, backend, xhr_header):
331 331 commit = backend.repo.get_commit(commit_idx=173)
332 332 response = self.app.get(
333 333 route_path('repo_files_nodelist',
334 334 repo_name=backend.repo_name,
335 335 commit_id=commit.raw_id, f_path='/'),
336 336 extra_environ=xhr_header)
337 337 assert 'nodes' in response.json
338 338 assert {'name': 'docs', 'type': 'dir'} in response.json['nodes']
339 339
340 340 def test_tree_search_missing_xhr(self, backend):
341 341 self.app.get(
342 342 route_path('repo_files_nodelist',
343 343 repo_name=backend.repo_name,
344 344 commit_id='tip', f_path='/'),
345 345 status=404)
346 346
347 347 def test_tree_search_at_path(self, backend, xhr_header):
348 348 commit = backend.repo.get_commit(commit_idx=173)
349 349 response = self.app.get(
350 350 route_path('repo_files_nodelist',
351 351 repo_name=backend.repo_name,
352 352 commit_id=commit.raw_id, f_path='/docs'),
353 353 extra_environ=xhr_header)
354 354 assert 'nodes' in response.json
355 355 nodes = response.json['nodes']
356 356 assert {'name': 'docs/api', 'type': 'dir'} in nodes
357 357 assert {'name': 'docs/index.rst', 'type': 'file'} in nodes
358 358
359 359 def test_tree_search_at_path_2nd_level(self, backend, xhr_header):
360 360 commit = backend.repo.get_commit(commit_idx=173)
361 361 response = self.app.get(
362 362 route_path('repo_files_nodelist',
363 363 repo_name=backend.repo_name,
364 364 commit_id=commit.raw_id, f_path='/docs/api'),
365 365 extra_environ=xhr_header)
366 366 assert 'nodes' in response.json
367 367 nodes = response.json['nodes']
368 368 assert {'name': 'docs/api/index.rst', 'type': 'file'} in nodes
369 369
370 370 def test_tree_search_at_path_missing_xhr(self, backend):
371 371 self.app.get(
372 372 route_path('repo_files_nodelist',
373 373 repo_name=backend.repo_name,
374 374 commit_id='tip', f_path='/docs'),
375 375 status=404)
376 376
377 377 def test_nodetree(self, backend, xhr_header):
378 378 commit = backend.repo.get_commit(commit_idx=173)
379 379 response = self.app.get(
380 380 route_path('repo_nodetree_full',
381 381 repo_name=backend.repo_name,
382 382 commit_id=commit.raw_id, f_path='/'),
383 383 extra_environ=xhr_header)
384 384
385 385 assert_response = response.assert_response()
386 386
387 387 for attr in ['data-commit-id', 'data-date', 'data-author']:
388 388 elements = assert_response.get_elements('[{}]'.format(attr))
389 389 assert len(elements) > 1
390 390
391 391 for element in elements:
392 392 assert element.get(attr)
393 393
394 394 def test_nodetree_if_file(self, backend, xhr_header):
395 395 commit = backend.repo.get_commit(commit_idx=173)
396 396 response = self.app.get(
397 397 route_path('repo_nodetree_full',
398 398 repo_name=backend.repo_name,
399 399 commit_id=commit.raw_id, f_path='README.rst'),
400 400 extra_environ=xhr_header)
401 401 assert response.body == ''
402 402
403 403 def test_nodetree_wrong_path(self, backend, xhr_header):
404 404 commit = backend.repo.get_commit(commit_idx=173)
405 405 response = self.app.get(
406 406 route_path('repo_nodetree_full',
407 407 repo_name=backend.repo_name,
408 408 commit_id=commit.raw_id, f_path='/dont-exist'),
409 409 extra_environ=xhr_header)
410 410
411 411 err = 'error: There is no file nor ' \
412 412 'directory at the given path'
413 413 assert err in response.body
414 414
415 415 def test_nodetree_missing_xhr(self, backend):
416 416 self.app.get(
417 417 route_path('repo_nodetree_full',
418 418 repo_name=backend.repo_name,
419 419 commit_id='tip', f_path='/'),
420 420 status=404)
421 421
422 422
423 423 @pytest.mark.usefixtures("app", "autologin_user")
424 424 class TestRawFileHandling(object):
425 425
426 426 def test_download_file(self, backend):
427 427 commit = backend.repo.get_commit(commit_idx=173)
428 428 response = self.app.get(
429 429 route_path('repo_file_download',
430 430 repo_name=backend.repo_name,
431 431 commit_id=commit.raw_id, f_path='vcs/nodes.py'),)
432 432
433 433 assert response.content_disposition == 'attachment; filename="nodes.py"; filename*=UTF-8\'\'nodes.py'
434 434 assert response.content_type == "text/x-python"
435 435
436 436 def test_download_file_wrong_cs(self, backend):
437 437 raw_id = u'ERRORce30c96924232dffcd24178a07ffeb5dfc'
438 438
439 439 response = self.app.get(
440 440 route_path('repo_file_download',
441 441 repo_name=backend.repo_name,
442 442 commit_id=raw_id, f_path='vcs/nodes.svg'),
443 443 status=404)
444 444
445 445 msg = """No such commit exists for this repository"""
446 446 response.mustcontain(msg)
447 447
448 448 def test_download_file_wrong_f_path(self, backend):
449 449 commit = backend.repo.get_commit(commit_idx=173)
450 450 f_path = 'vcs/ERRORnodes.py'
451 451
452 452 response = self.app.get(
453 453 route_path('repo_file_download',
454 454 repo_name=backend.repo_name,
455 455 commit_id=commit.raw_id, f_path=f_path),
456 456 status=404)
457 457
458 458 msg = (
459 459 "There is no file nor directory at the given path: "
460 460 "`%s` at commit %s" % (f_path, commit.short_id))
461 461 response.mustcontain(msg)
462 462
463 463 def test_file_raw(self, backend):
464 464 commit = backend.repo.get_commit(commit_idx=173)
465 465 response = self.app.get(
466 466 route_path('repo_file_raw',
467 467 repo_name=backend.repo_name,
468 468 commit_id=commit.raw_id, f_path='vcs/nodes.py'),)
469 469
470 470 assert response.content_type == "text/plain"
471 471
472 472 def test_file_raw_binary(self, backend):
473 473 commit = backend.repo.get_commit()
474 474 response = self.app.get(
475 475 route_path('repo_file_raw',
476 476 repo_name=backend.repo_name,
477 477 commit_id=commit.raw_id,
478 478 f_path='docs/theme/ADC/static/breadcrumb_background.png'),)
479 479
480 480 assert response.content_disposition == 'inline'
481 481
482 482 def test_raw_file_wrong_cs(self, backend):
483 483 raw_id = u'ERRORcce30c96924232dffcd24178a07ffeb5dfc'
484 484
485 485 response = self.app.get(
486 486 route_path('repo_file_raw',
487 487 repo_name=backend.repo_name,
488 488 commit_id=raw_id, f_path='vcs/nodes.svg'),
489 489 status=404)
490 490
491 491 msg = """No such commit exists for this repository"""
492 492 response.mustcontain(msg)
493 493
494 494 def test_raw_wrong_f_path(self, backend):
495 495 commit = backend.repo.get_commit(commit_idx=173)
496 496 f_path = 'vcs/ERRORnodes.py'
497 497 response = self.app.get(
498 498 route_path('repo_file_raw',
499 499 repo_name=backend.repo_name,
500 500 commit_id=commit.raw_id, f_path=f_path),
501 501 status=404)
502 502
503 503 msg = (
504 504 "There is no file nor directory at the given path: "
505 505 "`%s` at commit %s" % (f_path, commit.short_id))
506 506 response.mustcontain(msg)
507 507
508 508 def test_raw_svg_should_not_be_rendered(self, backend):
509 509 backend.create_repo()
510 510 backend.ensure_file("xss.svg")
511 511 response = self.app.get(
512 512 route_path('repo_file_raw',
513 513 repo_name=backend.repo_name,
514 514 commit_id='tip', f_path='xss.svg'),)
515 515 # If the content type is image/svg+xml then it allows to render HTML
516 516 # and malicious SVG.
517 517 assert response.content_type == "text/plain"
518 518
519 519
520 520 @pytest.mark.usefixtures("app")
521 521 class TestRepositoryArchival(object):
522 522
523 523 def test_archival(self, backend):
524 524 backend.enable_downloads()
525 525 commit = backend.repo.get_commit(commit_idx=173)
526 526 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
527 527
528 528 short = commit.short_id + extension
529 529 fname = commit.raw_id + extension
530 530 filename = '%s-%s' % (backend.repo_name, short)
531 531 response = self.app.get(
532 532 route_path('repo_archivefile',
533 533 repo_name=backend.repo_name,
534 534 fname=fname))
535 535
536 536 assert response.status == '200 OK'
537 537 headers = [
538 538 ('Content-Disposition', 'attachment; filename=%s' % filename),
539 539 ('Content-Type', '%s' % content_type),
540 540 ]
541 541
542 542 for header in headers:
543 543 assert header in response.headers.items()
544 544
545 def test_archival_no_hash(self, backend):
546 backend.enable_downloads()
547 commit = backend.repo.get_commit(commit_idx=173)
548 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
549
550 short = 'plain' + extension
551 fname = commit.raw_id + extension
552 filename = '%s-%s' % (backend.repo_name, short)
553 response = self.app.get(
554 route_path('repo_archivefile',
555 repo_name=backend.repo_name,
556 fname=fname, params={'with_hash': 0}))
557
558 assert response.status == '200 OK'
559 headers = [
560 ('Content-Disposition', 'attachment; filename=%s' % filename),
561 ('Content-Type', '%s' % content_type),
562 ]
563
564 for header in headers:
565 assert header in response.headers.items()
566
545 567 @pytest.mark.parametrize('arch_ext',[
546 568 'tar', 'rar', 'x', '..ax', '.zipz', 'tar.gz.tar'])
547 569 def test_archival_wrong_ext(self, backend, arch_ext):
548 570 backend.enable_downloads()
549 571 commit = backend.repo.get_commit(commit_idx=173)
550 572
551 573 fname = commit.raw_id + '.' + arch_ext
552 574
553 575 response = self.app.get(
554 576 route_path('repo_archivefile',
555 577 repo_name=backend.repo_name,
556 578 fname=fname))
557 579 response.mustcontain(
558 580 'Unknown archive type for: `{}`'.format(fname))
559 581
560 582 @pytest.mark.parametrize('commit_id', [
561 583 '00x000000', 'tar', 'wrong', '@$@$42413232', '232dffcd'])
562 584 def test_archival_wrong_commit_id(self, backend, commit_id):
563 585 backend.enable_downloads()
564 586 fname = '%s.zip' % commit_id
565 587
566 588 response = self.app.get(
567 589 route_path('repo_archivefile',
568 590 repo_name=backend.repo_name,
569 591 fname=fname))
570 592 response.mustcontain('Unknown commit_id')
571 593
572 594
573 595 @pytest.mark.usefixtures("app")
574 596 class TestFilesDiff(object):
575 597
576 598 @pytest.mark.parametrize("diff", ['diff', 'download', 'raw'])
577 599 def test_file_full_diff(self, backend, diff):
578 600 commit1 = backend.repo.get_commit(commit_idx=-1)
579 601 commit2 = backend.repo.get_commit(commit_idx=-2)
580 602
581 603 response = self.app.get(
582 604 route_path('repo_files_diff',
583 605 repo_name=backend.repo_name,
584 606 f_path='README'),
585 607 params={
586 608 'diff1': commit2.raw_id,
587 609 'diff2': commit1.raw_id,
588 610 'fulldiff': '1',
589 611 'diff': diff,
590 612 })
591 613
592 614 if diff == 'diff':
593 615 # use redirect since this is OLD view redirecting to compare page
594 616 response = response.follow()
595 617
596 618 # It's a symlink to README.rst
597 619 response.mustcontain('README.rst')
598 620 response.mustcontain('No newline at end of file')
599 621
600 622 def test_file_binary_diff(self, backend):
601 623 commits = [
602 624 {'message': 'First commit'},
603 625 {'message': 'Commit with binary',
604 626 'added': [nodes.FileNode('file.bin', content='\0BINARY\0')]},
605 627 ]
606 628 repo = backend.create_repo(commits=commits)
607 629
608 630 response = self.app.get(
609 631 route_path('repo_files_diff',
610 632 repo_name=backend.repo_name,
611 633 f_path='file.bin'),
612 634 params={
613 635 'diff1': repo.get_commit(commit_idx=0).raw_id,
614 636 'diff2': repo.get_commit(commit_idx=1).raw_id,
615 637 'fulldiff': '1',
616 638 'diff': 'diff',
617 639 })
618 640 # use redirect since this is OLD view redirecting to compare page
619 641 response = response.follow()
620 642 response.mustcontain('Collapse 1 commit')
621 643 file_changes = (1, 0, 0)
622 644
623 645 compare_page = ComparePage(response)
624 646 compare_page.contains_change_summary(*file_changes)
625 647
626 648 if backend.alias == 'svn':
627 649 response.mustcontain('new file 10644')
628 650 # TODO(marcink): SVN doesn't yet detect binary changes
629 651 else:
630 652 response.mustcontain('new file 100644')
631 653 response.mustcontain('binary diff hidden')
632 654
633 655 def test_diff_2way(self, backend):
634 656 commit1 = backend.repo.get_commit(commit_idx=-1)
635 657 commit2 = backend.repo.get_commit(commit_idx=-2)
636 658 response = self.app.get(
637 659 route_path('repo_files_diff_2way_redirect',
638 660 repo_name=backend.repo_name,
639 661 f_path='README'),
640 662 params={
641 663 'diff1': commit2.raw_id,
642 664 'diff2': commit1.raw_id,
643 665 })
644 666 # use redirect since this is OLD view redirecting to compare page
645 667 response = response.follow()
646 668
647 669 # It's a symlink to README.rst
648 670 response.mustcontain('README.rst')
649 671 response.mustcontain('No newline at end of file')
650 672
651 673 def test_requires_one_commit_id(self, backend, autologin_user):
652 674 response = self.app.get(
653 675 route_path('repo_files_diff',
654 676 repo_name=backend.repo_name,
655 677 f_path='README.rst'),
656 678 status=400)
657 679 response.mustcontain(
658 680 'Need query parameter', 'diff1', 'diff2', 'to generate a diff.')
659 681
660 682 def test_returns_no_files_if_file_does_not_exist(self, vcsbackend):
661 683 repo = vcsbackend.repo
662 684 response = self.app.get(
663 685 route_path('repo_files_diff',
664 686 repo_name=repo.name,
665 687 f_path='does-not-exist-in-any-commit'),
666 688 params={
667 689 'diff1': repo[0].raw_id,
668 690 'diff2': repo[1].raw_id
669 691 })
670 692
671 693 response = response.follow()
672 694 response.mustcontain('No files')
673 695
674 696 def test_returns_redirect_if_file_not_changed(self, backend):
675 697 commit = backend.repo.get_commit(commit_idx=-1)
676 698 response = self.app.get(
677 699 route_path('repo_files_diff_2way_redirect',
678 700 repo_name=backend.repo_name,
679 701 f_path='README'),
680 702 params={
681 703 'diff1': commit.raw_id,
682 704 'diff2': commit.raw_id,
683 705 })
684 706
685 707 response = response.follow()
686 708 response.mustcontain('No files')
687 709 response.mustcontain('No commits in this compare')
688 710
689 711 def test_supports_diff_to_different_path_svn(self, backend_svn):
690 712 #TODO: check this case
691 713 return
692 714
693 715 repo = backend_svn['svn-simple-layout'].scm_instance()
694 716 commit_id_1 = '24'
695 717 commit_id_2 = '26'
696 718
697 719 response = self.app.get(
698 720 route_path('repo_files_diff',
699 721 repo_name=backend_svn.repo_name,
700 722 f_path='trunk/example.py'),
701 723 params={
702 724 'diff1': 'tags/v0.2/example.py@' + commit_id_1,
703 725 'diff2': commit_id_2,
704 726 })
705 727
706 728 response = response.follow()
707 729 response.mustcontain(
708 730 # diff contains this
709 731 "Will print out a useful message on invocation.")
710 732
711 733 # Note: Expecting that we indicate the user what's being compared
712 734 response.mustcontain("trunk/example.py")
713 735 response.mustcontain("tags/v0.2/example.py")
714 736
715 737 def test_show_rev_redirects_to_svn_path(self, backend_svn):
716 738 #TODO: check this case
717 739 return
718 740
719 741 repo = backend_svn['svn-simple-layout'].scm_instance()
720 742 commit_id = repo[-1].raw_id
721 743
722 744 response = self.app.get(
723 745 route_path('repo_files_diff',
724 746 repo_name=backend_svn.repo_name,
725 747 f_path='trunk/example.py'),
726 748 params={
727 749 'diff1': 'branches/argparse/example.py@' + commit_id,
728 750 'diff2': commit_id,
729 751 },
730 752 status=302)
731 753 response = response.follow()
732 754 assert response.headers['Location'].endswith(
733 755 'svn-svn-simple-layout/files/26/branches/argparse/example.py')
734 756
735 757 def test_show_rev_and_annotate_redirects_to_svn_path(self, backend_svn):
736 758 #TODO: check this case
737 759 return
738 760
739 761 repo = backend_svn['svn-simple-layout'].scm_instance()
740 762 commit_id = repo[-1].raw_id
741 763 response = self.app.get(
742 764 route_path('repo_files_diff',
743 765 repo_name=backend_svn.repo_name,
744 766 f_path='trunk/example.py'),
745 767 params={
746 768 'diff1': 'branches/argparse/example.py@' + commit_id,
747 769 'diff2': commit_id,
748 770 'show_rev': 'Show at Revision',
749 771 'annotate': 'true',
750 772 },
751 773 status=302)
752 774 response = response.follow()
753 775 assert response.headers['Location'].endswith(
754 776 'svn-svn-simple-layout/annotate/26/branches/argparse/example.py')
755 777
756 778
757 779 @pytest.mark.usefixtures("app", "autologin_user")
758 780 class TestModifyFilesWithWebInterface(object):
759 781
760 782 def test_add_file_view(self, backend):
761 783 self.app.get(
762 784 route_path('repo_files_add_file',
763 785 repo_name=backend.repo_name,
764 786 commit_id='tip', f_path='/')
765 787 )
766 788
767 789 @pytest.mark.xfail_backends("svn", reason="Depends on online editing")
768 790 def test_add_file_into_repo_missing_content(self, backend, csrf_token):
769 791 backend.create_repo()
770 792 filename = 'init.py'
771 793 response = self.app.post(
772 794 route_path('repo_files_create_file',
773 795 repo_name=backend.repo_name,
774 796 commit_id='tip', f_path='/'),
775 797 params={
776 798 'content': "",
777 799 'filename': filename,
778 800 'csrf_token': csrf_token,
779 801 },
780 802 status=302)
781 803 expected_msg = 'Successfully committed new file `{}`'.format(os.path.join(filename))
782 804 assert_session_flash(response, expected_msg)
783 805
784 806 def test_add_file_into_repo_missing_filename(self, backend, csrf_token):
785 807 commit_id = backend.repo.get_commit().raw_id
786 808 response = self.app.post(
787 809 route_path('repo_files_create_file',
788 810 repo_name=backend.repo_name,
789 811 commit_id=commit_id, f_path='/'),
790 812 params={
791 813 'content': "foo",
792 814 'csrf_token': csrf_token,
793 815 },
794 816 status=302)
795 817
796 818 assert_session_flash(response, 'No filename specified')
797 819
798 820 def test_add_file_into_repo_errors_and_no_commits(
799 821 self, backend, csrf_token):
800 822 repo = backend.create_repo()
801 823 # Create a file with no filename, it will display an error but
802 824 # the repo has no commits yet
803 825 response = self.app.post(
804 826 route_path('repo_files_create_file',
805 827 repo_name=repo.repo_name,
806 828 commit_id='tip', f_path='/'),
807 829 params={
808 830 'content': "foo",
809 831 'csrf_token': csrf_token,
810 832 },
811 833 status=302)
812 834
813 835 assert_session_flash(response, 'No filename specified')
814 836
815 837 # Not allowed, redirect to the summary
816 838 redirected = response.follow()
817 839 summary_url = h.route_path('repo_summary', repo_name=repo.repo_name)
818 840
819 841 # As there are no commits, displays the summary page with the error of
820 842 # creating a file with no filename
821 843
822 844 assert redirected.request.path == summary_url
823 845
824 846 @pytest.mark.parametrize("filename, clean_filename", [
825 847 ('/abs/foo', 'abs/foo'),
826 848 ('../rel/foo', 'rel/foo'),
827 849 ('file/../foo/foo', 'file/foo/foo'),
828 850 ])
829 851 def test_add_file_into_repo_bad_filenames(self, filename, clean_filename, backend, csrf_token):
830 852 repo = backend.create_repo()
831 853 commit_id = repo.get_commit().raw_id
832 854
833 855 response = self.app.post(
834 856 route_path('repo_files_create_file',
835 857 repo_name=repo.repo_name,
836 858 commit_id=commit_id, f_path='/'),
837 859 params={
838 860 'content': "foo",
839 861 'filename': filename,
840 862 'csrf_token': csrf_token,
841 863 },
842 864 status=302)
843 865
844 866 expected_msg = 'Successfully committed new file `{}`'.format(clean_filename)
845 867 assert_session_flash(response, expected_msg)
846 868
847 869 @pytest.mark.parametrize("cnt, filename, content", [
848 870 (1, 'foo.txt', "Content"),
849 871 (2, 'dir/foo.rst', "Content"),
850 872 (3, 'dir/foo-second.rst', "Content"),
851 873 (4, 'rel/dir/foo.bar', "Content"),
852 874 ])
853 875 def test_add_file_into_empty_repo(self, cnt, filename, content, backend, csrf_token):
854 876 repo = backend.create_repo()
855 877 commit_id = repo.get_commit().raw_id
856 878 response = self.app.post(
857 879 route_path('repo_files_create_file',
858 880 repo_name=repo.repo_name,
859 881 commit_id=commit_id, f_path='/'),
860 882 params={
861 883 'content': content,
862 884 'filename': filename,
863 885 'csrf_token': csrf_token,
864 886 },
865 887 status=302)
866 888
867 889 expected_msg = 'Successfully committed new file `{}`'.format(filename)
868 890 assert_session_flash(response, expected_msg)
869 891
870 892 def test_edit_file_view(self, backend):
871 893 response = self.app.get(
872 894 route_path('repo_files_edit_file',
873 895 repo_name=backend.repo_name,
874 896 commit_id=backend.default_head_id,
875 897 f_path='vcs/nodes.py'),
876 898 status=200)
877 899 response.mustcontain("Module holding everything related to vcs nodes.")
878 900
879 901 def test_edit_file_view_not_on_branch(self, backend):
880 902 repo = backend.create_repo()
881 903 backend.ensure_file("vcs/nodes.py")
882 904
883 905 response = self.app.get(
884 906 route_path('repo_files_edit_file',
885 907 repo_name=repo.repo_name,
886 908 commit_id='tip',
887 909 f_path='vcs/nodes.py'),
888 910 status=302)
889 911 assert_session_flash(
890 912 response, 'Cannot modify file. Given commit `tip` is not head of a branch.')
891 913
892 914 def test_edit_file_view_commit_changes(self, backend, csrf_token):
893 915 repo = backend.create_repo()
894 916 backend.ensure_file("vcs/nodes.py", content="print 'hello'")
895 917
896 918 response = self.app.post(
897 919 route_path('repo_files_update_file',
898 920 repo_name=repo.repo_name,
899 921 commit_id=backend.default_head_id,
900 922 f_path='vcs/nodes.py'),
901 923 params={
902 924 'content': "print 'hello world'",
903 925 'message': 'I committed',
904 926 'filename': "vcs/nodes.py",
905 927 'csrf_token': csrf_token,
906 928 },
907 929 status=302)
908 930 assert_session_flash(
909 931 response, 'Successfully committed changes to file `vcs/nodes.py`')
910 932 tip = repo.get_commit(commit_idx=-1)
911 933 assert tip.message == 'I committed'
912 934
913 935 def test_edit_file_view_commit_changes_default_message(self, backend,
914 936 csrf_token):
915 937 repo = backend.create_repo()
916 938 backend.ensure_file("vcs/nodes.py", content="print 'hello'")
917 939
918 940 commit_id = (
919 941 backend.default_branch_name or
920 942 backend.repo.scm_instance().commit_ids[-1])
921 943
922 944 response = self.app.post(
923 945 route_path('repo_files_update_file',
924 946 repo_name=repo.repo_name,
925 947 commit_id=commit_id,
926 948 f_path='vcs/nodes.py'),
927 949 params={
928 950 'content': "print 'hello world'",
929 951 'message': '',
930 952 'filename': "vcs/nodes.py",
931 953 'csrf_token': csrf_token,
932 954 },
933 955 status=302)
934 956 assert_session_flash(
935 957 response, 'Successfully committed changes to file `vcs/nodes.py`')
936 958 tip = repo.get_commit(commit_idx=-1)
937 959 assert tip.message == 'Edited file vcs/nodes.py via RhodeCode Enterprise'
938 960
939 961 def test_delete_file_view(self, backend):
940 962 self.app.get(
941 963 route_path('repo_files_remove_file',
942 964 repo_name=backend.repo_name,
943 965 commit_id=backend.default_head_id,
944 966 f_path='vcs/nodes.py'),
945 967 status=200)
946 968
947 969 def test_delete_file_view_not_on_branch(self, backend):
948 970 repo = backend.create_repo()
949 971 backend.ensure_file('vcs/nodes.py')
950 972
951 973 response = self.app.get(
952 974 route_path('repo_files_remove_file',
953 975 repo_name=repo.repo_name,
954 976 commit_id='tip',
955 977 f_path='vcs/nodes.py'),
956 978 status=302)
957 979 assert_session_flash(
958 980 response, 'Cannot modify file. Given commit `tip` is not head of a branch.')
959 981
960 982 def test_delete_file_view_commit_changes(self, backend, csrf_token):
961 983 repo = backend.create_repo()
962 984 backend.ensure_file("vcs/nodes.py")
963 985
964 986 response = self.app.post(
965 987 route_path('repo_files_delete_file',
966 988 repo_name=repo.repo_name,
967 989 commit_id=backend.default_head_id,
968 990 f_path='vcs/nodes.py'),
969 991 params={
970 992 'message': 'i commited',
971 993 'csrf_token': csrf_token,
972 994 },
973 995 status=302)
974 996 assert_session_flash(
975 997 response, 'Successfully deleted file `vcs/nodes.py`')
976 998
977 999
978 1000 @pytest.mark.usefixtures("app")
979 1001 class TestFilesViewOtherCases(object):
980 1002
981 1003 def test_access_empty_repo_redirect_to_summary_with_alert_write_perms(
982 1004 self, backend_stub, autologin_regular_user, user_regular,
983 1005 user_util):
984 1006
985 1007 repo = backend_stub.create_repo()
986 1008 user_util.grant_user_permission_to_repo(
987 1009 repo, user_regular, 'repository.write')
988 1010 response = self.app.get(
989 1011 route_path('repo_files',
990 1012 repo_name=repo.repo_name,
991 1013 commit_id='tip', f_path='/'))
992 1014
993 1015 repo_file_add_url = route_path(
994 1016 'repo_files_add_file',
995 1017 repo_name=repo.repo_name,
996 1018 commit_id=0, f_path='')
997 1019
998 1020 assert_session_flash(
999 1021 response,
1000 1022 'There are no files yet. <a class="alert-link" '
1001 1023 'href="{}">Click here to add a new file.</a>'
1002 1024 .format(repo_file_add_url))
1003 1025
1004 1026 def test_access_empty_repo_redirect_to_summary_with_alert_no_write_perms(
1005 1027 self, backend_stub, autologin_regular_user):
1006 1028 repo = backend_stub.create_repo()
1007 1029 # init session for anon user
1008 1030 route_path('repo_summary', repo_name=repo.repo_name)
1009 1031
1010 1032 repo_file_add_url = route_path(
1011 1033 'repo_files_add_file',
1012 1034 repo_name=repo.repo_name,
1013 1035 commit_id=0, f_path='')
1014 1036
1015 1037 response = self.app.get(
1016 1038 route_path('repo_files',
1017 1039 repo_name=repo.repo_name,
1018 1040 commit_id='tip', f_path='/'))
1019 1041
1020 1042 assert_session_flash(response, no_=repo_file_add_url)
1021 1043
1022 1044 @pytest.mark.parametrize('file_node', [
1023 1045 'archive/file.zip',
1024 1046 'diff/my-file.txt',
1025 1047 'render.py',
1026 1048 'render',
1027 1049 'remove_file',
1028 1050 'remove_file/to-delete.txt',
1029 1051 ])
1030 1052 def test_file_names_equal_to_routes_parts(self, backend, file_node):
1031 1053 backend.create_repo()
1032 1054 backend.ensure_file(file_node)
1033 1055
1034 1056 self.app.get(
1035 1057 route_path('repo_files',
1036 1058 repo_name=backend.repo_name,
1037 1059 commit_id='tip', f_path=file_node),
1038 1060 status=200)
1039 1061
1040 1062
1041 1063 class TestAdjustFilePathForSvn(object):
1042 1064 """
1043 1065 SVN specific adjustments of node history in RepoFilesView.
1044 1066 """
1045 1067
1046 1068 def test_returns_path_relative_to_matched_reference(self):
1047 1069 repo = self._repo(branches=['trunk'])
1048 1070 self.assert_file_adjustment('trunk/file', 'file', repo)
1049 1071
1050 1072 def test_does_not_modify_file_if_no_reference_matches(self):
1051 1073 repo = self._repo(branches=['trunk'])
1052 1074 self.assert_file_adjustment('notes/file', 'notes/file', repo)
1053 1075
1054 1076 def test_does_not_adjust_partial_directory_names(self):
1055 1077 repo = self._repo(branches=['trun'])
1056 1078 self.assert_file_adjustment('trunk/file', 'trunk/file', repo)
1057 1079
1058 1080 def test_is_robust_to_patterns_which_prefix_other_patterns(self):
1059 1081 repo = self._repo(branches=['trunk', 'trunk/new', 'trunk/old'])
1060 1082 self.assert_file_adjustment('trunk/new/file', 'file', repo)
1061 1083
1062 1084 def assert_file_adjustment(self, f_path, expected, repo):
1063 1085 result = RepoFilesView.adjust_file_path_for_svn(f_path, repo)
1064 1086 assert result == expected
1065 1087
1066 1088 def _repo(self, branches=None):
1067 1089 repo = mock.Mock()
1068 1090 repo.branches = OrderedDict((name, '0') for name in branches or [])
1069 1091 repo.tags = {}
1070 1092 return repo
@@ -1,358 +1,358 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21
22 22 import logging
23 23
24 24 from pyramid.httpexceptions import HTTPNotFound, HTTPFound
25 25
26 26 from pyramid.renderers import render
27 27 from pyramid.response import Response
28 28
29 29 from rhodecode.apps._base import RepoAppView
30 30 import rhodecode.lib.helpers as h
31 31 from rhodecode.lib.auth import (
32 32 LoginRequired, HasRepoPermissionAnyDecorator)
33 33
34 34 from rhodecode.lib.ext_json import json
35 35 from rhodecode.lib.graphmod import _colored, _dagwalker
36 36 from rhodecode.lib.helpers import RepoPage
37 from rhodecode.lib.utils2 import safe_int, safe_str, str2bool
37 from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode
38 38 from rhodecode.lib.vcs.exceptions import (
39 39 RepositoryError, CommitDoesNotExistError,
40 40 CommitError, NodeDoesNotExistError, EmptyRepositoryError)
41 41
42 42 log = logging.getLogger(__name__)
43 43
44 44 DEFAULT_CHANGELOG_SIZE = 20
45 45
46 46
47 47 class RepoChangelogView(RepoAppView):
48 48
49 49 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
50 50 """
51 51 This is a safe way to get commit. If an error occurs it redirects to
52 52 tip with proper message
53 53
54 54 :param commit_id: id of commit to fetch
55 55 :param redirect_after: toggle redirection
56 56 """
57 57 _ = self.request.translate
58 58
59 59 try:
60 60 return self.rhodecode_vcs_repo.get_commit(commit_id)
61 61 except EmptyRepositoryError:
62 62 if not redirect_after:
63 63 return None
64 64
65 65 h.flash(h.literal(
66 66 _('There are no commits yet')), category='warning')
67 67 raise HTTPFound(
68 68 h.route_path('repo_summary', repo_name=self.db_repo_name))
69 69
70 70 except (CommitDoesNotExistError, LookupError):
71 71 msg = _('No such commit exists for this repository')
72 72 h.flash(msg, category='error')
73 73 raise HTTPNotFound()
74 74 except RepositoryError as e:
75 75 h.flash(safe_str(h.escape(e)), category='error')
76 76 raise HTTPNotFound()
77 77
78 78 def _graph(self, repo, commits, prev_data=None, next_data=None):
79 79 """
80 80 Generates a DAG graph for repo
81 81
82 82 :param repo: repo instance
83 83 :param commits: list of commits
84 84 """
85 85 if not commits:
86 86 return json.dumps([]), json.dumps([])
87 87
88 88 def serialize(commit, parents=True):
89 89 data = dict(
90 90 raw_id=commit.raw_id,
91 91 idx=commit.idx,
92 92 branch=None,
93 93 )
94 94 if parents:
95 95 data['parents'] = [
96 96 serialize(x, parents=False) for x in commit.parents]
97 97 return data
98 98
99 99 prev_data = prev_data or []
100 100 next_data = next_data or []
101 101
102 102 current = [serialize(x) for x in commits]
103 103 commits = prev_data + current + next_data
104 104
105 105 dag = _dagwalker(repo, commits)
106 106
107 107 data = [[commit_id, vtx, edges, branch]
108 108 for commit_id, vtx, edges, branch in _colored(dag)]
109 109 return json.dumps(data), json.dumps(current)
110 110
111 111 def _check_if_valid_branch(self, branch_name, repo_name, f_path):
112 112 if branch_name not in self.rhodecode_vcs_repo.branches_all:
113 h.flash('Branch {} is not found.'.format(h.escape(branch_name)),
113 h.flash(u'Branch {} is not found.'.format(h.escape(safe_unicode(branch_name))),
114 114 category='warning')
115 115 redirect_url = h.route_path(
116 116 'repo_commits_file', repo_name=repo_name,
117 117 commit_id=branch_name, f_path=f_path or '')
118 118 raise HTTPFound(redirect_url)
119 119
120 120 def _load_changelog_data(
121 121 self, c, collection, page, chunk_size, branch_name=None,
122 122 dynamic=False, f_path=None, commit_id=None):
123 123
124 124 def url_generator(page_num):
125 125 query_params = {
126 126 'page': page_num
127 127 }
128 128
129 129 if branch_name:
130 130 query_params.update({
131 131 'branch': branch_name
132 132 })
133 133
134 134 if f_path:
135 135 # changelog for file
136 136 return h.route_path(
137 137 'repo_commits_file',
138 138 repo_name=c.rhodecode_db_repo.repo_name,
139 139 commit_id=commit_id, f_path=f_path,
140 140 _query=query_params)
141 141 else:
142 142 return h.route_path(
143 143 'repo_commits',
144 144 repo_name=c.rhodecode_db_repo.repo_name, _query=query_params)
145 145
146 146 c.total_cs = len(collection)
147 147 c.showing_commits = min(chunk_size, c.total_cs)
148 148 c.pagination = RepoPage(collection, page=page, item_count=c.total_cs,
149 149 items_per_page=chunk_size, url_maker=url_generator)
150 150
151 151 c.next_page = c.pagination.next_page
152 152 c.prev_page = c.pagination.previous_page
153 153
154 154 if dynamic:
155 155 if self.request.GET.get('chunk') != 'next':
156 156 c.next_page = None
157 157 if self.request.GET.get('chunk') != 'prev':
158 158 c.prev_page = None
159 159
160 160 page_commit_ids = [x.raw_id for x in c.pagination]
161 161 c.comments = c.rhodecode_db_repo.get_comments(page_commit_ids)
162 162 c.statuses = c.rhodecode_db_repo.statuses(page_commit_ids)
163 163
164 164 def load_default_context(self):
165 165 c = self._get_local_tmpl_context(include_app_defaults=True)
166 166
167 167 c.rhodecode_repo = self.rhodecode_vcs_repo
168 168
169 169 return c
170 170
171 171 def _get_preload_attrs(self):
172 172 pre_load = ['author', 'branch', 'date', 'message', 'parents',
173 173 'obsolete', 'phase', 'hidden']
174 174 return pre_load
175 175
176 176 @LoginRequired()
177 177 @HasRepoPermissionAnyDecorator(
178 178 'repository.read', 'repository.write', 'repository.admin')
179 179 def repo_changelog(self):
180 180 c = self.load_default_context()
181 181
182 182 commit_id = self.request.matchdict.get('commit_id')
183 183 f_path = self._get_f_path(self.request.matchdict)
184 184 show_hidden = str2bool(self.request.GET.get('evolve'))
185 185
186 186 chunk_size = 20
187 187
188 188 c.branch_name = branch_name = self.request.GET.get('branch') or ''
189 189 c.book_name = book_name = self.request.GET.get('bookmark') or ''
190 190 c.f_path = f_path
191 191 c.commit_id = commit_id
192 192 c.show_hidden = show_hidden
193 193
194 194 hist_limit = safe_int(self.request.GET.get('limit')) or None
195 195
196 196 p = safe_int(self.request.GET.get('page', 1), 1)
197 197
198 198 c.selected_name = branch_name or book_name
199 199 if not commit_id and branch_name:
200 200 self._check_if_valid_branch(branch_name, self.db_repo_name, f_path)
201 201
202 202 c.changelog_for_path = f_path
203 203 pre_load = self._get_preload_attrs()
204 204
205 205 partial_xhr = self.request.environ.get('HTTP_X_PARTIAL_XHR')
206 206
207 207 try:
208 208 if f_path:
209 209 log.debug('generating changelog for path %s', f_path)
210 210 # get the history for the file !
211 211 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
212 212
213 213 try:
214 214 collection = base_commit.get_path_history(
215 215 f_path, limit=hist_limit, pre_load=pre_load)
216 216 if collection and partial_xhr:
217 217 # for ajax call we remove first one since we're looking
218 218 # at it right now in the context of a file commit
219 219 collection.pop(0)
220 220 except (NodeDoesNotExistError, CommitError):
221 221 # this node is not present at tip!
222 222 try:
223 223 commit = self._get_commit_or_redirect(commit_id)
224 224 collection = commit.get_path_history(f_path)
225 225 except RepositoryError as e:
226 226 h.flash(safe_str(e), category='warning')
227 227 redirect_url = h.route_path(
228 228 'repo_commits', repo_name=self.db_repo_name)
229 229 raise HTTPFound(redirect_url)
230 230 collection = list(reversed(collection))
231 231 else:
232 232 collection = self.rhodecode_vcs_repo.get_commits(
233 233 branch_name=branch_name, show_hidden=show_hidden,
234 234 pre_load=pre_load, translate_tags=False)
235 235
236 236 self._load_changelog_data(
237 237 c, collection, p, chunk_size, c.branch_name,
238 238 f_path=f_path, commit_id=commit_id)
239 239
240 240 except EmptyRepositoryError as e:
241 241 h.flash(safe_str(h.escape(e)), category='warning')
242 242 raise HTTPFound(
243 243 h.route_path('repo_summary', repo_name=self.db_repo_name))
244 244 except HTTPFound:
245 245 raise
246 246 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
247 247 log.exception(safe_str(e))
248 248 h.flash(safe_str(h.escape(e)), category='error')
249 249
250 250 if commit_id:
251 251 # from single commit page, we redirect to main commits
252 252 raise HTTPFound(
253 253 h.route_path('repo_commits', repo_name=self.db_repo_name))
254 254 else:
255 255 # otherwise we redirect to summary
256 256 raise HTTPFound(
257 257 h.route_path('repo_summary', repo_name=self.db_repo_name))
258 258
259 259 if partial_xhr or self.request.environ.get('HTTP_X_PJAX'):
260 260 # case when loading dynamic file history in file view
261 261 # loading from ajax, we don't want the first result, it's popped
262 262 # in the code above
263 263 html = render(
264 264 'rhodecode:templates/commits/changelog_file_history.mako',
265 265 self._get_template_context(c), self.request)
266 266 return Response(html)
267 267
268 268 commit_ids = []
269 269 if not f_path:
270 270 # only load graph data when not in file history mode
271 271 commit_ids = c.pagination
272 272
273 273 c.graph_data, c.graph_commits = self._graph(
274 274 self.rhodecode_vcs_repo, commit_ids)
275 275
276 276 return self._get_template_context(c)
277 277
278 278 @LoginRequired()
279 279 @HasRepoPermissionAnyDecorator(
280 280 'repository.read', 'repository.write', 'repository.admin')
281 281 def repo_commits_elements(self):
282 282 c = self.load_default_context()
283 283 commit_id = self.request.matchdict.get('commit_id')
284 284 f_path = self._get_f_path(self.request.matchdict)
285 285 show_hidden = str2bool(self.request.GET.get('evolve'))
286 286
287 287 chunk_size = 20
288 288 hist_limit = safe_int(self.request.GET.get('limit')) or None
289 289
290 290 def wrap_for_error(err):
291 291 html = '<tr>' \
292 292 '<td colspan="9" class="alert alert-error">ERROR: {}</td>' \
293 293 '</tr>'.format(err)
294 294 return Response(html)
295 295
296 296 c.branch_name = branch_name = self.request.GET.get('branch') or ''
297 297 c.book_name = book_name = self.request.GET.get('bookmark') or ''
298 298 c.f_path = f_path
299 299 c.commit_id = commit_id
300 300 c.show_hidden = show_hidden
301 301
302 302 c.selected_name = branch_name or book_name
303 303 if branch_name and branch_name not in self.rhodecode_vcs_repo.branches_all:
304 304 return wrap_for_error(
305 305 safe_str('Branch: {} is not valid'.format(branch_name)))
306 306
307 307 pre_load = self._get_preload_attrs()
308 308
309 309 if f_path:
310 310 try:
311 311 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
312 312 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
313 313 log.exception(safe_str(e))
314 314 raise HTTPFound(
315 315 h.route_path('repo_commits', repo_name=self.db_repo_name))
316 316
317 317 collection = base_commit.get_path_history(
318 318 f_path, limit=hist_limit, pre_load=pre_load)
319 319 collection = list(reversed(collection))
320 320 else:
321 321 collection = self.rhodecode_vcs_repo.get_commits(
322 322 branch_name=branch_name, show_hidden=show_hidden, pre_load=pre_load,
323 323 translate_tags=False)
324 324
325 325 p = safe_int(self.request.GET.get('page', 1), 1)
326 326 try:
327 327 self._load_changelog_data(
328 328 c, collection, p, chunk_size, dynamic=True,
329 329 f_path=f_path, commit_id=commit_id)
330 330 except EmptyRepositoryError as e:
331 331 return wrap_for_error(safe_str(e))
332 332 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
333 333 log.exception('Failed to fetch commits')
334 334 return wrap_for_error(safe_str(e))
335 335
336 336 prev_data = None
337 337 next_data = None
338 338
339 339 try:
340 340 prev_graph = json.loads(self.request.POST.get('graph') or '{}')
341 341 except json.JSONDecodeError:
342 342 prev_graph = {}
343 343
344 344 if self.request.GET.get('chunk') == 'prev':
345 345 next_data = prev_graph
346 346 elif self.request.GET.get('chunk') == 'next':
347 347 prev_data = prev_graph
348 348
349 349 commit_ids = []
350 350 if not f_path:
351 351 # only load graph data when not in file history mode
352 352 commit_ids = c.pagination
353 353
354 354 c.graph_data, c.graph_commits = self._graph(
355 355 self.rhodecode_vcs_repo, commit_ids,
356 356 prev_data=prev_data, next_data=next_data)
357 357
358 358 return self._get_template_context(c)
@@ -1,809 +1,813 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import collections
23 23
24 24 from pyramid.httpexceptions import (
25 25 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
26 26 from pyramid.renderers import render
27 27 from pyramid.response import Response
28 28
29 29 from rhodecode.apps._base import RepoAppView
30 30 from rhodecode.apps.file_store import utils as store_utils
31 31 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
32 32
33 33 from rhodecode.lib import diffs, codeblocks, channelstream
34 34 from rhodecode.lib.auth import (
35 35 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
36 36 from rhodecode.lib.ext_json import json
37 37 from rhodecode.lib.compat import OrderedDict
38 38 from rhodecode.lib.diffs import (
39 39 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
40 40 get_diff_whitespace_flag)
41 41 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
42 42 import rhodecode.lib.helpers as h
43 43 from rhodecode.lib.utils2 import safe_unicode, str2bool, StrictAttributeDict, safe_str
44 44 from rhodecode.lib.vcs.backends.base import EmptyCommit
45 45 from rhodecode.lib.vcs.exceptions import (
46 46 RepositoryError, CommitDoesNotExistError)
47 47 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
48 48 ChangesetCommentHistory
49 49 from rhodecode.model.changeset_status import ChangesetStatusModel
50 50 from rhodecode.model.comment import CommentsModel
51 51 from rhodecode.model.meta import Session
52 52 from rhodecode.model.settings import VcsSettingsModel
53 53
54 54 log = logging.getLogger(__name__)
55 55
56 56
57 57 def _update_with_GET(params, request):
58 58 for k in ['diff1', 'diff2', 'diff']:
59 59 params[k] += request.GET.getall(k)
60 60
61 61
62 62 class RepoCommitsView(RepoAppView):
63 63 def load_default_context(self):
64 64 c = self._get_local_tmpl_context(include_app_defaults=True)
65 65 c.rhodecode_repo = self.rhodecode_vcs_repo
66 66
67 67 return c
68 68
69 69 def _is_diff_cache_enabled(self, target_repo):
70 70 caching_enabled = self._get_general_setting(
71 71 target_repo, 'rhodecode_diff_cache')
72 72 log.debug('Diff caching enabled: %s', caching_enabled)
73 73 return caching_enabled
74 74
75 75 def _commit(self, commit_id_range, method):
76 76 _ = self.request.translate
77 77 c = self.load_default_context()
78 78 c.fulldiff = self.request.GET.get('fulldiff')
79 79 redirect_to_combined = str2bool(self.request.GET.get('redirect_combined'))
80 80
81 81 # fetch global flags of ignore ws or context lines
82 82 diff_context = get_diff_context(self.request)
83 83 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
84 84
85 85 # diff_limit will cut off the whole diff if the limit is applied
86 86 # otherwise it will just hide the big files from the front-end
87 87 diff_limit = c.visual.cut_off_limit_diff
88 88 file_limit = c.visual.cut_off_limit_file
89 89
90 90 # get ranges of commit ids if preset
91 91 commit_range = commit_id_range.split('...')[:2]
92 92
93 93 try:
94 94 pre_load = ['affected_files', 'author', 'branch', 'date',
95 95 'message', 'parents']
96 96 if self.rhodecode_vcs_repo.alias == 'hg':
97 97 pre_load += ['hidden', 'obsolete', 'phase']
98 98
99 99 if len(commit_range) == 2:
100 100 commits = self.rhodecode_vcs_repo.get_commits(
101 101 start_id=commit_range[0], end_id=commit_range[1],
102 102 pre_load=pre_load, translate_tags=False)
103 103 commits = list(commits)
104 104 else:
105 105 commits = [self.rhodecode_vcs_repo.get_commit(
106 106 commit_id=commit_id_range, pre_load=pre_load)]
107 107
108 108 c.commit_ranges = commits
109 109 if not c.commit_ranges:
110 110 raise RepositoryError('The commit range returned an empty result')
111 111 except CommitDoesNotExistError as e:
112 112 msg = _('No such commit exists. Org exception: `{}`').format(safe_str(e))
113 113 h.flash(msg, category='error')
114 114 raise HTTPNotFound()
115 115 except Exception:
116 116 log.exception("General failure")
117 117 raise HTTPNotFound()
118 118 single_commit = len(c.commit_ranges) == 1
119 119
120 120 if redirect_to_combined and not single_commit:
121 121 source_ref = getattr(c.commit_ranges[0].parents[0]
122 122 if c.commit_ranges[0].parents else h.EmptyCommit(), 'raw_id')
123 123 target_ref = c.commit_ranges[-1].raw_id
124 124 next_url = h.route_path(
125 125 'repo_compare',
126 126 repo_name=c.repo_name,
127 127 source_ref_type='rev',
128 128 source_ref=source_ref,
129 129 target_ref_type='rev',
130 130 target_ref=target_ref)
131 131 raise HTTPFound(next_url)
132 132
133 133 c.changes = OrderedDict()
134 134 c.lines_added = 0
135 135 c.lines_deleted = 0
136 136
137 137 # auto collapse if we have more than limit
138 138 collapse_limit = diffs.DiffProcessor._collapse_commits_over
139 139 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
140 140
141 141 c.commit_statuses = ChangesetStatus.STATUSES
142 142 c.inline_comments = []
143 143 c.files = []
144 144
145 145 c.comments = []
146 146 c.unresolved_comments = []
147 147 c.resolved_comments = []
148 148
149 149 # Single commit
150 150 if single_commit:
151 151 commit = c.commit_ranges[0]
152 152 c.comments = CommentsModel().get_comments(
153 153 self.db_repo.repo_id,
154 154 revision=commit.raw_id)
155 155
156 156 # comments from PR
157 157 statuses = ChangesetStatusModel().get_statuses(
158 158 self.db_repo.repo_id, commit.raw_id,
159 159 with_revisions=True)
160 160
161 161 prs = set()
162 162 reviewers = list()
163 163 reviewers_duplicates = set() # to not have duplicates from multiple votes
164 164 for c_status in statuses:
165 165
166 166 # extract associated pull-requests from votes
167 167 if c_status.pull_request:
168 168 prs.add(c_status.pull_request)
169 169
170 170 # extract reviewers
171 171 _user_id = c_status.author.user_id
172 172 if _user_id not in reviewers_duplicates:
173 173 reviewers.append(
174 174 StrictAttributeDict({
175 175 'user': c_status.author,
176 176
177 177 # fake attributed for commit, page that we don't have
178 178 # but we share the display with PR page
179 179 'mandatory': False,
180 180 'reasons': [],
181 181 'rule_user_group_data': lambda: None
182 182 })
183 183 )
184 184 reviewers_duplicates.add(_user_id)
185 185
186 186 c.reviewers_count = len(reviewers)
187 187 c.observers_count = 0
188 188
189 189 # from associated statuses, check the pull requests, and
190 190 # show comments from them
191 191 for pr in prs:
192 192 c.comments.extend(pr.comments)
193 193
194 194 c.unresolved_comments = CommentsModel()\
195 195 .get_commit_unresolved_todos(commit.raw_id)
196 196 c.resolved_comments = CommentsModel()\
197 197 .get_commit_resolved_todos(commit.raw_id)
198 198
199 199 c.inline_comments_flat = CommentsModel()\
200 200 .get_commit_inline_comments(commit.raw_id)
201 201
202 202 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
203 203 statuses, reviewers)
204 204
205 205 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
206 206
207 207 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
208 208
209 209 for review_obj, member, reasons, mandatory, status in review_statuses:
210 210 member_reviewer = h.reviewer_as_json(
211 211 member, reasons=reasons, mandatory=mandatory, role=None,
212 212 user_group=None
213 213 )
214 214
215 215 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
216 216 member_reviewer['review_status'] = current_review_status
217 217 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
218 218 member_reviewer['allowed_to_update'] = False
219 219 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
220 220
221 221 c.commit_set_reviewers_data_json = json.dumps(c.commit_set_reviewers_data_json)
222 222
223 223 # NOTE(marcink): this uses the same voting logic as in pull-requests
224 224 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
225 225 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
226 226
227 227 diff = None
228 228 # Iterate over ranges (default commit view is always one commit)
229 229 for commit in c.commit_ranges:
230 230 c.changes[commit.raw_id] = []
231 231
232 232 commit2 = commit
233 233 commit1 = commit.first_parent
234 234
235 235 if method == 'show':
236 236 inline_comments = CommentsModel().get_inline_comments(
237 237 self.db_repo.repo_id, revision=commit.raw_id)
238 238 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
239 239 inline_comments))
240 240 c.inline_comments = inline_comments
241 241
242 242 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
243 243 self.db_repo)
244 244 cache_file_path = diff_cache_exist(
245 245 cache_path, 'diff', commit.raw_id,
246 246 hide_whitespace_changes, diff_context, c.fulldiff)
247 247
248 248 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
249 249 force_recache = str2bool(self.request.GET.get('force_recache'))
250 250
251 251 cached_diff = None
252 252 if caching_enabled:
253 253 cached_diff = load_cached_diff(cache_file_path)
254 254
255 255 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
256 256 if not force_recache and has_proper_diff_cache:
257 257 diffset = cached_diff['diff']
258 258 else:
259 259 vcs_diff = self.rhodecode_vcs_repo.get_diff(
260 260 commit1, commit2,
261 261 ignore_whitespace=hide_whitespace_changes,
262 262 context=diff_context)
263 263
264 264 diff_processor = diffs.DiffProcessor(
265 265 vcs_diff, format='newdiff', diff_limit=diff_limit,
266 266 file_limit=file_limit, show_full_diff=c.fulldiff)
267 267
268 268 _parsed = diff_processor.prepare()
269 269
270 270 diffset = codeblocks.DiffSet(
271 271 repo_name=self.db_repo_name,
272 272 source_node_getter=codeblocks.diffset_node_getter(commit1),
273 273 target_node_getter=codeblocks.diffset_node_getter(commit2))
274 274
275 275 diffset = self.path_filter.render_patchset_filtered(
276 276 diffset, _parsed, commit1.raw_id, commit2.raw_id)
277 277
278 278 # save cached diff
279 279 if caching_enabled:
280 280 cache_diff(cache_file_path, diffset, None)
281 281
282 282 c.limited_diff = diffset.limited_diff
283 283 c.changes[commit.raw_id] = diffset
284 284 else:
285 285 # TODO(marcink): no cache usage here...
286 286 _diff = self.rhodecode_vcs_repo.get_diff(
287 287 commit1, commit2,
288 288 ignore_whitespace=hide_whitespace_changes, context=diff_context)
289 289 diff_processor = diffs.DiffProcessor(
290 290 _diff, format='newdiff', diff_limit=diff_limit,
291 291 file_limit=file_limit, show_full_diff=c.fulldiff)
292 292 # downloads/raw we only need RAW diff nothing else
293 293 diff = self.path_filter.get_raw_patch(diff_processor)
294 294 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
295 295
296 296 # sort comments by how they were generated
297 297 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
298 298 c.at_version_num = None
299 299
300 300 if len(c.commit_ranges) == 1:
301 301 c.commit = c.commit_ranges[0]
302 302 c.parent_tmpl = ''.join(
303 303 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
304 304
305 305 if method == 'download':
306 306 response = Response(diff)
307 307 response.content_type = 'text/plain'
308 308 response.content_disposition = (
309 309 'attachment; filename=%s.diff' % commit_id_range[:12])
310 310 return response
311 311 elif method == 'patch':
312 312 c.diff = safe_unicode(diff)
313 313 patch = render(
314 314 'rhodecode:templates/changeset/patch_changeset.mako',
315 315 self._get_template_context(c), self.request)
316 316 response = Response(patch)
317 317 response.content_type = 'text/plain'
318 318 return response
319 319 elif method == 'raw':
320 320 response = Response(diff)
321 321 response.content_type = 'text/plain'
322 322 return response
323 323 elif method == 'show':
324 324 if len(c.commit_ranges) == 1:
325 325 html = render(
326 326 'rhodecode:templates/changeset/changeset.mako',
327 327 self._get_template_context(c), self.request)
328 328 return Response(html)
329 329 else:
330 330 c.ancestor = None
331 331 c.target_repo = self.db_repo
332 332 html = render(
333 333 'rhodecode:templates/changeset/changeset_range.mako',
334 334 self._get_template_context(c), self.request)
335 335 return Response(html)
336 336
337 337 raise HTTPBadRequest()
338 338
339 339 @LoginRequired()
340 340 @HasRepoPermissionAnyDecorator(
341 341 'repository.read', 'repository.write', 'repository.admin')
342 342 def repo_commit_show(self):
343 343 commit_id = self.request.matchdict['commit_id']
344 344 return self._commit(commit_id, method='show')
345 345
346 346 @LoginRequired()
347 347 @HasRepoPermissionAnyDecorator(
348 348 'repository.read', 'repository.write', 'repository.admin')
349 349 def repo_commit_raw(self):
350 350 commit_id = self.request.matchdict['commit_id']
351 351 return self._commit(commit_id, method='raw')
352 352
353 353 @LoginRequired()
354 354 @HasRepoPermissionAnyDecorator(
355 355 'repository.read', 'repository.write', 'repository.admin')
356 356 def repo_commit_patch(self):
357 357 commit_id = self.request.matchdict['commit_id']
358 358 return self._commit(commit_id, method='patch')
359 359
360 360 @LoginRequired()
361 361 @HasRepoPermissionAnyDecorator(
362 362 'repository.read', 'repository.write', 'repository.admin')
363 363 def repo_commit_download(self):
364 364 commit_id = self.request.matchdict['commit_id']
365 365 return self._commit(commit_id, method='download')
366 366
367 367 def _commit_comments_create(self, commit_id, comments):
368 368 _ = self.request.translate
369 369 data = {}
370 370 if not comments:
371 371 return
372 372
373 373 commit = self.db_repo.get_commit(commit_id)
374 374
375 375 all_drafts = len([x for x in comments if str2bool(x['is_draft'])]) == len(comments)
376 376 for entry in comments:
377 377 c = self.load_default_context()
378 378 comment_type = entry['comment_type']
379 379 text = entry['text']
380 380 status = entry['status']
381 381 is_draft = str2bool(entry['is_draft'])
382 382 resolves_comment_id = entry['resolves_comment_id']
383 383 f_path = entry['f_path']
384 384 line_no = entry['line']
385 385 target_elem_id = 'file-{}'.format(h.safeid(h.safe_unicode(f_path)))
386 386
387 387 if status:
388 388 text = text or (_('Status change %(transition_icon)s %(status)s')
389 389 % {'transition_icon': '>',
390 390 'status': ChangesetStatus.get_status_lbl(status)})
391 391
392 392 comment = CommentsModel().create(
393 393 text=text,
394 394 repo=self.db_repo.repo_id,
395 395 user=self._rhodecode_db_user.user_id,
396 396 commit_id=commit_id,
397 397 f_path=f_path,
398 398 line_no=line_no,
399 399 status_change=(ChangesetStatus.get_status_lbl(status)
400 400 if status else None),
401 401 status_change_type=status,
402 402 comment_type=comment_type,
403 403 is_draft=is_draft,
404 404 resolves_comment_id=resolves_comment_id,
405 405 auth_user=self._rhodecode_user,
406 406 send_email=not is_draft, # skip notification for draft comments
407 407 )
408 408 is_inline = comment.is_inline
409 409
410 410 # get status if set !
411 411 if status:
412 412 # `dont_allow_on_closed_pull_request = True` means
413 413 # if latest status was from pull request and it's closed
414 414 # disallow changing status !
415 415
416 416 try:
417 417 ChangesetStatusModel().set_status(
418 418 self.db_repo.repo_id,
419 419 status,
420 420 self._rhodecode_db_user.user_id,
421 421 comment,
422 422 revision=commit_id,
423 423 dont_allow_on_closed_pull_request=True
424 424 )
425 425 except StatusChangeOnClosedPullRequestError:
426 426 msg = _('Changing the status of a commit associated with '
427 427 'a closed pull request is not allowed')
428 428 log.exception(msg)
429 429 h.flash(msg, category='warning')
430 430 raise HTTPFound(h.route_path(
431 431 'repo_commit', repo_name=self.db_repo_name,
432 432 commit_id=commit_id))
433 433
434 434 Session().flush()
435 435 # this is somehow required to get access to some relationship
436 436 # loaded on comment
437 437 Session().refresh(comment)
438 438
439 439 # skip notifications for drafts
440 440 if not is_draft:
441 441 CommentsModel().trigger_commit_comment_hook(
442 442 self.db_repo, self._rhodecode_user, 'create',
443 443 data={'comment': comment, 'commit': commit})
444 444
445 445 comment_id = comment.comment_id
446 446 data[comment_id] = {
447 447 'target_id': target_elem_id
448 448 }
449 449 Session().flush()
450 450
451 451 c.co = comment
452 452 c.at_version_num = 0
453 453 c.is_new = True
454 454 rendered_comment = render(
455 455 'rhodecode:templates/changeset/changeset_comment_block.mako',
456 456 self._get_template_context(c), self.request)
457 457
458 458 data[comment_id].update(comment.get_dict())
459 459 data[comment_id].update({'rendered_text': rendered_comment})
460 460
461 461 # finalize, commit and redirect
462 462 Session().commit()
463 463
464 464 # skip channelstream for draft comments
465 465 if not all_drafts:
466 466 comment_broadcast_channel = channelstream.comment_channel(
467 467 self.db_repo_name, commit_obj=commit)
468 468
469 469 comment_data = data
470 470 posted_comment_type = 'inline' if is_inline else 'general'
471 471 if len(data) == 1:
472 472 msg = _('posted {} new {} comment').format(len(data), posted_comment_type)
473 473 else:
474 474 msg = _('posted {} new {} comments').format(len(data), posted_comment_type)
475 475
476 476 channelstream.comment_channelstream_push(
477 477 self.request, comment_broadcast_channel, self._rhodecode_user, msg,
478 478 comment_data=comment_data)
479 479
480 480 return data
481 481
482 482 @LoginRequired()
483 483 @NotAnonymous()
484 484 @HasRepoPermissionAnyDecorator(
485 485 'repository.read', 'repository.write', 'repository.admin')
486 486 @CSRFRequired()
487 487 def repo_commit_comment_create(self):
488 488 _ = self.request.translate
489 489 commit_id = self.request.matchdict['commit_id']
490 490
491 491 multi_commit_ids = []
492 492 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
493 493 if _commit_id not in ['', None, EmptyCommit.raw_id]:
494 494 if _commit_id not in multi_commit_ids:
495 495 multi_commit_ids.append(_commit_id)
496 496
497 497 commit_ids = multi_commit_ids or [commit_id]
498 498
499 499 data = []
500 500 # Multiple comments for each passed commit id
501 501 for current_id in filter(None, commit_ids):
502 502 comment_data = {
503 503 'comment_type': self.request.POST.get('comment_type'),
504 504 'text': self.request.POST.get('text'),
505 505 'status': self.request.POST.get('changeset_status', None),
506 506 'is_draft': self.request.POST.get('draft'),
507 507 'resolves_comment_id': self.request.POST.get('resolves_comment_id', None),
508 508 'close_pull_request': self.request.POST.get('close_pull_request'),
509 509 'f_path': self.request.POST.get('f_path'),
510 510 'line': self.request.POST.get('line'),
511 511 }
512 512 comment = self._commit_comments_create(commit_id=current_id, comments=[comment_data])
513 513 data.append(comment)
514 514
515 515 return data if len(data) > 1 else data[0]
516 516
517 517 @LoginRequired()
518 518 @NotAnonymous()
519 519 @HasRepoPermissionAnyDecorator(
520 520 'repository.read', 'repository.write', 'repository.admin')
521 521 @CSRFRequired()
522 522 def repo_commit_comment_preview(self):
523 523 # Technically a CSRF token is not needed as no state changes with this
524 524 # call. However, as this is a POST is better to have it, so automated
525 525 # tools don't flag it as potential CSRF.
526 526 # Post is required because the payload could be bigger than the maximum
527 527 # allowed by GET.
528 528
529 529 text = self.request.POST.get('text')
530 530 renderer = self.request.POST.get('renderer') or 'rst'
531 531 if text:
532 532 return h.render(text, renderer=renderer, mentions=True,
533 533 repo_name=self.db_repo_name)
534 534 return ''
535 535
536 536 @LoginRequired()
537 537 @HasRepoPermissionAnyDecorator(
538 538 'repository.read', 'repository.write', 'repository.admin')
539 539 @CSRFRequired()
540 540 def repo_commit_comment_history_view(self):
541 541 c = self.load_default_context()
542 542
543 543 comment_history_id = self.request.matchdict['comment_history_id']
544 544 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
545 545 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
546 546
547 547 if is_repo_comment:
548 548 c.comment_history = comment_history
549 549
550 550 rendered_comment = render(
551 551 'rhodecode:templates/changeset/comment_history.mako',
552 552 self._get_template_context(c)
553 553 , self.request)
554 554 return rendered_comment
555 555 else:
556 556 log.warning('No permissions for user %s to show comment_history_id: %s',
557 557 self._rhodecode_db_user, comment_history_id)
558 558 raise HTTPNotFound()
559 559
560 560 @LoginRequired()
561 561 @NotAnonymous()
562 562 @HasRepoPermissionAnyDecorator(
563 563 'repository.read', 'repository.write', 'repository.admin')
564 564 @CSRFRequired()
565 565 def repo_commit_comment_attachment_upload(self):
566 566 c = self.load_default_context()
567 567 upload_key = 'attachment'
568 568
569 569 file_obj = self.request.POST.get(upload_key)
570 570
571 571 if file_obj is None:
572 572 self.request.response.status = 400
573 573 return {'store_fid': None,
574 574 'access_path': None,
575 575 'error': '{} data field is missing'.format(upload_key)}
576 576
577 577 if not hasattr(file_obj, 'filename'):
578 578 self.request.response.status = 400
579 579 return {'store_fid': None,
580 580 'access_path': None,
581 581 'error': 'filename cannot be read from the data field'}
582 582
583 583 filename = file_obj.filename
584 584 file_display_name = filename
585 585
586 586 metadata = {
587 587 'user_uploaded': {'username': self._rhodecode_user.username,
588 588 'user_id': self._rhodecode_user.user_id,
589 589 'ip': self._rhodecode_user.ip_addr}}
590 590
591 591 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
592 592 allowed_extensions = [
593 593 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
594 594 '.pptx', '.txt', '.xlsx', '.zip']
595 595 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
596 596
597 597 try:
598 598 storage = store_utils.get_file_storage(self.request.registry.settings)
599 599 store_uid, metadata = storage.save_file(
600 600 file_obj.file, filename, extra_metadata=metadata,
601 601 extensions=allowed_extensions, max_filesize=max_file_size)
602 602 except FileNotAllowedException:
603 603 self.request.response.status = 400
604 604 permitted_extensions = ', '.join(allowed_extensions)
605 605 error_msg = 'File `{}` is not allowed. ' \
606 606 'Only following extensions are permitted: {}'.format(
607 607 filename, permitted_extensions)
608 608 return {'store_fid': None,
609 609 'access_path': None,
610 610 'error': error_msg}
611 611 except FileOverSizeException:
612 612 self.request.response.status = 400
613 613 limit_mb = h.format_byte_size_binary(max_file_size)
614 614 return {'store_fid': None,
615 615 'access_path': None,
616 616 'error': 'File {} is exceeding allowed limit of {}.'.format(
617 617 filename, limit_mb)}
618 618
619 619 try:
620 620 entry = FileStore.create(
621 621 file_uid=store_uid, filename=metadata["filename"],
622 622 file_hash=metadata["sha256"], file_size=metadata["size"],
623 623 file_display_name=file_display_name,
624 624 file_description=u'comment attachment `{}`'.format(safe_unicode(filename)),
625 625 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
626 626 scope_repo_id=self.db_repo.repo_id
627 627 )
628 628 Session().add(entry)
629 629 Session().commit()
630 630 log.debug('Stored upload in DB as %s', entry)
631 631 except Exception:
632 632 log.exception('Failed to store file %s', filename)
633 633 self.request.response.status = 400
634 634 return {'store_fid': None,
635 635 'access_path': None,
636 636 'error': 'File {} failed to store in DB.'.format(filename)}
637 637
638 638 Session().commit()
639 639
640 640 return {
641 641 'store_fid': store_uid,
642 642 'access_path': h.route_path(
643 643 'download_file', fid=store_uid),
644 644 'fqn_access_path': h.route_url(
645 645 'download_file', fid=store_uid),
646 646 'repo_access_path': h.route_path(
647 647 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
648 648 'repo_fqn_access_path': h.route_url(
649 649 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
650 650 }
651 651
652 652 @LoginRequired()
653 653 @NotAnonymous()
654 654 @HasRepoPermissionAnyDecorator(
655 655 'repository.read', 'repository.write', 'repository.admin')
656 656 @CSRFRequired()
657 657 def repo_commit_comment_delete(self):
658 658 commit_id = self.request.matchdict['commit_id']
659 659 comment_id = self.request.matchdict['comment_id']
660 660
661 661 comment = ChangesetComment.get_or_404(comment_id)
662 662 if not comment:
663 663 log.debug('Comment with id:%s not found, skipping', comment_id)
664 664 # comment already deleted in another call probably
665 665 return True
666 666
667 667 if comment.immutable:
668 668 # don't allow deleting comments that are immutable
669 669 raise HTTPForbidden()
670 670
671 671 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
672 672 super_admin = h.HasPermissionAny('hg.admin')()
673 673 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
674 674 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
675 675 comment_repo_admin = is_repo_admin and is_repo_comment
676 676
677 if comment.draft and not comment_owner:
678 # We never allow to delete draft comments for other than owners
679 raise HTTPNotFound()
680
677 681 if super_admin or comment_owner or comment_repo_admin:
678 682 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
679 683 Session().commit()
680 684 return True
681 685 else:
682 686 log.warning('No permissions for user %s to delete comment_id: %s',
683 687 self._rhodecode_db_user, comment_id)
684 688 raise HTTPNotFound()
685 689
686 690 @LoginRequired()
687 691 @NotAnonymous()
688 692 @HasRepoPermissionAnyDecorator(
689 693 'repository.read', 'repository.write', 'repository.admin')
690 694 @CSRFRequired()
691 695 def repo_commit_comment_edit(self):
692 696 self.load_default_context()
693 697
694 698 commit_id = self.request.matchdict['commit_id']
695 699 comment_id = self.request.matchdict['comment_id']
696 700 comment = ChangesetComment.get_or_404(comment_id)
697 701
698 702 if comment.immutable:
699 703 # don't allow deleting comments that are immutable
700 704 raise HTTPForbidden()
701 705
702 706 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
703 707 super_admin = h.HasPermissionAny('hg.admin')()
704 708 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
705 709 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
706 710 comment_repo_admin = is_repo_admin and is_repo_comment
707 711
708 712 if super_admin or comment_owner or comment_repo_admin:
709 713 text = self.request.POST.get('text')
710 714 version = self.request.POST.get('version')
711 715 if text == comment.text:
712 716 log.warning(
713 717 'Comment(repo): '
714 718 'Trying to create new version '
715 719 'with the same comment body {}'.format(
716 720 comment_id,
717 721 )
718 722 )
719 723 raise HTTPNotFound()
720 724
721 725 if version.isdigit():
722 726 version = int(version)
723 727 else:
724 728 log.warning(
725 729 'Comment(repo): Wrong version type {} {} '
726 730 'for comment {}'.format(
727 731 version,
728 732 type(version),
729 733 comment_id,
730 734 )
731 735 )
732 736 raise HTTPNotFound()
733 737
734 738 try:
735 739 comment_history = CommentsModel().edit(
736 740 comment_id=comment_id,
737 741 text=text,
738 742 auth_user=self._rhodecode_user,
739 743 version=version,
740 744 )
741 745 except CommentVersionMismatch:
742 746 raise HTTPConflict()
743 747
744 748 if not comment_history:
745 749 raise HTTPNotFound()
746 750
747 751 if not comment.draft:
748 752 commit = self.db_repo.get_commit(commit_id)
749 753 CommentsModel().trigger_commit_comment_hook(
750 754 self.db_repo, self._rhodecode_user, 'edit',
751 755 data={'comment': comment, 'commit': commit})
752 756
753 757 Session().commit()
754 758 return {
755 759 'comment_history_id': comment_history.comment_history_id,
756 760 'comment_id': comment.comment_id,
757 761 'comment_version': comment_history.version,
758 762 'comment_author_username': comment_history.author.username,
759 763 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16),
760 764 'comment_created_on': h.age_component(comment_history.created_on,
761 765 time_is_local=True),
762 766 }
763 767 else:
764 768 log.warning('No permissions for user %s to edit comment_id: %s',
765 769 self._rhodecode_db_user, comment_id)
766 770 raise HTTPNotFound()
767 771
768 772 @LoginRequired()
769 773 @HasRepoPermissionAnyDecorator(
770 774 'repository.read', 'repository.write', 'repository.admin')
771 775 def repo_commit_data(self):
772 776 commit_id = self.request.matchdict['commit_id']
773 777 self.load_default_context()
774 778
775 779 try:
776 780 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
777 781 except CommitDoesNotExistError as e:
778 782 return EmptyCommit(message=str(e))
779 783
780 784 @LoginRequired()
781 785 @HasRepoPermissionAnyDecorator(
782 786 'repository.read', 'repository.write', 'repository.admin')
783 787 def repo_commit_children(self):
784 788 commit_id = self.request.matchdict['commit_id']
785 789 self.load_default_context()
786 790
787 791 try:
788 792 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
789 793 children = commit.children
790 794 except CommitDoesNotExistError:
791 795 children = []
792 796
793 797 result = {"results": children}
794 798 return result
795 799
796 800 @LoginRequired()
797 801 @HasRepoPermissionAnyDecorator(
798 802 'repository.read', 'repository.write', 'repository.admin')
799 803 def repo_commit_parents(self):
800 804 commit_id = self.request.matchdict['commit_id']
801 805 self.load_default_context()
802 806
803 807 try:
804 808 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
805 809 parents = commit.parents
806 810 except CommitDoesNotExistError:
807 811 parents = []
808 812 result = {"results": parents}
809 813 return result
@@ -1,206 +1,210 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2017-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20 import pytz
21 21 import logging
22 22
23 23 from pyramid.response import Response
24 24
25 25 from rhodecode.apps._base import RepoAppView
26 26 from rhodecode.lib.feedgenerator import Rss201rev2Feed, Atom1Feed
27 27 from rhodecode.lib import audit_logger
28 28 from rhodecode.lib import rc_cache
29 29 from rhodecode.lib import helpers as h
30 30 from rhodecode.lib.auth import (
31 31 LoginRequired, HasRepoPermissionAnyDecorator)
32 32 from rhodecode.lib.diffs import DiffProcessor, LimitedDiffContainer
33 33 from rhodecode.lib.utils2 import str2bool, safe_int, md5_safe
34 34 from rhodecode.model.db import UserApiKeys, CacheKey
35 35
36 36 log = logging.getLogger(__name__)
37 37
38 38
39 39 class RepoFeedView(RepoAppView):
40 40 def load_default_context(self):
41 41 c = self._get_local_tmpl_context()
42 42 self._load_defaults()
43 43 return c
44 44
45 45 def _get_config(self):
46 46 import rhodecode
47 47 config = rhodecode.CONFIG
48 48
49 49 return {
50 50 'language': 'en-us',
51 51 'feed_ttl': '5', # TTL of feed,
52 52 'feed_include_diff':
53 53 str2bool(config.get('rss_include_diff', False)),
54 54 'feed_items_per_page':
55 55 safe_int(config.get('rss_items_per_page', 20)),
56 56 'feed_diff_limit':
57 57 # we need to protect from parsing huge diffs here other way
58 58 # we can kill the server
59 59 safe_int(config.get('rss_cut_off_limit', 32 * 1024)),
60 60 }
61 61
62 62 def _load_defaults(self):
63 63 _ = self.request.translate
64 64 config = self._get_config()
65 65 # common values for feeds
66 66 self.description = _('Changes on %s repository')
67 67 self.title = _('%s %s feed') % (self.db_repo_name, '%s')
68 68 self.language = config["language"]
69 69 self.ttl = config["feed_ttl"]
70 70 self.feed_include_diff = config['feed_include_diff']
71 71 self.feed_diff_limit = config['feed_diff_limit']
72 72 self.feed_items_per_page = config['feed_items_per_page']
73 73
74 74 def _changes(self, commit):
75 75 diff_processor = DiffProcessor(
76 76 commit.diff(), diff_limit=self.feed_diff_limit)
77 77 _parsed = diff_processor.prepare(inline_diff=False)
78 78 limited_diff = isinstance(_parsed, LimitedDiffContainer)
79 79
80 80 return diff_processor, _parsed, limited_diff
81 81
82 82 def _get_title(self, commit):
83 83 return h.chop_at_smart(commit.message, '\n', suffix_if_chopped='...')
84 84
85 85 def _get_description(self, commit):
86 86 _renderer = self.request.get_partial_renderer(
87 87 'rhodecode:templates/feed/atom_feed_entry.mako')
88 88 diff_processor, parsed_diff, limited_diff = self._changes(commit)
89 89 filtered_parsed_diff, has_hidden_changes = self.path_filter.filter_patchset(parsed_diff)
90 90 return _renderer(
91 91 'body',
92 92 commit=commit,
93 93 parsed_diff=filtered_parsed_diff,
94 94 limited_diff=limited_diff,
95 95 feed_include_diff=self.feed_include_diff,
96 96 diff_processor=diff_processor,
97 97 has_hidden_changes=has_hidden_changes
98 98 )
99 99
100 100 def _set_timezone(self, date, tzinfo=pytz.utc):
101 101 if not getattr(date, "tzinfo", None):
102 102 date.replace(tzinfo=tzinfo)
103 103 return date
104 104
105 105 def _get_commits(self):
106 106 pre_load = ['author', 'branch', 'date', 'message', 'parents']
107 if self.rhodecode_vcs_repo.is_empty():
108 return []
109
107 110 collection = self.rhodecode_vcs_repo.get_commits(
108 111 branch_name=None, show_hidden=False, pre_load=pre_load,
109 112 translate_tags=False)
110 113
111 114 return list(collection[-self.feed_items_per_page:])
112 115
113 116 def uid(self, repo_id, commit_id):
114 117 return '{}:{}'.format(md5_safe(repo_id), md5_safe(commit_id))
115 118
116 119 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_FEED])
117 120 @HasRepoPermissionAnyDecorator(
118 121 'repository.read', 'repository.write', 'repository.admin')
119 122 def atom(self):
120 123 """
121 124 Produce an atom-1.0 feed via feedgenerator module
122 125 """
123 126 self.load_default_context()
124 127 force_recache = self.get_recache_flag()
125 128
126 129 cache_namespace_uid = 'cache_repo_feed.{}'.format(self.db_repo.repo_id)
127 130 condition = not (self.path_filter.is_enabled or force_recache)
128 131 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
129 132
130 133 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
131 134 condition=condition)
132 135 def generate_atom_feed(repo_id, _repo_name, _commit_id, _feed_type):
133 136 feed = Atom1Feed(
134 137 title=self.title % 'atom',
135 138 link=h.route_url('repo_summary', repo_name=_repo_name),
136 139 description=self.description % _repo_name,
137 140 language=self.language,
138 141 ttl=self.ttl
139 142 )
143
140 144 for commit in reversed(self._get_commits()):
141 145 date = self._set_timezone(commit.date)
142 146 feed.add_item(
143 147 unique_id=self.uid(repo_id, commit.raw_id),
144 148 title=self._get_title(commit),
145 149 author_name=commit.author,
146 150 description=self._get_description(commit),
147 151 link=h.route_url(
148 152 'repo_commit', repo_name=_repo_name,
149 153 commit_id=commit.raw_id),
150 154 pubdate=date,)
151 155
152 156 return feed.content_type, feed.writeString('utf-8')
153 157
154 158 commit_id = self.db_repo.changeset_cache.get('raw_id')
155 159 content_type, feed = generate_atom_feed(
156 160 self.db_repo.repo_id, self.db_repo.repo_name, commit_id, 'atom')
157 161
158 162 response = Response(feed)
159 163 response.content_type = content_type
160 164 return response
161 165
162 166 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_FEED])
163 167 @HasRepoPermissionAnyDecorator(
164 168 'repository.read', 'repository.write', 'repository.admin')
165 169 def rss(self):
166 170 """
167 171 Produce an rss2 feed via feedgenerator module
168 172 """
169 173 self.load_default_context()
170 174 force_recache = self.get_recache_flag()
171 175
172 176 cache_namespace_uid = 'cache_repo_feed.{}'.format(self.db_repo.repo_id)
173 177 condition = not (self.path_filter.is_enabled or force_recache)
174 178 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
175 179
176 180 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
177 181 condition=condition)
178 182 def generate_rss_feed(repo_id, _repo_name, _commit_id, _feed_type):
179 183 feed = Rss201rev2Feed(
180 184 title=self.title % 'rss',
181 185 link=h.route_url('repo_summary', repo_name=_repo_name),
182 186 description=self.description % _repo_name,
183 187 language=self.language,
184 188 ttl=self.ttl
185 189 )
186 190
187 191 for commit in reversed(self._get_commits()):
188 192 date = self._set_timezone(commit.date)
189 193 feed.add_item(
190 194 unique_id=self.uid(repo_id, commit.raw_id),
191 195 title=self._get_title(commit),
192 196 author_name=commit.author,
193 197 description=self._get_description(commit),
194 198 link=h.route_url(
195 199 'repo_commit', repo_name=_repo_name,
196 200 commit_id=commit.raw_id),
197 201 pubdate=date,)
198 202 return feed.content_type, feed.writeString('utf-8')
199 203
200 204 commit_id = self.db_repo.changeset_cache.get('raw_id')
201 205 content_type, feed = generate_rss_feed(
202 206 self.db_repo.repo_id, self.db_repo.repo_name, commit_id, 'rss')
203 207
204 208 response = Response(feed)
205 209 response.content_type = content_type
206 210 return response
@@ -1,1574 +1,1581 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import itertools
22 22 import logging
23 23 import os
24 24 import shutil
25 25 import tempfile
26 26 import collections
27 27 import urllib
28 28 import pathlib2
29 29
30 30 from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound
31 31
32 32 from pyramid.renderers import render
33 33 from pyramid.response import Response
34 34
35 35 import rhodecode
36 36 from rhodecode.apps._base import RepoAppView
37 37
38 38
39 39 from rhodecode.lib import diffs, helpers as h, rc_cache
40 40 from rhodecode.lib import audit_logger
41 41 from rhodecode.lib.view_utils import parse_path_ref
42 42 from rhodecode.lib.exceptions import NonRelativePathError
43 43 from rhodecode.lib.codeblocks import (
44 44 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
45 45 from rhodecode.lib.utils2 import (
46 46 convert_line_endings, detect_mode, safe_str, str2bool, safe_int, sha1, safe_unicode)
47 47 from rhodecode.lib.auth import (
48 48 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
49 49 from rhodecode.lib.vcs import path as vcspath
50 50 from rhodecode.lib.vcs.backends.base import EmptyCommit
51 51 from rhodecode.lib.vcs.conf import settings
52 52 from rhodecode.lib.vcs.nodes import FileNode
53 53 from rhodecode.lib.vcs.exceptions import (
54 54 RepositoryError, CommitDoesNotExistError, EmptyRepositoryError,
55 55 ImproperArchiveTypeError, VCSError, NodeAlreadyExistsError,
56 56 NodeDoesNotExistError, CommitError, NodeError)
57 57
58 58 from rhodecode.model.scm import ScmModel
59 59 from rhodecode.model.db import Repository
60 60
61 61 log = logging.getLogger(__name__)
62 62
63 63
64 64 class RepoFilesView(RepoAppView):
65 65
66 66 @staticmethod
67 67 def adjust_file_path_for_svn(f_path, repo):
68 68 """
69 69 Computes the relative path of `f_path`.
70 70
71 71 This is mainly based on prefix matching of the recognized tags and
72 72 branches in the underlying repository.
73 73 """
74 74 tags_and_branches = itertools.chain(
75 75 repo.branches.iterkeys(),
76 76 repo.tags.iterkeys())
77 77 tags_and_branches = sorted(tags_and_branches, key=len, reverse=True)
78 78
79 79 for name in tags_and_branches:
80 80 if f_path.startswith('{}/'.format(name)):
81 81 f_path = vcspath.relpath(f_path, name)
82 82 break
83 83 return f_path
84 84
85 85 def load_default_context(self):
86 86 c = self._get_local_tmpl_context(include_app_defaults=True)
87 87 c.rhodecode_repo = self.rhodecode_vcs_repo
88 88 c.enable_downloads = self.db_repo.enable_downloads
89 89 return c
90 90
91 91 def _ensure_not_locked(self, commit_id='tip'):
92 92 _ = self.request.translate
93 93
94 94 repo = self.db_repo
95 95 if repo.enable_locking and repo.locked[0]:
96 96 h.flash(_('This repository has been locked by %s on %s')
97 97 % (h.person_by_id(repo.locked[0]),
98 98 h.format_date(h.time_to_datetime(repo.locked[1]))),
99 99 'warning')
100 100 files_url = h.route_path(
101 101 'repo_files:default_path',
102 102 repo_name=self.db_repo_name, commit_id=commit_id)
103 103 raise HTTPFound(files_url)
104 104
105 105 def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False):
106 106 _ = self.request.translate
107 107
108 108 if not is_head:
109 109 message = _('Cannot modify file. '
110 110 'Given commit `{}` is not head of a branch.').format(commit_id)
111 111 h.flash(message, category='warning')
112 112
113 113 if json_mode:
114 114 return message
115 115
116 116 files_url = h.route_path(
117 117 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id,
118 118 f_path=f_path)
119 119 raise HTTPFound(files_url)
120 120
121 121 def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False):
122 122 _ = self.request.translate
123 123
124 124 rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission(
125 125 self.db_repo_name, branch_name)
126 126 if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']:
127 127 message = _('Branch `{}` changes forbidden by rule {}.').format(
128 128 h.escape(branch_name), h.escape(rule))
129 129 h.flash(message, 'warning')
130 130
131 131 if json_mode:
132 132 return message
133 133
134 134 files_url = h.route_path(
135 135 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id)
136 136
137 137 raise HTTPFound(files_url)
138 138
139 139 def _get_commit_and_path(self):
140 140 default_commit_id = self.db_repo.landing_ref_name
141 141 default_f_path = '/'
142 142
143 143 commit_id = self.request.matchdict.get(
144 144 'commit_id', default_commit_id)
145 145 f_path = self._get_f_path(self.request.matchdict, default_f_path)
146 146 return commit_id, f_path
147 147
148 148 def _get_default_encoding(self, c):
149 149 enc_list = getattr(c, 'default_encodings', [])
150 150 return enc_list[0] if enc_list else 'UTF-8'
151 151
152 152 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
153 153 """
154 154 This is a safe way to get commit. If an error occurs it redirects to
155 155 tip with proper message
156 156
157 157 :param commit_id: id of commit to fetch
158 158 :param redirect_after: toggle redirection
159 159 """
160 160 _ = self.request.translate
161 161
162 162 try:
163 163 return self.rhodecode_vcs_repo.get_commit(commit_id)
164 164 except EmptyRepositoryError:
165 165 if not redirect_after:
166 166 return None
167 167
168 168 _url = h.route_path(
169 169 'repo_files_add_file',
170 170 repo_name=self.db_repo_name, commit_id=0, f_path='')
171 171
172 172 if h.HasRepoPermissionAny(
173 173 'repository.write', 'repository.admin')(self.db_repo_name):
174 174 add_new = h.link_to(
175 175 _('Click here to add a new file.'), _url, class_="alert-link")
176 176 else:
177 177 add_new = ""
178 178
179 179 h.flash(h.literal(
180 180 _('There are no files yet. %s') % add_new), category='warning')
181 181 raise HTTPFound(
182 182 h.route_path('repo_summary', repo_name=self.db_repo_name))
183 183
184 184 except (CommitDoesNotExistError, LookupError) as e:
185 185 msg = _('No such commit exists for this repository. Commit: {}').format(commit_id)
186 186 h.flash(msg, category='error')
187 187 raise HTTPNotFound()
188 188 except RepositoryError as e:
189 189 h.flash(safe_str(h.escape(e)), category='error')
190 190 raise HTTPNotFound()
191 191
192 192 def _get_filenode_or_redirect(self, commit_obj, path):
193 193 """
194 194 Returns file_node, if error occurs or given path is directory,
195 195 it'll redirect to top level path
196 196 """
197 197 _ = self.request.translate
198 198
199 199 try:
200 200 file_node = commit_obj.get_node(path)
201 201 if file_node.is_dir():
202 202 raise RepositoryError('The given path is a directory')
203 203 except CommitDoesNotExistError:
204 204 log.exception('No such commit exists for this repository')
205 205 h.flash(_('No such commit exists for this repository'), category='error')
206 206 raise HTTPNotFound()
207 207 except RepositoryError as e:
208 208 log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e)
209 209 h.flash(safe_str(h.escape(e)), category='error')
210 210 raise HTTPNotFound()
211 211
212 212 return file_node
213 213
214 214 def _is_valid_head(self, commit_id, repo, landing_ref):
215 215 branch_name = sha_commit_id = ''
216 216 is_head = False
217 217 log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo)
218 218
219 219 for _branch_name, branch_commit_id in repo.branches.items():
220 220 # simple case we pass in branch name, it's a HEAD
221 221 if commit_id == _branch_name:
222 222 is_head = True
223 223 branch_name = _branch_name
224 224 sha_commit_id = branch_commit_id
225 225 break
226 226 # case when we pass in full sha commit_id, which is a head
227 227 elif commit_id == branch_commit_id:
228 228 is_head = True
229 229 branch_name = _branch_name
230 230 sha_commit_id = branch_commit_id
231 231 break
232 232
233 233 if h.is_svn(repo) and not repo.is_empty():
234 234 # Note: Subversion only has one head.
235 235 if commit_id == repo.get_commit(commit_idx=-1).raw_id:
236 236 is_head = True
237 237 return branch_name, sha_commit_id, is_head
238 238
239 239 # checked branches, means we only need to try to get the branch/commit_sha
240 240 if repo.is_empty():
241 241 is_head = True
242 242 branch_name = landing_ref
243 243 sha_commit_id = EmptyCommit().raw_id
244 244 else:
245 245 commit = repo.get_commit(commit_id=commit_id)
246 246 if commit:
247 247 branch_name = commit.branch
248 248 sha_commit_id = commit.raw_id
249 249
250 250 return branch_name, sha_commit_id, is_head
251 251
252 252 def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False, at_rev=None):
253 253
254 254 repo_id = self.db_repo.repo_id
255 255 force_recache = self.get_recache_flag()
256 256
257 257 cache_seconds = safe_int(
258 258 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
259 259 cache_on = not force_recache and cache_seconds > 0
260 260 log.debug(
261 261 'Computing FILE TREE for repo_id %s commit_id `%s` and path `%s`'
262 262 'with caching: %s[TTL: %ss]' % (
263 263 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
264 264
265 265 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
266 266 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
267 267
268 268 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
269 269 def compute_file_tree(ver, _name_hash, _repo_id, _commit_id, _f_path, _full_load, _at_rev):
270 270 log.debug('Generating cached file tree at ver:%s for repo_id: %s, %s, %s',
271 271 ver, _repo_id, _commit_id, _f_path)
272 272
273 273 c.full_load = _full_load
274 274 return render(
275 275 'rhodecode:templates/files/files_browser_tree.mako',
276 276 self._get_template_context(c), self.request, _at_rev)
277 277
278 278 return compute_file_tree(
279 279 rc_cache.FILE_TREE_CACHE_VER, self.db_repo.repo_name_hash,
280 280 self.db_repo.repo_id, commit_id, f_path, full_load, at_rev)
281 281
282 282 def _get_archive_spec(self, fname):
283 283 log.debug('Detecting archive spec for: `%s`', fname)
284 284
285 285 fileformat = None
286 286 ext = None
287 287 content_type = None
288 288 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
289 289
290 290 if fname.endswith(extension):
291 291 fileformat = a_type
292 292 log.debug('archive is of type: %s', fileformat)
293 293 ext = extension
294 294 break
295 295
296 296 if not fileformat:
297 297 raise ValueError()
298 298
299 299 # left over part of whole fname is the commit
300 300 commit_id = fname[:-len(ext)]
301 301
302 302 return commit_id, ext, fileformat, content_type
303 303
304 304 def create_pure_path(self, *parts):
305 305 # Split paths and sanitize them, removing any ../ etc
306 306 sanitized_path = [
307 307 x for x in pathlib2.PurePath(*parts).parts
308 308 if x not in ['.', '..']]
309 309
310 310 pure_path = pathlib2.PurePath(*sanitized_path)
311 311 return pure_path
312 312
313 313 def _is_lf_enabled(self, target_repo):
314 314 lf_enabled = False
315 315
316 316 lf_key_for_vcs_map = {
317 317 'hg': 'extensions_largefiles',
318 318 'git': 'vcs_git_lfs_enabled'
319 319 }
320 320
321 321 lf_key_for_vcs = lf_key_for_vcs_map.get(target_repo.repo_type)
322 322
323 323 if lf_key_for_vcs:
324 324 lf_enabled = self._get_repo_setting(target_repo, lf_key_for_vcs)
325 325
326 326 return lf_enabled
327 327
328 def _get_archive_name(self, db_repo_name, commit_sha, ext, subrepos=False, path_sha=''):
328 def _get_archive_name(self, db_repo_name, commit_sha, ext, subrepos=False, path_sha='', with_hash=True):
329 329 # original backward compat name of archive
330 330 clean_name = safe_str(db_repo_name.replace('/', '_'))
331 331
332 332 # e.g vcsserver.zip
333 333 # e.g vcsserver-abcdefgh.zip
334 334 # e.g vcsserver-abcdefgh-defghijk.zip
335 archive_name = '{}{}{}{}{}'.format(
335 archive_name = '{}{}{}{}{}{}'.format(
336 336 clean_name,
337 337 '-sub' if subrepos else '',
338 338 commit_sha,
339 '-{}'.format('plain') if not with_hash else '',
339 340 '-{}'.format(path_sha) if path_sha else '',
340 341 ext)
341 342 return archive_name
342 343
343 344 @LoginRequired()
344 345 @HasRepoPermissionAnyDecorator(
345 346 'repository.read', 'repository.write', 'repository.admin')
346 347 def repo_archivefile(self):
347 348 # archive cache config
348 349 from rhodecode import CONFIG
349 350 _ = self.request.translate
350 351 self.load_default_context()
351 352 default_at_path = '/'
352 353 fname = self.request.matchdict['fname']
353 354 subrepos = self.request.GET.get('subrepos') == 'true'
354 355 with_hash = str2bool(self.request.GET.get('with_hash', '1'))
355 356 at_path = self.request.GET.get('at_path') or default_at_path
356 357
357 358 if not self.db_repo.enable_downloads:
358 359 return Response(_('Downloads disabled'))
359 360
360 361 try:
361 362 commit_id, ext, fileformat, content_type = \
362 363 self._get_archive_spec(fname)
363 364 except ValueError:
364 365 return Response(_('Unknown archive type for: `{}`').format(
365 366 h.escape(fname)))
366 367
367 368 try:
368 369 commit = self.rhodecode_vcs_repo.get_commit(commit_id)
369 370 except CommitDoesNotExistError:
370 371 return Response(_('Unknown commit_id {}').format(
371 372 h.escape(commit_id)))
372 373 except EmptyRepositoryError:
373 374 return Response(_('Empty repository'))
374 375
376 # we used a ref, or a shorter version, lets redirect client ot use explicit hash
377 if commit_id != commit.raw_id:
378 fname='{}{}'.format(commit.raw_id, ext)
379 raise HTTPFound(self.request.current_route_path(fname=fname))
380
375 381 try:
376 382 at_path = commit.get_node(at_path).path or default_at_path
377 383 except Exception:
378 384 return Response(_('No node at path {} for this repository').format(at_path))
379 385
380 386 # path sha is part of subdir
381 387 path_sha = ''
382 388 if at_path != default_at_path:
383 389 path_sha = sha1(at_path)[:8]
384 390 short_sha = '-{}'.format(safe_str(commit.short_id))
385 391 # used for cache etc
386 392 archive_name = self._get_archive_name(
387 393 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
388 path_sha=path_sha)
394 path_sha=path_sha, with_hash=with_hash)
389 395
390 396 if not with_hash:
391 397 short_sha = ''
392 398 path_sha = ''
393 399
394 400 # what end client gets served
395 401 response_archive_name = self._get_archive_name(
396 402 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
397 path_sha=path_sha)
403 path_sha=path_sha, with_hash=with_hash)
398 404 # remove extension from our archive directory name
399 405 archive_dir_name = response_archive_name[:-len(ext)]
400 406
401 407 use_cached_archive = False
402 408 archive_cache_dir = CONFIG.get('archive_cache_dir')
403 409 archive_cache_enabled = archive_cache_dir and not self.request.GET.get('no_cache')
404 410 cached_archive_path = None
405 411
406 412 if archive_cache_enabled:
407 # check if we it's ok to write
413 # check if we it's ok to write, and re-create the archive cache
408 414 if not os.path.isdir(CONFIG['archive_cache_dir']):
409 415 os.makedirs(CONFIG['archive_cache_dir'])
416
410 417 cached_archive_path = os.path.join(
411 418 CONFIG['archive_cache_dir'], archive_name)
412 419 if os.path.isfile(cached_archive_path):
413 420 log.debug('Found cached archive in %s', cached_archive_path)
414 421 fd, archive = None, cached_archive_path
415 422 use_cached_archive = True
416 423 else:
417 424 log.debug('Archive %s is not yet cached', archive_name)
418 425
419 426 # generate new archive, as previous was not found in the cache
420 427 if not use_cached_archive:
421 428 _dir = os.path.abspath(archive_cache_dir) if archive_cache_dir else None
422 429 fd, archive = tempfile.mkstemp(dir=_dir)
423 430 log.debug('Creating new temp archive in %s', archive)
424 431 try:
425 432 commit.archive_repo(archive, archive_dir_name=archive_dir_name,
426 433 kind=fileformat, subrepos=subrepos,
427 434 archive_at_path=at_path)
428 435 except ImproperArchiveTypeError:
429 436 return _('Unknown archive type')
430 437 if archive_cache_enabled:
431 438 # if we generated the archive and we have cache enabled
432 439 # let's use this for future
433 440 log.debug('Storing new archive in %s', cached_archive_path)
434 441 shutil.move(archive, cached_archive_path)
435 442 archive = cached_archive_path
436 443
437 444 # store download action
438 445 audit_logger.store_web(
439 446 'repo.archive.download', action_data={
440 447 'user_agent': self.request.user_agent,
441 448 'archive_name': archive_name,
442 449 'archive_spec': fname,
443 450 'archive_cached': use_cached_archive},
444 451 user=self._rhodecode_user,
445 452 repo=self.db_repo,
446 453 commit=True
447 454 )
448 455
449 456 def get_chunked_archive(archive_path):
450 457 with open(archive_path, 'rb') as stream:
451 458 while True:
452 459 data = stream.read(16 * 1024)
453 460 if not data:
454 461 if fd: # fd means we used temporary file
455 462 os.close(fd)
456 463 if not archive_cache_enabled:
457 464 log.debug('Destroying temp archive %s', archive_path)
458 465 os.remove(archive_path)
459 466 break
460 467 yield data
461 468
462 469 response = Response(app_iter=get_chunked_archive(archive))
463 470 response.content_disposition = str('attachment; filename=%s' % response_archive_name)
464 471 response.content_type = str(content_type)
465 472
466 473 return response
467 474
468 475 def _get_file_node(self, commit_id, f_path):
469 476 if commit_id not in ['', None, 'None', '0' * 12, '0' * 40]:
470 477 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
471 478 try:
472 479 node = commit.get_node(f_path)
473 480 if node.is_dir():
474 481 raise NodeError('%s path is a %s not a file'
475 482 % (node, type(node)))
476 483 except NodeDoesNotExistError:
477 484 commit = EmptyCommit(
478 485 commit_id=commit_id,
479 486 idx=commit.idx,
480 487 repo=commit.repository,
481 488 alias=commit.repository.alias,
482 489 message=commit.message,
483 490 author=commit.author,
484 491 date=commit.date)
485 492 node = FileNode(f_path, '', commit=commit)
486 493 else:
487 494 commit = EmptyCommit(
488 495 repo=self.rhodecode_vcs_repo,
489 496 alias=self.rhodecode_vcs_repo.alias)
490 497 node = FileNode(f_path, '', commit=commit)
491 498 return node
492 499
493 500 @LoginRequired()
494 501 @HasRepoPermissionAnyDecorator(
495 502 'repository.read', 'repository.write', 'repository.admin')
496 503 def repo_files_diff(self):
497 504 c = self.load_default_context()
498 505 f_path = self._get_f_path(self.request.matchdict)
499 506 diff1 = self.request.GET.get('diff1', '')
500 507 diff2 = self.request.GET.get('diff2', '')
501 508
502 509 path1, diff1 = parse_path_ref(diff1, default_path=f_path)
503 510
504 511 ignore_whitespace = str2bool(self.request.GET.get('ignorews'))
505 512 line_context = self.request.GET.get('context', 3)
506 513
507 514 if not any((diff1, diff2)):
508 515 h.flash(
509 516 'Need query parameter "diff1" or "diff2" to generate a diff.',
510 517 category='error')
511 518 raise HTTPBadRequest()
512 519
513 520 c.action = self.request.GET.get('diff')
514 521 if c.action not in ['download', 'raw']:
515 522 compare_url = h.route_path(
516 523 'repo_compare',
517 524 repo_name=self.db_repo_name,
518 525 source_ref_type='rev',
519 526 source_ref=diff1,
520 527 target_repo=self.db_repo_name,
521 528 target_ref_type='rev',
522 529 target_ref=diff2,
523 530 _query=dict(f_path=f_path))
524 531 # redirect to new view if we render diff
525 532 raise HTTPFound(compare_url)
526 533
527 534 try:
528 535 node1 = self._get_file_node(diff1, path1)
529 536 node2 = self._get_file_node(diff2, f_path)
530 537 except (RepositoryError, NodeError):
531 538 log.exception("Exception while trying to get node from repository")
532 539 raise HTTPFound(
533 540 h.route_path('repo_files', repo_name=self.db_repo_name,
534 541 commit_id='tip', f_path=f_path))
535 542
536 543 if all(isinstance(node.commit, EmptyCommit)
537 544 for node in (node1, node2)):
538 545 raise HTTPNotFound()
539 546
540 547 c.commit_1 = node1.commit
541 548 c.commit_2 = node2.commit
542 549
543 550 if c.action == 'download':
544 551 _diff = diffs.get_gitdiff(node1, node2,
545 552 ignore_whitespace=ignore_whitespace,
546 553 context=line_context)
547 554 diff = diffs.DiffProcessor(_diff, format='gitdiff')
548 555
549 556 response = Response(self.path_filter.get_raw_patch(diff))
550 557 response.content_type = 'text/plain'
551 558 response.content_disposition = (
552 559 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2)
553 560 )
554 561 charset = self._get_default_encoding(c)
555 562 if charset:
556 563 response.charset = charset
557 564 return response
558 565
559 566 elif c.action == 'raw':
560 567 _diff = diffs.get_gitdiff(node1, node2,
561 568 ignore_whitespace=ignore_whitespace,
562 569 context=line_context)
563 570 diff = diffs.DiffProcessor(_diff, format='gitdiff')
564 571
565 572 response = Response(self.path_filter.get_raw_patch(diff))
566 573 response.content_type = 'text/plain'
567 574 charset = self._get_default_encoding(c)
568 575 if charset:
569 576 response.charset = charset
570 577 return response
571 578
572 579 # in case we ever end up here
573 580 raise HTTPNotFound()
574 581
575 582 @LoginRequired()
576 583 @HasRepoPermissionAnyDecorator(
577 584 'repository.read', 'repository.write', 'repository.admin')
578 585 def repo_files_diff_2way_redirect(self):
579 586 """
580 587 Kept only to make OLD links work
581 588 """
582 589 f_path = self._get_f_path_unchecked(self.request.matchdict)
583 590 diff1 = self.request.GET.get('diff1', '')
584 591 diff2 = self.request.GET.get('diff2', '')
585 592
586 593 if not any((diff1, diff2)):
587 594 h.flash(
588 595 'Need query parameter "diff1" or "diff2" to generate a diff.',
589 596 category='error')
590 597 raise HTTPBadRequest()
591 598
592 599 compare_url = h.route_path(
593 600 'repo_compare',
594 601 repo_name=self.db_repo_name,
595 602 source_ref_type='rev',
596 603 source_ref=diff1,
597 604 target_ref_type='rev',
598 605 target_ref=diff2,
599 606 _query=dict(f_path=f_path, diffmode='sideside',
600 607 target_repo=self.db_repo_name,))
601 608 raise HTTPFound(compare_url)
602 609
603 610 @LoginRequired()
604 611 def repo_files_default_commit_redirect(self):
605 612 """
606 613 Special page that redirects to the landing page of files based on the default
607 614 commit for repository
608 615 """
609 616 c = self.load_default_context()
610 617 ref_name = c.rhodecode_db_repo.landing_ref_name
611 618 landing_url = h.repo_files_by_ref_url(
612 619 c.rhodecode_db_repo.repo_name,
613 620 c.rhodecode_db_repo.repo_type,
614 621 f_path='',
615 622 ref_name=ref_name,
616 623 commit_id='tip',
617 624 query=dict(at=ref_name)
618 625 )
619 626
620 627 raise HTTPFound(landing_url)
621 628
622 629 @LoginRequired()
623 630 @HasRepoPermissionAnyDecorator(
624 631 'repository.read', 'repository.write', 'repository.admin')
625 632 def repo_files(self):
626 633 c = self.load_default_context()
627 634
628 635 view_name = getattr(self.request.matched_route, 'name', None)
629 636
630 637 c.annotate = view_name == 'repo_files:annotated'
631 638 # default is false, but .rst/.md files later are auto rendered, we can
632 639 # overwrite auto rendering by setting this GET flag
633 640 c.renderer = view_name == 'repo_files:rendered' or \
634 641 not self.request.GET.get('no-render', False)
635 642
636 643 commit_id, f_path = self._get_commit_and_path()
637 644
638 645 c.commit = self._get_commit_or_redirect(commit_id)
639 646 c.branch = self.request.GET.get('branch', None)
640 647 c.f_path = f_path
641 648 at_rev = self.request.GET.get('at')
642 649
643 650 # prev link
644 651 try:
645 652 prev_commit = c.commit.prev(c.branch)
646 653 c.prev_commit = prev_commit
647 654 c.url_prev = h.route_path(
648 655 'repo_files', repo_name=self.db_repo_name,
649 656 commit_id=prev_commit.raw_id, f_path=f_path)
650 657 if c.branch:
651 658 c.url_prev += '?branch=%s' % c.branch
652 659 except (CommitDoesNotExistError, VCSError):
653 660 c.url_prev = '#'
654 661 c.prev_commit = EmptyCommit()
655 662
656 663 # next link
657 664 try:
658 665 next_commit = c.commit.next(c.branch)
659 666 c.next_commit = next_commit
660 667 c.url_next = h.route_path(
661 668 'repo_files', repo_name=self.db_repo_name,
662 669 commit_id=next_commit.raw_id, f_path=f_path)
663 670 if c.branch:
664 671 c.url_next += '?branch=%s' % c.branch
665 672 except (CommitDoesNotExistError, VCSError):
666 673 c.url_next = '#'
667 674 c.next_commit = EmptyCommit()
668 675
669 676 # files or dirs
670 677 try:
671 678 c.file = c.commit.get_node(f_path)
672 679 c.file_author = True
673 680 c.file_tree = ''
674 681
675 682 # load file content
676 683 if c.file.is_file():
677 684 c.lf_node = {}
678 685
679 686 has_lf_enabled = self._is_lf_enabled(self.db_repo)
680 687 if has_lf_enabled:
681 688 c.lf_node = c.file.get_largefile_node()
682 689
683 690 c.file_source_page = 'true'
684 691 c.file_last_commit = c.file.last_commit
685 692
686 693 c.file_size_too_big = c.file.size > c.visual.cut_off_limit_file
687 694
688 695 if not (c.file_size_too_big or c.file.is_binary):
689 696 if c.annotate: # annotation has precedence over renderer
690 697 c.annotated_lines = filenode_as_annotated_lines_tokens(
691 698 c.file
692 699 )
693 700 else:
694 701 c.renderer = (
695 702 c.renderer and h.renderer_from_filename(c.file.path)
696 703 )
697 704 if not c.renderer:
698 705 c.lines = filenode_as_lines_tokens(c.file)
699 706
700 707 _branch_name, _sha_commit_id, is_head = \
701 708 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
702 709 landing_ref=self.db_repo.landing_ref_name)
703 710 c.on_branch_head = is_head
704 711
705 712 branch = c.commit.branch if (
706 713 c.commit.branch and '/' not in c.commit.branch) else None
707 714 c.branch_or_raw_id = branch or c.commit.raw_id
708 715 c.branch_name = c.commit.branch or h.short_id(c.commit.raw_id)
709 716
710 717 author = c.file_last_commit.author
711 718 c.authors = [[
712 719 h.email(author),
713 720 h.person(author, 'username_or_name_or_email'),
714 721 1
715 722 ]]
716 723
717 724 else: # load tree content at path
718 725 c.file_source_page = 'false'
719 726 c.authors = []
720 727 # this loads a simple tree without metadata to speed things up
721 728 # later via ajax we call repo_nodetree_full and fetch whole
722 729 c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path, at_rev=at_rev)
723 730
724 731 c.readme_data, c.readme_file = \
725 732 self._get_readme_data(self.db_repo, c.visual.default_renderer,
726 733 c.commit.raw_id, f_path)
727 734
728 735 except RepositoryError as e:
729 736 h.flash(safe_str(h.escape(e)), category='error')
730 737 raise HTTPNotFound()
731 738
732 739 if self.request.environ.get('HTTP_X_PJAX'):
733 740 html = render('rhodecode:templates/files/files_pjax.mako',
734 741 self._get_template_context(c), self.request)
735 742 else:
736 743 html = render('rhodecode:templates/files/files.mako',
737 744 self._get_template_context(c), self.request)
738 745 return Response(html)
739 746
740 747 @HasRepoPermissionAnyDecorator(
741 748 'repository.read', 'repository.write', 'repository.admin')
742 749 def repo_files_annotated_previous(self):
743 750 self.load_default_context()
744 751
745 752 commit_id, f_path = self._get_commit_and_path()
746 753 commit = self._get_commit_or_redirect(commit_id)
747 754 prev_commit_id = commit.raw_id
748 755 line_anchor = self.request.GET.get('line_anchor')
749 756 is_file = False
750 757 try:
751 758 _file = commit.get_node(f_path)
752 759 is_file = _file.is_file()
753 760 except (NodeDoesNotExistError, CommitDoesNotExistError, VCSError):
754 761 pass
755 762
756 763 if is_file:
757 764 history = commit.get_path_history(f_path)
758 765 prev_commit_id = history[1].raw_id \
759 766 if len(history) > 1 else prev_commit_id
760 767 prev_url = h.route_path(
761 768 'repo_files:annotated', repo_name=self.db_repo_name,
762 769 commit_id=prev_commit_id, f_path=f_path,
763 770 _anchor='L{}'.format(line_anchor))
764 771
765 772 raise HTTPFound(prev_url)
766 773
767 774 @LoginRequired()
768 775 @HasRepoPermissionAnyDecorator(
769 776 'repository.read', 'repository.write', 'repository.admin')
770 777 def repo_nodetree_full(self):
771 778 """
772 779 Returns rendered html of file tree that contains commit date,
773 780 author, commit_id for the specified combination of
774 781 repo, commit_id and file path
775 782 """
776 783 c = self.load_default_context()
777 784
778 785 commit_id, f_path = self._get_commit_and_path()
779 786 commit = self._get_commit_or_redirect(commit_id)
780 787 try:
781 788 dir_node = commit.get_node(f_path)
782 789 except RepositoryError as e:
783 790 return Response('error: {}'.format(h.escape(safe_str(e))))
784 791
785 792 if dir_node.is_file():
786 793 return Response('')
787 794
788 795 c.file = dir_node
789 796 c.commit = commit
790 797 at_rev = self.request.GET.get('at')
791 798
792 799 html = self._get_tree_at_commit(
793 800 c, commit.raw_id, dir_node.path, full_load=True, at_rev=at_rev)
794 801
795 802 return Response(html)
796 803
797 804 def _get_attachement_headers(self, f_path):
798 805 f_name = safe_str(f_path.split(Repository.NAME_SEP)[-1])
799 806 safe_path = f_name.replace('"', '\\"')
800 807 encoded_path = urllib.quote(f_name)
801 808
802 809 return "attachment; " \
803 810 "filename=\"{}\"; " \
804 811 "filename*=UTF-8\'\'{}".format(safe_path, encoded_path)
805 812
806 813 @LoginRequired()
807 814 @HasRepoPermissionAnyDecorator(
808 815 'repository.read', 'repository.write', 'repository.admin')
809 816 def repo_file_raw(self):
810 817 """
811 818 Action for show as raw, some mimetypes are "rendered",
812 819 those include images, icons.
813 820 """
814 821 c = self.load_default_context()
815 822
816 823 commit_id, f_path = self._get_commit_and_path()
817 824 commit = self._get_commit_or_redirect(commit_id)
818 825 file_node = self._get_filenode_or_redirect(commit, f_path)
819 826
820 827 raw_mimetype_mapping = {
821 828 # map original mimetype to a mimetype used for "show as raw"
822 829 # you can also provide a content-disposition to override the
823 830 # default "attachment" disposition.
824 831 # orig_type: (new_type, new_dispo)
825 832
826 833 # show images inline:
827 834 # Do not re-add SVG: it is unsafe and permits XSS attacks. One can
828 835 # for example render an SVG with javascript inside or even render
829 836 # HTML.
830 837 'image/x-icon': ('image/x-icon', 'inline'),
831 838 'image/png': ('image/png', 'inline'),
832 839 'image/gif': ('image/gif', 'inline'),
833 840 'image/jpeg': ('image/jpeg', 'inline'),
834 841 'application/pdf': ('application/pdf', 'inline'),
835 842 }
836 843
837 844 mimetype = file_node.mimetype
838 845 try:
839 846 mimetype, disposition = raw_mimetype_mapping[mimetype]
840 847 except KeyError:
841 848 # we don't know anything special about this, handle it safely
842 849 if file_node.is_binary:
843 850 # do same as download raw for binary files
844 851 mimetype, disposition = 'application/octet-stream', 'attachment'
845 852 else:
846 853 # do not just use the original mimetype, but force text/plain,
847 854 # otherwise it would serve text/html and that might be unsafe.
848 855 # Note: underlying vcs library fakes text/plain mimetype if the
849 856 # mimetype can not be determined and it thinks it is not
850 857 # binary.This might lead to erroneous text display in some
851 858 # cases, but helps in other cases, like with text files
852 859 # without extension.
853 860 mimetype, disposition = 'text/plain', 'inline'
854 861
855 862 if disposition == 'attachment':
856 863 disposition = self._get_attachement_headers(f_path)
857 864
858 865 stream_content = file_node.stream_bytes()
859 866
860 867 response = Response(app_iter=stream_content)
861 868 response.content_disposition = disposition
862 869 response.content_type = mimetype
863 870
864 871 charset = self._get_default_encoding(c)
865 872 if charset:
866 873 response.charset = charset
867 874
868 875 return response
869 876
870 877 @LoginRequired()
871 878 @HasRepoPermissionAnyDecorator(
872 879 'repository.read', 'repository.write', 'repository.admin')
873 880 def repo_file_download(self):
874 881 c = self.load_default_context()
875 882
876 883 commit_id, f_path = self._get_commit_and_path()
877 884 commit = self._get_commit_or_redirect(commit_id)
878 885 file_node = self._get_filenode_or_redirect(commit, f_path)
879 886
880 887 if self.request.GET.get('lf'):
881 888 # only if lf get flag is passed, we download this file
882 889 # as LFS/Largefile
883 890 lf_node = file_node.get_largefile_node()
884 891 if lf_node:
885 892 # overwrite our pointer with the REAL large-file
886 893 file_node = lf_node
887 894
888 895 disposition = self._get_attachement_headers(f_path)
889 896
890 897 stream_content = file_node.stream_bytes()
891 898
892 899 response = Response(app_iter=stream_content)
893 900 response.content_disposition = disposition
894 901 response.content_type = file_node.mimetype
895 902
896 903 charset = self._get_default_encoding(c)
897 904 if charset:
898 905 response.charset = charset
899 906
900 907 return response
901 908
902 909 def _get_nodelist_at_commit(self, repo_name, repo_id, commit_id, f_path):
903 910
904 911 cache_seconds = safe_int(
905 912 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
906 913 cache_on = cache_seconds > 0
907 914 log.debug(
908 915 'Computing FILE SEARCH for repo_id %s commit_id `%s` and path `%s`'
909 916 'with caching: %s[TTL: %ss]' % (
910 917 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
911 918
912 919 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
913 920 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
914 921
915 922 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
916 923 def compute_file_search(_name_hash, _repo_id, _commit_id, _f_path):
917 924 log.debug('Generating cached nodelist for repo_id:%s, %s, %s',
918 925 _repo_id, commit_id, f_path)
919 926 try:
920 927 _d, _f = ScmModel().get_quick_filter_nodes(repo_name, _commit_id, _f_path)
921 928 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
922 929 log.exception(safe_str(e))
923 930 h.flash(safe_str(h.escape(e)), category='error')
924 931 raise HTTPFound(h.route_path(
925 932 'repo_files', repo_name=self.db_repo_name,
926 933 commit_id='tip', f_path='/'))
927 934
928 935 return _d + _f
929 936
930 937 result = compute_file_search(self.db_repo.repo_name_hash, self.db_repo.repo_id,
931 938 commit_id, f_path)
932 939 return filter(lambda n: self.path_filter.path_access_allowed(n['name']), result)
933 940
934 941 @LoginRequired()
935 942 @HasRepoPermissionAnyDecorator(
936 943 'repository.read', 'repository.write', 'repository.admin')
937 944 def repo_nodelist(self):
938 945 self.load_default_context()
939 946
940 947 commit_id, f_path = self._get_commit_and_path()
941 948 commit = self._get_commit_or_redirect(commit_id)
942 949
943 950 metadata = self._get_nodelist_at_commit(
944 951 self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path)
945 952 return {'nodes': metadata}
946 953
947 954 def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type):
948 955 items = []
949 956 for name, commit_id in branches_or_tags.items():
950 957 sym_ref = symbolic_reference(commit_id, name, f_path, ref_type)
951 958 items.append((sym_ref, name, ref_type))
952 959 return items
953 960
954 961 def _symbolic_reference(self, commit_id, name, f_path, ref_type):
955 962 return commit_id
956 963
957 964 def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type):
958 965 return commit_id
959 966
960 967 # NOTE(dan): old code we used in "diff" mode compare
961 968 new_f_path = vcspath.join(name, f_path)
962 969 return u'%s@%s' % (new_f_path, commit_id)
963 970
964 971 def _get_node_history(self, commit_obj, f_path, commits=None):
965 972 """
966 973 get commit history for given node
967 974
968 975 :param commit_obj: commit to calculate history
969 976 :param f_path: path for node to calculate history for
970 977 :param commits: if passed don't calculate history and take
971 978 commits defined in this list
972 979 """
973 980 _ = self.request.translate
974 981
975 982 # calculate history based on tip
976 983 tip = self.rhodecode_vcs_repo.get_commit()
977 984 if commits is None:
978 985 pre_load = ["author", "branch"]
979 986 try:
980 987 commits = tip.get_path_history(f_path, pre_load=pre_load)
981 988 except (NodeDoesNotExistError, CommitError):
982 989 # this node is not present at tip!
983 990 commits = commit_obj.get_path_history(f_path, pre_load=pre_load)
984 991
985 992 history = []
986 993 commits_group = ([], _("Changesets"))
987 994 for commit in commits:
988 995 branch = ' (%s)' % commit.branch if commit.branch else ''
989 996 n_desc = 'r%s:%s%s' % (commit.idx, commit.short_id, branch)
990 997 commits_group[0].append((commit.raw_id, n_desc, 'sha'))
991 998 history.append(commits_group)
992 999
993 1000 symbolic_reference = self._symbolic_reference
994 1001
995 1002 if self.rhodecode_vcs_repo.alias == 'svn':
996 1003 adjusted_f_path = RepoFilesView.adjust_file_path_for_svn(
997 1004 f_path, self.rhodecode_vcs_repo)
998 1005 if adjusted_f_path != f_path:
999 1006 log.debug(
1000 1007 'Recognized svn tag or branch in file "%s", using svn '
1001 1008 'specific symbolic references', f_path)
1002 1009 f_path = adjusted_f_path
1003 1010 symbolic_reference = self._symbolic_reference_svn
1004 1011
1005 1012 branches = self._create_references(
1006 1013 self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch')
1007 1014 branches_group = (branches, _("Branches"))
1008 1015
1009 1016 tags = self._create_references(
1010 1017 self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag')
1011 1018 tags_group = (tags, _("Tags"))
1012 1019
1013 1020 history.append(branches_group)
1014 1021 history.append(tags_group)
1015 1022
1016 1023 return history, commits
1017 1024
1018 1025 @LoginRequired()
1019 1026 @HasRepoPermissionAnyDecorator(
1020 1027 'repository.read', 'repository.write', 'repository.admin')
1021 1028 def repo_file_history(self):
1022 1029 self.load_default_context()
1023 1030
1024 1031 commit_id, f_path = self._get_commit_and_path()
1025 1032 commit = self._get_commit_or_redirect(commit_id)
1026 1033 file_node = self._get_filenode_or_redirect(commit, f_path)
1027 1034
1028 1035 if file_node.is_file():
1029 1036 file_history, _hist = self._get_node_history(commit, f_path)
1030 1037
1031 1038 res = []
1032 1039 for section_items, section in file_history:
1033 1040 items = []
1034 1041 for obj_id, obj_text, obj_type in section_items:
1035 1042 at_rev = ''
1036 1043 if obj_type in ['branch', 'bookmark', 'tag']:
1037 1044 at_rev = obj_text
1038 1045 entry = {
1039 1046 'id': obj_id,
1040 1047 'text': obj_text,
1041 1048 'type': obj_type,
1042 1049 'at_rev': at_rev
1043 1050 }
1044 1051
1045 1052 items.append(entry)
1046 1053
1047 1054 res.append({
1048 1055 'text': section,
1049 1056 'children': items
1050 1057 })
1051 1058
1052 1059 data = {
1053 1060 'more': False,
1054 1061 'results': res
1055 1062 }
1056 1063 return data
1057 1064
1058 1065 log.warning('Cannot fetch history for directory')
1059 1066 raise HTTPBadRequest()
1060 1067
1061 1068 @LoginRequired()
1062 1069 @HasRepoPermissionAnyDecorator(
1063 1070 'repository.read', 'repository.write', 'repository.admin')
1064 1071 def repo_file_authors(self):
1065 1072 c = self.load_default_context()
1066 1073
1067 1074 commit_id, f_path = self._get_commit_and_path()
1068 1075 commit = self._get_commit_or_redirect(commit_id)
1069 1076 file_node = self._get_filenode_or_redirect(commit, f_path)
1070 1077
1071 1078 if not file_node.is_file():
1072 1079 raise HTTPBadRequest()
1073 1080
1074 1081 c.file_last_commit = file_node.last_commit
1075 1082 if self.request.GET.get('annotate') == '1':
1076 1083 # use _hist from annotation if annotation mode is on
1077 1084 commit_ids = set(x[1] for x in file_node.annotate)
1078 1085 _hist = (
1079 1086 self.rhodecode_vcs_repo.get_commit(commit_id)
1080 1087 for commit_id in commit_ids)
1081 1088 else:
1082 1089 _f_history, _hist = self._get_node_history(commit, f_path)
1083 1090 c.file_author = False
1084 1091
1085 1092 unique = collections.OrderedDict()
1086 1093 for commit in _hist:
1087 1094 author = commit.author
1088 1095 if author not in unique:
1089 1096 unique[commit.author] = [
1090 1097 h.email(author),
1091 1098 h.person(author, 'username_or_name_or_email'),
1092 1099 1 # counter
1093 1100 ]
1094 1101
1095 1102 else:
1096 1103 # increase counter
1097 1104 unique[commit.author][2] += 1
1098 1105
1099 1106 c.authors = [val for val in unique.values()]
1100 1107
1101 1108 return self._get_template_context(c)
1102 1109
1103 1110 @LoginRequired()
1104 1111 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1105 1112 def repo_files_check_head(self):
1106 1113 self.load_default_context()
1107 1114
1108 1115 commit_id, f_path = self._get_commit_and_path()
1109 1116 _branch_name, _sha_commit_id, is_head = \
1110 1117 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1111 1118 landing_ref=self.db_repo.landing_ref_name)
1112 1119
1113 1120 new_path = self.request.POST.get('path')
1114 1121 operation = self.request.POST.get('operation')
1115 1122 path_exist = ''
1116 1123
1117 1124 if new_path and operation in ['create', 'upload']:
1118 1125 new_f_path = os.path.join(f_path.lstrip('/'), new_path)
1119 1126 try:
1120 1127 commit_obj = self.rhodecode_vcs_repo.get_commit(commit_id)
1121 1128 # NOTE(dan): construct whole path without leading /
1122 1129 file_node = commit_obj.get_node(new_f_path)
1123 1130 if file_node is not None:
1124 1131 path_exist = new_f_path
1125 1132 except EmptyRepositoryError:
1126 1133 pass
1127 1134 except Exception:
1128 1135 pass
1129 1136
1130 1137 return {
1131 1138 'branch': _branch_name,
1132 1139 'sha': _sha_commit_id,
1133 1140 'is_head': is_head,
1134 1141 'path_exists': path_exist
1135 1142 }
1136 1143
1137 1144 @LoginRequired()
1138 1145 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1139 1146 def repo_files_remove_file(self):
1140 1147 _ = self.request.translate
1141 1148 c = self.load_default_context()
1142 1149 commit_id, f_path = self._get_commit_and_path()
1143 1150
1144 1151 self._ensure_not_locked()
1145 1152 _branch_name, _sha_commit_id, is_head = \
1146 1153 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1147 1154 landing_ref=self.db_repo.landing_ref_name)
1148 1155
1149 1156 self.forbid_non_head(is_head, f_path)
1150 1157 self.check_branch_permission(_branch_name)
1151 1158
1152 1159 c.commit = self._get_commit_or_redirect(commit_id)
1153 1160 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1154 1161
1155 1162 c.default_message = _(
1156 1163 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1157 1164 c.f_path = f_path
1158 1165
1159 1166 return self._get_template_context(c)
1160 1167
1161 1168 @LoginRequired()
1162 1169 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1163 1170 @CSRFRequired()
1164 1171 def repo_files_delete_file(self):
1165 1172 _ = self.request.translate
1166 1173
1167 1174 c = self.load_default_context()
1168 1175 commit_id, f_path = self._get_commit_and_path()
1169 1176
1170 1177 self._ensure_not_locked()
1171 1178 _branch_name, _sha_commit_id, is_head = \
1172 1179 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1173 1180 landing_ref=self.db_repo.landing_ref_name)
1174 1181
1175 1182 self.forbid_non_head(is_head, f_path)
1176 1183 self.check_branch_permission(_branch_name)
1177 1184
1178 1185 c.commit = self._get_commit_or_redirect(commit_id)
1179 1186 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1180 1187
1181 1188 c.default_message = _(
1182 1189 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1183 1190 c.f_path = f_path
1184 1191 node_path = f_path
1185 1192 author = self._rhodecode_db_user.full_contact
1186 1193 message = self.request.POST.get('message') or c.default_message
1187 1194 try:
1188 1195 nodes = {
1189 1196 node_path: {
1190 1197 'content': ''
1191 1198 }
1192 1199 }
1193 1200 ScmModel().delete_nodes(
1194 1201 user=self._rhodecode_db_user.user_id, repo=self.db_repo,
1195 1202 message=message,
1196 1203 nodes=nodes,
1197 1204 parent_commit=c.commit,
1198 1205 author=author,
1199 1206 )
1200 1207
1201 1208 h.flash(
1202 1209 _('Successfully deleted file `{}`').format(
1203 1210 h.escape(f_path)), category='success')
1204 1211 except Exception:
1205 1212 log.exception('Error during commit operation')
1206 1213 h.flash(_('Error occurred during commit'), category='error')
1207 1214 raise HTTPFound(
1208 1215 h.route_path('repo_commit', repo_name=self.db_repo_name,
1209 1216 commit_id='tip'))
1210 1217
1211 1218 @LoginRequired()
1212 1219 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1213 1220 def repo_files_edit_file(self):
1214 1221 _ = self.request.translate
1215 1222 c = self.load_default_context()
1216 1223 commit_id, f_path = self._get_commit_and_path()
1217 1224
1218 1225 self._ensure_not_locked()
1219 1226 _branch_name, _sha_commit_id, is_head = \
1220 1227 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1221 1228 landing_ref=self.db_repo.landing_ref_name)
1222 1229
1223 1230 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1224 1231 self.check_branch_permission(_branch_name, commit_id=commit_id)
1225 1232
1226 1233 c.commit = self._get_commit_or_redirect(commit_id)
1227 1234 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1228 1235
1229 1236 if c.file.is_binary:
1230 1237 files_url = h.route_path(
1231 1238 'repo_files',
1232 1239 repo_name=self.db_repo_name,
1233 1240 commit_id=c.commit.raw_id, f_path=f_path)
1234 1241 raise HTTPFound(files_url)
1235 1242
1236 1243 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1237 1244 c.f_path = f_path
1238 1245
1239 1246 return self._get_template_context(c)
1240 1247
1241 1248 @LoginRequired()
1242 1249 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1243 1250 @CSRFRequired()
1244 1251 def repo_files_update_file(self):
1245 1252 _ = self.request.translate
1246 1253 c = self.load_default_context()
1247 1254 commit_id, f_path = self._get_commit_and_path()
1248 1255
1249 1256 self._ensure_not_locked()
1250 1257
1251 1258 c.commit = self._get_commit_or_redirect(commit_id)
1252 1259 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1253 1260
1254 1261 if c.file.is_binary:
1255 1262 raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name,
1256 1263 commit_id=c.commit.raw_id, f_path=f_path))
1257 1264
1258 1265 _branch_name, _sha_commit_id, is_head = \
1259 1266 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1260 1267 landing_ref=self.db_repo.landing_ref_name)
1261 1268
1262 1269 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1263 1270 self.check_branch_permission(_branch_name, commit_id=commit_id)
1264 1271
1265 1272 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1266 1273 c.f_path = f_path
1267 1274
1268 1275 old_content = c.file.content
1269 1276 sl = old_content.splitlines(1)
1270 1277 first_line = sl[0] if sl else ''
1271 1278
1272 1279 r_post = self.request.POST
1273 1280 # line endings: 0 - Unix, 1 - Mac, 2 - DOS
1274 1281 line_ending_mode = detect_mode(first_line, 0)
1275 1282 content = convert_line_endings(r_post.get('content', ''), line_ending_mode)
1276 1283
1277 1284 message = r_post.get('message') or c.default_message
1278 1285 org_node_path = c.file.unicode_path
1279 1286 filename = r_post['filename']
1280 1287
1281 1288 root_path = c.file.dir_path
1282 1289 pure_path = self.create_pure_path(root_path, filename)
1283 1290 node_path = safe_unicode(bytes(pure_path))
1284 1291
1285 1292 default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name,
1286 1293 commit_id=commit_id)
1287 1294 if content == old_content and node_path == org_node_path:
1288 1295 h.flash(_('No changes detected on {}').format(h.escape(org_node_path)),
1289 1296 category='warning')
1290 1297 raise HTTPFound(default_redirect_url)
1291 1298
1292 1299 try:
1293 1300 mapping = {
1294 1301 org_node_path: {
1295 1302 'org_filename': org_node_path,
1296 1303 'filename': node_path,
1297 1304 'content': content,
1298 1305 'lexer': '',
1299 1306 'op': 'mod',
1300 1307 'mode': c.file.mode
1301 1308 }
1302 1309 }
1303 1310
1304 1311 commit = ScmModel().update_nodes(
1305 1312 user=self._rhodecode_db_user.user_id,
1306 1313 repo=self.db_repo,
1307 1314 message=message,
1308 1315 nodes=mapping,
1309 1316 parent_commit=c.commit,
1310 1317 )
1311 1318
1312 1319 h.flash(_('Successfully committed changes to file `{}`').format(
1313 1320 h.escape(f_path)), category='success')
1314 1321 default_redirect_url = h.route_path(
1315 1322 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1316 1323
1317 1324 except Exception:
1318 1325 log.exception('Error occurred during commit')
1319 1326 h.flash(_('Error occurred during commit'), category='error')
1320 1327
1321 1328 raise HTTPFound(default_redirect_url)
1322 1329
1323 1330 @LoginRequired()
1324 1331 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1325 1332 def repo_files_add_file(self):
1326 1333 _ = self.request.translate
1327 1334 c = self.load_default_context()
1328 1335 commit_id, f_path = self._get_commit_and_path()
1329 1336
1330 1337 self._ensure_not_locked()
1331 1338
1332 1339 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1333 1340 if c.commit is None:
1334 1341 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1335 1342
1336 1343 if self.rhodecode_vcs_repo.is_empty():
1337 1344 # for empty repository we cannot check for current branch, we rely on
1338 1345 # c.commit.branch instead
1339 1346 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1340 1347 else:
1341 1348 _branch_name, _sha_commit_id, is_head = \
1342 1349 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1343 1350 landing_ref=self.db_repo.landing_ref_name)
1344 1351
1345 1352 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1346 1353 self.check_branch_permission(_branch_name, commit_id=commit_id)
1347 1354
1348 1355 c.default_message = (_('Added file via RhodeCode Enterprise'))
1349 1356 c.f_path = f_path.lstrip('/') # ensure not relative path
1350 1357
1351 1358 return self._get_template_context(c)
1352 1359
1353 1360 @LoginRequired()
1354 1361 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1355 1362 @CSRFRequired()
1356 1363 def repo_files_create_file(self):
1357 1364 _ = self.request.translate
1358 1365 c = self.load_default_context()
1359 1366 commit_id, f_path = self._get_commit_and_path()
1360 1367
1361 1368 self._ensure_not_locked()
1362 1369
1363 1370 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1364 1371 if c.commit is None:
1365 1372 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1366 1373
1367 1374 # calculate redirect URL
1368 1375 if self.rhodecode_vcs_repo.is_empty():
1369 1376 default_redirect_url = h.route_path(
1370 1377 'repo_summary', repo_name=self.db_repo_name)
1371 1378 else:
1372 1379 default_redirect_url = h.route_path(
1373 1380 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1374 1381
1375 1382 if self.rhodecode_vcs_repo.is_empty():
1376 1383 # for empty repository we cannot check for current branch, we rely on
1377 1384 # c.commit.branch instead
1378 1385 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1379 1386 else:
1380 1387 _branch_name, _sha_commit_id, is_head = \
1381 1388 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1382 1389 landing_ref=self.db_repo.landing_ref_name)
1383 1390
1384 1391 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1385 1392 self.check_branch_permission(_branch_name, commit_id=commit_id)
1386 1393
1387 1394 c.default_message = (_('Added file via RhodeCode Enterprise'))
1388 1395 c.f_path = f_path
1389 1396
1390 1397 r_post = self.request.POST
1391 1398 message = r_post.get('message') or c.default_message
1392 1399 filename = r_post.get('filename')
1393 1400 unix_mode = 0
1394 1401 content = convert_line_endings(r_post.get('content', ''), unix_mode)
1395 1402
1396 1403 if not filename:
1397 1404 # If there's no commit, redirect to repo summary
1398 1405 if type(c.commit) is EmptyCommit:
1399 1406 redirect_url = h.route_path(
1400 1407 'repo_summary', repo_name=self.db_repo_name)
1401 1408 else:
1402 1409 redirect_url = default_redirect_url
1403 1410 h.flash(_('No filename specified'), category='warning')
1404 1411 raise HTTPFound(redirect_url)
1405 1412
1406 1413 root_path = f_path
1407 1414 pure_path = self.create_pure_path(root_path, filename)
1408 1415 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1409 1416
1410 1417 author = self._rhodecode_db_user.full_contact
1411 1418 nodes = {
1412 1419 node_path: {
1413 1420 'content': content
1414 1421 }
1415 1422 }
1416 1423
1417 1424 try:
1418 1425
1419 1426 commit = ScmModel().create_nodes(
1420 1427 user=self._rhodecode_db_user.user_id,
1421 1428 repo=self.db_repo,
1422 1429 message=message,
1423 1430 nodes=nodes,
1424 1431 parent_commit=c.commit,
1425 1432 author=author,
1426 1433 )
1427 1434
1428 1435 h.flash(_('Successfully committed new file `{}`').format(
1429 1436 h.escape(node_path)), category='success')
1430 1437
1431 1438 default_redirect_url = h.route_path(
1432 1439 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1433 1440
1434 1441 except NonRelativePathError:
1435 1442 log.exception('Non Relative path found')
1436 1443 h.flash(_('The location specified must be a relative path and must not '
1437 1444 'contain .. in the path'), category='warning')
1438 1445 raise HTTPFound(default_redirect_url)
1439 1446 except (NodeError, NodeAlreadyExistsError) as e:
1440 1447 h.flash(_(h.escape(e)), category='error')
1441 1448 except Exception:
1442 1449 log.exception('Error occurred during commit')
1443 1450 h.flash(_('Error occurred during commit'), category='error')
1444 1451
1445 1452 raise HTTPFound(default_redirect_url)
1446 1453
1447 1454 @LoginRequired()
1448 1455 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1449 1456 @CSRFRequired()
1450 1457 def repo_files_upload_file(self):
1451 1458 _ = self.request.translate
1452 1459 c = self.load_default_context()
1453 1460 commit_id, f_path = self._get_commit_and_path()
1454 1461
1455 1462 self._ensure_not_locked()
1456 1463
1457 1464 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1458 1465 if c.commit is None:
1459 1466 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1460 1467
1461 1468 # calculate redirect URL
1462 1469 if self.rhodecode_vcs_repo.is_empty():
1463 1470 default_redirect_url = h.route_path(
1464 1471 'repo_summary', repo_name=self.db_repo_name)
1465 1472 else:
1466 1473 default_redirect_url = h.route_path(
1467 1474 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1468 1475
1469 1476 if self.rhodecode_vcs_repo.is_empty():
1470 1477 # for empty repository we cannot check for current branch, we rely on
1471 1478 # c.commit.branch instead
1472 1479 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1473 1480 else:
1474 1481 _branch_name, _sha_commit_id, is_head = \
1475 1482 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1476 1483 landing_ref=self.db_repo.landing_ref_name)
1477 1484
1478 1485 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1479 1486 if error:
1480 1487 return {
1481 1488 'error': error,
1482 1489 'redirect_url': default_redirect_url
1483 1490 }
1484 1491 error = self.check_branch_permission(_branch_name, json_mode=True)
1485 1492 if error:
1486 1493 return {
1487 1494 'error': error,
1488 1495 'redirect_url': default_redirect_url
1489 1496 }
1490 1497
1491 1498 c.default_message = (_('Uploaded file via RhodeCode Enterprise'))
1492 1499 c.f_path = f_path
1493 1500
1494 1501 r_post = self.request.POST
1495 1502
1496 1503 message = c.default_message
1497 1504 user_message = r_post.getall('message')
1498 1505 if isinstance(user_message, list) and user_message:
1499 1506 # we take the first from duplicated results if it's not empty
1500 1507 message = user_message[0] if user_message[0] else message
1501 1508
1502 1509 nodes = {}
1503 1510
1504 1511 for file_obj in r_post.getall('files_upload') or []:
1505 1512 content = file_obj.file
1506 1513 filename = file_obj.filename
1507 1514
1508 1515 root_path = f_path
1509 1516 pure_path = self.create_pure_path(root_path, filename)
1510 1517 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1511 1518
1512 1519 nodes[node_path] = {
1513 1520 'content': content
1514 1521 }
1515 1522
1516 1523 if not nodes:
1517 1524 error = 'missing files'
1518 1525 return {
1519 1526 'error': error,
1520 1527 'redirect_url': default_redirect_url
1521 1528 }
1522 1529
1523 1530 author = self._rhodecode_db_user.full_contact
1524 1531
1525 1532 try:
1526 1533 commit = ScmModel().create_nodes(
1527 1534 user=self._rhodecode_db_user.user_id,
1528 1535 repo=self.db_repo,
1529 1536 message=message,
1530 1537 nodes=nodes,
1531 1538 parent_commit=c.commit,
1532 1539 author=author,
1533 1540 )
1534 1541 if len(nodes) == 1:
1535 1542 flash_message = _('Successfully committed {} new files').format(len(nodes))
1536 1543 else:
1537 1544 flash_message = _('Successfully committed 1 new file')
1538 1545
1539 1546 h.flash(flash_message, category='success')
1540 1547
1541 1548 default_redirect_url = h.route_path(
1542 1549 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1543 1550
1544 1551 except NonRelativePathError:
1545 1552 log.exception('Non Relative path found')
1546 1553 error = _('The location specified must be a relative path and must not '
1547 1554 'contain .. in the path')
1548 1555 h.flash(error, category='warning')
1549 1556
1550 1557 return {
1551 1558 'error': error,
1552 1559 'redirect_url': default_redirect_url
1553 1560 }
1554 1561 except (NodeError, NodeAlreadyExistsError) as e:
1555 1562 error = h.escape(e)
1556 1563 h.flash(error, category='error')
1557 1564
1558 1565 return {
1559 1566 'error': error,
1560 1567 'redirect_url': default_redirect_url
1561 1568 }
1562 1569 except Exception:
1563 1570 log.exception('Error occurred during commit')
1564 1571 error = _('Error occurred during commit')
1565 1572 h.flash(error, category='error')
1566 1573 return {
1567 1574 'error': error,
1568 1575 'redirect_url': default_redirect_url
1569 1576 }
1570 1577
1571 1578 return {
1572 1579 'error': None,
1573 1580 'redirect_url': default_redirect_url
1574 1581 }
@@ -1,254 +1,254 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import datetime
23 23 import formencode
24 24 import formencode.htmlfill
25 25
26 26 from pyramid.httpexceptions import HTTPFound
27 27
28 28 from pyramid.renderers import render
29 29 from pyramid.response import Response
30 30
31 31 from rhodecode import events
32 32 from rhodecode.apps._base import RepoAppView, DataGridAppView
33 33 from rhodecode.lib.auth import (
34 34 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous,
35 35 HasRepoPermissionAny, HasPermissionAnyDecorator, CSRFRequired)
36 36 import rhodecode.lib.helpers as h
37 37 from rhodecode.lib.celerylib.utils import get_task_id
38 38 from rhodecode.model.db import coalesce, or_, Repository, RepoGroup
39 39 from rhodecode.model.permission import PermissionModel
40 40 from rhodecode.model.repo import RepoModel
41 41 from rhodecode.model.forms import RepoForkForm
42 42 from rhodecode.model.scm import ScmModel, RepoGroupList
43 43 from rhodecode.lib.utils2 import safe_int, safe_unicode
44 44
45 45 log = logging.getLogger(__name__)
46 46
47 47
48 48 class RepoForksView(RepoAppView, DataGridAppView):
49 49
50 50 def load_default_context(self):
51 51 c = self._get_local_tmpl_context(include_app_defaults=True)
52 52 c.rhodecode_repo = self.rhodecode_vcs_repo
53 53
54 54 acl_groups = RepoGroupList(
55 55 RepoGroup.query().all(),
56 56 perm_set=['group.write', 'group.admin'])
57 57 c.repo_groups = RepoGroup.groups_choices(groups=acl_groups)
58 58 c.repo_groups_choices = map(lambda k: safe_unicode(k[0]), c.repo_groups)
59 59
60 60 c.personal_repo_group = c.rhodecode_user.personal_repo_group
61 61
62 62 return c
63 63
64 64 @LoginRequired()
65 65 @HasRepoPermissionAnyDecorator(
66 66 'repository.read', 'repository.write', 'repository.admin')
67 67 def repo_forks_show_all(self):
68 68 c = self.load_default_context()
69 69 return self._get_template_context(c)
70 70
71 71 @LoginRequired()
72 72 @HasRepoPermissionAnyDecorator(
73 73 'repository.read', 'repository.write', 'repository.admin')
74 74 def repo_forks_data(self):
75 75 _ = self.request.translate
76 76 self.load_default_context()
77 77 column_map = {
78 78 'fork_name': 'repo_name',
79 79 'fork_date': 'created_on',
80 80 'last_activity': 'updated_on'
81 81 }
82 82 draw, start, limit = self._extract_chunk(self.request)
83 83 search_q, order_by, order_dir = self._extract_ordering(
84 84 self.request, column_map=column_map)
85 85
86 86 acl_check = HasRepoPermissionAny(
87 87 'repository.read', 'repository.write', 'repository.admin')
88 88 repo_id = self.db_repo.repo_id
89 89 allowed_ids = [-1]
90 90 for f in Repository.query().filter(Repository.fork_id == repo_id):
91 91 if acl_check(f.repo_name, 'get forks check'):
92 92 allowed_ids.append(f.repo_id)
93 93
94 94 forks_data_total_count = Repository.query()\
95 95 .filter(Repository.fork_id == repo_id)\
96 96 .filter(Repository.repo_id.in_(allowed_ids))\
97 97 .count()
98 98
99 99 # json generate
100 100 base_q = Repository.query()\
101 101 .filter(Repository.fork_id == repo_id)\
102 102 .filter(Repository.repo_id.in_(allowed_ids))\
103 103
104 104 if search_q:
105 105 like_expression = u'%{}%'.format(safe_unicode(search_q))
106 106 base_q = base_q.filter(or_(
107 107 Repository.repo_name.ilike(like_expression),
108 108 Repository.description.ilike(like_expression),
109 109 ))
110 110
111 111 forks_data_total_filtered_count = base_q.count()
112 112
113 113 sort_col = getattr(Repository, order_by, None)
114 114 if sort_col:
115 115 if order_dir == 'asc':
116 116 # handle null values properly to order by NULL last
117 117 if order_by in ['last_activity']:
118 118 sort_col = coalesce(sort_col, datetime.date.max)
119 119 sort_col = sort_col.asc()
120 120 else:
121 121 # handle null values properly to order by NULL last
122 122 if order_by in ['last_activity']:
123 123 sort_col = coalesce(sort_col, datetime.date.min)
124 124 sort_col = sort_col.desc()
125 125
126 126 base_q = base_q.order_by(sort_col)
127 127 base_q = base_q.offset(start).limit(limit)
128 128
129 129 fork_list = base_q.all()
130 130
131 131 def fork_actions(fork):
132 132 url_link = h.route_path(
133 133 'repo_compare',
134 134 repo_name=fork.repo_name,
135 135 source_ref_type=self.db_repo.landing_ref_type,
136 136 source_ref=self.db_repo.landing_ref_name,
137 137 target_ref_type=self.db_repo.landing_ref_type,
138 138 target_ref=self.db_repo.landing_ref_name,
139 139 _query=dict(merge=1, target_repo=f.repo_name))
140 140 return h.link_to(_('Compare fork'), url_link, class_='btn-link')
141 141
142 142 def fork_name(fork):
143 143 return h.link_to(fork.repo_name,
144 144 h.route_path('repo_summary', repo_name=fork.repo_name))
145 145
146 146 forks_data = []
147 147 for fork in fork_list:
148 148 forks_data.append({
149 149 "username": h.gravatar_with_user(self.request, fork.user.username),
150 150 "fork_name": fork_name(fork),
151 151 "description": fork.description_safe,
152 152 "fork_date": h.age_component(fork.created_on, time_is_local=True),
153 153 "last_activity": h.format_date(fork.updated_on),
154 154 "action": fork_actions(fork),
155 155 })
156 156
157 157 data = ({
158 158 'draw': draw,
159 159 'data': forks_data,
160 160 'recordsTotal': forks_data_total_count,
161 161 'recordsFiltered': forks_data_total_filtered_count,
162 162 })
163 163
164 164 return data
165 165
166 166 @LoginRequired()
167 167 @NotAnonymous()
168 @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository')
168 @HasPermissionAnyDecorator('hg.admin', PermissionModel.FORKING_ENABLED)
169 169 @HasRepoPermissionAnyDecorator(
170 170 'repository.read', 'repository.write', 'repository.admin')
171 171 def repo_fork_new(self):
172 172 c = self.load_default_context()
173 173
174 174 defaults = RepoModel()._get_defaults(self.db_repo_name)
175 175 # alter the description to indicate a fork
176 176 defaults['description'] = (
177 177 'fork of repository: %s \n%s' % (
178 178 defaults['repo_name'], defaults['description']))
179 179 # add suffix to fork
180 180 defaults['repo_name'] = '%s-fork' % defaults['repo_name']
181 181
182 182 data = render('rhodecode:templates/forks/fork.mako',
183 183 self._get_template_context(c), self.request)
184 184 html = formencode.htmlfill.render(
185 185 data,
186 186 defaults=defaults,
187 187 encoding="UTF-8",
188 188 force_defaults=False
189 189 )
190 190 return Response(html)
191 191
192 192 @LoginRequired()
193 193 @NotAnonymous()
194 @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository')
194 @HasPermissionAnyDecorator('hg.admin', PermissionModel.FORKING_ENABLED)
195 195 @HasRepoPermissionAnyDecorator(
196 196 'repository.read', 'repository.write', 'repository.admin')
197 197 @CSRFRequired()
198 198 def repo_fork_create(self):
199 199 _ = self.request.translate
200 200 c = self.load_default_context()
201 201
202 202 _form = RepoForkForm(self.request.translate,
203 203 old_data={'repo_type': self.db_repo.repo_type},
204 204 repo_groups=c.repo_groups_choices)()
205 205 post_data = dict(self.request.POST)
206 206
207 207 # forbid injecting other repo by forging a request
208 208 post_data['fork_parent_id'] = self.db_repo.repo_id
209 209 post_data['landing_rev'] = self.db_repo._landing_revision
210 210
211 211 form_result = {}
212 212 task_id = None
213 213 try:
214 214 form_result = _form.to_python(post_data)
215 215 copy_permissions = form_result.get('copy_permissions')
216 216 # create fork is done sometimes async on celery, db transaction
217 217 # management is handled there.
218 218 task = RepoModel().create_fork(
219 219 form_result, c.rhodecode_user.user_id)
220 220
221 221 task_id = get_task_id(task)
222 222 except formencode.Invalid as errors:
223 223 c.rhodecode_db_repo = self.db_repo
224 224
225 225 data = render('rhodecode:templates/forks/fork.mako',
226 226 self._get_template_context(c), self.request)
227 227 html = formencode.htmlfill.render(
228 228 data,
229 229 defaults=errors.value,
230 230 errors=errors.error_dict or {},
231 231 prefix_error=False,
232 232 encoding="UTF-8",
233 233 force_defaults=False
234 234 )
235 235 return Response(html)
236 236 except Exception:
237 237 log.exception(
238 238 u'Exception while trying to fork the repository %s', self.db_repo_name)
239 239 msg = _('An error occurred during repository forking %s') % (self.db_repo_name, )
240 240 h.flash(msg, category='error')
241 241 raise HTTPFound(h.route_path('home'))
242 242
243 243 repo_name = form_result.get('repo_name_full', self.db_repo_name)
244 244
245 245 affected_user_ids = [self._rhodecode_user.user_id]
246 246 if copy_permissions:
247 247 # permission flush is done in repo creating
248 248 pass
249 249
250 250 PermissionModel().trigger_permission_flush(affected_user_ids)
251 251
252 252 raise HTTPFound(
253 253 h.route_path('repo_creating', repo_name=repo_name,
254 254 _query=dict(task_id=task_id)))
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
General Comments 1
Under Review
author

Auto status change to "Under Review"

You need to be logged in to leave comments. Login now