##// END OF EJS Templates
release: Merge default into stable for release preparation
milka -
r4671:d6153155 merge stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,76 b''
1 |RCE| 4.25.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2021-04-02
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - SSH: allow clone by ID via SSH operations.
14 - Artifacts: added an admin panel to manage artifacts.
15 - Redmine: added option to add note to a ticket without changing its status in Redmine integration.
16
17
18 General
19 ^^^^^^^
20
21 - Git: change lookups logic. Prioritize reference names over numerical ids.
22 Numerical ids are supported as a fallback if ref matching is unsuccessful.
23 - Permissions: changed fork permission help text to reflect the actual state on how it works.
24 - Permissions: flush permissions on owner changes for repo and repo groups. This
25 would fix problems when owner of repository changes then the new owner lacked permissions
26 until cache expired.
27 - Artifacts: added API function to remove artifacts.
28 - Archives: use a special name for non-hashed archives to fix caching issues.
29 - Packaging: fixed few packages requirements for a proper builds.
30 - Packaging: fix rhodecode-tools for docker builds.
31 - Packaging: fixed some problem after latest setuptools-scm release.
32 - Packaging: added setuptools-scm to packages for build.
33 - Packaging: fix jira package for reproducible builds.
34 - Packaging: fix zipp package patches.
35
36
37 Security
38 ^^^^^^^^
39
40 - Comments: forbid removal of comments by anyone except the owners.
41 Previously admins of a repository could remove them if they would construct a special url with data.
42 - Pull requests: fixed some xss problems when a deleted file with special characters were commented on.
43
44
45 Performance
46 ^^^^^^^^^^^
47
48 - License: skip channelstream connect on license checks logic to reduce calls handling times.
49 - Core: optimize some calls to skip license/scm detection on them. Each license check is expensive
50 and we don't need them on each call.
51
52
53 Fixes
54 ^^^^^
55
56 - Branch-permissions: fixed ce view. Fixes #5656
57 - Feed: fix errors on feed access of empty repositories.
58 - Archives: if implicit ref name was used (e.g master) to obtain archive, we now
59 redirect to explicit commit sha so we can have the proper caching for references names.
60 - rcextensions: fixed pre-files extractor return code support.
61 - Svn: fix subprocess problems on some of the calls for file checking.
62 - Pull requests: fixed multiple repetitions of referenced tickets in pull requests summary sidebar.
63 - Maintenance: fixed bad routes def
64 - clone-uri: fixed the problems with key mismatch that caused errors on summary page.
65 - Largefiles: added fix for downloading largefiles which had no extension in file name.
66 - Compare: fix referenced commits bug.
67 - Git: fix for unicode branches
68
69
70 Upgrade notes
71 ^^^^^^^^^^^^^
72
73 - Scheduled release 4.25.0.
74
75
76
@@ -0,0 +1,13 b''
1 diff -rup channelstream-0.6.14-orig/setup.py channelstream-0.6.14/setup.py
2
3 --- channelstream-0.6.14/setup-orig.py 2021-03-11 12:34:45.000000000 +0100
4 +++ channelstream-0.6.14/setup.py 2021-03-11 12:34:56.000000000 +0100
5 @@ -52,7 +52,7 @@ setup(
6 include_package_data=True,
7 install_requires=requires,
8 python_requires=">=2.7",
9 - setup_requires=["pytest-runner"],
10 + setup_requires=["pytest-runner==5.1.0"],
11 extras_require={
12 "dev": ["coverage", "pytest", "pyramid", "tox", "mock", "webtest"],
13 "lint": ["black"],
@@ -0,0 +1,10 b''
1 diff -rup configparser-4.0.2-orig/pyproject.toml configparser-4.0.2/pyproject.toml
2 --- configparser-4.0.2-orig/pyproject.toml 2021-03-22 21:28:11.000000000 +0100
3 +++ configparser-4.0.2/pyproject.toml 2021-03-22 21:28:11.000000000 +0100
4 @@ -1,5 +1,5 @@
5 [build-system]
6 -requires = ["setuptools>=40.7", "wheel", "setuptools_scm>=1.15"]
7 +requires = ["setuptools<=42.0", "wheel", "setuptools_scm<6.0.0"]
8 build-backend = "setuptools.build_meta"
9
10 [tool.black]
@@ -0,0 +1,7 b''
1 diff -rup importlib-metadata-1.6.0-orig/yproject.toml importlib-metadata-1.6.0/pyproject.toml
2 --- importlib-metadata-1.6.0-orig/yproject.toml 2021-03-22 22:10:33.000000000 +0100
3 +++ importlib-metadata-1.6.0/pyproject.toml 2021-03-22 22:11:09.000000000 +0100
4 @@ -1,3 +1,3 @@
5 [build-system]
6 -requires = ["setuptools>=30.3", "wheel", "setuptools_scm"]
7 +requires = ["setuptools<42.0", "wheel", "setuptools_scm<6.0.0"]
@@ -0,0 +1,12 b''
1 diff -rup pyramid-apispec-0.3.2-orig/setup.py pyramid-apispec-0.3.2/setup.py
2 --- pyramid-apispec-0.3.2-orig/setup.py 2021-03-11 11:19:26.000000000 +0100
3 +++ pyramid-apispec-0.3.2/setup.py 2021-03-11 11:19:51.000000000 +0100
4 @@ -44,7 +44,7 @@ setup(
5 packages=find_packages(exclude=["contrib", "docs", "tests"]),
6 package_data={"pyramid_apispec": ["static/*.*"], "": ["LICENSE"]},
7 install_requires=["apispec[yaml]==1.0.0"],
8 - setup_requires=["pytest-runner"],
9 + setup_requires=["pytest-runner==5.1"],
10 extras_require={
11 "dev": ["coverage", "pytest", "pyramid", "tox", "webtest"],
12 "demo": ["marshmallow==2.15.3", "pyramid", "apispec", "webtest"], No newline at end of file
@@ -0,0 +1,12 b''
1 diff -rup rhodecode-tools-1.4.0-orig/setup.py rhodecode-tools-1.4.0/setup.py
2 --- rhodecode-tools-1.4.0/setup-orig.py 2021-03-11 12:34:45.000000000 +0100
3 +++ rhodecode-tools-1.4.0/setup.py 2021-03-11 12:34:56.000000000 +0100
4 @@ -69,7 +69,7 @@ def _get_requirements(req_filename, excl
5
6
7 # requirements extract
8 -setup_requirements = ['pytest-runner']
9 +setup_requirements = ['pytest-runner==5.1.0']
10 install_requirements = _get_requirements(
11 'requirements.txt', exclude=['setuptools'])
12 test_requirements = _get_requirements('requirements_test.txt') No newline at end of file
@@ -0,0 +1,10 b''
1 diff -rup zip-1.2.0-orig/pyproject.toml zip-1.2.0/pyproject.toml
2 --- zip-1.2.0-orig/pyproject.toml 2021-03-23 10:55:37.000000000 +0100
3 +++ zip-1.2.0/pyproject.toml 2021-03-23 10:56:05.000000000 +0100
4 @@ -1,5 +1,5 @@
5 [build-system]
6 -requires = ["setuptools>=34.4", "wheel", "setuptools_scm>=1.15"]
7 +requires = ["setuptools<42.0", "wheel", "setuptools_scm<6.0.0"]
8 build-backend = "setuptools.build_meta"
9
10 [tool.black]
@@ -0,0 +1,74 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import mock
22 import pytest
23
24 from rhodecode.lib.utils2 import str2bool
25 from rhodecode.lib.vcs.exceptions import RepositoryRequirementError
26 from rhodecode.model.db import Repository, UserRepoToPerm, Permission, User
27 from rhodecode.model.meta import Session
28 from rhodecode.tests import (
29 TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_LOGIN, assert_session_flash)
30 from rhodecode.tests.fixture import Fixture
31
32 fixture = Fixture()
33
34
35 def route_path(name, params=None, **kwargs):
36 import urllib
37
38 base_url = {
39 'edit_repo_maintenance': '/{repo_name}/settings/maintenance',
40 'edit_repo_maintenance_execute': '/{repo_name}/settings/maintenance/execute',
41
42 }[name].format(**kwargs)
43
44 if params:
45 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
46 return base_url
47
48
49 def _get_permission_for_user(user, repo):
50 perm = UserRepoToPerm.query()\
51 .filter(UserRepoToPerm.repository ==
52 Repository.get_by_repo_name(repo))\
53 .filter(UserRepoToPerm.user == User.get_by_username(user))\
54 .all()
55 return perm
56
57
58 @pytest.mark.usefixtures('autologin_user', 'app')
59 class TestAdminRepoMaintenance(object):
60 @pytest.mark.parametrize('urlname', [
61 'edit_repo_maintenance',
62 ])
63 def test_show_page(self, urlname, app, backend):
64 app.get(route_path(urlname, repo_name=backend.repo_name), status=200)
65
66 def test_execute_maintenance_for_repo_hg(self, app, backend_hg, autologin_user, xhr_header):
67 repo_name = backend_hg.repo_name
68
69 response = app.get(
70 route_path('edit_repo_maintenance_execute',
71 repo_name=repo_name,),
72 extra_environ=xhr_header)
73
74 assert "HG Verify repo" in ''.join(response.json)
@@ -1,6 +1,5 b''
1 [bumpversion]
1 [bumpversion]
2 current_version = 4.24.1
2 current_version = 4.25.0
3 message = release: Bump version {current_version} to {new_version}
3 message = release: Bump version {current_version} to {new_version}
4
4
5 [bumpversion:file:rhodecode/VERSION]
5 [bumpversion:file:rhodecode/VERSION]
6
@@ -1,33 +1,28 b''
1 [DEFAULT]
1 [DEFAULT]
2 done = false
2 done = false
3
3
4 [task:bump_version]
4 [task:bump_version]
5 done = true
5 done = true
6
6
7 [task:rc_tools_pinned]
7 [task:rc_tools_pinned]
8 done = true
9
8
10 [task:fixes_on_stable]
9 [task:fixes_on_stable]
11 done = true
12
10
13 [task:pip2nix_generated]
11 [task:pip2nix_generated]
14 done = true
15
12
16 [task:changelog_updated]
13 [task:changelog_updated]
17 done = true
18
14
19 [task:generate_api_docs]
15 [task:generate_api_docs]
20 done = true
16
17 [task:updated_translation]
21
18
22 [release]
19 [release]
23 state = prepared
20 state = in_progress
24 version = 4.24.1
21 version = 4.25.0
25
26 [task:updated_translation]
27
22
28 [task:generate_js_routes]
23 [task:generate_js_routes]
29
24
30 [task:updated_trial_license]
25 [task:updated_trial_license]
31
26
32 [task:generate_oss_licenses]
27 [task:generate_oss_licenses]
33
28
@@ -1,184 +1,202 b''
1 .. _store-methods-ref:
1 .. _store-methods-ref:
2
2
3 store methods
3 store methods
4 =============
4 =============
5
5
6 file_store_add (EE only)
6 file_store_add (EE only)
7 ------------------------
7 ------------------------
8
8
9 .. py:function:: file_store_add(apiuser, filename, content, description=<Optional:''>)
9 .. py:function:: file_store_add(apiuser, filename, content, description=<Optional:''>)
10
10
11 Upload API for the file_store
11 Upload API for the file_store
12
12
13 Example usage from CLI::
13 Example usage from CLI::
14 rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg"}"
14 rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg"}"
15
15
16 This command takes the following options:
16 This command takes the following options:
17
17
18 :param apiuser: This is filled automatically from the |authtoken|.
18 :param apiuser: This is filled automatically from the |authtoken|.
19 :type apiuser: AuthUser
19 :type apiuser: AuthUser
20 :param filename: name of the file uploaded
20 :param filename: name of the file uploaded
21 :type filename: str
21 :type filename: str
22 :param description: Optional description for added file
22 :param description: Optional description for added file
23 :type description: str
23 :type description: str
24 :param content: base64 encoded content of the uploaded file
24 :param content: base64 encoded content of the uploaded file
25 :type content: str
25 :type content: str
26
26
27 Example output:
27 Example output:
28
28
29 .. code-block:: bash
29 .. code-block:: bash
30
30
31 id : <id_given_in_input>
31 id : <id_given_in_input>
32 result: {
32 result: {
33 "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
33 "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
34 "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
34 "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
35 "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg"
35 "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg"
36 }
36 }
37 error : null
37 error : null
38
38
39
39
40 file_store_add_with_acl (EE only)
40 file_store_add_with_acl (EE only)
41 ---------------------------------
41 ---------------------------------
42
42
43 .. py:function:: file_store_add_with_acl(apiuser, filename, content, description=<Optional:''>, scope_user_id=<Optional:None>, scope_repo_id=<Optional:None>, scope_repo_group_id=<Optional:None>)
43 .. py:function:: file_store_add_with_acl(apiuser, filename, content, description=<Optional:''>, scope_user_id=<Optional:None>, scope_repo_id=<Optional:None>, scope_repo_group_id=<Optional:None>)
44
44
45 Upload API for the file_store
45 Upload API for the file_store
46
46
47 Example usage from CLI::
47 Example usage from CLI::
48 rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg", "scope_repo_id":101}"
48 rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg", "scope_repo_id":101}"
49
49
50 This command takes the following options:
50 This command takes the following options:
51
51
52 :param apiuser: This is filled automatically from the |authtoken|.
52 :param apiuser: This is filled automatically from the |authtoken|.
53 :type apiuser: AuthUser
53 :type apiuser: AuthUser
54 :param filename: name of the file uploaded
54 :param filename: name of the file uploaded
55 :type filename: str
55 :type filename: str
56 :param description: Optional description for added file
56 :param description: Optional description for added file
57 :type description: str
57 :type description: str
58 :param content: base64 encoded content of the uploaded file
58 :param content: base64 encoded content of the uploaded file
59 :type content: str
59 :type content: str
60
60
61 :param scope_user_id: Optionally bind this file to user.
61 :param scope_user_id: Optionally bind this file to user.
62 This will check ACL in such way only this user can access the file.
62 This will check ACL in such way only this user can access the file.
63 :type scope_user_id: int
63 :type scope_user_id: int
64 :param scope_repo_id: Optionally bind this file to repository.
64 :param scope_repo_id: Optionally bind this file to repository.
65 This will check ACL in such way only user with proper access to such
65 This will check ACL in such way only user with proper access to such
66 repository can access the file.
66 repository can access the file.
67 :type scope_repo_id: int
67 :type scope_repo_id: int
68 :param scope_repo_group_id: Optionally bind this file to repository group.
68 :param scope_repo_group_id: Optionally bind this file to repository group.
69 This will check ACL in such way only user with proper access to such
69 This will check ACL in such way only user with proper access to such
70 repository group can access the file.
70 repository group can access the file.
71 :type scope_repo_group_id: int
71 :type scope_repo_group_id: int
72
72
73 Example output:
73 Example output:
74
74
75 .. code-block:: bash
75 .. code-block:: bash
76
76
77 id : <id_given_in_input>
77 id : <id_given_in_input>
78 result: {
78 result: {
79 "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
79 "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
80 "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
80 "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg",
81 "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg"
81 "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg"
82 }
82 }
83 error : null
83 error : null
84
84
85
85
86 file_store_get_info (EE only)
86 file_store_get_info (EE only)
87 -----------------------------
87 -----------------------------
88
88
89 .. py:function:: file_store_get_info(apiuser, store_fid)
89 .. py:function:: file_store_get_info(apiuser, store_fid)
90
90
91 Get artifact data.
91 Get artifact data.
92
92
93 Example output:
93 Example output:
94
94
95 .. code-block:: bash
95 .. code-block:: bash
96
96
97 id : <id_given_in_input>
97 id : <id_given_in_input>
98 result: {
98 result: {
99 "artifact": {
99 "artifact": {
100 "access_path_fqn": "https://rhodecode.example.com/_file_store/download/0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
100 "access_path_fqn": "https://rhodecode.example.com/_file_store/download/0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
101 "created_on": "2019-10-15T16:25:35.491",
101 "created_on": "2019-10-15T16:25:35.491",
102 "description": "my upload",
102 "description": "my upload",
103 "downloaded_times": 1,
103 "downloaded_times": 1,
104 "file_uid": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
104 "file_uid": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
105 "filename": "example.jpg",
105 "filename": "example.jpg",
106 "filename_org": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
106 "filename_org": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
107 "hidden": false,
107 "hidden": false,
108 "metadata": [
108 "metadata": [
109 {
109 {
110 "artifact": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
110 "artifact": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg",
111 "key": "yellow",
111 "key": "yellow",
112 "section": "tags",
112 "section": "tags",
113 "value": "bar"
113 "value": "bar"
114 }
114 }
115 ],
115 ],
116 "sha256": "818dff0f44574dfb6814d38e6bf3c60c5943d1d13653398ecddaedf2f6a5b04d",
116 "sha256": "818dff0f44574dfb6814d38e6bf3c60c5943d1d13653398ecddaedf2f6a5b04d",
117 "size": 18599,
117 "size": 18599,
118 "uploaded_by": {
118 "uploaded_by": {
119 "email": "admin@rhodecode.com",
119 "email": "admin@rhodecode.com",
120 "emails": [
120 "emails": [
121 "admin@rhodecode.com"
121 "admin@rhodecode.com"
122 ],
122 ],
123 "firstname": "Admin",
123 "firstname": "Admin",
124 "lastname": "LastName",
124 "lastname": "LastName",
125 "user_id": 2,
125 "user_id": 2,
126 "username": "admin"
126 "username": "admin"
127 }
127 }
128 }
128 }
129 }
129 }
130 error : null
130 error : null
131
131
132
132
133 file_store_delete (EE only)
134 ---------------------------
135
136 .. py:function:: file_store_delete(apiuser, store_fid)
137
138 Delete an artifact based on the secret uuid.
139
140 Example output:
141
142 .. code-block:: bash
143
144 id : <id_given_in_input>
145 result: {
146 "artifact" : {"uid": "some uid", "removed": true}
147 }
148 error : null
149
150
133 file_store_add_metadata (EE only)
151 file_store_add_metadata (EE only)
134 ---------------------------------
152 ---------------------------------
135
153
136 .. py:function:: file_store_add_metadata(apiuser, store_fid, section, key, value, value_type=<Optional:'unicode'>)
154 .. py:function:: file_store_add_metadata(apiuser, store_fid, section, key, value, value_type=<Optional:'unicode'>)
137
155
138 Add metadata into artifact. The metadata consist of section, key, value. eg.
156 Add metadata into artifact. The metadata consist of section, key, value. eg.
139 section='tags', 'key'='tag_name', value='1'
157 section='tags', 'key'='tag_name', value='1'
140
158
141 :param apiuser: This is filled automatically from the |authtoken|.
159 :param apiuser: This is filled automatically from the |authtoken|.
142 :type apiuser: AuthUser
160 :type apiuser: AuthUser
143
161
144 :param store_fid: file uid, e.g 0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4
162 :param store_fid: file uid, e.g 0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4
145 :type store_fid: str
163 :type store_fid: str
146
164
147 :param section: Section name to add metadata
165 :param section: Section name to add metadata
148 :type section: str
166 :type section: str
149
167
150 :param key: Key to add as metadata
168 :param key: Key to add as metadata
151 :type key: str
169 :type key: str
152
170
153 :param value: Value to add as metadata
171 :param value: Value to add as metadata
154 :type value: str
172 :type value: str
155
173
156 :param value_type: Optional type, default is 'unicode' other types are:
174 :param value_type: Optional type, default is 'unicode' other types are:
157 int, list, bool, unicode, str
175 int, list, bool, unicode, str
158
176
159 :type value_type: str
177 :type value_type: str
160
178
161 Example output:
179 Example output:
162
180
163 .. code-block:: bash
181 .. code-block:: bash
164
182
165 id : <id_given_in_input>
183 id : <id_given_in_input>
166 result: {
184 result: {
167 "metadata": [
185 "metadata": [
168 {
186 {
169 "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4",
187 "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4",
170 "key": "secret",
188 "key": "secret",
171 "section": "tags",
189 "section": "tags",
172 "value": "1"
190 "value": "1"
173 },
191 },
174 {
192 {
175 "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4",
193 "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4",
176 "key": "video",
194 "key": "video",
177 "section": "tags",
195 "section": "tags",
178 "value": "1"
196 "value": "1"
179 }
197 }
180 ]
198 ]
181 }
199 }
182 error : null
200 error : null
183
201
184
202
@@ -1,153 +1,154 b''
1 .. _rhodecode-release-notes-ref:
1 .. _rhodecode-release-notes-ref:
2
2
3 Release Notes
3 Release Notes
4 =============
4 =============
5
5
6 |RCE| 4.x Versions
6 |RCE| 4.x Versions
7 ------------------
7 ------------------
8
8
9 .. toctree::
9 .. toctree::
10 :maxdepth: 1
10 :maxdepth: 1
11
11
12 release-notes-4.25.0.rst
12 release-notes-4.24.1.rst
13 release-notes-4.24.1.rst
13 release-notes-4.24.0.rst
14 release-notes-4.24.0.rst
14 release-notes-4.23.2.rst
15 release-notes-4.23.2.rst
15 release-notes-4.23.1.rst
16 release-notes-4.23.1.rst
16 release-notes-4.23.0.rst
17 release-notes-4.23.0.rst
17 release-notes-4.22.0.rst
18 release-notes-4.22.0.rst
18 release-notes-4.21.0.rst
19 release-notes-4.21.0.rst
19 release-notes-4.20.1.rst
20 release-notes-4.20.1.rst
20 release-notes-4.20.0.rst
21 release-notes-4.20.0.rst
21 release-notes-4.19.3.rst
22 release-notes-4.19.3.rst
22 release-notes-4.19.2.rst
23 release-notes-4.19.2.rst
23 release-notes-4.19.1.rst
24 release-notes-4.19.1.rst
24 release-notes-4.19.0.rst
25 release-notes-4.19.0.rst
25 release-notes-4.18.3.rst
26 release-notes-4.18.3.rst
26 release-notes-4.18.2.rst
27 release-notes-4.18.2.rst
27 release-notes-4.18.1.rst
28 release-notes-4.18.1.rst
28 release-notes-4.18.0.rst
29 release-notes-4.18.0.rst
29 release-notes-4.17.4.rst
30 release-notes-4.17.4.rst
30 release-notes-4.17.3.rst
31 release-notes-4.17.3.rst
31 release-notes-4.17.2.rst
32 release-notes-4.17.2.rst
32 release-notes-4.17.1.rst
33 release-notes-4.17.1.rst
33 release-notes-4.17.0.rst
34 release-notes-4.17.0.rst
34 release-notes-4.16.2.rst
35 release-notes-4.16.2.rst
35 release-notes-4.16.1.rst
36 release-notes-4.16.1.rst
36 release-notes-4.16.0.rst
37 release-notes-4.16.0.rst
37 release-notes-4.15.2.rst
38 release-notes-4.15.2.rst
38 release-notes-4.15.1.rst
39 release-notes-4.15.1.rst
39 release-notes-4.15.0.rst
40 release-notes-4.15.0.rst
40 release-notes-4.14.1.rst
41 release-notes-4.14.1.rst
41 release-notes-4.14.0.rst
42 release-notes-4.14.0.rst
42 release-notes-4.13.3.rst
43 release-notes-4.13.3.rst
43 release-notes-4.13.2.rst
44 release-notes-4.13.2.rst
44 release-notes-4.13.1.rst
45 release-notes-4.13.1.rst
45 release-notes-4.13.0.rst
46 release-notes-4.13.0.rst
46 release-notes-4.12.4.rst
47 release-notes-4.12.4.rst
47 release-notes-4.12.3.rst
48 release-notes-4.12.3.rst
48 release-notes-4.12.2.rst
49 release-notes-4.12.2.rst
49 release-notes-4.12.1.rst
50 release-notes-4.12.1.rst
50 release-notes-4.12.0.rst
51 release-notes-4.12.0.rst
51 release-notes-4.11.6.rst
52 release-notes-4.11.6.rst
52 release-notes-4.11.5.rst
53 release-notes-4.11.5.rst
53 release-notes-4.11.4.rst
54 release-notes-4.11.4.rst
54 release-notes-4.11.3.rst
55 release-notes-4.11.3.rst
55 release-notes-4.11.2.rst
56 release-notes-4.11.2.rst
56 release-notes-4.11.1.rst
57 release-notes-4.11.1.rst
57 release-notes-4.11.0.rst
58 release-notes-4.11.0.rst
58 release-notes-4.10.6.rst
59 release-notes-4.10.6.rst
59 release-notes-4.10.5.rst
60 release-notes-4.10.5.rst
60 release-notes-4.10.4.rst
61 release-notes-4.10.4.rst
61 release-notes-4.10.3.rst
62 release-notes-4.10.3.rst
62 release-notes-4.10.2.rst
63 release-notes-4.10.2.rst
63 release-notes-4.10.1.rst
64 release-notes-4.10.1.rst
64 release-notes-4.10.0.rst
65 release-notes-4.10.0.rst
65 release-notes-4.9.1.rst
66 release-notes-4.9.1.rst
66 release-notes-4.9.0.rst
67 release-notes-4.9.0.rst
67 release-notes-4.8.0.rst
68 release-notes-4.8.0.rst
68 release-notes-4.7.2.rst
69 release-notes-4.7.2.rst
69 release-notes-4.7.1.rst
70 release-notes-4.7.1.rst
70 release-notes-4.7.0.rst
71 release-notes-4.7.0.rst
71 release-notes-4.6.1.rst
72 release-notes-4.6.1.rst
72 release-notes-4.6.0.rst
73 release-notes-4.6.0.rst
73 release-notes-4.5.2.rst
74 release-notes-4.5.2.rst
74 release-notes-4.5.1.rst
75 release-notes-4.5.1.rst
75 release-notes-4.5.0.rst
76 release-notes-4.5.0.rst
76 release-notes-4.4.2.rst
77 release-notes-4.4.2.rst
77 release-notes-4.4.1.rst
78 release-notes-4.4.1.rst
78 release-notes-4.4.0.rst
79 release-notes-4.4.0.rst
79 release-notes-4.3.1.rst
80 release-notes-4.3.1.rst
80 release-notes-4.3.0.rst
81 release-notes-4.3.0.rst
81 release-notes-4.2.1.rst
82 release-notes-4.2.1.rst
82 release-notes-4.2.0.rst
83 release-notes-4.2.0.rst
83 release-notes-4.1.2.rst
84 release-notes-4.1.2.rst
84 release-notes-4.1.1.rst
85 release-notes-4.1.1.rst
85 release-notes-4.1.0.rst
86 release-notes-4.1.0.rst
86 release-notes-4.0.1.rst
87 release-notes-4.0.1.rst
87 release-notes-4.0.0.rst
88 release-notes-4.0.0.rst
88
89
89 |RCE| 3.x Versions
90 |RCE| 3.x Versions
90 ------------------
91 ------------------
91
92
92 .. toctree::
93 .. toctree::
93 :maxdepth: 1
94 :maxdepth: 1
94
95
95 release-notes-3.8.4.rst
96 release-notes-3.8.4.rst
96 release-notes-3.8.3.rst
97 release-notes-3.8.3.rst
97 release-notes-3.8.2.rst
98 release-notes-3.8.2.rst
98 release-notes-3.8.1.rst
99 release-notes-3.8.1.rst
99 release-notes-3.8.0.rst
100 release-notes-3.8.0.rst
100 release-notes-3.7.1.rst
101 release-notes-3.7.1.rst
101 release-notes-3.7.0.rst
102 release-notes-3.7.0.rst
102 release-notes-3.6.1.rst
103 release-notes-3.6.1.rst
103 release-notes-3.6.0.rst
104 release-notes-3.6.0.rst
104 release-notes-3.5.2.rst
105 release-notes-3.5.2.rst
105 release-notes-3.5.1.rst
106 release-notes-3.5.1.rst
106 release-notes-3.5.0.rst
107 release-notes-3.5.0.rst
107 release-notes-3.4.1.rst
108 release-notes-3.4.1.rst
108 release-notes-3.4.0.rst
109 release-notes-3.4.0.rst
109 release-notes-3.3.4.rst
110 release-notes-3.3.4.rst
110 release-notes-3.3.3.rst
111 release-notes-3.3.3.rst
111 release-notes-3.3.2.rst
112 release-notes-3.3.2.rst
112 release-notes-3.3.1.rst
113 release-notes-3.3.1.rst
113 release-notes-3.3.0.rst
114 release-notes-3.3.0.rst
114 release-notes-3.2.3.rst
115 release-notes-3.2.3.rst
115 release-notes-3.2.2.rst
116 release-notes-3.2.2.rst
116 release-notes-3.2.1.rst
117 release-notes-3.2.1.rst
117 release-notes-3.2.0.rst
118 release-notes-3.2.0.rst
118 release-notes-3.1.1.rst
119 release-notes-3.1.1.rst
119 release-notes-3.1.0.rst
120 release-notes-3.1.0.rst
120 release-notes-3.0.2.rst
121 release-notes-3.0.2.rst
121 release-notes-3.0.1.rst
122 release-notes-3.0.1.rst
122 release-notes-3.0.0.rst
123 release-notes-3.0.0.rst
123
124
124 |RCE| 2.x Versions
125 |RCE| 2.x Versions
125 ------------------
126 ------------------
126
127
127 .. toctree::
128 .. toctree::
128 :maxdepth: 1
129 :maxdepth: 1
129
130
130 release-notes-2.2.8.rst
131 release-notes-2.2.8.rst
131 release-notes-2.2.7.rst
132 release-notes-2.2.7.rst
132 release-notes-2.2.6.rst
133 release-notes-2.2.6.rst
133 release-notes-2.2.5.rst
134 release-notes-2.2.5.rst
134 release-notes-2.2.4.rst
135 release-notes-2.2.4.rst
135 release-notes-2.2.3.rst
136 release-notes-2.2.3.rst
136 release-notes-2.2.2.rst
137 release-notes-2.2.2.rst
137 release-notes-2.2.1.rst
138 release-notes-2.2.1.rst
138 release-notes-2.2.0.rst
139 release-notes-2.2.0.rst
139 release-notes-2.1.0.rst
140 release-notes-2.1.0.rst
140 release-notes-2.0.2.rst
141 release-notes-2.0.2.rst
141 release-notes-2.0.1.rst
142 release-notes-2.0.1.rst
142 release-notes-2.0.0.rst
143 release-notes-2.0.0.rst
143
144
144 |RCE| 1.x Versions
145 |RCE| 1.x Versions
145 ------------------
146 ------------------
146
147
147 .. toctree::
148 .. toctree::
148 :maxdepth: 1
149 :maxdepth: 1
149
150
150 release-notes-1.7.2.rst
151 release-notes-1.7.2.rst
151 release-notes-1.7.1.rst
152 release-notes-1.7.1.rst
152 release-notes-1.7.0.rst
153 release-notes-1.7.0.rst
153 release-notes-1.6.0.rst
154 release-notes-1.6.0.rst
@@ -1,12 +1,12 b''
1 diff -rup pytest-4.6.5-orig/setup.py pytest-4.6.5/setup.py
1 diff -rup pytest-4.6.5-orig/setup.py pytest-4.6.5/setup.py
2 --- pytest-4.6.5-orig/setup.py 2018-04-10 10:23:04.000000000 +0200
2 --- pytest-4.6.5-orig/setup.py 2018-04-10 10:23:04.000000000 +0200
3 +++ pytest-4.6.5/setup.py 2018-04-10 10:23:34.000000000 +0200
3 +++ pytest-4.6.5/setup.py 2018-04-10 10:23:34.000000000 +0200
4 @@ -24,7 +24,7 @@ INSTALL_REQUIRES = [
4 @@ -24,7 +24,7 @@ INSTALL_REQUIRES = [
5 def main():
5 def main():
6 setup(
6 setup(
7 use_scm_version={"write_to": "src/_pytest/_version.py"},
7 use_scm_version={"write_to": "src/_pytest/_version.py"},
8 - setup_requires=["setuptools-scm", "setuptools>=40.0"],
8 - setup_requires=["setuptools-scm", "setuptools>=40.0"],
9 + setup_requires=["setuptools-scm", "setuptools<=42.0"],
9 + setup_requires=["setuptools-scm<6.0.0", "setuptools<=42.0"],
10 package_dir={"": "src"},
10 package_dir={"": "src"},
11 # fmt: off
11 # fmt: off
12 extras_require={ No newline at end of file
12 extras_require={
@@ -1,287 +1,353 b''
1 # Overrides for the generated python-packages.nix
1 # Overrides for the generated python-packages.nix
2 #
2 #
3 # This function is intended to be used as an extension to the generated file
3 # This function is intended to be used as an extension to the generated file
4 # python-packages.nix. The main objective is to add needed dependencies of C
4 # python-packages.nix. The main objective is to add needed dependencies of C
5 # libraries and tweak the build instructions where needed.
5 # libraries and tweak the build instructions where needed.
6
6
7 { pkgs
7 { pkgs
8 , basePythonPackages
8 , basePythonPackages
9 }:
9 }:
10
10
11 let
11 let
12 sed = "sed -i";
12 sed = "sed -i";
13
13
14 localLicenses = {
14 localLicenses = {
15 repoze = {
15 repoze = {
16 fullName = "Repoze License";
16 fullName = "Repoze License";
17 url = http://www.repoze.org/LICENSE.txt;
17 url = http://www.repoze.org/LICENSE.txt;
18 };
18 };
19 };
19 };
20
20
21 in
21 in
22
22
23 self: super: {
23 self: super: {
24
24
25 "appenlight-client" = super."appenlight-client".override (attrs: {
25 "appenlight-client" = super."appenlight-client".override (attrs: {
26 meta = {
26 meta = {
27 license = [ pkgs.lib.licenses.bsdOriginal ];
27 license = [ pkgs.lib.licenses.bsdOriginal ];
28 };
28 };
29 });
29 });
30
30
31 "beaker" = super."beaker".override (attrs: {
31 "beaker" = super."beaker".override (attrs: {
32 patches = [
32 patches = [
33 ./patches/beaker/patch-beaker-lock-func-debug.diff
33 ./patches/beaker/patch-beaker-lock-func-debug.diff
34 ./patches/beaker/patch-beaker-metadata-reuse.diff
34 ./patches/beaker/patch-beaker-metadata-reuse.diff
35 ./patches/beaker/patch-beaker-improved-redis.diff
35 ./patches/beaker/patch-beaker-improved-redis.diff
36 ./patches/beaker/patch-beaker-improved-redis-2.diff
36 ./patches/beaker/patch-beaker-improved-redis-2.diff
37 ];
37 ];
38 });
38 });
39
39
40 "cffi" = super."cffi".override (attrs: {
40 "cffi" = super."cffi".override (attrs: {
41 buildInputs = [
41 buildInputs = [
42 pkgs.libffi
42 pkgs.libffi
43 ];
43 ];
44 });
44 });
45
45
46 "cryptography" = super."cryptography".override (attrs: {
46 "cryptography" = super."cryptography".override (attrs: {
47 buildInputs = [
47 buildInputs = [
48 pkgs.openssl
48 pkgs.openssl
49 ];
49 ];
50 });
50 });
51
51
52 "gevent" = super."gevent".override (attrs: {
52 "gevent" = super."gevent".override (attrs: {
53 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
53 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
54 # NOTE: (marcink) odd requirements from gevent aren't set properly,
54 # NOTE: (marcink) odd requirements from gevent aren't set properly,
55 # thus we need to inject psutil manually
55 # thus we need to inject psutil manually
56 self."psutil"
56 self."psutil"
57 ];
57 ];
58 });
58 });
59
59
60 "future" = super."future".override (attrs: {
60 "future" = super."future".override (attrs: {
61 meta = {
61 meta = {
62 license = [ pkgs.lib.licenses.mit ];
62 license = [ pkgs.lib.licenses.mit ];
63 };
63 };
64 });
64 });
65
65
66 "testpath" = super."testpath".override (attrs: {
66 "testpath" = super."testpath".override (attrs: {
67 meta = {
67 meta = {
68 license = [ pkgs.lib.licenses.mit ];
68 license = [ pkgs.lib.licenses.mit ];
69 };
69 };
70 });
70 });
71
71
72 "gnureadline" = super."gnureadline".override (attrs: {
72 "gnureadline" = super."gnureadline".override (attrs: {
73 buildInputs = [
73 buildInputs = [
74 pkgs.ncurses
74 pkgs.ncurses
75 ];
75 ];
76 patchPhase = ''
76 patchPhase = ''
77 substituteInPlace setup.py --replace "/bin/bash" "${pkgs.bash}/bin/bash"
77 substituteInPlace setup.py --replace "/bin/bash" "${pkgs.bash}/bin/bash"
78 '';
78 '';
79 });
79 });
80
80
81 "gunicorn" = super."gunicorn".override (attrs: {
81 "gunicorn" = super."gunicorn".override (attrs: {
82 propagatedBuildInputs = [
82 propagatedBuildInputs = [
83 # johbo: futures is needed as long as we are on Python 2, otherwise
83 # johbo: futures is needed as long as we are on Python 2, otherwise
84 # gunicorn explodes if used with multiple threads per worker.
84 # gunicorn explodes if used with multiple threads per worker.
85 self."futures"
85 self."futures"
86 ];
86 ];
87 });
87 });
88
88
89 "nbconvert" = super."nbconvert".override (attrs: {
89 "nbconvert" = super."nbconvert".override (attrs: {
90 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
90 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
91 # marcink: plug in jupyter-client for notebook rendering
91 # marcink: plug in jupyter-client for notebook rendering
92 self."jupyter-client"
92 self."jupyter-client"
93 ];
93 ];
94 });
94 });
95
95
96 "ipython" = super."ipython".override (attrs: {
96 "ipython" = super."ipython".override (attrs: {
97 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
97 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
98 self."gnureadline"
98 self."gnureadline"
99 ];
99 ];
100 });
100 });
101
101
102 "lxml" = super."lxml".override (attrs: {
102 "lxml" = super."lxml".override (attrs: {
103 buildInputs = [
103 buildInputs = [
104 pkgs.libxml2
104 pkgs.libxml2
105 pkgs.libxslt
105 pkgs.libxslt
106 ];
106 ];
107 propagatedBuildInputs = [
107 propagatedBuildInputs = [
108 # Needed, so that "setup.py bdist_wheel" does work
108 # Needed, so that "setup.py bdist_wheel" does work
109 self."wheel"
109 self."wheel"
110 ];
110 ];
111 });
111 });
112
112
113 "mysql-python" = super."mysql-python".override (attrs: {
113 "mysql-python" = super."mysql-python".override (attrs: {
114 buildInputs = [
114 buildInputs = [
115 pkgs.openssl
115 pkgs.openssl
116 ];
116 ];
117 propagatedBuildInputs = [
117 propagatedBuildInputs = [
118 pkgs.libmysql
118 pkgs.libmysql
119 pkgs.zlib
119 pkgs.zlib
120 ];
120 ];
121 });
121 });
122
122
123 "psycopg2" = super."psycopg2".override (attrs: {
123 "psycopg2" = super."psycopg2".override (attrs: {
124 propagatedBuildInputs = [
124 propagatedBuildInputs = [
125 pkgs.postgresql
125 pkgs.postgresql
126 ];
126 ];
127 meta = {
127 meta = {
128 license = pkgs.lib.licenses.lgpl3Plus;
128 license = pkgs.lib.licenses.lgpl3Plus;
129 };
129 };
130 });
130 });
131
131
132 "pycurl" = super."pycurl".override (attrs: {
132 "pycurl" = super."pycurl".override (attrs: {
133 propagatedBuildInputs = [
133 propagatedBuildInputs = [
134 pkgs.curl
134 pkgs.curl
135 pkgs.openssl
135 pkgs.openssl
136 ];
136 ];
137
137
138 preConfigure = ''
138 preConfigure = ''
139 substituteInPlace setup.py --replace '--static-libs' '--libs'
139 substituteInPlace setup.py --replace '--static-libs' '--libs'
140 export PYCURL_SSL_LIBRARY=openssl
140 export PYCURL_SSL_LIBRARY=openssl
141 '';
141 '';
142
142
143 meta = {
143 meta = {
144 license = pkgs.lib.licenses.mit;
144 license = pkgs.lib.licenses.mit;
145 };
145 };
146 });
146 });
147
147
148 "pyramid" = super."pyramid".override (attrs: {
148 "pyramid" = super."pyramid".override (attrs: {
149 meta = {
149 meta = {
150 license = localLicenses.repoze;
150 license = localLicenses.repoze;
151 };
151 };
152 });
152 });
153
153
154 "pyramid-debugtoolbar" = super."pyramid-debugtoolbar".override (attrs: {
154 "pyramid-debugtoolbar" = super."pyramid-debugtoolbar".override (attrs: {
155 meta = {
155 meta = {
156 license = [ pkgs.lib.licenses.bsdOriginal localLicenses.repoze ];
156 license = [ pkgs.lib.licenses.bsdOriginal localLicenses.repoze ];
157 };
157 };
158 });
158 });
159
159
160 "pysqlite" = super."pysqlite".override (attrs: {
160 "pysqlite" = super."pysqlite".override (attrs: {
161 propagatedBuildInputs = [
161 propagatedBuildInputs = [
162 pkgs.sqlite
162 pkgs.sqlite
163 ];
163 ];
164 meta = {
164 meta = {
165 license = [ pkgs.lib.licenses.zlib pkgs.lib.licenses.libpng ];
165 license = [ pkgs.lib.licenses.zlib pkgs.lib.licenses.libpng ];
166 };
166 };
167 });
167 });
168
168
169 "python-ldap" = super."python-ldap".override (attrs: {
169 "python-ldap" = super."python-ldap".override (attrs: {
170 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
170 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
171 pkgs.openldap
171 pkgs.openldap
172 pkgs.cyrus_sasl
172 pkgs.cyrus_sasl
173 pkgs.openssl
173 pkgs.openssl
174 ];
174 ];
175 });
175 });
176
176
177 "python-pam" = super."python-pam".override (attrs: {
177 "python-pam" = super."python-pam".override (attrs: {
178 propagatedBuildInputs = [
178 propagatedBuildInputs = [
179 pkgs.pam
179 pkgs.pam
180 ];
180 ];
181
181
182 # TODO: johbo: Check if this can be avoided, or transform into
182 # TODO: johbo: Check if this can be avoided, or transform into
183 # a real patch
183 # a real patch
184 patchPhase = ''
184 patchPhase = ''
185 substituteInPlace pam.py \
185 substituteInPlace pam.py \
186 --replace 'find_library("pam")' '"${pkgs.pam}/lib/libpam.so.0"'
186 --replace 'find_library("pam")' '"${pkgs.pam}/lib/libpam.so.0"'
187 '';
187 '';
188
188
189 });
189 });
190
190
191 "python-saml" = super."python-saml".override (attrs: {
191 "python-saml" = super."python-saml".override (attrs: {
192 buildInputs = [
192 buildInputs = [
193 pkgs.libxml2
193 pkgs.libxml2
194 pkgs.libxslt
194 pkgs.libxslt
195 ];
195 ];
196 });
196 });
197
197
198 "dm.xmlsec.binding" = super."dm.xmlsec.binding".override (attrs: {
198 "dm.xmlsec.binding" = super."dm.xmlsec.binding".override (attrs: {
199 buildInputs = [
199 buildInputs = [
200 pkgs.libxml2
200 pkgs.libxml2
201 pkgs.libxslt
201 pkgs.libxslt
202 pkgs.xmlsec
202 pkgs.xmlsec
203 pkgs.libtool
203 pkgs.libtool
204 ];
204 ];
205 });
205 });
206
206
207 "pyzmq" = super."pyzmq".override (attrs: {
207 "pyzmq" = super."pyzmq".override (attrs: {
208 buildInputs = [
208 buildInputs = [
209 pkgs.czmq
209 pkgs.czmq
210 ];
210 ];
211 });
211 });
212
212
213 "urlobject" = super."urlobject".override (attrs: {
213 "urlobject" = super."urlobject".override (attrs: {
214 meta = {
214 meta = {
215 license = {
215 license = {
216 spdxId = "Unlicense";
216 spdxId = "Unlicense";
217 fullName = "The Unlicense";
217 fullName = "The Unlicense";
218 url = http://unlicense.org/;
218 url = http://unlicense.org/;
219 };
219 };
220 };
220 };
221 });
221 });
222
222
223 "docutils" = super."docutils".override (attrs: {
223 "docutils" = super."docutils".override (attrs: {
224 meta = {
224 meta = {
225 license = pkgs.lib.licenses.bsd2;
225 license = pkgs.lib.licenses.bsd2;
226 };
226 };
227 });
227 });
228
228
229 "colander" = super."colander".override (attrs: {
229 "colander" = super."colander".override (attrs: {
230 meta = {
230 meta = {
231 license = localLicenses.repoze;
231 license = localLicenses.repoze;
232 };
232 };
233 });
233 });
234
234
235 "pyramid-beaker" = super."pyramid-beaker".override (attrs: {
235 "pyramid-beaker" = super."pyramid-beaker".override (attrs: {
236 meta = {
236 meta = {
237 license = localLicenses.repoze;
237 license = localLicenses.repoze;
238 };
238 };
239 });
239 });
240
240
241 "pyramid-mako" = super."pyramid-mako".override (attrs: {
241 "pyramid-mako" = super."pyramid-mako".override (attrs: {
242 meta = {
242 meta = {
243 license = localLicenses.repoze;
243 license = localLicenses.repoze;
244 };
244 };
245 });
245 });
246
246
247 "repoze.lru" = super."repoze.lru".override (attrs: {
247 "repoze.lru" = super."repoze.lru".override (attrs: {
248 meta = {
248 meta = {
249 license = localLicenses.repoze;
249 license = localLicenses.repoze;
250 };
250 };
251 });
251 });
252
252
253 "python-editor" = super."python-editor".override (attrs: {
253 "python-editor" = super."python-editor".override (attrs: {
254 meta = {
254 meta = {
255 license = pkgs.lib.licenses.asl20;
255 license = pkgs.lib.licenses.asl20;
256 };
256 };
257 });
257 });
258
258
259 "translationstring" = super."translationstring".override (attrs: {
259 "translationstring" = super."translationstring".override (attrs: {
260 meta = {
260 meta = {
261 license = localLicenses.repoze;
261 license = localLicenses.repoze;
262 };
262 };
263 });
263 });
264
264
265 "venusian" = super."venusian".override (attrs: {
265 "venusian" = super."venusian".override (attrs: {
266 meta = {
266 meta = {
267 license = localLicenses.repoze;
267 license = localLicenses.repoze;
268 };
268 };
269 });
269 });
270
270
271 "supervisor" = super."supervisor".override (attrs: {
271 "supervisor" = super."supervisor".override (attrs: {
272 patches = [
272 patches = [
273 ./patches/supervisor/patch-rlimits-old-kernel.diff
273 ./patches/supervisor/patch-rlimits-old-kernel.diff
274 ];
274 ];
275 });
275 });
276
276
277 "pytest" = super."pytest".override (attrs: {
277 "pytest" = super."pytest".override (attrs: {
278 patches = [
278 patches = [
279 ./patches/pytest/setuptools.patch
279 ./patches/pytest/setuptools.patch
280 ];
280 ];
281 });
281 });
282
282
283 "pytest-runner" = super."pytest-runner".override (attrs: {
284 propagatedBuildInputs = [
285 self."setuptools-scm"
286 ];
287 });
288
289 "py" = super."py".override (attrs: {
290 propagatedBuildInputs = [
291 self."setuptools-scm"
292 ];
293 });
294
295 "python-dateutil" = super."python-dateutil".override (attrs: {
296 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
297 self."setuptools-scm"
298 ];
299 });
300
301 "configparser" = super."configparser".override (attrs: {
302 patches = [
303 ./patches/configparser/pyproject.patch
304 ];
305 propagatedBuildInputs = [
306 self."setuptools-scm"
307 ];
308 });
309
310 "importlib-metadata" = super."importlib-metadata".override (attrs: {
311
312 patches = [
313 ./patches/importlib_metadata/pyproject.patch
314 ];
315
316 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
317 self."setuptools-scm"
318 ];
319
320 });
321
322 "zipp" = super."zipp".override (attrs: {
323 patches = [
324 ./patches/zipp/pyproject.patch
325 ];
326 propagatedBuildInputs = attrs.propagatedBuildInputs ++ [
327 self."setuptools-scm"
328 ];
329 });
330
331 "pyramid-apispec" = super."pyramid-apispec".override (attrs: {
332 patches = [
333 ./patches/pyramid_apispec/setuptools.patch
334 ];
335 });
336
337 "channelstream" = super."channelstream".override (attrs: {
338 patches = [
339 ./patches/channelstream/setuptools.patch
340 ];
341 });
342
343 "rhodecode-tools" = super."rhodecode-tools".override (attrs: {
344 patches = [
345 ./patches/rhodecode_tools/setuptools.patch
346 ];
347 });
348
283 # Avoid that base packages screw up the build process
349 # Avoid that base packages screw up the build process
284 inherit (basePythonPackages)
350 inherit (basePythonPackages)
285 setuptools;
351 setuptools;
286
352
287 }
353 }
@@ -1,2509 +1,2520 b''
1 # Generated by pip2nix 0.8.0.dev1
1 # Generated by pip2nix 0.8.0.dev1
2 # See https://github.com/johbo/pip2nix
2 # See https://github.com/johbo/pip2nix
3
3
4 { pkgs, fetchurl, fetchgit, fetchhg }:
4 { pkgs, fetchurl, fetchgit, fetchhg }:
5
5
6 self: super: {
6 self: super: {
7 "alembic" = super.buildPythonPackage {
7 "alembic" = super.buildPythonPackage {
8 name = "alembic-1.4.2";
8 name = "alembic-1.4.2";
9 doCheck = false;
9 doCheck = false;
10 propagatedBuildInputs = [
10 propagatedBuildInputs = [
11 self."sqlalchemy"
11 self."sqlalchemy"
12 self."mako"
12 self."mako"
13 self."python-editor"
13 self."python-editor"
14 self."python-dateutil"
14 self."python-dateutil"
15 ];
15 ];
16 src = fetchurl {
16 src = fetchurl {
17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
19 };
19 };
20 meta = {
20 meta = {
21 license = [ pkgs.lib.licenses.mit ];
21 license = [ pkgs.lib.licenses.mit ];
22 };
22 };
23 };
23 };
24 "amqp" = super.buildPythonPackage {
24 "amqp" = super.buildPythonPackage {
25 name = "amqp-2.5.2";
25 name = "amqp-2.5.2";
26 doCheck = false;
26 doCheck = false;
27 propagatedBuildInputs = [
27 propagatedBuildInputs = [
28 self."vine"
28 self."vine"
29 ];
29 ];
30 src = fetchurl {
30 src = fetchurl {
31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
33 };
33 };
34 meta = {
34 meta = {
35 license = [ pkgs.lib.licenses.bsdOriginal ];
35 license = [ pkgs.lib.licenses.bsdOriginal ];
36 };
36 };
37 };
37 };
38 "apispec" = super.buildPythonPackage {
38 "apispec" = super.buildPythonPackage {
39 name = "apispec-1.0.0";
39 name = "apispec-1.0.0";
40 doCheck = false;
40 doCheck = false;
41 propagatedBuildInputs = [
41 propagatedBuildInputs = [
42 self."PyYAML"
42 self."PyYAML"
43 ];
43 ];
44 src = fetchurl {
44 src = fetchurl {
45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
47 };
47 };
48 meta = {
48 meta = {
49 license = [ pkgs.lib.licenses.mit ];
49 license = [ pkgs.lib.licenses.mit ];
50 };
50 };
51 };
51 };
52 "appenlight-client" = super.buildPythonPackage {
52 "appenlight-client" = super.buildPythonPackage {
53 name = "appenlight-client-0.6.26";
53 name = "appenlight-client-0.6.26";
54 doCheck = false;
54 doCheck = false;
55 propagatedBuildInputs = [
55 propagatedBuildInputs = [
56 self."webob"
56 self."webob"
57 self."requests"
57 self."requests"
58 self."six"
58 self."six"
59 ];
59 ];
60 src = fetchurl {
60 src = fetchurl {
61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
63 };
63 };
64 meta = {
64 meta = {
65 license = [ pkgs.lib.licenses.bsdOriginal ];
65 license = [ pkgs.lib.licenses.bsdOriginal ];
66 };
66 };
67 };
67 };
68 "asn1crypto" = super.buildPythonPackage {
68 "asn1crypto" = super.buildPythonPackage {
69 name = "asn1crypto-0.24.0";
69 name = "asn1crypto-0.24.0";
70 doCheck = false;
70 doCheck = false;
71 src = fetchurl {
71 src = fetchurl {
72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
74 };
74 };
75 meta = {
75 meta = {
76 license = [ pkgs.lib.licenses.mit ];
76 license = [ pkgs.lib.licenses.mit ];
77 };
77 };
78 };
78 };
79 "atomicwrites" = super.buildPythonPackage {
79 "atomicwrites" = super.buildPythonPackage {
80 name = "atomicwrites-1.3.0";
80 name = "atomicwrites-1.3.0";
81 doCheck = false;
81 doCheck = false;
82 src = fetchurl {
82 src = fetchurl {
83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
85 };
85 };
86 meta = {
86 meta = {
87 license = [ pkgs.lib.licenses.mit ];
87 license = [ pkgs.lib.licenses.mit ];
88 };
88 };
89 };
89 };
90 "attrs" = super.buildPythonPackage {
90 "attrs" = super.buildPythonPackage {
91 name = "attrs-19.3.0";
91 name = "attrs-19.3.0";
92 doCheck = false;
92 doCheck = false;
93 src = fetchurl {
93 src = fetchurl {
94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
96 };
96 };
97 meta = {
97 meta = {
98 license = [ pkgs.lib.licenses.mit ];
98 license = [ pkgs.lib.licenses.mit ];
99 };
99 };
100 };
100 };
101 "babel" = super.buildPythonPackage {
101 "babel" = super.buildPythonPackage {
102 name = "babel-1.3";
102 name = "babel-1.3";
103 doCheck = false;
103 doCheck = false;
104 propagatedBuildInputs = [
104 propagatedBuildInputs = [
105 self."pytz"
105 self."pytz"
106 ];
106 ];
107 src = fetchurl {
107 src = fetchurl {
108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
110 };
110 };
111 meta = {
111 meta = {
112 license = [ pkgs.lib.licenses.bsdOriginal ];
112 license = [ pkgs.lib.licenses.bsdOriginal ];
113 };
113 };
114 };
114 };
115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
116 name = "backports.shutil-get-terminal-size-1.0.0";
116 name = "backports.shutil-get-terminal-size-1.0.0";
117 doCheck = false;
117 doCheck = false;
118 src = fetchurl {
118 src = fetchurl {
119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
121 };
121 };
122 meta = {
122 meta = {
123 license = [ pkgs.lib.licenses.mit ];
123 license = [ pkgs.lib.licenses.mit ];
124 };
124 };
125 };
125 };
126 "beaker" = super.buildPythonPackage {
126 "beaker" = super.buildPythonPackage {
127 name = "beaker-1.9.1";
127 name = "beaker-1.9.1";
128 doCheck = false;
128 doCheck = false;
129 propagatedBuildInputs = [
129 propagatedBuildInputs = [
130 self."funcsigs"
130 self."funcsigs"
131 ];
131 ];
132 src = fetchurl {
132 src = fetchurl {
133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
135 };
135 };
136 meta = {
136 meta = {
137 license = [ pkgs.lib.licenses.bsdOriginal ];
137 license = [ pkgs.lib.licenses.bsdOriginal ];
138 };
138 };
139 };
139 };
140 "beautifulsoup4" = super.buildPythonPackage {
140 "beautifulsoup4" = super.buildPythonPackage {
141 name = "beautifulsoup4-4.6.3";
141 name = "beautifulsoup4-4.6.3";
142 doCheck = false;
142 doCheck = false;
143 src = fetchurl {
143 src = fetchurl {
144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
146 };
146 };
147 meta = {
147 meta = {
148 license = [ pkgs.lib.licenses.mit ];
148 license = [ pkgs.lib.licenses.mit ];
149 };
149 };
150 };
150 };
151 "billiard" = super.buildPythonPackage {
151 "billiard" = super.buildPythonPackage {
152 name = "billiard-3.6.1.0";
152 name = "billiard-3.6.1.0";
153 doCheck = false;
153 doCheck = false;
154 src = fetchurl {
154 src = fetchurl {
155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
157 };
157 };
158 meta = {
158 meta = {
159 license = [ pkgs.lib.licenses.bsdOriginal ];
159 license = [ pkgs.lib.licenses.bsdOriginal ];
160 };
160 };
161 };
161 };
162 "bleach" = super.buildPythonPackage {
162 "bleach" = super.buildPythonPackage {
163 name = "bleach-3.1.3";
163 name = "bleach-3.1.3";
164 doCheck = false;
164 doCheck = false;
165 propagatedBuildInputs = [
165 propagatedBuildInputs = [
166 self."six"
166 self."six"
167 self."webencodings"
167 self."webencodings"
168 ];
168 ];
169 src = fetchurl {
169 src = fetchurl {
170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
172 };
172 };
173 meta = {
173 meta = {
174 license = [ pkgs.lib.licenses.asl20 ];
174 license = [ pkgs.lib.licenses.asl20 ];
175 };
175 };
176 };
176 };
177 "bumpversion" = super.buildPythonPackage {
177 "bumpversion" = super.buildPythonPackage {
178 name = "bumpversion-0.5.3";
178 name = "bumpversion-0.5.3";
179 doCheck = false;
179 doCheck = false;
180 src = fetchurl {
180 src = fetchurl {
181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
183 };
183 };
184 meta = {
184 meta = {
185 license = [ pkgs.lib.licenses.mit ];
185 license = [ pkgs.lib.licenses.mit ];
186 };
186 };
187 };
187 };
188 "cachetools" = super.buildPythonPackage {
188 "cachetools" = super.buildPythonPackage {
189 name = "cachetools-3.1.1";
189 name = "cachetools-3.1.1";
190 doCheck = false;
190 doCheck = false;
191 src = fetchurl {
191 src = fetchurl {
192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
194 };
194 };
195 meta = {
195 meta = {
196 license = [ pkgs.lib.licenses.mit ];
196 license = [ pkgs.lib.licenses.mit ];
197 };
197 };
198 };
198 };
199 "celery" = super.buildPythonPackage {
199 "celery" = super.buildPythonPackage {
200 name = "celery-4.3.0";
200 name = "celery-4.3.0";
201 doCheck = false;
201 doCheck = false;
202 propagatedBuildInputs = [
202 propagatedBuildInputs = [
203 self."pytz"
203 self."pytz"
204 self."billiard"
204 self."billiard"
205 self."kombu"
205 self."kombu"
206 self."vine"
206 self."vine"
207 ];
207 ];
208 src = fetchurl {
208 src = fetchurl {
209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
211 };
211 };
212 meta = {
212 meta = {
213 license = [ pkgs.lib.licenses.bsdOriginal ];
213 license = [ pkgs.lib.licenses.bsdOriginal ];
214 };
214 };
215 };
215 };
216 "certifi" = super.buildPythonPackage {
216 "certifi" = super.buildPythonPackage {
217 name = "certifi-2020.4.5.1";
217 name = "certifi-2020.4.5.1";
218 doCheck = false;
218 doCheck = false;
219 src = fetchurl {
219 src = fetchurl {
220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
222 };
222 };
223 meta = {
223 meta = {
224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
225 };
225 };
226 };
226 };
227 "cffi" = super.buildPythonPackage {
227 "cffi" = super.buildPythonPackage {
228 name = "cffi-1.12.3";
228 name = "cffi-1.12.3";
229 doCheck = false;
229 doCheck = false;
230 propagatedBuildInputs = [
230 propagatedBuildInputs = [
231 self."pycparser"
231 self."pycparser"
232 ];
232 ];
233 src = fetchurl {
233 src = fetchurl {
234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
236 };
236 };
237 meta = {
237 meta = {
238 license = [ pkgs.lib.licenses.mit ];
238 license = [ pkgs.lib.licenses.mit ];
239 };
239 };
240 };
240 };
241 "chameleon" = super.buildPythonPackage {
241 "chameleon" = super.buildPythonPackage {
242 name = "chameleon-2.24";
242 name = "chameleon-2.24";
243 doCheck = false;
243 doCheck = false;
244 src = fetchurl {
244 src = fetchurl {
245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
247 };
247 };
248 meta = {
248 meta = {
249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
250 };
250 };
251 };
251 };
252 "channelstream" = super.buildPythonPackage {
252 "channelstream" = super.buildPythonPackage {
253 name = "channelstream-0.6.14";
253 name = "channelstream-0.6.14";
254 doCheck = false;
254 doCheck = false;
255 propagatedBuildInputs = [
255 propagatedBuildInputs = [
256 self."gevent"
256 self."gevent"
257 self."ws4py"
257 self."ws4py"
258 self."marshmallow"
258 self."marshmallow"
259 self."python-dateutil"
259 self."python-dateutil"
260 self."pyramid"
260 self."pyramid"
261 self."pyramid-jinja2"
261 self."pyramid-jinja2"
262 self."pyramid-apispec"
262 self."pyramid-apispec"
263 self."itsdangerous"
263 self."itsdangerous"
264 self."requests"
264 self."requests"
265 self."six"
265 self."six"
266 ];
266 ];
267 src = fetchurl {
267 src = fetchurl {
268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
270 };
270 };
271 meta = {
271 meta = {
272 license = [ pkgs.lib.licenses.bsdOriginal ];
272 license = [ pkgs.lib.licenses.bsdOriginal ];
273 };
273 };
274 };
274 };
275 "chardet" = super.buildPythonPackage {
275 "chardet" = super.buildPythonPackage {
276 name = "chardet-3.0.4";
276 name = "chardet-3.0.4";
277 doCheck = false;
277 doCheck = false;
278 src = fetchurl {
278 src = fetchurl {
279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
281 };
281 };
282 meta = {
282 meta = {
283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
284 };
284 };
285 };
285 };
286 "click" = super.buildPythonPackage {
286 "click" = super.buildPythonPackage {
287 name = "click-7.0";
287 name = "click-7.0";
288 doCheck = false;
288 doCheck = false;
289 src = fetchurl {
289 src = fetchurl {
290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
292 };
292 };
293 meta = {
293 meta = {
294 license = [ pkgs.lib.licenses.bsdOriginal ];
294 license = [ pkgs.lib.licenses.bsdOriginal ];
295 };
295 };
296 };
296 };
297 "colander" = super.buildPythonPackage {
297 "colander" = super.buildPythonPackage {
298 name = "colander-1.7.0";
298 name = "colander-1.7.0";
299 doCheck = false;
299 doCheck = false;
300 propagatedBuildInputs = [
300 propagatedBuildInputs = [
301 self."translationstring"
301 self."translationstring"
302 self."iso8601"
302 self."iso8601"
303 self."enum34"
303 self."enum34"
304 ];
304 ];
305 src = fetchurl {
305 src = fetchurl {
306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
308 };
308 };
309 meta = {
309 meta = {
310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
311 };
311 };
312 };
312 };
313 "configobj" = super.buildPythonPackage {
313 "configobj" = super.buildPythonPackage {
314 name = "configobj-5.0.6";
314 name = "configobj-5.0.6";
315 doCheck = false;
315 doCheck = false;
316 propagatedBuildInputs = [
316 propagatedBuildInputs = [
317 self."six"
317 self."six"
318 ];
318 ];
319 src = fetchurl {
319 src = fetchurl {
320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
322 };
322 };
323 meta = {
323 meta = {
324 license = [ pkgs.lib.licenses.bsdOriginal ];
324 license = [ pkgs.lib.licenses.bsdOriginal ];
325 };
325 };
326 };
326 };
327 "configparser" = super.buildPythonPackage {
327 "configparser" = super.buildPythonPackage {
328 name = "configparser-4.0.2";
328 name = "configparser-4.0.2";
329 doCheck = false;
329 doCheck = false;
330 src = fetchurl {
330 src = fetchurl {
331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
333 };
333 };
334 meta = {
334 meta = {
335 license = [ pkgs.lib.licenses.mit ];
335 license = [ pkgs.lib.licenses.mit ];
336 };
336 };
337 };
337 };
338 "contextlib2" = super.buildPythonPackage {
338 "contextlib2" = super.buildPythonPackage {
339 name = "contextlib2-0.6.0.post1";
339 name = "contextlib2-0.6.0.post1";
340 doCheck = false;
340 doCheck = false;
341 src = fetchurl {
341 src = fetchurl {
342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
344 };
344 };
345 meta = {
345 meta = {
346 license = [ pkgs.lib.licenses.psfl ];
346 license = [ pkgs.lib.licenses.psfl ];
347 };
347 };
348 };
348 };
349 "cov-core" = super.buildPythonPackage {
349 "cov-core" = super.buildPythonPackage {
350 name = "cov-core-1.15.0";
350 name = "cov-core-1.15.0";
351 doCheck = false;
351 doCheck = false;
352 propagatedBuildInputs = [
352 propagatedBuildInputs = [
353 self."coverage"
353 self."coverage"
354 ];
354 ];
355 src = fetchurl {
355 src = fetchurl {
356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
358 };
358 };
359 meta = {
359 meta = {
360 license = [ pkgs.lib.licenses.mit ];
360 license = [ pkgs.lib.licenses.mit ];
361 };
361 };
362 };
362 };
363 "coverage" = super.buildPythonPackage {
363 "coverage" = super.buildPythonPackage {
364 name = "coverage-4.5.4";
364 name = "coverage-4.5.4";
365 doCheck = false;
365 doCheck = false;
366 src = fetchurl {
366 src = fetchurl {
367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
369 };
369 };
370 meta = {
370 meta = {
371 license = [ pkgs.lib.licenses.asl20 ];
371 license = [ pkgs.lib.licenses.asl20 ];
372 };
372 };
373 };
373 };
374 "cryptography" = super.buildPythonPackage {
374 "cryptography" = super.buildPythonPackage {
375 name = "cryptography-2.6.1";
375 name = "cryptography-2.6.1";
376 doCheck = false;
376 doCheck = false;
377 propagatedBuildInputs = [
377 propagatedBuildInputs = [
378 self."asn1crypto"
378 self."asn1crypto"
379 self."six"
379 self."six"
380 self."cffi"
380 self."cffi"
381 self."enum34"
381 self."enum34"
382 self."ipaddress"
382 self."ipaddress"
383 ];
383 ];
384 src = fetchurl {
384 src = fetchurl {
385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
387 };
387 };
388 meta = {
388 meta = {
389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
390 };
390 };
391 };
391 };
392 "cssselect" = super.buildPythonPackage {
392 "cssselect" = super.buildPythonPackage {
393 name = "cssselect-1.0.3";
393 name = "cssselect-1.0.3";
394 doCheck = false;
394 doCheck = false;
395 src = fetchurl {
395 src = fetchurl {
396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
398 };
398 };
399 meta = {
399 meta = {
400 license = [ pkgs.lib.licenses.bsdOriginal ];
400 license = [ pkgs.lib.licenses.bsdOriginal ];
401 };
401 };
402 };
402 };
403 "cssutils" = super.buildPythonPackage {
403 "cssutils" = super.buildPythonPackage {
404 name = "cssutils-1.0.2";
404 name = "cssutils-1.0.2";
405 doCheck = false;
405 doCheck = false;
406 src = fetchurl {
406 src = fetchurl {
407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
409 };
409 };
410 meta = {
410 meta = {
411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
412 };
412 };
413 };
413 };
414 "decorator" = super.buildPythonPackage {
414 "decorator" = super.buildPythonPackage {
415 name = "decorator-4.1.2";
415 name = "decorator-4.1.2";
416 doCheck = false;
416 doCheck = false;
417 src = fetchurl {
417 src = fetchurl {
418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
420 };
420 };
421 meta = {
421 meta = {
422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
423 };
423 };
424 };
424 };
425 "deform" = super.buildPythonPackage {
425 "deform" = super.buildPythonPackage {
426 name = "deform-2.0.8";
426 name = "deform-2.0.8";
427 doCheck = false;
427 doCheck = false;
428 propagatedBuildInputs = [
428 propagatedBuildInputs = [
429 self."chameleon"
429 self."chameleon"
430 self."colander"
430 self."colander"
431 self."iso8601"
431 self."iso8601"
432 self."peppercorn"
432 self."peppercorn"
433 self."translationstring"
433 self."translationstring"
434 self."zope.deprecation"
434 self."zope.deprecation"
435 ];
435 ];
436 src = fetchurl {
436 src = fetchurl {
437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
439 };
439 };
440 meta = {
440 meta = {
441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
442 };
442 };
443 };
443 };
444 "defusedxml" = super.buildPythonPackage {
444 "defusedxml" = super.buildPythonPackage {
445 name = "defusedxml-0.6.0";
445 name = "defusedxml-0.6.0";
446 doCheck = false;
446 doCheck = false;
447 src = fetchurl {
447 src = fetchurl {
448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
450 };
450 };
451 meta = {
451 meta = {
452 license = [ pkgs.lib.licenses.psfl ];
452 license = [ pkgs.lib.licenses.psfl ];
453 };
453 };
454 };
454 };
455 "dm.xmlsec.binding" = super.buildPythonPackage {
455 "dm.xmlsec.binding" = super.buildPythonPackage {
456 name = "dm.xmlsec.binding-1.3.7";
456 name = "dm.xmlsec.binding-1.3.7";
457 doCheck = false;
457 doCheck = false;
458 propagatedBuildInputs = [
458 propagatedBuildInputs = [
459 self."setuptools"
459 self."setuptools"
460 self."lxml"
460 self."lxml"
461 ];
461 ];
462 src = fetchurl {
462 src = fetchurl {
463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
465 };
465 };
466 meta = {
466 meta = {
467 license = [ pkgs.lib.licenses.bsdOriginal ];
467 license = [ pkgs.lib.licenses.bsdOriginal ];
468 };
468 };
469 };
469 };
470 "docutils" = super.buildPythonPackage {
470 "docutils" = super.buildPythonPackage {
471 name = "docutils-0.16";
471 name = "docutils-0.16";
472 doCheck = false;
472 doCheck = false;
473 src = fetchurl {
473 src = fetchurl {
474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
476 };
476 };
477 meta = {
477 meta = {
478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
479 };
479 };
480 };
480 };
481 "dogpile.cache" = super.buildPythonPackage {
481 "dogpile.cache" = super.buildPythonPackage {
482 name = "dogpile.cache-0.9.0";
482 name = "dogpile.cache-0.9.0";
483 doCheck = false;
483 doCheck = false;
484 propagatedBuildInputs = [
484 propagatedBuildInputs = [
485 self."decorator"
485 self."decorator"
486 ];
486 ];
487 src = fetchurl {
487 src = fetchurl {
488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
490 };
490 };
491 meta = {
491 meta = {
492 license = [ pkgs.lib.licenses.bsdOriginal ];
492 license = [ pkgs.lib.licenses.bsdOriginal ];
493 };
493 };
494 };
494 };
495 "dogpile.core" = super.buildPythonPackage {
495 "dogpile.core" = super.buildPythonPackage {
496 name = "dogpile.core-0.4.1";
496 name = "dogpile.core-0.4.1";
497 doCheck = false;
497 doCheck = false;
498 src = fetchurl {
498 src = fetchurl {
499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
501 };
501 };
502 meta = {
502 meta = {
503 license = [ pkgs.lib.licenses.bsdOriginal ];
503 license = [ pkgs.lib.licenses.bsdOriginal ];
504 };
504 };
505 };
505 };
506 "ecdsa" = super.buildPythonPackage {
506 "ecdsa" = super.buildPythonPackage {
507 name = "ecdsa-0.13.2";
507 name = "ecdsa-0.13.2";
508 doCheck = false;
508 doCheck = false;
509 src = fetchurl {
509 src = fetchurl {
510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
512 };
512 };
513 meta = {
513 meta = {
514 license = [ pkgs.lib.licenses.mit ];
514 license = [ pkgs.lib.licenses.mit ];
515 };
515 };
516 };
516 };
517 "elasticsearch" = super.buildPythonPackage {
517 "elasticsearch" = super.buildPythonPackage {
518 name = "elasticsearch-6.3.1";
518 name = "elasticsearch-6.3.1";
519 doCheck = false;
519 doCheck = false;
520 propagatedBuildInputs = [
520 propagatedBuildInputs = [
521 self."urllib3"
521 self."urllib3"
522 ];
522 ];
523 src = fetchurl {
523 src = fetchurl {
524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
526 };
526 };
527 meta = {
527 meta = {
528 license = [ pkgs.lib.licenses.asl20 ];
528 license = [ pkgs.lib.licenses.asl20 ];
529 };
529 };
530 };
530 };
531 "elasticsearch-dsl" = super.buildPythonPackage {
531 "elasticsearch-dsl" = super.buildPythonPackage {
532 name = "elasticsearch-dsl-6.3.1";
532 name = "elasticsearch-dsl-6.3.1";
533 doCheck = false;
533 doCheck = false;
534 propagatedBuildInputs = [
534 propagatedBuildInputs = [
535 self."six"
535 self."six"
536 self."python-dateutil"
536 self."python-dateutil"
537 self."elasticsearch"
537 self."elasticsearch"
538 self."ipaddress"
538 self."ipaddress"
539 ];
539 ];
540 src = fetchurl {
540 src = fetchurl {
541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
543 };
543 };
544 meta = {
544 meta = {
545 license = [ pkgs.lib.licenses.asl20 ];
545 license = [ pkgs.lib.licenses.asl20 ];
546 };
546 };
547 };
547 };
548 "elasticsearch1" = super.buildPythonPackage {
548 "elasticsearch1" = super.buildPythonPackage {
549 name = "elasticsearch1-1.10.0";
549 name = "elasticsearch1-1.10.0";
550 doCheck = false;
550 doCheck = false;
551 propagatedBuildInputs = [
551 propagatedBuildInputs = [
552 self."urllib3"
552 self."urllib3"
553 ];
553 ];
554 src = fetchurl {
554 src = fetchurl {
555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
557 };
557 };
558 meta = {
558 meta = {
559 license = [ pkgs.lib.licenses.asl20 ];
559 license = [ pkgs.lib.licenses.asl20 ];
560 };
560 };
561 };
561 };
562 "elasticsearch1-dsl" = super.buildPythonPackage {
562 "elasticsearch1-dsl" = super.buildPythonPackage {
563 name = "elasticsearch1-dsl-0.0.12";
563 name = "elasticsearch1-dsl-0.0.12";
564 doCheck = false;
564 doCheck = false;
565 propagatedBuildInputs = [
565 propagatedBuildInputs = [
566 self."six"
566 self."six"
567 self."python-dateutil"
567 self."python-dateutil"
568 self."elasticsearch1"
568 self."elasticsearch1"
569 ];
569 ];
570 src = fetchurl {
570 src = fetchurl {
571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
573 };
573 };
574 meta = {
574 meta = {
575 license = [ pkgs.lib.licenses.asl20 ];
575 license = [ pkgs.lib.licenses.asl20 ];
576 };
576 };
577 };
577 };
578 "elasticsearch2" = super.buildPythonPackage {
578 "elasticsearch2" = super.buildPythonPackage {
579 name = "elasticsearch2-2.5.1";
579 name = "elasticsearch2-2.5.1";
580 doCheck = false;
580 doCheck = false;
581 propagatedBuildInputs = [
581 propagatedBuildInputs = [
582 self."urllib3"
582 self."urllib3"
583 ];
583 ];
584 src = fetchurl {
584 src = fetchurl {
585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
587 };
587 };
588 meta = {
588 meta = {
589 license = [ pkgs.lib.licenses.asl20 ];
589 license = [ pkgs.lib.licenses.asl20 ];
590 };
590 };
591 };
591 };
592 "entrypoints" = super.buildPythonPackage {
592 "entrypoints" = super.buildPythonPackage {
593 name = "entrypoints-0.2.2";
593 name = "entrypoints-0.2.2";
594 doCheck = false;
594 doCheck = false;
595 propagatedBuildInputs = [
595 propagatedBuildInputs = [
596 self."configparser"
596 self."configparser"
597 ];
597 ];
598 src = fetchurl {
598 src = fetchurl {
599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
601 };
601 };
602 meta = {
602 meta = {
603 license = [ pkgs.lib.licenses.mit ];
603 license = [ pkgs.lib.licenses.mit ];
604 };
604 };
605 };
605 };
606 "enum34" = super.buildPythonPackage {
606 "enum34" = super.buildPythonPackage {
607 name = "enum34-1.1.10";
607 name = "enum34-1.1.10";
608 doCheck = false;
608 doCheck = false;
609 src = fetchurl {
609 src = fetchurl {
610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
612 };
612 };
613 meta = {
613 meta = {
614 license = [ pkgs.lib.licenses.bsdOriginal ];
614 license = [ pkgs.lib.licenses.bsdOriginal ];
615 };
615 };
616 };
616 };
617 "formencode" = super.buildPythonPackage {
617 "formencode" = super.buildPythonPackage {
618 name = "formencode-1.2.4";
618 name = "formencode-1.2.4";
619 doCheck = false;
619 doCheck = false;
620 src = fetchurl {
620 src = fetchurl {
621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
623 };
623 };
624 meta = {
624 meta = {
625 license = [ pkgs.lib.licenses.psfl ];
625 license = [ pkgs.lib.licenses.psfl ];
626 };
626 };
627 };
627 };
628 "funcsigs" = super.buildPythonPackage {
628 "funcsigs" = super.buildPythonPackage {
629 name = "funcsigs-1.0.2";
629 name = "funcsigs-1.0.2";
630 doCheck = false;
630 doCheck = false;
631 src = fetchurl {
631 src = fetchurl {
632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
634 };
634 };
635 meta = {
635 meta = {
636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
637 };
637 };
638 };
638 };
639 "functools32" = super.buildPythonPackage {
639 "functools32" = super.buildPythonPackage {
640 name = "functools32-3.2.3.post2";
640 name = "functools32-3.2.3.post2";
641 doCheck = false;
641 doCheck = false;
642 src = fetchurl {
642 src = fetchurl {
643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
645 };
645 };
646 meta = {
646 meta = {
647 license = [ pkgs.lib.licenses.psfl ];
647 license = [ pkgs.lib.licenses.psfl ];
648 };
648 };
649 };
649 };
650 "future" = super.buildPythonPackage {
650 "future" = super.buildPythonPackage {
651 name = "future-0.14.3";
651 name = "future-0.14.3";
652 doCheck = false;
652 doCheck = false;
653 src = fetchurl {
653 src = fetchurl {
654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
656 };
656 };
657 meta = {
657 meta = {
658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
659 };
659 };
660 };
660 };
661 "futures" = super.buildPythonPackage {
661 "futures" = super.buildPythonPackage {
662 name = "futures-3.0.2";
662 name = "futures-3.0.2";
663 doCheck = false;
663 doCheck = false;
664 src = fetchurl {
664 src = fetchurl {
665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
667 };
667 };
668 meta = {
668 meta = {
669 license = [ pkgs.lib.licenses.bsdOriginal ];
669 license = [ pkgs.lib.licenses.bsdOriginal ];
670 };
670 };
671 };
671 };
672 "gevent" = super.buildPythonPackage {
672 "gevent" = super.buildPythonPackage {
673 name = "gevent-1.5.0";
673 name = "gevent-1.5.0";
674 doCheck = false;
674 doCheck = false;
675 propagatedBuildInputs = [
675 propagatedBuildInputs = [
676 self."greenlet"
676 self."greenlet"
677 ];
677 ];
678 src = fetchurl {
678 src = fetchurl {
679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
681 };
681 };
682 meta = {
682 meta = {
683 license = [ pkgs.lib.licenses.mit ];
683 license = [ pkgs.lib.licenses.mit ];
684 };
684 };
685 };
685 };
686 "gnureadline" = super.buildPythonPackage {
686 "gnureadline" = super.buildPythonPackage {
687 name = "gnureadline-6.3.8";
687 name = "gnureadline-6.3.8";
688 doCheck = false;
688 doCheck = false;
689 src = fetchurl {
689 src = fetchurl {
690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
692 };
692 };
693 meta = {
693 meta = {
694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
695 };
695 };
696 };
696 };
697 "gprof2dot" = super.buildPythonPackage {
697 "gprof2dot" = super.buildPythonPackage {
698 name = "gprof2dot-2017.9.19";
698 name = "gprof2dot-2017.9.19";
699 doCheck = false;
699 doCheck = false;
700 src = fetchurl {
700 src = fetchurl {
701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
703 };
703 };
704 meta = {
704 meta = {
705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
706 };
706 };
707 };
707 };
708 "greenlet" = super.buildPythonPackage {
708 "greenlet" = super.buildPythonPackage {
709 name = "greenlet-0.4.15";
709 name = "greenlet-0.4.15";
710 doCheck = false;
710 doCheck = false;
711 src = fetchurl {
711 src = fetchurl {
712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
714 };
714 };
715 meta = {
715 meta = {
716 license = [ pkgs.lib.licenses.mit ];
716 license = [ pkgs.lib.licenses.mit ];
717 };
717 };
718 };
718 };
719 "gunicorn" = super.buildPythonPackage {
719 "gunicorn" = super.buildPythonPackage {
720 name = "gunicorn-19.9.0";
720 name = "gunicorn-19.9.0";
721 doCheck = false;
721 doCheck = false;
722 src = fetchurl {
722 src = fetchurl {
723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
725 };
725 };
726 meta = {
726 meta = {
727 license = [ pkgs.lib.licenses.mit ];
727 license = [ pkgs.lib.licenses.mit ];
728 };
728 };
729 };
729 };
730 "hupper" = super.buildPythonPackage {
730 "hupper" = super.buildPythonPackage {
731 name = "hupper-1.10.2";
731 name = "hupper-1.10.2";
732 doCheck = false;
732 doCheck = false;
733 src = fetchurl {
733 src = fetchurl {
734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
736 };
736 };
737 meta = {
737 meta = {
738 license = [ pkgs.lib.licenses.mit ];
738 license = [ pkgs.lib.licenses.mit ];
739 };
739 };
740 };
740 };
741 "idna" = super.buildPythonPackage {
741 "idna" = super.buildPythonPackage {
742 name = "idna-2.8";
742 name = "idna-2.8";
743 doCheck = false;
743 doCheck = false;
744 src = fetchurl {
744 src = fetchurl {
745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
747 };
747 };
748 meta = {
748 meta = {
749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
750 };
750 };
751 };
751 };
752 "importlib-metadata" = super.buildPythonPackage {
752 "importlib-metadata" = super.buildPythonPackage {
753 name = "importlib-metadata-1.6.0";
753 name = "importlib-metadata-1.6.0";
754 doCheck = false;
754 doCheck = false;
755 propagatedBuildInputs = [
755 propagatedBuildInputs = [
756 self."zipp"
756 self."zipp"
757 self."pathlib2"
757 self."pathlib2"
758 self."contextlib2"
758 self."contextlib2"
759 self."configparser"
759 self."configparser"
760 ];
760 ];
761 src = fetchurl {
761 src = fetchurl {
762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
764 };
764 };
765 meta = {
765 meta = {
766 license = [ pkgs.lib.licenses.asl20 ];
766 license = [ pkgs.lib.licenses.asl20 ];
767 };
767 };
768 };
768 };
769 "infrae.cache" = super.buildPythonPackage {
769 "infrae.cache" = super.buildPythonPackage {
770 name = "infrae.cache-1.0.1";
770 name = "infrae.cache-1.0.1";
771 doCheck = false;
771 doCheck = false;
772 propagatedBuildInputs = [
772 propagatedBuildInputs = [
773 self."beaker"
773 self."beaker"
774 self."repoze.lru"
774 self."repoze.lru"
775 ];
775 ];
776 src = fetchurl {
776 src = fetchurl {
777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
779 };
779 };
780 meta = {
780 meta = {
781 license = [ pkgs.lib.licenses.zpl21 ];
781 license = [ pkgs.lib.licenses.zpl21 ];
782 };
782 };
783 };
783 };
784 "invoke" = super.buildPythonPackage {
784 "invoke" = super.buildPythonPackage {
785 name = "invoke-0.13.0";
785 name = "invoke-0.13.0";
786 doCheck = false;
786 doCheck = false;
787 src = fetchurl {
787 src = fetchurl {
788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
790 };
790 };
791 meta = {
791 meta = {
792 license = [ pkgs.lib.licenses.bsdOriginal ];
792 license = [ pkgs.lib.licenses.bsdOriginal ];
793 };
793 };
794 };
794 };
795 "ipaddress" = super.buildPythonPackage {
795 "ipaddress" = super.buildPythonPackage {
796 name = "ipaddress-1.0.23";
796 name = "ipaddress-1.0.23";
797 doCheck = false;
797 doCheck = false;
798 src = fetchurl {
798 src = fetchurl {
799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
801 };
801 };
802 meta = {
802 meta = {
803 license = [ pkgs.lib.licenses.psfl ];
803 license = [ pkgs.lib.licenses.psfl ];
804 };
804 };
805 };
805 };
806 "ipdb" = super.buildPythonPackage {
806 "ipdb" = super.buildPythonPackage {
807 name = "ipdb-0.13.2";
807 name = "ipdb-0.13.2";
808 doCheck = false;
808 doCheck = false;
809 propagatedBuildInputs = [
809 propagatedBuildInputs = [
810 self."setuptools"
810 self."setuptools"
811 self."ipython"
811 self."ipython"
812 ];
812 ];
813 src = fetchurl {
813 src = fetchurl {
814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
816 };
816 };
817 meta = {
817 meta = {
818 license = [ pkgs.lib.licenses.bsdOriginal ];
818 license = [ pkgs.lib.licenses.bsdOriginal ];
819 };
819 };
820 };
820 };
821 "ipython" = super.buildPythonPackage {
821 "ipython" = super.buildPythonPackage {
822 name = "ipython-5.1.0";
822 name = "ipython-5.1.0";
823 doCheck = false;
823 doCheck = false;
824 propagatedBuildInputs = [
824 propagatedBuildInputs = [
825 self."setuptools"
825 self."setuptools"
826 self."decorator"
826 self."decorator"
827 self."pickleshare"
827 self."pickleshare"
828 self."simplegeneric"
828 self."simplegeneric"
829 self."traitlets"
829 self."traitlets"
830 self."prompt-toolkit"
830 self."prompt-toolkit"
831 self."pygments"
831 self."pygments"
832 self."pexpect"
832 self."pexpect"
833 self."backports.shutil-get-terminal-size"
833 self."backports.shutil-get-terminal-size"
834 self."pathlib2"
834 self."pathlib2"
835 self."pexpect"
835 self."pexpect"
836 ];
836 ];
837 src = fetchurl {
837 src = fetchurl {
838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
840 };
840 };
841 meta = {
841 meta = {
842 license = [ pkgs.lib.licenses.bsdOriginal ];
842 license = [ pkgs.lib.licenses.bsdOriginal ];
843 };
843 };
844 };
844 };
845 "ipython-genutils" = super.buildPythonPackage {
845 "ipython-genutils" = super.buildPythonPackage {
846 name = "ipython-genutils-0.2.0";
846 name = "ipython-genutils-0.2.0";
847 doCheck = false;
847 doCheck = false;
848 src = fetchurl {
848 src = fetchurl {
849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
851 };
851 };
852 meta = {
852 meta = {
853 license = [ pkgs.lib.licenses.bsdOriginal ];
853 license = [ pkgs.lib.licenses.bsdOriginal ];
854 };
854 };
855 };
855 };
856 "iso8601" = super.buildPythonPackage {
856 "iso8601" = super.buildPythonPackage {
857 name = "iso8601-0.1.12";
857 name = "iso8601-0.1.12";
858 doCheck = false;
858 doCheck = false;
859 src = fetchurl {
859 src = fetchurl {
860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
862 };
862 };
863 meta = {
863 meta = {
864 license = [ pkgs.lib.licenses.mit ];
864 license = [ pkgs.lib.licenses.mit ];
865 };
865 };
866 };
866 };
867 "isodate" = super.buildPythonPackage {
867 "isodate" = super.buildPythonPackage {
868 name = "isodate-0.6.0";
868 name = "isodate-0.6.0";
869 doCheck = false;
869 doCheck = false;
870 propagatedBuildInputs = [
870 propagatedBuildInputs = [
871 self."six"
871 self."six"
872 ];
872 ];
873 src = fetchurl {
873 src = fetchurl {
874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
876 };
876 };
877 meta = {
877 meta = {
878 license = [ pkgs.lib.licenses.bsdOriginal ];
878 license = [ pkgs.lib.licenses.bsdOriginal ];
879 };
879 };
880 };
880 };
881 "itsdangerous" = super.buildPythonPackage {
881 "itsdangerous" = super.buildPythonPackage {
882 name = "itsdangerous-1.1.0";
882 name = "itsdangerous-1.1.0";
883 doCheck = false;
883 doCheck = false;
884 src = fetchurl {
884 src = fetchurl {
885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
887 };
887 };
888 meta = {
888 meta = {
889 license = [ pkgs.lib.licenses.bsdOriginal ];
889 license = [ pkgs.lib.licenses.bsdOriginal ];
890 };
890 };
891 };
891 };
892 "jinja2" = super.buildPythonPackage {
892 "jinja2" = super.buildPythonPackage {
893 name = "jinja2-2.9.6";
893 name = "jinja2-2.9.6";
894 doCheck = false;
894 doCheck = false;
895 propagatedBuildInputs = [
895 propagatedBuildInputs = [
896 self."markupsafe"
896 self."markupsafe"
897 ];
897 ];
898 src = fetchurl {
898 src = fetchurl {
899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
901 };
901 };
902 meta = {
902 meta = {
903 license = [ pkgs.lib.licenses.bsdOriginal ];
903 license = [ pkgs.lib.licenses.bsdOriginal ];
904 };
904 };
905 };
905 };
906 "jsonschema" = super.buildPythonPackage {
906 "jsonschema" = super.buildPythonPackage {
907 name = "jsonschema-2.6.0";
907 name = "jsonschema-2.6.0";
908 doCheck = false;
908 doCheck = false;
909 propagatedBuildInputs = [
909 propagatedBuildInputs = [
910 self."functools32"
910 self."functools32"
911 ];
911 ];
912 src = fetchurl {
912 src = fetchurl {
913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
915 };
915 };
916 meta = {
916 meta = {
917 license = [ pkgs.lib.licenses.mit ];
917 license = [ pkgs.lib.licenses.mit ];
918 };
918 };
919 };
919 };
920 "jupyter-client" = super.buildPythonPackage {
920 "jupyter-client" = super.buildPythonPackage {
921 name = "jupyter-client-5.0.0";
921 name = "jupyter-client-5.0.0";
922 doCheck = false;
922 doCheck = false;
923 propagatedBuildInputs = [
923 propagatedBuildInputs = [
924 self."traitlets"
924 self."traitlets"
925 self."jupyter-core"
925 self."jupyter-core"
926 self."pyzmq"
926 self."pyzmq"
927 self."python-dateutil"
927 self."python-dateutil"
928 ];
928 ];
929 src = fetchurl {
929 src = fetchurl {
930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
932 };
932 };
933 meta = {
933 meta = {
934 license = [ pkgs.lib.licenses.bsdOriginal ];
934 license = [ pkgs.lib.licenses.bsdOriginal ];
935 };
935 };
936 };
936 };
937 "jupyter-core" = super.buildPythonPackage {
937 "jupyter-core" = super.buildPythonPackage {
938 name = "jupyter-core-4.5.0";
938 name = "jupyter-core-4.5.0";
939 doCheck = false;
939 doCheck = false;
940 propagatedBuildInputs = [
940 propagatedBuildInputs = [
941 self."traitlets"
941 self."traitlets"
942 ];
942 ];
943 src = fetchurl {
943 src = fetchurl {
944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
946 };
946 };
947 meta = {
947 meta = {
948 license = [ pkgs.lib.licenses.bsdOriginal ];
948 license = [ pkgs.lib.licenses.bsdOriginal ];
949 };
949 };
950 };
950 };
951 "kombu" = super.buildPythonPackage {
951 "kombu" = super.buildPythonPackage {
952 name = "kombu-4.6.6";
952 name = "kombu-4.6.6";
953 doCheck = false;
953 doCheck = false;
954 propagatedBuildInputs = [
954 propagatedBuildInputs = [
955 self."amqp"
955 self."amqp"
956 self."importlib-metadata"
956 self."importlib-metadata"
957 ];
957 ];
958 src = fetchurl {
958 src = fetchurl {
959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
961 };
961 };
962 meta = {
962 meta = {
963 license = [ pkgs.lib.licenses.bsdOriginal ];
963 license = [ pkgs.lib.licenses.bsdOriginal ];
964 };
964 };
965 };
965 };
966 "lxml" = super.buildPythonPackage {
966 "lxml" = super.buildPythonPackage {
967 name = "lxml-4.2.5";
967 name = "lxml-4.2.5";
968 doCheck = false;
968 doCheck = false;
969 src = fetchurl {
969 src = fetchurl {
970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
972 };
972 };
973 meta = {
973 meta = {
974 license = [ pkgs.lib.licenses.bsdOriginal ];
974 license = [ pkgs.lib.licenses.bsdOriginal ];
975 };
975 };
976 };
976 };
977 "mako" = super.buildPythonPackage {
977 "mako" = super.buildPythonPackage {
978 name = "mako-1.1.0";
978 name = "mako-1.1.0";
979 doCheck = false;
979 doCheck = false;
980 propagatedBuildInputs = [
980 propagatedBuildInputs = [
981 self."markupsafe"
981 self."markupsafe"
982 ];
982 ];
983 src = fetchurl {
983 src = fetchurl {
984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
986 };
986 };
987 meta = {
987 meta = {
988 license = [ pkgs.lib.licenses.mit ];
988 license = [ pkgs.lib.licenses.mit ];
989 };
989 };
990 };
990 };
991 "markdown" = super.buildPythonPackage {
991 "markdown" = super.buildPythonPackage {
992 name = "markdown-2.6.11";
992 name = "markdown-2.6.11";
993 doCheck = false;
993 doCheck = false;
994 src = fetchurl {
994 src = fetchurl {
995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
997 };
997 };
998 meta = {
998 meta = {
999 license = [ pkgs.lib.licenses.bsdOriginal ];
999 license = [ pkgs.lib.licenses.bsdOriginal ];
1000 };
1000 };
1001 };
1001 };
1002 "markupsafe" = super.buildPythonPackage {
1002 "markupsafe" = super.buildPythonPackage {
1003 name = "markupsafe-1.1.1";
1003 name = "markupsafe-1.1.1";
1004 doCheck = false;
1004 doCheck = false;
1005 src = fetchurl {
1005 src = fetchurl {
1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1008 };
1008 };
1009 meta = {
1009 meta = {
1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1011 };
1011 };
1012 };
1012 };
1013 "marshmallow" = super.buildPythonPackage {
1013 "marshmallow" = super.buildPythonPackage {
1014 name = "marshmallow-2.18.0";
1014 name = "marshmallow-2.18.0";
1015 doCheck = false;
1015 doCheck = false;
1016 src = fetchurl {
1016 src = fetchurl {
1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1019 };
1019 };
1020 meta = {
1020 meta = {
1021 license = [ pkgs.lib.licenses.mit ];
1021 license = [ pkgs.lib.licenses.mit ];
1022 };
1022 };
1023 };
1023 };
1024 "mistune" = super.buildPythonPackage {
1024 "mistune" = super.buildPythonPackage {
1025 name = "mistune-0.8.4";
1025 name = "mistune-0.8.4";
1026 doCheck = false;
1026 doCheck = false;
1027 src = fetchurl {
1027 src = fetchurl {
1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1030 };
1030 };
1031 meta = {
1031 meta = {
1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1033 };
1033 };
1034 };
1034 };
1035 "mock" = super.buildPythonPackage {
1035 "mock" = super.buildPythonPackage {
1036 name = "mock-3.0.5";
1036 name = "mock-3.0.5";
1037 doCheck = false;
1037 doCheck = false;
1038 propagatedBuildInputs = [
1038 propagatedBuildInputs = [
1039 self."six"
1039 self."six"
1040 self."funcsigs"
1040 self."funcsigs"
1041 ];
1041 ];
1042 src = fetchurl {
1042 src = fetchurl {
1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1045 };
1045 };
1046 meta = {
1046 meta = {
1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1048 };
1048 };
1049 };
1049 };
1050 "more-itertools" = super.buildPythonPackage {
1050 "more-itertools" = super.buildPythonPackage {
1051 name = "more-itertools-5.0.0";
1051 name = "more-itertools-5.0.0";
1052 doCheck = false;
1052 doCheck = false;
1053 propagatedBuildInputs = [
1053 propagatedBuildInputs = [
1054 self."six"
1054 self."six"
1055 ];
1055 ];
1056 src = fetchurl {
1056 src = fetchurl {
1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1059 };
1059 };
1060 meta = {
1060 meta = {
1061 license = [ pkgs.lib.licenses.mit ];
1061 license = [ pkgs.lib.licenses.mit ];
1062 };
1062 };
1063 };
1063 };
1064 "msgpack-python" = super.buildPythonPackage {
1064 "msgpack-python" = super.buildPythonPackage {
1065 name = "msgpack-python-0.5.6";
1065 name = "msgpack-python-0.5.6";
1066 doCheck = false;
1066 doCheck = false;
1067 src = fetchurl {
1067 src = fetchurl {
1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1070 };
1070 };
1071 meta = {
1071 meta = {
1072 license = [ pkgs.lib.licenses.asl20 ];
1072 license = [ pkgs.lib.licenses.asl20 ];
1073 };
1073 };
1074 };
1074 };
1075 "mysql-python" = super.buildPythonPackage {
1075 "mysql-python" = super.buildPythonPackage {
1076 name = "mysql-python-1.2.5";
1076 name = "mysql-python-1.2.5";
1077 doCheck = false;
1077 doCheck = false;
1078 src = fetchurl {
1078 src = fetchurl {
1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1081 };
1081 };
1082 meta = {
1082 meta = {
1083 license = [ pkgs.lib.licenses.gpl1 ];
1083 license = [ pkgs.lib.licenses.gpl1 ];
1084 };
1084 };
1085 };
1085 };
1086 "nbconvert" = super.buildPythonPackage {
1086 "nbconvert" = super.buildPythonPackage {
1087 name = "nbconvert-5.3.1";
1087 name = "nbconvert-5.3.1";
1088 doCheck = false;
1088 doCheck = false;
1089 propagatedBuildInputs = [
1089 propagatedBuildInputs = [
1090 self."mistune"
1090 self."mistune"
1091 self."jinja2"
1091 self."jinja2"
1092 self."pygments"
1092 self."pygments"
1093 self."traitlets"
1093 self."traitlets"
1094 self."jupyter-core"
1094 self."jupyter-core"
1095 self."nbformat"
1095 self."nbformat"
1096 self."entrypoints"
1096 self."entrypoints"
1097 self."bleach"
1097 self."bleach"
1098 self."pandocfilters"
1098 self."pandocfilters"
1099 self."testpath"
1099 self."testpath"
1100 ];
1100 ];
1101 src = fetchurl {
1101 src = fetchurl {
1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1104 };
1104 };
1105 meta = {
1105 meta = {
1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1107 };
1107 };
1108 };
1108 };
1109 "nbformat" = super.buildPythonPackage {
1109 "nbformat" = super.buildPythonPackage {
1110 name = "nbformat-4.4.0";
1110 name = "nbformat-4.4.0";
1111 doCheck = false;
1111 doCheck = false;
1112 propagatedBuildInputs = [
1112 propagatedBuildInputs = [
1113 self."ipython-genutils"
1113 self."ipython-genutils"
1114 self."traitlets"
1114 self."traitlets"
1115 self."jsonschema"
1115 self."jsonschema"
1116 self."jupyter-core"
1116 self."jupyter-core"
1117 ];
1117 ];
1118 src = fetchurl {
1118 src = fetchurl {
1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1121 };
1121 };
1122 meta = {
1122 meta = {
1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1124 };
1124 };
1125 };
1125 };
1126 "packaging" = super.buildPythonPackage {
1126 "packaging" = super.buildPythonPackage {
1127 name = "packaging-20.3";
1127 name = "packaging-20.3";
1128 doCheck = false;
1128 doCheck = false;
1129 propagatedBuildInputs = [
1129 propagatedBuildInputs = [
1130 self."pyparsing"
1130 self."pyparsing"
1131 self."six"
1131 self."six"
1132 ];
1132 ];
1133 src = fetchurl {
1133 src = fetchurl {
1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1136 };
1136 };
1137 meta = {
1137 meta = {
1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1139 };
1139 };
1140 };
1140 };
1141 "pandocfilters" = super.buildPythonPackage {
1141 "pandocfilters" = super.buildPythonPackage {
1142 name = "pandocfilters-1.4.2";
1142 name = "pandocfilters-1.4.2";
1143 doCheck = false;
1143 doCheck = false;
1144 src = fetchurl {
1144 src = fetchurl {
1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1147 };
1147 };
1148 meta = {
1148 meta = {
1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1150 };
1150 };
1151 };
1151 };
1152 "paste" = super.buildPythonPackage {
1152 "paste" = super.buildPythonPackage {
1153 name = "paste-3.4.0";
1153 name = "paste-3.4.0";
1154 doCheck = false;
1154 doCheck = false;
1155 propagatedBuildInputs = [
1155 propagatedBuildInputs = [
1156 self."six"
1156 self."six"
1157 ];
1157 ];
1158 src = fetchurl {
1158 src = fetchurl {
1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1161 };
1161 };
1162 meta = {
1162 meta = {
1163 license = [ pkgs.lib.licenses.mit ];
1163 license = [ pkgs.lib.licenses.mit ];
1164 };
1164 };
1165 };
1165 };
1166 "pastedeploy" = super.buildPythonPackage {
1166 "pastedeploy" = super.buildPythonPackage {
1167 name = "pastedeploy-2.1.0";
1167 name = "pastedeploy-2.1.0";
1168 doCheck = false;
1168 doCheck = false;
1169 src = fetchurl {
1169 src = fetchurl {
1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1172 };
1172 };
1173 meta = {
1173 meta = {
1174 license = [ pkgs.lib.licenses.mit ];
1174 license = [ pkgs.lib.licenses.mit ];
1175 };
1175 };
1176 };
1176 };
1177 "pastescript" = super.buildPythonPackage {
1177 "pastescript" = super.buildPythonPackage {
1178 name = "pastescript-3.2.0";
1178 name = "pastescript-3.2.0";
1179 doCheck = false;
1179 doCheck = false;
1180 propagatedBuildInputs = [
1180 propagatedBuildInputs = [
1181 self."paste"
1181 self."paste"
1182 self."pastedeploy"
1182 self."pastedeploy"
1183 self."six"
1183 self."six"
1184 ];
1184 ];
1185 src = fetchurl {
1185 src = fetchurl {
1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1188 };
1188 };
1189 meta = {
1189 meta = {
1190 license = [ pkgs.lib.licenses.mit ];
1190 license = [ pkgs.lib.licenses.mit ];
1191 };
1191 };
1192 };
1192 };
1193 "pathlib2" = super.buildPythonPackage {
1193 "pathlib2" = super.buildPythonPackage {
1194 name = "pathlib2-2.3.5";
1194 name = "pathlib2-2.3.5";
1195 doCheck = false;
1195 doCheck = false;
1196 propagatedBuildInputs = [
1196 propagatedBuildInputs = [
1197 self."six"
1197 self."six"
1198 self."scandir"
1198 self."scandir"
1199 ];
1199 ];
1200 src = fetchurl {
1200 src = fetchurl {
1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1203 };
1203 };
1204 meta = {
1204 meta = {
1205 license = [ pkgs.lib.licenses.mit ];
1205 license = [ pkgs.lib.licenses.mit ];
1206 };
1206 };
1207 };
1207 };
1208 "peppercorn" = super.buildPythonPackage {
1208 "peppercorn" = super.buildPythonPackage {
1209 name = "peppercorn-0.6";
1209 name = "peppercorn-0.6";
1210 doCheck = false;
1210 doCheck = false;
1211 src = fetchurl {
1211 src = fetchurl {
1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1214 };
1214 };
1215 meta = {
1215 meta = {
1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1217 };
1217 };
1218 };
1218 };
1219 "pexpect" = super.buildPythonPackage {
1219 "pexpect" = super.buildPythonPackage {
1220 name = "pexpect-4.8.0";
1220 name = "pexpect-4.8.0";
1221 doCheck = false;
1221 doCheck = false;
1222 propagatedBuildInputs = [
1222 propagatedBuildInputs = [
1223 self."ptyprocess"
1223 self."ptyprocess"
1224 ];
1224 ];
1225 src = fetchurl {
1225 src = fetchurl {
1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1228 };
1228 };
1229 meta = {
1229 meta = {
1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1231 };
1231 };
1232 };
1232 };
1233 "pickleshare" = super.buildPythonPackage {
1233 "pickleshare" = super.buildPythonPackage {
1234 name = "pickleshare-0.7.5";
1234 name = "pickleshare-0.7.5";
1235 doCheck = false;
1235 doCheck = false;
1236 propagatedBuildInputs = [
1236 propagatedBuildInputs = [
1237 self."pathlib2"
1237 self."pathlib2"
1238 ];
1238 ];
1239 src = fetchurl {
1239 src = fetchurl {
1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1242 };
1242 };
1243 meta = {
1243 meta = {
1244 license = [ pkgs.lib.licenses.mit ];
1244 license = [ pkgs.lib.licenses.mit ];
1245 };
1245 };
1246 };
1246 };
1247 "plaster" = super.buildPythonPackage {
1247 "plaster" = super.buildPythonPackage {
1248 name = "plaster-1.0";
1248 name = "plaster-1.0";
1249 doCheck = false;
1249 doCheck = false;
1250 propagatedBuildInputs = [
1250 propagatedBuildInputs = [
1251 self."setuptools"
1251 self."setuptools"
1252 ];
1252 ];
1253 src = fetchurl {
1253 src = fetchurl {
1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1256 };
1256 };
1257 meta = {
1257 meta = {
1258 license = [ pkgs.lib.licenses.mit ];
1258 license = [ pkgs.lib.licenses.mit ];
1259 };
1259 };
1260 };
1260 };
1261 "plaster-pastedeploy" = super.buildPythonPackage {
1261 "plaster-pastedeploy" = super.buildPythonPackage {
1262 name = "plaster-pastedeploy-0.7";
1262 name = "plaster-pastedeploy-0.7";
1263 doCheck = false;
1263 doCheck = false;
1264 propagatedBuildInputs = [
1264 propagatedBuildInputs = [
1265 self."pastedeploy"
1265 self."pastedeploy"
1266 self."plaster"
1266 self."plaster"
1267 ];
1267 ];
1268 src = fetchurl {
1268 src = fetchurl {
1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1271 };
1271 };
1272 meta = {
1272 meta = {
1273 license = [ pkgs.lib.licenses.mit ];
1273 license = [ pkgs.lib.licenses.mit ];
1274 };
1274 };
1275 };
1275 };
1276 "pluggy" = super.buildPythonPackage {
1276 "pluggy" = super.buildPythonPackage {
1277 name = "pluggy-0.13.1";
1277 name = "pluggy-0.13.1";
1278 doCheck = false;
1278 doCheck = false;
1279 propagatedBuildInputs = [
1279 propagatedBuildInputs = [
1280 self."importlib-metadata"
1280 self."importlib-metadata"
1281 ];
1281 ];
1282 src = fetchurl {
1282 src = fetchurl {
1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1285 };
1285 };
1286 meta = {
1286 meta = {
1287 license = [ pkgs.lib.licenses.mit ];
1287 license = [ pkgs.lib.licenses.mit ];
1288 };
1288 };
1289 };
1289 };
1290 "premailer" = super.buildPythonPackage {
1290 "premailer" = super.buildPythonPackage {
1291 name = "premailer-3.6.1";
1291 name = "premailer-3.6.1";
1292 doCheck = false;
1292 doCheck = false;
1293 propagatedBuildInputs = [
1293 propagatedBuildInputs = [
1294 self."lxml"
1294 self."lxml"
1295 self."cssselect"
1295 self."cssselect"
1296 self."cssutils"
1296 self."cssutils"
1297 self."requests"
1297 self."requests"
1298 self."cachetools"
1298 self."cachetools"
1299 ];
1299 ];
1300 src = fetchurl {
1300 src = fetchurl {
1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1303 };
1303 };
1304 meta = {
1304 meta = {
1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1306 };
1306 };
1307 };
1307 };
1308 "prompt-toolkit" = super.buildPythonPackage {
1308 "prompt-toolkit" = super.buildPythonPackage {
1309 name = "prompt-toolkit-1.0.18";
1309 name = "prompt-toolkit-1.0.18";
1310 doCheck = false;
1310 doCheck = false;
1311 propagatedBuildInputs = [
1311 propagatedBuildInputs = [
1312 self."six"
1312 self."six"
1313 self."wcwidth"
1313 self."wcwidth"
1314 ];
1314 ];
1315 src = fetchurl {
1315 src = fetchurl {
1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1318 };
1318 };
1319 meta = {
1319 meta = {
1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1321 };
1321 };
1322 };
1322 };
1323 "psutil" = super.buildPythonPackage {
1323 "psutil" = super.buildPythonPackage {
1324 name = "psutil-5.7.0";
1324 name = "psutil-5.7.0";
1325 doCheck = false;
1325 doCheck = false;
1326 src = fetchurl {
1326 src = fetchurl {
1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1329 };
1329 };
1330 meta = {
1330 meta = {
1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1332 };
1332 };
1333 };
1333 };
1334 "psycopg2" = super.buildPythonPackage {
1334 "psycopg2" = super.buildPythonPackage {
1335 name = "psycopg2-2.8.4";
1335 name = "psycopg2-2.8.4";
1336 doCheck = false;
1336 doCheck = false;
1337 src = fetchurl {
1337 src = fetchurl {
1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1340 };
1340 };
1341 meta = {
1341 meta = {
1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1343 };
1343 };
1344 };
1344 };
1345 "ptyprocess" = super.buildPythonPackage {
1345 "ptyprocess" = super.buildPythonPackage {
1346 name = "ptyprocess-0.6.0";
1346 name = "ptyprocess-0.6.0";
1347 doCheck = false;
1347 doCheck = false;
1348 src = fetchurl {
1348 src = fetchurl {
1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1351 };
1351 };
1352 meta = {
1352 meta = {
1353 license = [ ];
1353 license = [ ];
1354 };
1354 };
1355 };
1355 };
1356 "py" = super.buildPythonPackage {
1356 "py" = super.buildPythonPackage {
1357 name = "py-1.8.0";
1357 name = "py-1.8.0";
1358 doCheck = false;
1358 doCheck = false;
1359 src = fetchurl {
1359 src = fetchurl {
1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1362 };
1362 };
1363 meta = {
1363 meta = {
1364 license = [ pkgs.lib.licenses.mit ];
1364 license = [ pkgs.lib.licenses.mit ];
1365 };
1365 };
1366 };
1366 };
1367 "py-bcrypt" = super.buildPythonPackage {
1367 "py-bcrypt" = super.buildPythonPackage {
1368 name = "py-bcrypt-0.4";
1368 name = "py-bcrypt-0.4";
1369 doCheck = false;
1369 doCheck = false;
1370 src = fetchurl {
1370 src = fetchurl {
1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1373 };
1373 };
1374 meta = {
1374 meta = {
1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1376 };
1376 };
1377 };
1377 };
1378 "py-gfm" = super.buildPythonPackage {
1378 "py-gfm" = super.buildPythonPackage {
1379 name = "py-gfm-0.1.4";
1379 name = "py-gfm-0.1.4";
1380 doCheck = false;
1380 doCheck = false;
1381 propagatedBuildInputs = [
1381 propagatedBuildInputs = [
1382 self."setuptools"
1382 self."setuptools"
1383 self."markdown"
1383 self."markdown"
1384 ];
1384 ];
1385 src = fetchurl {
1385 src = fetchurl {
1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1388 };
1388 };
1389 meta = {
1389 meta = {
1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1391 };
1391 };
1392 };
1392 };
1393 "pyasn1" = super.buildPythonPackage {
1393 "pyasn1" = super.buildPythonPackage {
1394 name = "pyasn1-0.4.8";
1394 name = "pyasn1-0.4.8";
1395 doCheck = false;
1395 doCheck = false;
1396 src = fetchurl {
1396 src = fetchurl {
1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1399 };
1399 };
1400 meta = {
1400 meta = {
1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1402 };
1402 };
1403 };
1403 };
1404 "pyasn1-modules" = super.buildPythonPackage {
1404 "pyasn1-modules" = super.buildPythonPackage {
1405 name = "pyasn1-modules-0.2.6";
1405 name = "pyasn1-modules-0.2.6";
1406 doCheck = false;
1406 doCheck = false;
1407 propagatedBuildInputs = [
1407 propagatedBuildInputs = [
1408 self."pyasn1"
1408 self."pyasn1"
1409 ];
1409 ];
1410 src = fetchurl {
1410 src = fetchurl {
1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1413 };
1413 };
1414 meta = {
1414 meta = {
1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1416 };
1416 };
1417 };
1417 };
1418 "pycparser" = super.buildPythonPackage {
1418 "pycparser" = super.buildPythonPackage {
1419 name = "pycparser-2.20";
1419 name = "pycparser-2.20";
1420 doCheck = false;
1420 doCheck = false;
1421 src = fetchurl {
1421 src = fetchurl {
1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1424 };
1424 };
1425 meta = {
1425 meta = {
1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1427 };
1427 };
1428 };
1428 };
1429 "pycrypto" = super.buildPythonPackage {
1429 "pycrypto" = super.buildPythonPackage {
1430 name = "pycrypto-2.6.1";
1430 name = "pycrypto-2.6.1";
1431 doCheck = false;
1431 doCheck = false;
1432 src = fetchurl {
1432 src = fetchurl {
1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1435 };
1435 };
1436 meta = {
1436 meta = {
1437 license = [ pkgs.lib.licenses.publicDomain ];
1437 license = [ pkgs.lib.licenses.publicDomain ];
1438 };
1438 };
1439 };
1439 };
1440 "pycurl" = super.buildPythonPackage {
1440 "pycurl" = super.buildPythonPackage {
1441 name = "pycurl-7.43.0.3";
1441 name = "pycurl-7.43.0.3";
1442 doCheck = false;
1442 doCheck = false;
1443 src = fetchurl {
1443 src = fetchurl {
1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1446 };
1446 };
1447 meta = {
1447 meta = {
1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1449 };
1449 };
1450 };
1450 };
1451 "pygments" = super.buildPythonPackage {
1451 "pygments" = super.buildPythonPackage {
1452 name = "pygments-2.4.2";
1452 name = "pygments-2.4.2";
1453 doCheck = false;
1453 doCheck = false;
1454 src = fetchurl {
1454 src = fetchurl {
1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1457 };
1457 };
1458 meta = {
1458 meta = {
1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1460 };
1460 };
1461 };
1461 };
1462 "pymysql" = super.buildPythonPackage {
1462 "pymysql" = super.buildPythonPackage {
1463 name = "pymysql-0.8.1";
1463 name = "pymysql-0.8.1";
1464 doCheck = false;
1464 doCheck = false;
1465 src = fetchurl {
1465 src = fetchurl {
1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1468 };
1468 };
1469 meta = {
1469 meta = {
1470 license = [ pkgs.lib.licenses.mit ];
1470 license = [ pkgs.lib.licenses.mit ];
1471 };
1471 };
1472 };
1472 };
1473 "pyotp" = super.buildPythonPackage {
1473 "pyotp" = super.buildPythonPackage {
1474 name = "pyotp-2.3.0";
1474 name = "pyotp-2.3.0";
1475 doCheck = false;
1475 doCheck = false;
1476 src = fetchurl {
1476 src = fetchurl {
1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1479 };
1479 };
1480 meta = {
1480 meta = {
1481 license = [ pkgs.lib.licenses.mit ];
1481 license = [ pkgs.lib.licenses.mit ];
1482 };
1482 };
1483 };
1483 };
1484 "pyparsing" = super.buildPythonPackage {
1484 "pyparsing" = super.buildPythonPackage {
1485 name = "pyparsing-2.4.7";
1485 name = "pyparsing-2.4.7";
1486 doCheck = false;
1486 doCheck = false;
1487 src = fetchurl {
1487 src = fetchurl {
1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1490 };
1490 };
1491 meta = {
1491 meta = {
1492 license = [ pkgs.lib.licenses.mit ];
1492 license = [ pkgs.lib.licenses.mit ];
1493 };
1493 };
1494 };
1494 };
1495 "pyramid" = super.buildPythonPackage {
1495 "pyramid" = super.buildPythonPackage {
1496 name = "pyramid-1.10.4";
1496 name = "pyramid-1.10.4";
1497 doCheck = false;
1497 doCheck = false;
1498 propagatedBuildInputs = [
1498 propagatedBuildInputs = [
1499 self."hupper"
1499 self."hupper"
1500 self."plaster"
1500 self."plaster"
1501 self."plaster-pastedeploy"
1501 self."plaster-pastedeploy"
1502 self."setuptools"
1502 self."setuptools"
1503 self."translationstring"
1503 self."translationstring"
1504 self."venusian"
1504 self."venusian"
1505 self."webob"
1505 self."webob"
1506 self."zope.deprecation"
1506 self."zope.deprecation"
1507 self."zope.interface"
1507 self."zope.interface"
1508 self."repoze.lru"
1508 self."repoze.lru"
1509 ];
1509 ];
1510 src = fetchurl {
1510 src = fetchurl {
1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1513 };
1513 };
1514 meta = {
1514 meta = {
1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1516 };
1516 };
1517 };
1517 };
1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1519 name = "pyramid-debugtoolbar-4.6.1";
1519 name = "pyramid-debugtoolbar-4.6.1";
1520 doCheck = false;
1520 doCheck = false;
1521 propagatedBuildInputs = [
1521 propagatedBuildInputs = [
1522 self."pyramid"
1522 self."pyramid"
1523 self."pyramid-mako"
1523 self."pyramid-mako"
1524 self."repoze.lru"
1524 self."repoze.lru"
1525 self."pygments"
1525 self."pygments"
1526 self."ipaddress"
1526 self."ipaddress"
1527 ];
1527 ];
1528 src = fetchurl {
1528 src = fetchurl {
1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1531 };
1531 };
1532 meta = {
1532 meta = {
1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1534 };
1534 };
1535 };
1535 };
1536 "pyramid-jinja2" = super.buildPythonPackage {
1536 "pyramid-jinja2" = super.buildPythonPackage {
1537 name = "pyramid-jinja2-2.7";
1537 name = "pyramid-jinja2-2.7";
1538 doCheck = false;
1538 doCheck = false;
1539 propagatedBuildInputs = [
1539 propagatedBuildInputs = [
1540 self."pyramid"
1540 self."pyramid"
1541 self."zope.deprecation"
1541 self."zope.deprecation"
1542 self."jinja2"
1542 self."jinja2"
1543 self."markupsafe"
1543 self."markupsafe"
1544 ];
1544 ];
1545 src = fetchurl {
1545 src = fetchurl {
1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1548 };
1548 };
1549 meta = {
1549 meta = {
1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1551 };
1551 };
1552 };
1552 };
1553 "pyramid-apispec" = super.buildPythonPackage {
1553 "pyramid-apispec" = super.buildPythonPackage {
1554 name = "pyramid-apispec-0.3.2";
1554 name = "pyramid-apispec-0.3.2";
1555 doCheck = false;
1555 doCheck = false;
1556 propagatedBuildInputs = [
1556 propagatedBuildInputs = [
1557 self."apispec"
1557 self."apispec"
1558 ];
1558 ];
1559 src = fetchurl {
1559 src = fetchurl {
1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1562 };
1562 };
1563 meta = {
1563 meta = {
1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1565 };
1565 };
1566 };
1566 };
1567 "pyramid-mailer" = super.buildPythonPackage {
1567 "pyramid-mailer" = super.buildPythonPackage {
1568 name = "pyramid-mailer-0.15.1";
1568 name = "pyramid-mailer-0.15.1";
1569 doCheck = false;
1569 doCheck = false;
1570 propagatedBuildInputs = [
1570 propagatedBuildInputs = [
1571 self."pyramid"
1571 self."pyramid"
1572 self."repoze.sendmail"
1572 self."repoze.sendmail"
1573 self."transaction"
1573 self."transaction"
1574 ];
1574 ];
1575 src = fetchurl {
1575 src = fetchurl {
1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1578 };
1578 };
1579 meta = {
1579 meta = {
1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1581 };
1581 };
1582 };
1582 };
1583 "pyramid-mako" = super.buildPythonPackage {
1583 "pyramid-mako" = super.buildPythonPackage {
1584 name = "pyramid-mako-1.1.0";
1584 name = "pyramid-mako-1.1.0";
1585 doCheck = false;
1585 doCheck = false;
1586 propagatedBuildInputs = [
1586 propagatedBuildInputs = [
1587 self."pyramid"
1587 self."pyramid"
1588 self."mako"
1588 self."mako"
1589 ];
1589 ];
1590 src = fetchurl {
1590 src = fetchurl {
1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1593 };
1593 };
1594 meta = {
1594 meta = {
1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1596 };
1596 };
1597 };
1597 };
1598 "pysqlite" = super.buildPythonPackage {
1598 "pysqlite" = super.buildPythonPackage {
1599 name = "pysqlite-2.8.3";
1599 name = "pysqlite-2.8.3";
1600 doCheck = false;
1600 doCheck = false;
1601 src = fetchurl {
1601 src = fetchurl {
1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1604 };
1604 };
1605 meta = {
1605 meta = {
1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1607 };
1607 };
1608 };
1608 };
1609 "pytest" = super.buildPythonPackage {
1609 "pytest" = super.buildPythonPackage {
1610 name = "pytest-4.6.5";
1610 name = "pytest-4.6.5";
1611 doCheck = false;
1611 doCheck = false;
1612 propagatedBuildInputs = [
1612 propagatedBuildInputs = [
1613 self."py"
1613 self."py"
1614 self."six"
1614 self."six"
1615 self."packaging"
1615 self."packaging"
1616 self."attrs"
1616 self."attrs"
1617 self."atomicwrites"
1617 self."atomicwrites"
1618 self."pluggy"
1618 self."pluggy"
1619 self."importlib-metadata"
1619 self."importlib-metadata"
1620 self."wcwidth"
1620 self."wcwidth"
1621 self."funcsigs"
1621 self."funcsigs"
1622 self."pathlib2"
1622 self."pathlib2"
1623 self."more-itertools"
1623 self."more-itertools"
1624 ];
1624 ];
1625 src = fetchurl {
1625 src = fetchurl {
1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1628 };
1628 };
1629 meta = {
1629 meta = {
1630 license = [ pkgs.lib.licenses.mit ];
1630 license = [ pkgs.lib.licenses.mit ];
1631 };
1631 };
1632 };
1632 };
1633 "pytest-cov" = super.buildPythonPackage {
1633 "pytest-cov" = super.buildPythonPackage {
1634 name = "pytest-cov-2.7.1";
1634 name = "pytest-cov-2.7.1";
1635 doCheck = false;
1635 doCheck = false;
1636 propagatedBuildInputs = [
1636 propagatedBuildInputs = [
1637 self."pytest"
1637 self."pytest"
1638 self."coverage"
1638 self."coverage"
1639 ];
1639 ];
1640 src = fetchurl {
1640 src = fetchurl {
1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1643 };
1643 };
1644 meta = {
1644 meta = {
1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1646 };
1646 };
1647 };
1647 };
1648 "pytest-profiling" = super.buildPythonPackage {
1648 "pytest-profiling" = super.buildPythonPackage {
1649 name = "pytest-profiling-1.7.0";
1649 name = "pytest-profiling-1.7.0";
1650 doCheck = false;
1650 doCheck = false;
1651 propagatedBuildInputs = [
1651 propagatedBuildInputs = [
1652 self."six"
1652 self."six"
1653 self."pytest"
1653 self."pytest"
1654 self."gprof2dot"
1654 self."gprof2dot"
1655 ];
1655 ];
1656 src = fetchurl {
1656 src = fetchurl {
1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1659 };
1659 };
1660 meta = {
1660 meta = {
1661 license = [ pkgs.lib.licenses.mit ];
1661 license = [ pkgs.lib.licenses.mit ];
1662 };
1662 };
1663 };
1663 };
1664 "pytest-runner" = super.buildPythonPackage {
1664 "pytest-runner" = super.buildPythonPackage {
1665 name = "pytest-runner-5.1";
1665 name = "pytest-runner-5.1";
1666 doCheck = false;
1666 doCheck = false;
1667 src = fetchurl {
1667 src = fetchurl {
1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1670 };
1670 };
1671 meta = {
1671 meta = {
1672 license = [ pkgs.lib.licenses.mit ];
1672 license = [ pkgs.lib.licenses.mit ];
1673 };
1673 };
1674 };
1674 };
1675 "pytest-sugar" = super.buildPythonPackage {
1675 "pytest-sugar" = super.buildPythonPackage {
1676 name = "pytest-sugar-0.9.2";
1676 name = "pytest-sugar-0.9.2";
1677 doCheck = false;
1677 doCheck = false;
1678 propagatedBuildInputs = [
1678 propagatedBuildInputs = [
1679 self."pytest"
1679 self."pytest"
1680 self."termcolor"
1680 self."termcolor"
1681 self."packaging"
1681 self."packaging"
1682 ];
1682 ];
1683 src = fetchurl {
1683 src = fetchurl {
1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1686 };
1686 };
1687 meta = {
1687 meta = {
1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1689 };
1689 };
1690 };
1690 };
1691 "pytest-timeout" = super.buildPythonPackage {
1691 "pytest-timeout" = super.buildPythonPackage {
1692 name = "pytest-timeout-1.3.3";
1692 name = "pytest-timeout-1.3.3";
1693 doCheck = false;
1693 doCheck = false;
1694 propagatedBuildInputs = [
1694 propagatedBuildInputs = [
1695 self."pytest"
1695 self."pytest"
1696 ];
1696 ];
1697 src = fetchurl {
1697 src = fetchurl {
1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1700 };
1700 };
1701 meta = {
1701 meta = {
1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1703 };
1703 };
1704 };
1704 };
1705 "python-dateutil" = super.buildPythonPackage {
1705 "python-dateutil" = super.buildPythonPackage {
1706 name = "python-dateutil-2.8.1";
1706 name = "python-dateutil-2.8.1";
1707 doCheck = false;
1707 doCheck = false;
1708 propagatedBuildInputs = [
1708 propagatedBuildInputs = [
1709 self."six"
1709 self."six"
1710 ];
1710 ];
1711 src = fetchurl {
1711 src = fetchurl {
1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1714 };
1714 };
1715 meta = {
1715 meta = {
1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1717 };
1717 };
1718 };
1718 };
1719 "python-editor" = super.buildPythonPackage {
1719 "python-editor" = super.buildPythonPackage {
1720 name = "python-editor-1.0.4";
1720 name = "python-editor-1.0.4";
1721 doCheck = false;
1721 doCheck = false;
1722 src = fetchurl {
1722 src = fetchurl {
1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1725 };
1725 };
1726 meta = {
1726 meta = {
1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1728 };
1728 };
1729 };
1729 };
1730 "python-ldap" = super.buildPythonPackage {
1730 "python-ldap" = super.buildPythonPackage {
1731 name = "python-ldap-3.2.0";
1731 name = "python-ldap-3.2.0";
1732 doCheck = false;
1732 doCheck = false;
1733 propagatedBuildInputs = [
1733 propagatedBuildInputs = [
1734 self."pyasn1"
1734 self."pyasn1"
1735 self."pyasn1-modules"
1735 self."pyasn1-modules"
1736 ];
1736 ];
1737 src = fetchurl {
1737 src = fetchurl {
1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1740 };
1740 };
1741 meta = {
1741 meta = {
1742 license = [ pkgs.lib.licenses.psfl ];
1742 license = [ pkgs.lib.licenses.psfl ];
1743 };
1743 };
1744 };
1744 };
1745 "python-memcached" = super.buildPythonPackage {
1745 "python-memcached" = super.buildPythonPackage {
1746 name = "python-memcached-1.59";
1746 name = "python-memcached-1.59";
1747 doCheck = false;
1747 doCheck = false;
1748 propagatedBuildInputs = [
1748 propagatedBuildInputs = [
1749 self."six"
1749 self."six"
1750 ];
1750 ];
1751 src = fetchurl {
1751 src = fetchurl {
1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1754 };
1754 };
1755 meta = {
1755 meta = {
1756 license = [ pkgs.lib.licenses.psfl ];
1756 license = [ pkgs.lib.licenses.psfl ];
1757 };
1757 };
1758 };
1758 };
1759 "python-pam" = super.buildPythonPackage {
1759 "python-pam" = super.buildPythonPackage {
1760 name = "python-pam-1.8.4";
1760 name = "python-pam-1.8.4";
1761 doCheck = false;
1761 doCheck = false;
1762 src = fetchurl {
1762 src = fetchurl {
1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1765 };
1765 };
1766 meta = {
1766 meta = {
1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1768 };
1768 };
1769 };
1769 };
1770 "python-saml" = super.buildPythonPackage {
1770 "python-saml" = super.buildPythonPackage {
1771 name = "python-saml-2.4.2";
1771 name = "python-saml-2.4.2";
1772 doCheck = false;
1772 doCheck = false;
1773 propagatedBuildInputs = [
1773 propagatedBuildInputs = [
1774 self."dm.xmlsec.binding"
1774 self."dm.xmlsec.binding"
1775 self."isodate"
1775 self."isodate"
1776 self."defusedxml"
1776 self."defusedxml"
1777 ];
1777 ];
1778 src = fetchurl {
1778 src = fetchurl {
1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1781 };
1781 };
1782 meta = {
1782 meta = {
1783 license = [ pkgs.lib.licenses.mit ];
1783 license = [ pkgs.lib.licenses.mit ];
1784 };
1784 };
1785 };
1785 };
1786 "pytz" = super.buildPythonPackage {
1786 "pytz" = super.buildPythonPackage {
1787 name = "pytz-2019.3";
1787 name = "pytz-2019.3";
1788 doCheck = false;
1788 doCheck = false;
1789 src = fetchurl {
1789 src = fetchurl {
1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1792 };
1792 };
1793 meta = {
1793 meta = {
1794 license = [ pkgs.lib.licenses.mit ];
1794 license = [ pkgs.lib.licenses.mit ];
1795 };
1795 };
1796 };
1796 };
1797 "pyzmq" = super.buildPythonPackage {
1797 "pyzmq" = super.buildPythonPackage {
1798 name = "pyzmq-14.6.0";
1798 name = "pyzmq-14.6.0";
1799 doCheck = false;
1799 doCheck = false;
1800 src = fetchurl {
1800 src = fetchurl {
1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1803 };
1803 };
1804 meta = {
1804 meta = {
1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1806 };
1806 };
1807 };
1807 };
1808 "PyYAML" = super.buildPythonPackage {
1808 "PyYAML" = super.buildPythonPackage {
1809 name = "PyYAML-5.3.1";
1809 name = "PyYAML-5.3.1";
1810 doCheck = false;
1810 doCheck = false;
1811 src = fetchurl {
1811 src = fetchurl {
1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1814 };
1814 };
1815 meta = {
1815 meta = {
1816 license = [ pkgs.lib.licenses.mit ];
1816 license = [ pkgs.lib.licenses.mit ];
1817 };
1817 };
1818 };
1818 };
1819 "regex" = super.buildPythonPackage {
1819 "regex" = super.buildPythonPackage {
1820 name = "regex-2020.9.27";
1820 name = "regex-2020.9.27";
1821 doCheck = false;
1821 doCheck = false;
1822 src = fetchurl {
1822 src = fetchurl {
1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1825 };
1825 };
1826 meta = {
1826 meta = {
1827 license = [ pkgs.lib.licenses.psfl ];
1827 license = [ pkgs.lib.licenses.psfl ];
1828 };
1828 };
1829 };
1829 };
1830 "redis" = super.buildPythonPackage {
1830 "redis" = super.buildPythonPackage {
1831 name = "redis-3.5.3";
1831 name = "redis-3.5.3";
1832 doCheck = false;
1832 doCheck = false;
1833 src = fetchurl {
1833 src = fetchurl {
1834 url = "https://files.pythonhosted.org/packages/b3/17/1e567ff78c83854e16b98694411fe6e08c3426af866ad11397cddceb80d3/redis-3.5.3.tar.gz";
1834 url = "https://files.pythonhosted.org/packages/b3/17/1e567ff78c83854e16b98694411fe6e08c3426af866ad11397cddceb80d3/redis-3.5.3.tar.gz";
1835 sha256 = "0e7e0cfca8660dea8b7d5cd8c4f6c5e29e11f31158c0b0ae91a397f00e5a05a2";
1835 sha256 = "0e7e0cfca8660dea8b7d5cd8c4f6c5e29e11f31158c0b0ae91a397f00e5a05a2";
1836 };
1836 };
1837 meta = {
1837 meta = {
1838 license = [ pkgs.lib.licenses.mit ];
1838 license = [ pkgs.lib.licenses.mit ];
1839 };
1839 };
1840 };
1840 };
1841 "repoze.lru" = super.buildPythonPackage {
1841 "repoze.lru" = super.buildPythonPackage {
1842 name = "repoze.lru-0.7";
1842 name = "repoze.lru-0.7";
1843 doCheck = false;
1843 doCheck = false;
1844 src = fetchurl {
1844 src = fetchurl {
1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1847 };
1847 };
1848 meta = {
1848 meta = {
1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1850 };
1850 };
1851 };
1851 };
1852 "repoze.sendmail" = super.buildPythonPackage {
1852 "repoze.sendmail" = super.buildPythonPackage {
1853 name = "repoze.sendmail-4.4.1";
1853 name = "repoze.sendmail-4.4.1";
1854 doCheck = false;
1854 doCheck = false;
1855 propagatedBuildInputs = [
1855 propagatedBuildInputs = [
1856 self."setuptools"
1856 self."setuptools"
1857 self."zope.interface"
1857 self."zope.interface"
1858 self."transaction"
1858 self."transaction"
1859 ];
1859 ];
1860 src = fetchurl {
1860 src = fetchurl {
1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1863 };
1863 };
1864 meta = {
1864 meta = {
1865 license = [ pkgs.lib.licenses.zpl21 ];
1865 license = [ pkgs.lib.licenses.zpl21 ];
1866 };
1866 };
1867 };
1867 };
1868 "requests" = super.buildPythonPackage {
1868 "requests" = super.buildPythonPackage {
1869 name = "requests-2.22.0";
1869 name = "requests-2.22.0";
1870 doCheck = false;
1870 doCheck = false;
1871 propagatedBuildInputs = [
1871 propagatedBuildInputs = [
1872 self."chardet"
1872 self."chardet"
1873 self."idna"
1873 self."idna"
1874 self."urllib3"
1874 self."urllib3"
1875 self."certifi"
1875 self."certifi"
1876 ];
1876 ];
1877 src = fetchurl {
1877 src = fetchurl {
1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1880 };
1880 };
1881 meta = {
1881 meta = {
1882 license = [ pkgs.lib.licenses.asl20 ];
1882 license = [ pkgs.lib.licenses.asl20 ];
1883 };
1883 };
1884 };
1884 };
1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1886 name = "rhodecode-enterprise-ce-4.24.1";
1886 name = "rhodecode-enterprise-ce-4.25.0";
1887 buildInputs = [
1887 buildInputs = [
1888 self."pytest"
1888 self."pytest"
1889 self."py"
1889 self."py"
1890 self."pytest-cov"
1890 self."pytest-cov"
1891 self."pytest-sugar"
1891 self."pytest-sugar"
1892 self."pytest-runner"
1892 self."pytest-runner"
1893 self."pytest-profiling"
1893 self."pytest-profiling"
1894 self."pytest-timeout"
1894 self."pytest-timeout"
1895 self."gprof2dot"
1895 self."gprof2dot"
1896 self."mock"
1896 self."mock"
1897 self."cov-core"
1897 self."cov-core"
1898 self."coverage"
1898 self."coverage"
1899 self."webtest"
1899 self."webtest"
1900 self."beautifulsoup4"
1900 self."beautifulsoup4"
1901 self."configobj"
1901 self."configobj"
1902 ];
1902 ];
1903 doCheck = true;
1903 doCheck = true;
1904 propagatedBuildInputs = [
1904 propagatedBuildInputs = [
1905 self."amqp"
1905 self."amqp"
1906 self."babel"
1906 self."babel"
1907 self."beaker"
1907 self."beaker"
1908 self."bleach"
1908 self."bleach"
1909 self."celery"
1909 self."celery"
1910 self."channelstream"
1910 self."channelstream"
1911 self."click"
1911 self."click"
1912 self."colander"
1912 self."colander"
1913 self."configobj"
1913 self."configobj"
1914 self."cssselect"
1914 self."cssselect"
1915 self."cryptography"
1915 self."cryptography"
1916 self."decorator"
1916 self."decorator"
1917 self."deform"
1917 self."deform"
1918 self."docutils"
1918 self."docutils"
1919 self."dogpile.cache"
1919 self."dogpile.cache"
1920 self."dogpile.core"
1920 self."dogpile.core"
1921 self."formencode"
1921 self."formencode"
1922 self."future"
1922 self."future"
1923 self."futures"
1923 self."futures"
1924 self."infrae.cache"
1924 self."infrae.cache"
1925 self."iso8601"
1925 self."iso8601"
1926 self."itsdangerous"
1926 self."itsdangerous"
1927 self."kombu"
1927 self."kombu"
1928 self."lxml"
1928 self."lxml"
1929 self."mako"
1929 self."mako"
1930 self."markdown"
1930 self."markdown"
1931 self."markupsafe"
1931 self."markupsafe"
1932 self."msgpack-python"
1932 self."msgpack-python"
1933 self."pyotp"
1933 self."pyotp"
1934 self."packaging"
1934 self."packaging"
1935 self."pathlib2"
1935 self."pathlib2"
1936 self."paste"
1936 self."paste"
1937 self."pastedeploy"
1937 self."pastedeploy"
1938 self."pastescript"
1938 self."pastescript"
1939 self."peppercorn"
1939 self."peppercorn"
1940 self."premailer"
1940 self."premailer"
1941 self."psutil"
1941 self."psutil"
1942 self."py-bcrypt"
1942 self."py-bcrypt"
1943 self."pycurl"
1943 self."pycurl"
1944 self."pycrypto"
1944 self."pycrypto"
1945 self."pygments"
1945 self."pygments"
1946 self."pyparsing"
1946 self."pyparsing"
1947 self."pyramid-debugtoolbar"
1947 self."pyramid-debugtoolbar"
1948 self."pyramid-mako"
1948 self."pyramid-mako"
1949 self."pyramid"
1949 self."pyramid"
1950 self."pyramid-mailer"
1950 self."pyramid-mailer"
1951 self."python-dateutil"
1951 self."python-dateutil"
1952 self."python-ldap"
1952 self."python-ldap"
1953 self."python-memcached"
1953 self."python-memcached"
1954 self."python-pam"
1954 self."python-pam"
1955 self."python-saml"
1955 self."python-saml"
1956 self."pytz"
1956 self."pytz"
1957 self."tzlocal"
1957 self."tzlocal"
1958 self."pyzmq"
1958 self."pyzmq"
1959 self."py-gfm"
1959 self."py-gfm"
1960 self."regex"
1960 self."regex"
1961 self."redis"
1961 self."redis"
1962 self."repoze.lru"
1962 self."repoze.lru"
1963 self."requests"
1963 self."requests"
1964 self."routes"
1964 self."routes"
1965 self."simplejson"
1965 self."simplejson"
1966 self."six"
1966 self."six"
1967 self."sqlalchemy"
1967 self."sqlalchemy"
1968 self."sshpubkeys"
1968 self."sshpubkeys"
1969 self."subprocess32"
1969 self."subprocess32"
1970 self."supervisor"
1970 self."supervisor"
1971 self."translationstring"
1971 self."translationstring"
1972 self."urllib3"
1972 self."urllib3"
1973 self."urlobject"
1973 self."urlobject"
1974 self."venusian"
1974 self."venusian"
1975 self."weberror"
1975 self."weberror"
1976 self."webhelpers2"
1976 self."webhelpers2"
1977 self."webob"
1977 self."webob"
1978 self."whoosh"
1978 self."whoosh"
1979 self."wsgiref"
1979 self."wsgiref"
1980 self."zope.cachedescriptors"
1980 self."zope.cachedescriptors"
1981 self."zope.deprecation"
1981 self."zope.deprecation"
1982 self."zope.event"
1982 self."zope.event"
1983 self."zope.interface"
1983 self."zope.interface"
1984 self."mysql-python"
1984 self."mysql-python"
1985 self."pymysql"
1985 self."pymysql"
1986 self."pysqlite"
1986 self."pysqlite"
1987 self."psycopg2"
1987 self."psycopg2"
1988 self."nbconvert"
1988 self."nbconvert"
1989 self."nbformat"
1989 self."nbformat"
1990 self."jupyter-client"
1990 self."jupyter-client"
1991 self."jupyter-core"
1991 self."jupyter-core"
1992 self."alembic"
1992 self."alembic"
1993 self."invoke"
1993 self."invoke"
1994 self."bumpversion"
1994 self."bumpversion"
1995 self."gevent"
1995 self."gevent"
1996 self."greenlet"
1996 self."greenlet"
1997 self."gunicorn"
1997 self."gunicorn"
1998 self."waitress"
1998 self."waitress"
1999 self."ipdb"
1999 self."ipdb"
2000 self."ipython"
2000 self."ipython"
2001 self."rhodecode-tools"
2001 self."rhodecode-tools"
2002 self."appenlight-client"
2002 self."appenlight-client"
2003 self."pytest"
2003 self."pytest"
2004 self."py"
2004 self."py"
2005 self."pytest-cov"
2005 self."pytest-cov"
2006 self."pytest-sugar"
2006 self."pytest-sugar"
2007 self."pytest-runner"
2007 self."pytest-runner"
2008 self."pytest-profiling"
2008 self."pytest-profiling"
2009 self."pytest-timeout"
2009 self."pytest-timeout"
2010 self."gprof2dot"
2010 self."gprof2dot"
2011 self."mock"
2011 self."mock"
2012 self."cov-core"
2012 self."cov-core"
2013 self."coverage"
2013 self."coverage"
2014 self."webtest"
2014 self."webtest"
2015 self."beautifulsoup4"
2015 self."beautifulsoup4"
2016 ];
2016 ];
2017 src = ./.;
2017 src = ./.;
2018 meta = {
2018 meta = {
2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2020 };
2020 };
2021 };
2021 };
2022 "rhodecode-tools" = super.buildPythonPackage {
2022 "rhodecode-tools" = super.buildPythonPackage {
2023 name = "rhodecode-tools-1.4.0";
2023 name = "rhodecode-tools-1.4.0";
2024 doCheck = false;
2024 doCheck = false;
2025 propagatedBuildInputs = [
2025 propagatedBuildInputs = [
2026 self."click"
2026 self."click"
2027 self."future"
2027 self."future"
2028 self."six"
2028 self."six"
2029 self."mako"
2029 self."mako"
2030 self."markupsafe"
2030 self."markupsafe"
2031 self."requests"
2031 self."requests"
2032 self."urllib3"
2032 self."urllib3"
2033 self."whoosh"
2033 self."whoosh"
2034 self."elasticsearch"
2034 self."elasticsearch"
2035 self."elasticsearch-dsl"
2035 self."elasticsearch-dsl"
2036 self."elasticsearch2"
2036 self."elasticsearch2"
2037 self."elasticsearch1-dsl"
2037 self."elasticsearch1-dsl"
2038 ];
2038 ];
2039 src = fetchurl {
2039 src = fetchurl {
2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2042 };
2042 };
2043 meta = {
2043 meta = {
2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2045 };
2045 };
2046 };
2046 };
2047 "routes" = super.buildPythonPackage {
2047 "routes" = super.buildPythonPackage {
2048 name = "routes-2.4.1";
2048 name = "routes-2.4.1";
2049 doCheck = false;
2049 doCheck = false;
2050 propagatedBuildInputs = [
2050 propagatedBuildInputs = [
2051 self."six"
2051 self."six"
2052 self."repoze.lru"
2052 self."repoze.lru"
2053 ];
2053 ];
2054 src = fetchurl {
2054 src = fetchurl {
2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2057 };
2057 };
2058 meta = {
2058 meta = {
2059 license = [ pkgs.lib.licenses.mit ];
2059 license = [ pkgs.lib.licenses.mit ];
2060 };
2060 };
2061 };
2061 };
2062 "scandir" = super.buildPythonPackage {
2062 "scandir" = super.buildPythonPackage {
2063 name = "scandir-1.10.0";
2063 name = "scandir-1.10.0";
2064 doCheck = false;
2064 doCheck = false;
2065 src = fetchurl {
2065 src = fetchurl {
2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2068 };
2068 };
2069 meta = {
2069 meta = {
2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2071 };
2071 };
2072 };
2072 };
2073 "setproctitle" = super.buildPythonPackage {
2073 "setproctitle" = super.buildPythonPackage {
2074 name = "setproctitle-1.1.10";
2074 name = "setproctitle-1.1.10";
2075 doCheck = false;
2075 doCheck = false;
2076 src = fetchurl {
2076 src = fetchurl {
2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2079 };
2079 };
2080 meta = {
2080 meta = {
2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2082 };
2082 };
2083 };
2083 };
2084 "setuptools" = super.buildPythonPackage {
2084 "setuptools" = super.buildPythonPackage {
2085 name = "setuptools-44.1.0";
2085 name = "setuptools-44.1.0";
2086 doCheck = false;
2086 doCheck = false;
2087 src = fetchurl {
2087 src = fetchurl {
2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2090 };
2090 };
2091 meta = {
2091 meta = {
2092 license = [ pkgs.lib.licenses.mit ];
2092 license = [ pkgs.lib.licenses.mit ];
2093 };
2093 };
2094 };
2094 };
2095 "setuptools-scm" = super.buildPythonPackage {
2096 name = "setuptools-scm-3.5.0";
2097 doCheck = false;
2098 src = fetchurl {
2099 url = "https://files.pythonhosted.org/packages/b2/f7/60a645aae001a2e06cf4b8db2fba9d9f36b8fd378f10647e3e218b61b74b/setuptools_scm-3.5.0.tar.gz";
2100 sha256 = "5bdf21a05792903cafe7ae0c9501182ab52497614fa6b1750d9dbae7b60c1a87";
2101 };
2102 meta = {
2103 license = [ pkgs.lib.licenses.psfl ];
2104 };
2105 };
2095 "simplegeneric" = super.buildPythonPackage {
2106 "simplegeneric" = super.buildPythonPackage {
2096 name = "simplegeneric-0.8.1";
2107 name = "simplegeneric-0.8.1";
2097 doCheck = false;
2108 doCheck = false;
2098 src = fetchurl {
2109 src = fetchurl {
2099 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2110 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2100 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2111 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2101 };
2112 };
2102 meta = {
2113 meta = {
2103 license = [ pkgs.lib.licenses.zpl21 ];
2114 license = [ pkgs.lib.licenses.zpl21 ];
2104 };
2115 };
2105 };
2116 };
2106 "simplejson" = super.buildPythonPackage {
2117 "simplejson" = super.buildPythonPackage {
2107 name = "simplejson-3.16.0";
2118 name = "simplejson-3.16.0";
2108 doCheck = false;
2119 doCheck = false;
2109 src = fetchurl {
2120 src = fetchurl {
2110 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2121 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2111 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2122 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2112 };
2123 };
2113 meta = {
2124 meta = {
2114 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2125 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2115 };
2126 };
2116 };
2127 };
2117 "six" = super.buildPythonPackage {
2128 "six" = super.buildPythonPackage {
2118 name = "six-1.11.0";
2129 name = "six-1.11.0";
2119 doCheck = false;
2130 doCheck = false;
2120 src = fetchurl {
2131 src = fetchurl {
2121 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2132 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2122 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2133 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2123 };
2134 };
2124 meta = {
2135 meta = {
2125 license = [ pkgs.lib.licenses.mit ];
2136 license = [ pkgs.lib.licenses.mit ];
2126 };
2137 };
2127 };
2138 };
2128 "sqlalchemy" = super.buildPythonPackage {
2139 "sqlalchemy" = super.buildPythonPackage {
2129 name = "sqlalchemy-1.3.15";
2140 name = "sqlalchemy-1.3.15";
2130 doCheck = false;
2141 doCheck = false;
2131 src = fetchurl {
2142 src = fetchurl {
2132 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2143 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2133 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2144 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2134 };
2145 };
2135 meta = {
2146 meta = {
2136 license = [ pkgs.lib.licenses.mit ];
2147 license = [ pkgs.lib.licenses.mit ];
2137 };
2148 };
2138 };
2149 };
2139 "sshpubkeys" = super.buildPythonPackage {
2150 "sshpubkeys" = super.buildPythonPackage {
2140 name = "sshpubkeys-3.1.0";
2151 name = "sshpubkeys-3.1.0";
2141 doCheck = false;
2152 doCheck = false;
2142 propagatedBuildInputs = [
2153 propagatedBuildInputs = [
2143 self."cryptography"
2154 self."cryptography"
2144 self."ecdsa"
2155 self."ecdsa"
2145 ];
2156 ];
2146 src = fetchurl {
2157 src = fetchurl {
2147 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2158 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2148 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2159 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2149 };
2160 };
2150 meta = {
2161 meta = {
2151 license = [ pkgs.lib.licenses.bsdOriginal ];
2162 license = [ pkgs.lib.licenses.bsdOriginal ];
2152 };
2163 };
2153 };
2164 };
2154 "subprocess32" = super.buildPythonPackage {
2165 "subprocess32" = super.buildPythonPackage {
2155 name = "subprocess32-3.5.4";
2166 name = "subprocess32-3.5.4";
2156 doCheck = false;
2167 doCheck = false;
2157 src = fetchurl {
2168 src = fetchurl {
2158 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2169 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2159 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2170 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2160 };
2171 };
2161 meta = {
2172 meta = {
2162 license = [ pkgs.lib.licenses.psfl ];
2173 license = [ pkgs.lib.licenses.psfl ];
2163 };
2174 };
2164 };
2175 };
2165 "supervisor" = super.buildPythonPackage {
2176 "supervisor" = super.buildPythonPackage {
2166 name = "supervisor-4.1.0";
2177 name = "supervisor-4.1.0";
2167 doCheck = false;
2178 doCheck = false;
2168 src = fetchurl {
2179 src = fetchurl {
2169 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2180 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2170 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2181 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2171 };
2182 };
2172 meta = {
2183 meta = {
2173 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2184 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2174 };
2185 };
2175 };
2186 };
2176 "tempita" = super.buildPythonPackage {
2187 "tempita" = super.buildPythonPackage {
2177 name = "tempita-0.5.2";
2188 name = "tempita-0.5.2";
2178 doCheck = false;
2189 doCheck = false;
2179 src = fetchurl {
2190 src = fetchurl {
2180 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2191 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2181 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2192 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2182 };
2193 };
2183 meta = {
2194 meta = {
2184 license = [ pkgs.lib.licenses.mit ];
2195 license = [ pkgs.lib.licenses.mit ];
2185 };
2196 };
2186 };
2197 };
2187 "termcolor" = super.buildPythonPackage {
2198 "termcolor" = super.buildPythonPackage {
2188 name = "termcolor-1.1.0";
2199 name = "termcolor-1.1.0";
2189 doCheck = false;
2200 doCheck = false;
2190 src = fetchurl {
2201 src = fetchurl {
2191 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2202 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2192 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2203 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2193 };
2204 };
2194 meta = {
2205 meta = {
2195 license = [ pkgs.lib.licenses.mit ];
2206 license = [ pkgs.lib.licenses.mit ];
2196 };
2207 };
2197 };
2208 };
2198 "testpath" = super.buildPythonPackage {
2209 "testpath" = super.buildPythonPackage {
2199 name = "testpath-0.4.4";
2210 name = "testpath-0.4.4";
2200 doCheck = false;
2211 doCheck = false;
2201 src = fetchurl {
2212 src = fetchurl {
2202 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2213 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2203 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2214 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2204 };
2215 };
2205 meta = {
2216 meta = {
2206 license = [ ];
2217 license = [ ];
2207 };
2218 };
2208 };
2219 };
2209 "traitlets" = super.buildPythonPackage {
2220 "traitlets" = super.buildPythonPackage {
2210 name = "traitlets-4.3.3";
2221 name = "traitlets-4.3.3";
2211 doCheck = false;
2222 doCheck = false;
2212 propagatedBuildInputs = [
2223 propagatedBuildInputs = [
2213 self."ipython-genutils"
2224 self."ipython-genutils"
2214 self."six"
2225 self."six"
2215 self."decorator"
2226 self."decorator"
2216 self."enum34"
2227 self."enum34"
2217 ];
2228 ];
2218 src = fetchurl {
2229 src = fetchurl {
2219 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2230 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2220 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2231 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2221 };
2232 };
2222 meta = {
2233 meta = {
2223 license = [ pkgs.lib.licenses.bsdOriginal ];
2234 license = [ pkgs.lib.licenses.bsdOriginal ];
2224 };
2235 };
2225 };
2236 };
2226 "transaction" = super.buildPythonPackage {
2237 "transaction" = super.buildPythonPackage {
2227 name = "transaction-2.4.0";
2238 name = "transaction-2.4.0";
2228 doCheck = false;
2239 doCheck = false;
2229 propagatedBuildInputs = [
2240 propagatedBuildInputs = [
2230 self."zope.interface"
2241 self."zope.interface"
2231 ];
2242 ];
2232 src = fetchurl {
2243 src = fetchurl {
2233 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2244 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2234 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2245 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2235 };
2246 };
2236 meta = {
2247 meta = {
2237 license = [ pkgs.lib.licenses.zpl21 ];
2248 license = [ pkgs.lib.licenses.zpl21 ];
2238 };
2249 };
2239 };
2250 };
2240 "translationstring" = super.buildPythonPackage {
2251 "translationstring" = super.buildPythonPackage {
2241 name = "translationstring-1.3";
2252 name = "translationstring-1.3";
2242 doCheck = false;
2253 doCheck = false;
2243 src = fetchurl {
2254 src = fetchurl {
2244 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2255 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2245 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2256 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2246 };
2257 };
2247 meta = {
2258 meta = {
2248 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2259 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2249 };
2260 };
2250 };
2261 };
2251 "tzlocal" = super.buildPythonPackage {
2262 "tzlocal" = super.buildPythonPackage {
2252 name = "tzlocal-1.5.1";
2263 name = "tzlocal-1.5.1";
2253 doCheck = false;
2264 doCheck = false;
2254 propagatedBuildInputs = [
2265 propagatedBuildInputs = [
2255 self."pytz"
2266 self."pytz"
2256 ];
2267 ];
2257 src = fetchurl {
2268 src = fetchurl {
2258 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2269 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2259 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2270 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2260 };
2271 };
2261 meta = {
2272 meta = {
2262 license = [ pkgs.lib.licenses.mit ];
2273 license = [ pkgs.lib.licenses.mit ];
2263 };
2274 };
2264 };
2275 };
2265 "urllib3" = super.buildPythonPackage {
2276 "urllib3" = super.buildPythonPackage {
2266 name = "urllib3-1.25.2";
2277 name = "urllib3-1.25.2";
2267 doCheck = false;
2278 doCheck = false;
2268 src = fetchurl {
2279 src = fetchurl {
2269 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2280 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2270 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2281 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2271 };
2282 };
2272 meta = {
2283 meta = {
2273 license = [ pkgs.lib.licenses.mit ];
2284 license = [ pkgs.lib.licenses.mit ];
2274 };
2285 };
2275 };
2286 };
2276 "urlobject" = super.buildPythonPackage {
2287 "urlobject" = super.buildPythonPackage {
2277 name = "urlobject-2.4.3";
2288 name = "urlobject-2.4.3";
2278 doCheck = false;
2289 doCheck = false;
2279 src = fetchurl {
2290 src = fetchurl {
2280 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2291 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2281 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2292 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2282 };
2293 };
2283 meta = {
2294 meta = {
2284 license = [ pkgs.lib.licenses.publicDomain ];
2295 license = [ pkgs.lib.licenses.publicDomain ];
2285 };
2296 };
2286 };
2297 };
2287 "venusian" = super.buildPythonPackage {
2298 "venusian" = super.buildPythonPackage {
2288 name = "venusian-1.2.0";
2299 name = "venusian-1.2.0";
2289 doCheck = false;
2300 doCheck = false;
2290 src = fetchurl {
2301 src = fetchurl {
2291 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2302 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2292 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2303 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2293 };
2304 };
2294 meta = {
2305 meta = {
2295 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2306 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2296 };
2307 };
2297 };
2308 };
2298 "vine" = super.buildPythonPackage {
2309 "vine" = super.buildPythonPackage {
2299 name = "vine-1.3.0";
2310 name = "vine-1.3.0";
2300 doCheck = false;
2311 doCheck = false;
2301 src = fetchurl {
2312 src = fetchurl {
2302 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2313 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2303 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2314 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2304 };
2315 };
2305 meta = {
2316 meta = {
2306 license = [ pkgs.lib.licenses.bsdOriginal ];
2317 license = [ pkgs.lib.licenses.bsdOriginal ];
2307 };
2318 };
2308 };
2319 };
2309 "waitress" = super.buildPythonPackage {
2320 "waitress" = super.buildPythonPackage {
2310 name = "waitress-1.3.1";
2321 name = "waitress-1.3.1";
2311 doCheck = false;
2322 doCheck = false;
2312 src = fetchurl {
2323 src = fetchurl {
2313 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2324 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2314 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2325 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2315 };
2326 };
2316 meta = {
2327 meta = {
2317 license = [ pkgs.lib.licenses.zpl21 ];
2328 license = [ pkgs.lib.licenses.zpl21 ];
2318 };
2329 };
2319 };
2330 };
2320 "wcwidth" = super.buildPythonPackage {
2331 "wcwidth" = super.buildPythonPackage {
2321 name = "wcwidth-0.1.9";
2332 name = "wcwidth-0.1.9";
2322 doCheck = false;
2333 doCheck = false;
2323 src = fetchurl {
2334 src = fetchurl {
2324 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2335 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2325 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2336 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2326 };
2337 };
2327 meta = {
2338 meta = {
2328 license = [ pkgs.lib.licenses.mit ];
2339 license = [ pkgs.lib.licenses.mit ];
2329 };
2340 };
2330 };
2341 };
2331 "webencodings" = super.buildPythonPackage {
2342 "webencodings" = super.buildPythonPackage {
2332 name = "webencodings-0.5.1";
2343 name = "webencodings-0.5.1";
2333 doCheck = false;
2344 doCheck = false;
2334 src = fetchurl {
2345 src = fetchurl {
2335 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2346 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2336 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2347 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2337 };
2348 };
2338 meta = {
2349 meta = {
2339 license = [ pkgs.lib.licenses.bsdOriginal ];
2350 license = [ pkgs.lib.licenses.bsdOriginal ];
2340 };
2351 };
2341 };
2352 };
2342 "weberror" = super.buildPythonPackage {
2353 "weberror" = super.buildPythonPackage {
2343 name = "weberror-0.13.1";
2354 name = "weberror-0.13.1";
2344 doCheck = false;
2355 doCheck = false;
2345 propagatedBuildInputs = [
2356 propagatedBuildInputs = [
2346 self."webob"
2357 self."webob"
2347 self."tempita"
2358 self."tempita"
2348 self."pygments"
2359 self."pygments"
2349 self."paste"
2360 self."paste"
2350 ];
2361 ];
2351 src = fetchurl {
2362 src = fetchurl {
2352 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2363 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2353 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2364 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2354 };
2365 };
2355 meta = {
2366 meta = {
2356 license = [ pkgs.lib.licenses.mit ];
2367 license = [ pkgs.lib.licenses.mit ];
2357 };
2368 };
2358 };
2369 };
2359 "webhelpers2" = super.buildPythonPackage {
2370 "webhelpers2" = super.buildPythonPackage {
2360 name = "webhelpers2-2.0";
2371 name = "webhelpers2-2.0";
2361 doCheck = false;
2372 doCheck = false;
2362 propagatedBuildInputs = [
2373 propagatedBuildInputs = [
2363 self."markupsafe"
2374 self."markupsafe"
2364 self."six"
2375 self."six"
2365 ];
2376 ];
2366 src = fetchurl {
2377 src = fetchurl {
2367 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2378 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2368 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2379 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2369 };
2380 };
2370 meta = {
2381 meta = {
2371 license = [ pkgs.lib.licenses.mit ];
2382 license = [ pkgs.lib.licenses.mit ];
2372 };
2383 };
2373 };
2384 };
2374 "webob" = super.buildPythonPackage {
2385 "webob" = super.buildPythonPackage {
2375 name = "webob-1.8.5";
2386 name = "webob-1.8.5";
2376 doCheck = false;
2387 doCheck = false;
2377 src = fetchurl {
2388 src = fetchurl {
2378 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2389 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2379 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2390 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2380 };
2391 };
2381 meta = {
2392 meta = {
2382 license = [ pkgs.lib.licenses.mit ];
2393 license = [ pkgs.lib.licenses.mit ];
2383 };
2394 };
2384 };
2395 };
2385 "webtest" = super.buildPythonPackage {
2396 "webtest" = super.buildPythonPackage {
2386 name = "webtest-2.0.34";
2397 name = "webtest-2.0.34";
2387 doCheck = false;
2398 doCheck = false;
2388 propagatedBuildInputs = [
2399 propagatedBuildInputs = [
2389 self."six"
2400 self."six"
2390 self."webob"
2401 self."webob"
2391 self."waitress"
2402 self."waitress"
2392 self."beautifulsoup4"
2403 self."beautifulsoup4"
2393 ];
2404 ];
2394 src = fetchurl {
2405 src = fetchurl {
2395 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2406 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2396 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2407 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2397 };
2408 };
2398 meta = {
2409 meta = {
2399 license = [ pkgs.lib.licenses.mit ];
2410 license = [ pkgs.lib.licenses.mit ];
2400 };
2411 };
2401 };
2412 };
2402 "whoosh" = super.buildPythonPackage {
2413 "whoosh" = super.buildPythonPackage {
2403 name = "whoosh-2.7.4";
2414 name = "whoosh-2.7.4";
2404 doCheck = false;
2415 doCheck = false;
2405 src = fetchurl {
2416 src = fetchurl {
2406 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2417 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2407 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2418 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2408 };
2419 };
2409 meta = {
2420 meta = {
2410 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2421 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2411 };
2422 };
2412 };
2423 };
2413 "ws4py" = super.buildPythonPackage {
2424 "ws4py" = super.buildPythonPackage {
2414 name = "ws4py-0.5.1";
2425 name = "ws4py-0.5.1";
2415 doCheck = false;
2426 doCheck = false;
2416 src = fetchurl {
2427 src = fetchurl {
2417 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2428 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2418 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2429 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2419 };
2430 };
2420 meta = {
2431 meta = {
2421 license = [ pkgs.lib.licenses.bsdOriginal ];
2432 license = [ pkgs.lib.licenses.bsdOriginal ];
2422 };
2433 };
2423 };
2434 };
2424 "wsgiref" = super.buildPythonPackage {
2435 "wsgiref" = super.buildPythonPackage {
2425 name = "wsgiref-0.1.2";
2436 name = "wsgiref-0.1.2";
2426 doCheck = false;
2437 doCheck = false;
2427 src = fetchurl {
2438 src = fetchurl {
2428 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2439 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2429 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2440 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2430 };
2441 };
2431 meta = {
2442 meta = {
2432 license = [ { fullName = "PSF or ZPL"; } ];
2443 license = [ { fullName = "PSF or ZPL"; } ];
2433 };
2444 };
2434 };
2445 };
2435 "zipp" = super.buildPythonPackage {
2446 "zipp" = super.buildPythonPackage {
2436 name = "zipp-1.2.0";
2447 name = "zipp-1.2.0";
2437 doCheck = false;
2448 doCheck = false;
2438 propagatedBuildInputs = [
2449 propagatedBuildInputs = [
2439 self."contextlib2"
2450 self."contextlib2"
2440 ];
2451 ];
2441 src = fetchurl {
2452 src = fetchurl {
2442 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2453 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2443 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2454 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2444 };
2455 };
2445 meta = {
2456 meta = {
2446 license = [ pkgs.lib.licenses.mit ];
2457 license = [ pkgs.lib.licenses.mit ];
2447 };
2458 };
2448 };
2459 };
2449 "zope.cachedescriptors" = super.buildPythonPackage {
2460 "zope.cachedescriptors" = super.buildPythonPackage {
2450 name = "zope.cachedescriptors-4.3.1";
2461 name = "zope.cachedescriptors-4.3.1";
2451 doCheck = false;
2462 doCheck = false;
2452 propagatedBuildInputs = [
2463 propagatedBuildInputs = [
2453 self."setuptools"
2464 self."setuptools"
2454 ];
2465 ];
2455 src = fetchurl {
2466 src = fetchurl {
2456 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2467 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2457 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2468 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2458 };
2469 };
2459 meta = {
2470 meta = {
2460 license = [ pkgs.lib.licenses.zpl21 ];
2471 license = [ pkgs.lib.licenses.zpl21 ];
2461 };
2472 };
2462 };
2473 };
2463 "zope.deprecation" = super.buildPythonPackage {
2474 "zope.deprecation" = super.buildPythonPackage {
2464 name = "zope.deprecation-4.4.0";
2475 name = "zope.deprecation-4.4.0";
2465 doCheck = false;
2476 doCheck = false;
2466 propagatedBuildInputs = [
2477 propagatedBuildInputs = [
2467 self."setuptools"
2478 self."setuptools"
2468 ];
2479 ];
2469 src = fetchurl {
2480 src = fetchurl {
2470 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2481 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2471 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2482 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2472 };
2483 };
2473 meta = {
2484 meta = {
2474 license = [ pkgs.lib.licenses.zpl21 ];
2485 license = [ pkgs.lib.licenses.zpl21 ];
2475 };
2486 };
2476 };
2487 };
2477 "zope.event" = super.buildPythonPackage {
2488 "zope.event" = super.buildPythonPackage {
2478 name = "zope.event-4.4";
2489 name = "zope.event-4.4";
2479 doCheck = false;
2490 doCheck = false;
2480 propagatedBuildInputs = [
2491 propagatedBuildInputs = [
2481 self."setuptools"
2492 self."setuptools"
2482 ];
2493 ];
2483 src = fetchurl {
2494 src = fetchurl {
2484 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2495 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2485 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2496 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2486 };
2497 };
2487 meta = {
2498 meta = {
2488 license = [ pkgs.lib.licenses.zpl21 ];
2499 license = [ pkgs.lib.licenses.zpl21 ];
2489 };
2500 };
2490 };
2501 };
2491 "zope.interface" = super.buildPythonPackage {
2502 "zope.interface" = super.buildPythonPackage {
2492 name = "zope.interface-4.6.0";
2503 name = "zope.interface-4.6.0";
2493 doCheck = false;
2504 doCheck = false;
2494 propagatedBuildInputs = [
2505 propagatedBuildInputs = [
2495 self."setuptools"
2506 self."setuptools"
2496 ];
2507 ];
2497 src = fetchurl {
2508 src = fetchurl {
2498 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2509 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2499 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2510 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2500 };
2511 };
2501 meta = {
2512 meta = {
2502 license = [ pkgs.lib.licenses.zpl21 ];
2513 license = [ pkgs.lib.licenses.zpl21 ];
2503 };
2514 };
2504 };
2515 };
2505
2516
2506 ### Test requirements
2517 ### Test requirements
2507
2518
2508
2519
2509 }
2520 }
@@ -1,1 +1,1 b''
1 4.24.1 No newline at end of file
1 4.25.0 No newline at end of file
@@ -1,2524 +1,2524 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2011-2020 RhodeCode GmbH
3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import time
22 import time
23
23
24 import rhodecode
24 import rhodecode
25 from rhodecode.api import (
25 from rhodecode.api import (
26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
27 from rhodecode.api.utils import (
27 from rhodecode.api.utils import (
28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
30 get_perm_or_error, parse_args, get_origin, build_commit_data,
30 get_perm_or_error, parse_args, get_origin, build_commit_data,
31 validate_set_owner_permissions)
31 validate_set_owner_permissions)
32 from rhodecode.lib import audit_logger, rc_cache, channelstream
32 from rhodecode.lib import audit_logger, rc_cache, channelstream
33 from rhodecode.lib import repo_maintenance
33 from rhodecode.lib import repo_maintenance
34 from rhodecode.lib.auth import (
34 from rhodecode.lib.auth import (
35 HasPermissionAnyApi, HasUserGroupPermissionAnyApi,
35 HasPermissionAnyApi, HasUserGroupPermissionAnyApi,
36 HasRepoPermissionAnyApi)
36 HasRepoPermissionAnyApi)
37 from rhodecode.lib.celerylib.utils import get_task_id
37 from rhodecode.lib.celerylib.utils import get_task_id
38 from rhodecode.lib.utils2 import (
38 from rhodecode.lib.utils2 import (
39 str2bool, time_to_datetime, safe_str, safe_int, safe_unicode)
39 str2bool, time_to_datetime, safe_str, safe_int, safe_unicode)
40 from rhodecode.lib.ext_json import json
40 from rhodecode.lib.ext_json import json
41 from rhodecode.lib.exceptions import (
41 from rhodecode.lib.exceptions import (
42 StatusChangeOnClosedPullRequestError, CommentVersionMismatch)
42 StatusChangeOnClosedPullRequestError, CommentVersionMismatch)
43 from rhodecode.lib.vcs import RepositoryError
43 from rhodecode.lib.vcs import RepositoryError
44 from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError
44 from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError
45 from rhodecode.model.changeset_status import ChangesetStatusModel
45 from rhodecode.model.changeset_status import ChangesetStatusModel
46 from rhodecode.model.comment import CommentsModel
46 from rhodecode.model.comment import CommentsModel
47 from rhodecode.model.db import (
47 from rhodecode.model.db import (
48 Session, ChangesetStatus, RepositoryField, Repository, RepoGroup,
48 Session, ChangesetStatus, RepositoryField, Repository, RepoGroup,
49 ChangesetComment)
49 ChangesetComment)
50 from rhodecode.model.permission import PermissionModel
50 from rhodecode.model.permission import PermissionModel
51 from rhodecode.model.pull_request import PullRequestModel
51 from rhodecode.model.pull_request import PullRequestModel
52 from rhodecode.model.repo import RepoModel
52 from rhodecode.model.repo import RepoModel
53 from rhodecode.model.scm import ScmModel, RepoList
53 from rhodecode.model.scm import ScmModel, RepoList
54 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
54 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
55 from rhodecode.model import validation_schema
55 from rhodecode.model import validation_schema
56 from rhodecode.model.validation_schema.schemas import repo_schema
56 from rhodecode.model.validation_schema.schemas import repo_schema
57
57
58 log = logging.getLogger(__name__)
58 log = logging.getLogger(__name__)
59
59
60
60
61 @jsonrpc_method()
61 @jsonrpc_method()
62 def get_repo(request, apiuser, repoid, cache=Optional(True)):
62 def get_repo(request, apiuser, repoid, cache=Optional(True)):
63 """
63 """
64 Gets an existing repository by its name or repository_id.
64 Gets an existing repository by its name or repository_id.
65
65
66 The members section so the output returns users groups or users
66 The members section so the output returns users groups or users
67 associated with that repository.
67 associated with that repository.
68
68
69 This command can only be run using an |authtoken| with admin rights,
69 This command can only be run using an |authtoken| with admin rights,
70 or users with at least read rights to the |repo|.
70 or users with at least read rights to the |repo|.
71
71
72 :param apiuser: This is filled automatically from the |authtoken|.
72 :param apiuser: This is filled automatically from the |authtoken|.
73 :type apiuser: AuthUser
73 :type apiuser: AuthUser
74 :param repoid: The repository name or repository id.
74 :param repoid: The repository name or repository id.
75 :type repoid: str or int
75 :type repoid: str or int
76 :param cache: use the cached value for last changeset
76 :param cache: use the cached value for last changeset
77 :type: cache: Optional(bool)
77 :type: cache: Optional(bool)
78
78
79 Example output:
79 Example output:
80
80
81 .. code-block:: bash
81 .. code-block:: bash
82
82
83 {
83 {
84 "error": null,
84 "error": null,
85 "id": <repo_id>,
85 "id": <repo_id>,
86 "result": {
86 "result": {
87 "clone_uri": null,
87 "clone_uri": null,
88 "created_on": "timestamp",
88 "created_on": "timestamp",
89 "description": "repo description",
89 "description": "repo description",
90 "enable_downloads": false,
90 "enable_downloads": false,
91 "enable_locking": false,
91 "enable_locking": false,
92 "enable_statistics": false,
92 "enable_statistics": false,
93 "followers": [
93 "followers": [
94 {
94 {
95 "active": true,
95 "active": true,
96 "admin": false,
96 "admin": false,
97 "api_key": "****************************************",
97 "api_key": "****************************************",
98 "api_keys": [
98 "api_keys": [
99 "****************************************"
99 "****************************************"
100 ],
100 ],
101 "email": "user@example.com",
101 "email": "user@example.com",
102 "emails": [
102 "emails": [
103 "user@example.com"
103 "user@example.com"
104 ],
104 ],
105 "extern_name": "rhodecode",
105 "extern_name": "rhodecode",
106 "extern_type": "rhodecode",
106 "extern_type": "rhodecode",
107 "firstname": "username",
107 "firstname": "username",
108 "ip_addresses": [],
108 "ip_addresses": [],
109 "language": null,
109 "language": null,
110 "last_login": "2015-09-16T17:16:35.854",
110 "last_login": "2015-09-16T17:16:35.854",
111 "lastname": "surname",
111 "lastname": "surname",
112 "user_id": <user_id>,
112 "user_id": <user_id>,
113 "username": "name"
113 "username": "name"
114 }
114 }
115 ],
115 ],
116 "fork_of": "parent-repo",
116 "fork_of": "parent-repo",
117 "landing_rev": [
117 "landing_rev": [
118 "rev",
118 "rev",
119 "tip"
119 "tip"
120 ],
120 ],
121 "last_changeset": {
121 "last_changeset": {
122 "author": "User <user@example.com>",
122 "author": "User <user@example.com>",
123 "branch": "default",
123 "branch": "default",
124 "date": "timestamp",
124 "date": "timestamp",
125 "message": "last commit message",
125 "message": "last commit message",
126 "parents": [
126 "parents": [
127 {
127 {
128 "raw_id": "commit-id"
128 "raw_id": "commit-id"
129 }
129 }
130 ],
130 ],
131 "raw_id": "commit-id",
131 "raw_id": "commit-id",
132 "revision": <revision number>,
132 "revision": <revision number>,
133 "short_id": "short id"
133 "short_id": "short id"
134 },
134 },
135 "lock_reason": null,
135 "lock_reason": null,
136 "locked_by": null,
136 "locked_by": null,
137 "locked_date": null,
137 "locked_date": null,
138 "owner": "owner-name",
138 "owner": "owner-name",
139 "permissions": [
139 "permissions": [
140 {
140 {
141 "name": "super-admin-name",
141 "name": "super-admin-name",
142 "origin": "super-admin",
142 "origin": "super-admin",
143 "permission": "repository.admin",
143 "permission": "repository.admin",
144 "type": "user"
144 "type": "user"
145 },
145 },
146 {
146 {
147 "name": "owner-name",
147 "name": "owner-name",
148 "origin": "owner",
148 "origin": "owner",
149 "permission": "repository.admin",
149 "permission": "repository.admin",
150 "type": "user"
150 "type": "user"
151 },
151 },
152 {
152 {
153 "name": "user-group-name",
153 "name": "user-group-name",
154 "origin": "permission",
154 "origin": "permission",
155 "permission": "repository.write",
155 "permission": "repository.write",
156 "type": "user_group"
156 "type": "user_group"
157 }
157 }
158 ],
158 ],
159 "private": true,
159 "private": true,
160 "repo_id": 676,
160 "repo_id": 676,
161 "repo_name": "user-group/repo-name",
161 "repo_name": "user-group/repo-name",
162 "repo_type": "hg"
162 "repo_type": "hg"
163 }
163 }
164 }
164 }
165 """
165 """
166
166
167 repo = get_repo_or_error(repoid)
167 repo = get_repo_or_error(repoid)
168 cache = Optional.extract(cache)
168 cache = Optional.extract(cache)
169
169
170 include_secrets = False
170 include_secrets = False
171 if has_superadmin_permission(apiuser):
171 if has_superadmin_permission(apiuser):
172 include_secrets = True
172 include_secrets = True
173 else:
173 else:
174 # check if we have at least read permission for this repo !
174 # check if we have at least read permission for this repo !
175 _perms = (
175 _perms = (
176 'repository.admin', 'repository.write', 'repository.read',)
176 'repository.admin', 'repository.write', 'repository.read',)
177 validate_repo_permissions(apiuser, repoid, repo, _perms)
177 validate_repo_permissions(apiuser, repoid, repo, _perms)
178
178
179 permissions = []
179 permissions = []
180 for _user in repo.permissions():
180 for _user in repo.permissions():
181 user_data = {
181 user_data = {
182 'name': _user.username,
182 'name': _user.username,
183 'permission': _user.permission,
183 'permission': _user.permission,
184 'origin': get_origin(_user),
184 'origin': get_origin(_user),
185 'type': "user",
185 'type': "user",
186 }
186 }
187 permissions.append(user_data)
187 permissions.append(user_data)
188
188
189 for _user_group in repo.permission_user_groups():
189 for _user_group in repo.permission_user_groups():
190 user_group_data = {
190 user_group_data = {
191 'name': _user_group.users_group_name,
191 'name': _user_group.users_group_name,
192 'permission': _user_group.permission,
192 'permission': _user_group.permission,
193 'origin': get_origin(_user_group),
193 'origin': get_origin(_user_group),
194 'type': "user_group",
194 'type': "user_group",
195 }
195 }
196 permissions.append(user_group_data)
196 permissions.append(user_group_data)
197
197
198 following_users = [
198 following_users = [
199 user.user.get_api_data(include_secrets=include_secrets)
199 user.user.get_api_data(include_secrets=include_secrets)
200 for user in repo.followers]
200 for user in repo.followers]
201
201
202 if not cache:
202 if not cache:
203 repo.update_commit_cache()
203 repo.update_commit_cache()
204 data = repo.get_api_data(include_secrets=include_secrets)
204 data = repo.get_api_data(include_secrets=include_secrets)
205 data['permissions'] = permissions
205 data['permissions'] = permissions
206 data['followers'] = following_users
206 data['followers'] = following_users
207 return data
207 return data
208
208
209
209
210 @jsonrpc_method()
210 @jsonrpc_method()
211 def get_repos(request, apiuser, root=Optional(None), traverse=Optional(True)):
211 def get_repos(request, apiuser, root=Optional(None), traverse=Optional(True)):
212 """
212 """
213 Lists all existing repositories.
213 Lists all existing repositories.
214
214
215 This command can only be run using an |authtoken| with admin rights,
215 This command can only be run using an |authtoken| with admin rights,
216 or users with at least read rights to |repos|.
216 or users with at least read rights to |repos|.
217
217
218 :param apiuser: This is filled automatically from the |authtoken|.
218 :param apiuser: This is filled automatically from the |authtoken|.
219 :type apiuser: AuthUser
219 :type apiuser: AuthUser
220 :param root: specify root repository group to fetch repositories.
220 :param root: specify root repository group to fetch repositories.
221 filters the returned repositories to be members of given root group.
221 filters the returned repositories to be members of given root group.
222 :type root: Optional(None)
222 :type root: Optional(None)
223 :param traverse: traverse given root into subrepositories. With this flag
223 :param traverse: traverse given root into subrepositories. With this flag
224 set to False, it will only return top-level repositories from `root`.
224 set to False, it will only return top-level repositories from `root`.
225 if root is empty it will return just top-level repositories.
225 if root is empty it will return just top-level repositories.
226 :type traverse: Optional(True)
226 :type traverse: Optional(True)
227
227
228
228
229 Example output:
229 Example output:
230
230
231 .. code-block:: bash
231 .. code-block:: bash
232
232
233 id : <id_given_in_input>
233 id : <id_given_in_input>
234 result: [
234 result: [
235 {
235 {
236 "repo_id" : "<repo_id>",
236 "repo_id" : "<repo_id>",
237 "repo_name" : "<reponame>"
237 "repo_name" : "<reponame>"
238 "repo_type" : "<repo_type>",
238 "repo_type" : "<repo_type>",
239 "clone_uri" : "<clone_uri>",
239 "clone_uri" : "<clone_uri>",
240 "private": : "<bool>",
240 "private": : "<bool>",
241 "created_on" : "<datetimecreated>",
241 "created_on" : "<datetimecreated>",
242 "description" : "<description>",
242 "description" : "<description>",
243 "landing_rev": "<landing_rev>",
243 "landing_rev": "<landing_rev>",
244 "owner": "<repo_owner>",
244 "owner": "<repo_owner>",
245 "fork_of": "<name_of_fork_parent>",
245 "fork_of": "<name_of_fork_parent>",
246 "enable_downloads": "<bool>",
246 "enable_downloads": "<bool>",
247 "enable_locking": "<bool>",
247 "enable_locking": "<bool>",
248 "enable_statistics": "<bool>",
248 "enable_statistics": "<bool>",
249 },
249 },
250 ...
250 ...
251 ]
251 ]
252 error: null
252 error: null
253 """
253 """
254
254
255 include_secrets = has_superadmin_permission(apiuser)
255 include_secrets = has_superadmin_permission(apiuser)
256 _perms = ('repository.read', 'repository.write', 'repository.admin',)
256 _perms = ('repository.read', 'repository.write', 'repository.admin',)
257 extras = {'user': apiuser}
257 extras = {'user': apiuser}
258
258
259 root = Optional.extract(root)
259 root = Optional.extract(root)
260 traverse = Optional.extract(traverse, binary=True)
260 traverse = Optional.extract(traverse, binary=True)
261
261
262 if root:
262 if root:
263 # verify parent existance, if it's empty return an error
263 # verify parent existance, if it's empty return an error
264 parent = RepoGroup.get_by_group_name(root)
264 parent = RepoGroup.get_by_group_name(root)
265 if not parent:
265 if not parent:
266 raise JSONRPCError(
266 raise JSONRPCError(
267 'Root repository group `{}` does not exist'.format(root))
267 'Root repository group `{}` does not exist'.format(root))
268
268
269 if traverse:
269 if traverse:
270 repos = RepoModel().get_repos_for_root(root=root, traverse=traverse)
270 repos = RepoModel().get_repos_for_root(root=root, traverse=traverse)
271 else:
271 else:
272 repos = RepoModel().get_repos_for_root(root=parent)
272 repos = RepoModel().get_repos_for_root(root=parent)
273 else:
273 else:
274 if traverse:
274 if traverse:
275 repos = RepoModel().get_all()
275 repos = RepoModel().get_all()
276 else:
276 else:
277 # return just top-level
277 # return just top-level
278 repos = RepoModel().get_repos_for_root(root=None)
278 repos = RepoModel().get_repos_for_root(root=None)
279
279
280 repo_list = RepoList(repos, perm_set=_perms, extra_kwargs=extras)
280 repo_list = RepoList(repos, perm_set=_perms, extra_kwargs=extras)
281 return [repo.get_api_data(include_secrets=include_secrets)
281 return [repo.get_api_data(include_secrets=include_secrets)
282 for repo in repo_list]
282 for repo in repo_list]
283
283
284
284
285 @jsonrpc_method()
285 @jsonrpc_method()
286 def get_repo_changeset(request, apiuser, repoid, revision,
286 def get_repo_changeset(request, apiuser, repoid, revision,
287 details=Optional('basic')):
287 details=Optional('basic')):
288 """
288 """
289 Returns information about a changeset.
289 Returns information about a changeset.
290
290
291 Additionally parameters define the amount of details returned by
291 Additionally parameters define the amount of details returned by
292 this function.
292 this function.
293
293
294 This command can only be run using an |authtoken| with admin rights,
294 This command can only be run using an |authtoken| with admin rights,
295 or users with at least read rights to the |repo|.
295 or users with at least read rights to the |repo|.
296
296
297 :param apiuser: This is filled automatically from the |authtoken|.
297 :param apiuser: This is filled automatically from the |authtoken|.
298 :type apiuser: AuthUser
298 :type apiuser: AuthUser
299 :param repoid: The repository name or repository id
299 :param repoid: The repository name or repository id
300 :type repoid: str or int
300 :type repoid: str or int
301 :param revision: revision for which listing should be done
301 :param revision: revision for which listing should be done
302 :type revision: str
302 :type revision: str
303 :param details: details can be 'basic|extended|full' full gives diff
303 :param details: details can be 'basic|extended|full' full gives diff
304 info details like the diff itself, and number of changed files etc.
304 info details like the diff itself, and number of changed files etc.
305 :type details: Optional(str)
305 :type details: Optional(str)
306
306
307 """
307 """
308 repo = get_repo_or_error(repoid)
308 repo = get_repo_or_error(repoid)
309 if not has_superadmin_permission(apiuser):
309 if not has_superadmin_permission(apiuser):
310 _perms = ('repository.admin', 'repository.write', 'repository.read',)
310 _perms = ('repository.admin', 'repository.write', 'repository.read',)
311 validate_repo_permissions(apiuser, repoid, repo, _perms)
311 validate_repo_permissions(apiuser, repoid, repo, _perms)
312
312
313 changes_details = Optional.extract(details)
313 changes_details = Optional.extract(details)
314 _changes_details_types = ['basic', 'extended', 'full']
314 _changes_details_types = ['basic', 'extended', 'full']
315 if changes_details not in _changes_details_types:
315 if changes_details not in _changes_details_types:
316 raise JSONRPCError(
316 raise JSONRPCError(
317 'ret_type must be one of %s' % (
317 'ret_type must be one of %s' % (
318 ','.join(_changes_details_types)))
318 ','.join(_changes_details_types)))
319
319
320 vcs_repo = repo.scm_instance()
320 vcs_repo = repo.scm_instance()
321 pre_load = ['author', 'branch', 'date', 'message', 'parents',
321 pre_load = ['author', 'branch', 'date', 'message', 'parents',
322 'status', '_commit', '_file_paths']
322 'status', '_commit', '_file_paths']
323
323
324 try:
324 try:
325 commit = repo.get_commit(commit_id=revision, pre_load=pre_load)
325 commit = repo.get_commit(commit_id=revision, pre_load=pre_load)
326 except TypeError as e:
326 except TypeError as e:
327 raise JSONRPCError(safe_str(e))
327 raise JSONRPCError(safe_str(e))
328 _cs_json = commit.__json__()
328 _cs_json = commit.__json__()
329 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
329 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
330 if changes_details == 'full':
330 if changes_details == 'full':
331 _cs_json['refs'] = commit._get_refs()
331 _cs_json['refs'] = commit._get_refs()
332 return _cs_json
332 return _cs_json
333
333
334
334
335 @jsonrpc_method()
335 @jsonrpc_method()
336 def get_repo_changesets(request, apiuser, repoid, start_rev, limit,
336 def get_repo_changesets(request, apiuser, repoid, start_rev, limit,
337 details=Optional('basic')):
337 details=Optional('basic')):
338 """
338 """
339 Returns a set of commits limited by the number starting
339 Returns a set of commits limited by the number starting
340 from the `start_rev` option.
340 from the `start_rev` option.
341
341
342 Additional parameters define the amount of details returned by this
342 Additional parameters define the amount of details returned by this
343 function.
343 function.
344
344
345 This command can only be run using an |authtoken| with admin rights,
345 This command can only be run using an |authtoken| with admin rights,
346 or users with at least read rights to |repos|.
346 or users with at least read rights to |repos|.
347
347
348 :param apiuser: This is filled automatically from the |authtoken|.
348 :param apiuser: This is filled automatically from the |authtoken|.
349 :type apiuser: AuthUser
349 :type apiuser: AuthUser
350 :param repoid: The repository name or repository ID.
350 :param repoid: The repository name or repository ID.
351 :type repoid: str or int
351 :type repoid: str or int
352 :param start_rev: The starting revision from where to get changesets.
352 :param start_rev: The starting revision from where to get changesets.
353 :type start_rev: str
353 :type start_rev: str
354 :param limit: Limit the number of commits to this amount
354 :param limit: Limit the number of commits to this amount
355 :type limit: str or int
355 :type limit: str or int
356 :param details: Set the level of detail returned. Valid option are:
356 :param details: Set the level of detail returned. Valid option are:
357 ``basic``, ``extended`` and ``full``.
357 ``basic``, ``extended`` and ``full``.
358 :type details: Optional(str)
358 :type details: Optional(str)
359
359
360 .. note::
360 .. note::
361
361
362 Setting the parameter `details` to the value ``full`` is extensive
362 Setting the parameter `details` to the value ``full`` is extensive
363 and returns details like the diff itself, and the number
363 and returns details like the diff itself, and the number
364 of changed files.
364 of changed files.
365
365
366 """
366 """
367 repo = get_repo_or_error(repoid)
367 repo = get_repo_or_error(repoid)
368 if not has_superadmin_permission(apiuser):
368 if not has_superadmin_permission(apiuser):
369 _perms = ('repository.admin', 'repository.write', 'repository.read',)
369 _perms = ('repository.admin', 'repository.write', 'repository.read',)
370 validate_repo_permissions(apiuser, repoid, repo, _perms)
370 validate_repo_permissions(apiuser, repoid, repo, _perms)
371
371
372 changes_details = Optional.extract(details)
372 changes_details = Optional.extract(details)
373 _changes_details_types = ['basic', 'extended', 'full']
373 _changes_details_types = ['basic', 'extended', 'full']
374 if changes_details not in _changes_details_types:
374 if changes_details not in _changes_details_types:
375 raise JSONRPCError(
375 raise JSONRPCError(
376 'ret_type must be one of %s' % (
376 'ret_type must be one of %s' % (
377 ','.join(_changes_details_types)))
377 ','.join(_changes_details_types)))
378
378
379 limit = int(limit)
379 limit = int(limit)
380 pre_load = ['author', 'branch', 'date', 'message', 'parents',
380 pre_load = ['author', 'branch', 'date', 'message', 'parents',
381 'status', '_commit', '_file_paths']
381 'status', '_commit', '_file_paths']
382
382
383 vcs_repo = repo.scm_instance()
383 vcs_repo = repo.scm_instance()
384 # SVN needs a special case to distinguish its index and commit id
384 # SVN needs a special case to distinguish its index and commit id
385 if vcs_repo and vcs_repo.alias == 'svn' and (start_rev == '0'):
385 if vcs_repo and vcs_repo.alias == 'svn' and (start_rev == '0'):
386 start_rev = vcs_repo.commit_ids[0]
386 start_rev = vcs_repo.commit_ids[0]
387
387
388 try:
388 try:
389 commits = vcs_repo.get_commits(
389 commits = vcs_repo.get_commits(
390 start_id=start_rev, pre_load=pre_load, translate_tags=False)
390 start_id=start_rev, pre_load=pre_load, translate_tags=False)
391 except TypeError as e:
391 except TypeError as e:
392 raise JSONRPCError(safe_str(e))
392 raise JSONRPCError(safe_str(e))
393 except Exception:
393 except Exception:
394 log.exception('Fetching of commits failed')
394 log.exception('Fetching of commits failed')
395 raise JSONRPCError('Error occurred during commit fetching')
395 raise JSONRPCError('Error occurred during commit fetching')
396
396
397 ret = []
397 ret = []
398 for cnt, commit in enumerate(commits):
398 for cnt, commit in enumerate(commits):
399 if cnt >= limit != -1:
399 if cnt >= limit != -1:
400 break
400 break
401 _cs_json = commit.__json__()
401 _cs_json = commit.__json__()
402 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
402 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
403 if changes_details == 'full':
403 if changes_details == 'full':
404 _cs_json['refs'] = {
404 _cs_json['refs'] = {
405 'branches': [commit.branch],
405 'branches': [commit.branch],
406 'bookmarks': getattr(commit, 'bookmarks', []),
406 'bookmarks': getattr(commit, 'bookmarks', []),
407 'tags': commit.tags
407 'tags': commit.tags
408 }
408 }
409 ret.append(_cs_json)
409 ret.append(_cs_json)
410 return ret
410 return ret
411
411
412
412
413 @jsonrpc_method()
413 @jsonrpc_method()
414 def get_repo_nodes(request, apiuser, repoid, revision, root_path,
414 def get_repo_nodes(request, apiuser, repoid, revision, root_path,
415 ret_type=Optional('all'), details=Optional('basic'),
415 ret_type=Optional('all'), details=Optional('basic'),
416 max_file_bytes=Optional(None)):
416 max_file_bytes=Optional(None)):
417 """
417 """
418 Returns a list of nodes and children in a flat list for a given
418 Returns a list of nodes and children in a flat list for a given
419 path at given revision.
419 path at given revision.
420
420
421 It's possible to specify ret_type to show only `files` or `dirs`.
421 It's possible to specify ret_type to show only `files` or `dirs`.
422
422
423 This command can only be run using an |authtoken| with admin rights,
423 This command can only be run using an |authtoken| with admin rights,
424 or users with at least read rights to |repos|.
424 or users with at least read rights to |repos|.
425
425
426 :param apiuser: This is filled automatically from the |authtoken|.
426 :param apiuser: This is filled automatically from the |authtoken|.
427 :type apiuser: AuthUser
427 :type apiuser: AuthUser
428 :param repoid: The repository name or repository ID.
428 :param repoid: The repository name or repository ID.
429 :type repoid: str or int
429 :type repoid: str or int
430 :param revision: The revision for which listing should be done.
430 :param revision: The revision for which listing should be done.
431 :type revision: str
431 :type revision: str
432 :param root_path: The path from which to start displaying.
432 :param root_path: The path from which to start displaying.
433 :type root_path: str
433 :type root_path: str
434 :param ret_type: Set the return type. Valid options are
434 :param ret_type: Set the return type. Valid options are
435 ``all`` (default), ``files`` and ``dirs``.
435 ``all`` (default), ``files`` and ``dirs``.
436 :type ret_type: Optional(str)
436 :type ret_type: Optional(str)
437 :param details: Returns extended information about nodes, such as
437 :param details: Returns extended information about nodes, such as
438 md5, binary, and or content.
438 md5, binary, and or content.
439 The valid options are ``basic`` and ``full``.
439 The valid options are ``basic`` and ``full``.
440 :type details: Optional(str)
440 :type details: Optional(str)
441 :param max_file_bytes: Only return file content under this file size bytes
441 :param max_file_bytes: Only return file content under this file size bytes
442 :type details: Optional(int)
442 :type details: Optional(int)
443
443
444 Example output:
444 Example output:
445
445
446 .. code-block:: bash
446 .. code-block:: bash
447
447
448 id : <id_given_in_input>
448 id : <id_given_in_input>
449 result: [
449 result: [
450 {
450 {
451 "binary": false,
451 "binary": false,
452 "content": "File line",
452 "content": "File line",
453 "extension": "md",
453 "extension": "md",
454 "lines": 2,
454 "lines": 2,
455 "md5": "059fa5d29b19c0657e384749480f6422",
455 "md5": "059fa5d29b19c0657e384749480f6422",
456 "mimetype": "text/x-minidsrc",
456 "mimetype": "text/x-minidsrc",
457 "name": "file.md",
457 "name": "file.md",
458 "size": 580,
458 "size": 580,
459 "type": "file"
459 "type": "file"
460 },
460 },
461 ...
461 ...
462 ]
462 ]
463 error: null
463 error: null
464 """
464 """
465
465
466 repo = get_repo_or_error(repoid)
466 repo = get_repo_or_error(repoid)
467 if not has_superadmin_permission(apiuser):
467 if not has_superadmin_permission(apiuser):
468 _perms = ('repository.admin', 'repository.write', 'repository.read',)
468 _perms = ('repository.admin', 'repository.write', 'repository.read',)
469 validate_repo_permissions(apiuser, repoid, repo, _perms)
469 validate_repo_permissions(apiuser, repoid, repo, _perms)
470
470
471 ret_type = Optional.extract(ret_type)
471 ret_type = Optional.extract(ret_type)
472 details = Optional.extract(details)
472 details = Optional.extract(details)
473 _extended_types = ['basic', 'full']
473 _extended_types = ['basic', 'full']
474 if details not in _extended_types:
474 if details not in _extended_types:
475 raise JSONRPCError('ret_type must be one of %s' % (','.join(_extended_types)))
475 raise JSONRPCError('ret_type must be one of %s' % (','.join(_extended_types)))
476 extended_info = False
476 extended_info = False
477 content = False
477 content = False
478 if details == 'basic':
478 if details == 'basic':
479 extended_info = True
479 extended_info = True
480
480
481 if details == 'full':
481 if details == 'full':
482 extended_info = content = True
482 extended_info = content = True
483
483
484 _map = {}
484 _map = {}
485 try:
485 try:
486 # check if repo is not empty by any chance, skip quicker if it is.
486 # check if repo is not empty by any chance, skip quicker if it is.
487 _scm = repo.scm_instance()
487 _scm = repo.scm_instance()
488 if _scm.is_empty():
488 if _scm.is_empty():
489 return []
489 return []
490
490
491 _d, _f = ScmModel().get_nodes(
491 _d, _f = ScmModel().get_nodes(
492 repo, revision, root_path, flat=False,
492 repo, revision, root_path, flat=False,
493 extended_info=extended_info, content=content,
493 extended_info=extended_info, content=content,
494 max_file_bytes=max_file_bytes)
494 max_file_bytes=max_file_bytes)
495 _map = {
495 _map = {
496 'all': _d + _f,
496 'all': _d + _f,
497 'files': _f,
497 'files': _f,
498 'dirs': _d,
498 'dirs': _d,
499 }
499 }
500 return _map[ret_type]
500 return _map[ret_type]
501 except KeyError:
501 except KeyError:
502 raise JSONRPCError(
502 raise JSONRPCError(
503 'ret_type must be one of %s' % (','.join(sorted(_map.keys()))))
503 'ret_type must be one of %s' % (','.join(sorted(_map.keys()))))
504 except Exception:
504 except Exception:
505 log.exception("Exception occurred while trying to get repo nodes")
505 log.exception("Exception occurred while trying to get repo nodes")
506 raise JSONRPCError(
506 raise JSONRPCError(
507 'failed to get repo: `%s` nodes' % repo.repo_name
507 'failed to get repo: `%s` nodes' % repo.repo_name
508 )
508 )
509
509
510
510
511 @jsonrpc_method()
511 @jsonrpc_method()
512 def get_repo_file(request, apiuser, repoid, commit_id, file_path,
512 def get_repo_file(request, apiuser, repoid, commit_id, file_path,
513 max_file_bytes=Optional(None), details=Optional('basic'),
513 max_file_bytes=Optional(None), details=Optional('basic'),
514 cache=Optional(True)):
514 cache=Optional(True)):
515 """
515 """
516 Returns a single file from repository at given revision.
516 Returns a single file from repository at given revision.
517
517
518 This command can only be run using an |authtoken| with admin rights,
518 This command can only be run using an |authtoken| with admin rights,
519 or users with at least read rights to |repos|.
519 or users with at least read rights to |repos|.
520
520
521 :param apiuser: This is filled automatically from the |authtoken|.
521 :param apiuser: This is filled automatically from the |authtoken|.
522 :type apiuser: AuthUser
522 :type apiuser: AuthUser
523 :param repoid: The repository name or repository ID.
523 :param repoid: The repository name or repository ID.
524 :type repoid: str or int
524 :type repoid: str or int
525 :param commit_id: The revision for which listing should be done.
525 :param commit_id: The revision for which listing should be done.
526 :type commit_id: str
526 :type commit_id: str
527 :param file_path: The path from which to start displaying.
527 :param file_path: The path from which to start displaying.
528 :type file_path: str
528 :type file_path: str
529 :param details: Returns different set of information about nodes.
529 :param details: Returns different set of information about nodes.
530 The valid options are ``minimal`` ``basic`` and ``full``.
530 The valid options are ``minimal`` ``basic`` and ``full``.
531 :type details: Optional(str)
531 :type details: Optional(str)
532 :param max_file_bytes: Only return file content under this file size bytes
532 :param max_file_bytes: Only return file content under this file size bytes
533 :type max_file_bytes: Optional(int)
533 :type max_file_bytes: Optional(int)
534 :param cache: Use internal caches for fetching files. If disabled fetching
534 :param cache: Use internal caches for fetching files. If disabled fetching
535 files is slower but more memory efficient
535 files is slower but more memory efficient
536 :type cache: Optional(bool)
536 :type cache: Optional(bool)
537
537
538 Example output:
538 Example output:
539
539
540 .. code-block:: bash
540 .. code-block:: bash
541
541
542 id : <id_given_in_input>
542 id : <id_given_in_input>
543 result: {
543 result: {
544 "binary": false,
544 "binary": false,
545 "extension": "py",
545 "extension": "py",
546 "lines": 35,
546 "lines": 35,
547 "content": "....",
547 "content": "....",
548 "md5": "76318336366b0f17ee249e11b0c99c41",
548 "md5": "76318336366b0f17ee249e11b0c99c41",
549 "mimetype": "text/x-python",
549 "mimetype": "text/x-python",
550 "name": "python.py",
550 "name": "python.py",
551 "size": 817,
551 "size": 817,
552 "type": "file",
552 "type": "file",
553 }
553 }
554 error: null
554 error: null
555 """
555 """
556
556
557 repo = get_repo_or_error(repoid)
557 repo = get_repo_or_error(repoid)
558 if not has_superadmin_permission(apiuser):
558 if not has_superadmin_permission(apiuser):
559 _perms = ('repository.admin', 'repository.write', 'repository.read',)
559 _perms = ('repository.admin', 'repository.write', 'repository.read',)
560 validate_repo_permissions(apiuser, repoid, repo, _perms)
560 validate_repo_permissions(apiuser, repoid, repo, _perms)
561
561
562 cache = Optional.extract(cache, binary=True)
562 cache = Optional.extract(cache, binary=True)
563 details = Optional.extract(details)
563 details = Optional.extract(details)
564 _extended_types = ['minimal', 'minimal+search', 'basic', 'full']
564 _extended_types = ['minimal', 'minimal+search', 'basic', 'full']
565 if details not in _extended_types:
565 if details not in _extended_types:
566 raise JSONRPCError(
566 raise JSONRPCError(
567 'ret_type must be one of %s, got %s' % (','.join(_extended_types)), details)
567 'ret_type must be one of %s, got %s' % (','.join(_extended_types)), details)
568 extended_info = False
568 extended_info = False
569 content = False
569 content = False
570
570
571 if details == 'minimal':
571 if details == 'minimal':
572 extended_info = False
572 extended_info = False
573
573
574 elif details == 'basic':
574 elif details == 'basic':
575 extended_info = True
575 extended_info = True
576
576
577 elif details == 'full':
577 elif details == 'full':
578 extended_info = content = True
578 extended_info = content = True
579
579
580 file_path = safe_unicode(file_path)
580 file_path = safe_unicode(file_path)
581 try:
581 try:
582 # check if repo is not empty by any chance, skip quicker if it is.
582 # check if repo is not empty by any chance, skip quicker if it is.
583 _scm = repo.scm_instance()
583 _scm = repo.scm_instance()
584 if _scm.is_empty():
584 if _scm.is_empty():
585 return None
585 return None
586
586
587 node = ScmModel().get_node(
587 node = ScmModel().get_node(
588 repo, commit_id, file_path, extended_info=extended_info,
588 repo, commit_id, file_path, extended_info=extended_info,
589 content=content, max_file_bytes=max_file_bytes, cache=cache)
589 content=content, max_file_bytes=max_file_bytes, cache=cache)
590 except NodeDoesNotExistError:
590 except NodeDoesNotExistError:
591 raise JSONRPCError(u'There is no file in repo: `{}` at path `{}` for commit: `{}`'.format(
591 raise JSONRPCError(u'There is no file in repo: `{}` at path `{}` for commit: `{}`'.format(
592 repo.repo_name, file_path, commit_id))
592 repo.repo_name, file_path, commit_id))
593 except Exception:
593 except Exception:
594 log.exception(u"Exception occurred while trying to get repo %s file",
594 log.exception(u"Exception occurred while trying to get repo %s file",
595 repo.repo_name)
595 repo.repo_name)
596 raise JSONRPCError(u'failed to get repo: `{}` file at path {}'.format(
596 raise JSONRPCError(u'failed to get repo: `{}` file at path {}'.format(
597 repo.repo_name, file_path))
597 repo.repo_name, file_path))
598
598
599 return node
599 return node
600
600
601
601
602 @jsonrpc_method()
602 @jsonrpc_method()
603 def get_repo_fts_tree(request, apiuser, repoid, commit_id, root_path):
603 def get_repo_fts_tree(request, apiuser, repoid, commit_id, root_path):
604 """
604 """
605 Returns a list of tree nodes for path at given revision. This api is built
605 Returns a list of tree nodes for path at given revision. This api is built
606 strictly for usage in full text search building, and shouldn't be consumed
606 strictly for usage in full text search building, and shouldn't be consumed
607
607
608 This command can only be run using an |authtoken| with admin rights,
608 This command can only be run using an |authtoken| with admin rights,
609 or users with at least read rights to |repos|.
609 or users with at least read rights to |repos|.
610
610
611 """
611 """
612
612
613 repo = get_repo_or_error(repoid)
613 repo = get_repo_or_error(repoid)
614 if not has_superadmin_permission(apiuser):
614 if not has_superadmin_permission(apiuser):
615 _perms = ('repository.admin', 'repository.write', 'repository.read',)
615 _perms = ('repository.admin', 'repository.write', 'repository.read',)
616 validate_repo_permissions(apiuser, repoid, repo, _perms)
616 validate_repo_permissions(apiuser, repoid, repo, _perms)
617
617
618 repo_id = repo.repo_id
618 repo_id = repo.repo_id
619 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
619 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
620 cache_on = cache_seconds > 0
620 cache_on = cache_seconds > 0
621
621
622 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
622 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
623 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
623 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
624
624
625 def compute_fts_tree(cache_ver, repo_id, commit_id, root_path):
625 def compute_fts_tree(cache_ver, repo_id, commit_id, root_path):
626 return ScmModel().get_fts_data(repo_id, commit_id, root_path)
626 return ScmModel().get_fts_data(repo_id, commit_id, root_path)
627
627
628 try:
628 try:
629 # check if repo is not empty by any chance, skip quicker if it is.
629 # check if repo is not empty by any chance, skip quicker if it is.
630 _scm = repo.scm_instance()
630 _scm = repo.scm_instance()
631 if _scm.is_empty():
631 if _scm.is_empty():
632 return []
632 return []
633 except RepositoryError:
633 except RepositoryError:
634 log.exception("Exception occurred while trying to get repo nodes")
634 log.exception("Exception occurred while trying to get repo nodes")
635 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
635 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
636
636
637 try:
637 try:
638 # we need to resolve commit_id to a FULL sha for cache to work correctly.
638 # we need to resolve commit_id to a FULL sha for cache to work correctly.
639 # sending 'master' is a pointer that needs to be translated to current commit.
639 # sending 'master' is a pointer that needs to be translated to current commit.
640 commit_id = _scm.get_commit(commit_id=commit_id).raw_id
640 commit_id = _scm.get_commit(commit_id=commit_id).raw_id
641 log.debug(
641 log.debug(
642 'Computing FTS REPO TREE for repo_id %s commit_id `%s` '
642 'Computing FTS REPO TREE for repo_id %s commit_id `%s` '
643 'with caching: %s[TTL: %ss]' % (
643 'with caching: %s[TTL: %ss]' % (
644 repo_id, commit_id, cache_on, cache_seconds or 0))
644 repo_id, commit_id, cache_on, cache_seconds or 0))
645
645
646 tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path)
646 tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path)
647 return tree_files
647 return tree_files
648
648
649 except Exception:
649 except Exception:
650 log.exception("Exception occurred while trying to get repo nodes")
650 log.exception("Exception occurred while trying to get repo nodes")
651 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
651 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
652
652
653
653
654 @jsonrpc_method()
654 @jsonrpc_method()
655 def get_repo_refs(request, apiuser, repoid):
655 def get_repo_refs(request, apiuser, repoid):
656 """
656 """
657 Returns a dictionary of current references. It returns
657 Returns a dictionary of current references. It returns
658 bookmarks, branches, closed_branches, and tags for given repository
658 bookmarks, branches, closed_branches, and tags for given repository
659
659
660 It's possible to specify ret_type to show only `files` or `dirs`.
660 It's possible to specify ret_type to show only `files` or `dirs`.
661
661
662 This command can only be run using an |authtoken| with admin rights,
662 This command can only be run using an |authtoken| with admin rights,
663 or users with at least read rights to |repos|.
663 or users with at least read rights to |repos|.
664
664
665 :param apiuser: This is filled automatically from the |authtoken|.
665 :param apiuser: This is filled automatically from the |authtoken|.
666 :type apiuser: AuthUser
666 :type apiuser: AuthUser
667 :param repoid: The repository name or repository ID.
667 :param repoid: The repository name or repository ID.
668 :type repoid: str or int
668 :type repoid: str or int
669
669
670 Example output:
670 Example output:
671
671
672 .. code-block:: bash
672 .. code-block:: bash
673
673
674 id : <id_given_in_input>
674 id : <id_given_in_input>
675 "result": {
675 "result": {
676 "bookmarks": {
676 "bookmarks": {
677 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
677 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
678 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
678 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
679 },
679 },
680 "branches": {
680 "branches": {
681 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
681 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
682 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
682 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
683 },
683 },
684 "branches_closed": {},
684 "branches_closed": {},
685 "tags": {
685 "tags": {
686 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
686 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
687 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
687 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
688 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
688 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
689 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
689 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
690 }
690 }
691 }
691 }
692 error: null
692 error: null
693 """
693 """
694
694
695 repo = get_repo_or_error(repoid)
695 repo = get_repo_or_error(repoid)
696 if not has_superadmin_permission(apiuser):
696 if not has_superadmin_permission(apiuser):
697 _perms = ('repository.admin', 'repository.write', 'repository.read',)
697 _perms = ('repository.admin', 'repository.write', 'repository.read',)
698 validate_repo_permissions(apiuser, repoid, repo, _perms)
698 validate_repo_permissions(apiuser, repoid, repo, _perms)
699
699
700 try:
700 try:
701 # check if repo is not empty by any chance, skip quicker if it is.
701 # check if repo is not empty by any chance, skip quicker if it is.
702 vcs_instance = repo.scm_instance()
702 vcs_instance = repo.scm_instance()
703 refs = vcs_instance.refs()
703 refs = vcs_instance.refs()
704 return refs
704 return refs
705 except Exception:
705 except Exception:
706 log.exception("Exception occurred while trying to get repo refs")
706 log.exception("Exception occurred while trying to get repo refs")
707 raise JSONRPCError(
707 raise JSONRPCError(
708 'failed to get repo: `%s` references' % repo.repo_name
708 'failed to get repo: `%s` references' % repo.repo_name
709 )
709 )
710
710
711
711
712 @jsonrpc_method()
712 @jsonrpc_method()
713 def create_repo(
713 def create_repo(
714 request, apiuser, repo_name, repo_type,
714 request, apiuser, repo_name, repo_type,
715 owner=Optional(OAttr('apiuser')),
715 owner=Optional(OAttr('apiuser')),
716 description=Optional(''),
716 description=Optional(''),
717 private=Optional(False),
717 private=Optional(False),
718 clone_uri=Optional(None),
718 clone_uri=Optional(None),
719 push_uri=Optional(None),
719 push_uri=Optional(None),
720 landing_rev=Optional(None),
720 landing_rev=Optional(None),
721 enable_statistics=Optional(False),
721 enable_statistics=Optional(False),
722 enable_locking=Optional(False),
722 enable_locking=Optional(False),
723 enable_downloads=Optional(False),
723 enable_downloads=Optional(False),
724 copy_permissions=Optional(False)):
724 copy_permissions=Optional(False)):
725 """
725 """
726 Creates a repository.
726 Creates a repository.
727
727
728 * If the repository name contains "/", repository will be created inside
728 * If the repository name contains "/", repository will be created inside
729 a repository group or nested repository groups
729 a repository group or nested repository groups
730
730
731 For example "foo/bar/repo1" will create |repo| called "repo1" inside
731 For example "foo/bar/repo1" will create |repo| called "repo1" inside
732 group "foo/bar". You have to have permissions to access and write to
732 group "foo/bar". You have to have permissions to access and write to
733 the last repository group ("bar" in this example)
733 the last repository group ("bar" in this example)
734
734
735 This command can only be run using an |authtoken| with at least
735 This command can only be run using an |authtoken| with at least
736 permissions to create repositories, or write permissions to
736 permissions to create repositories, or write permissions to
737 parent repository groups.
737 parent repository groups.
738
738
739 :param apiuser: This is filled automatically from the |authtoken|.
739 :param apiuser: This is filled automatically from the |authtoken|.
740 :type apiuser: AuthUser
740 :type apiuser: AuthUser
741 :param repo_name: Set the repository name.
741 :param repo_name: Set the repository name.
742 :type repo_name: str
742 :type repo_name: str
743 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
743 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
744 :type repo_type: str
744 :type repo_type: str
745 :param owner: user_id or username
745 :param owner: user_id or username
746 :type owner: Optional(str)
746 :type owner: Optional(str)
747 :param description: Set the repository description.
747 :param description: Set the repository description.
748 :type description: Optional(str)
748 :type description: Optional(str)
749 :param private: set repository as private
749 :param private: set repository as private
750 :type private: bool
750 :type private: bool
751 :param clone_uri: set clone_uri
751 :param clone_uri: set clone_uri
752 :type clone_uri: str
752 :type clone_uri: str
753 :param push_uri: set push_uri
753 :param push_uri: set push_uri
754 :type push_uri: str
754 :type push_uri: str
755 :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd
755 :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd
756 :type landing_rev: str
756 :type landing_rev: str
757 :param enable_locking:
757 :param enable_locking:
758 :type enable_locking: bool
758 :type enable_locking: bool
759 :param enable_downloads:
759 :param enable_downloads:
760 :type enable_downloads: bool
760 :type enable_downloads: bool
761 :param enable_statistics:
761 :param enable_statistics:
762 :type enable_statistics: bool
762 :type enable_statistics: bool
763 :param copy_permissions: Copy permission from group in which the
763 :param copy_permissions: Copy permission from group in which the
764 repository is being created.
764 repository is being created.
765 :type copy_permissions: bool
765 :type copy_permissions: bool
766
766
767
767
768 Example output:
768 Example output:
769
769
770 .. code-block:: bash
770 .. code-block:: bash
771
771
772 id : <id_given_in_input>
772 id : <id_given_in_input>
773 result: {
773 result: {
774 "msg": "Created new repository `<reponame>`",
774 "msg": "Created new repository `<reponame>`",
775 "success": true,
775 "success": true,
776 "task": "<celery task id or None if done sync>"
776 "task": "<celery task id or None if done sync>"
777 }
777 }
778 error: null
778 error: null
779
779
780
780
781 Example error output:
781 Example error output:
782
782
783 .. code-block:: bash
783 .. code-block:: bash
784
784
785 id : <id_given_in_input>
785 id : <id_given_in_input>
786 result : null
786 result : null
787 error : {
787 error : {
788 'failed to create repository `<repo_name>`'
788 'failed to create repository `<repo_name>`'
789 }
789 }
790
790
791 """
791 """
792
792
793 owner = validate_set_owner_permissions(apiuser, owner)
793 owner = validate_set_owner_permissions(apiuser, owner)
794
794
795 description = Optional.extract(description)
795 description = Optional.extract(description)
796 copy_permissions = Optional.extract(copy_permissions)
796 copy_permissions = Optional.extract(copy_permissions)
797 clone_uri = Optional.extract(clone_uri)
797 clone_uri = Optional.extract(clone_uri)
798 push_uri = Optional.extract(push_uri)
798 push_uri = Optional.extract(push_uri)
799
799
800 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
800 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
801 if isinstance(private, Optional):
801 if isinstance(private, Optional):
802 private = defs.get('repo_private') or Optional.extract(private)
802 private = defs.get('repo_private') or Optional.extract(private)
803 if isinstance(repo_type, Optional):
803 if isinstance(repo_type, Optional):
804 repo_type = defs.get('repo_type')
804 repo_type = defs.get('repo_type')
805 if isinstance(enable_statistics, Optional):
805 if isinstance(enable_statistics, Optional):
806 enable_statistics = defs.get('repo_enable_statistics')
806 enable_statistics = defs.get('repo_enable_statistics')
807 if isinstance(enable_locking, Optional):
807 if isinstance(enable_locking, Optional):
808 enable_locking = defs.get('repo_enable_locking')
808 enable_locking = defs.get('repo_enable_locking')
809 if isinstance(enable_downloads, Optional):
809 if isinstance(enable_downloads, Optional):
810 enable_downloads = defs.get('repo_enable_downloads')
810 enable_downloads = defs.get('repo_enable_downloads')
811
811
812 landing_ref, _label = ScmModel.backend_landing_ref(repo_type)
812 landing_ref, _label = ScmModel.backend_landing_ref(repo_type)
813 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
813 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
814 ref_choices = list(set(ref_choices + [landing_ref]))
814 ref_choices = list(set(ref_choices + [landing_ref]))
815
815
816 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
816 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
817
817
818 schema = repo_schema.RepoSchema().bind(
818 schema = repo_schema.RepoSchema().bind(
819 repo_type_options=rhodecode.BACKENDS.keys(),
819 repo_type_options=rhodecode.BACKENDS.keys(),
820 repo_ref_options=ref_choices,
820 repo_ref_options=ref_choices,
821 repo_type=repo_type,
821 repo_type=repo_type,
822 # user caller
822 # user caller
823 user=apiuser)
823 user=apiuser)
824
824
825 try:
825 try:
826 schema_data = schema.deserialize(dict(
826 schema_data = schema.deserialize(dict(
827 repo_name=repo_name,
827 repo_name=repo_name,
828 repo_type=repo_type,
828 repo_type=repo_type,
829 repo_owner=owner.username,
829 repo_owner=owner.username,
830 repo_description=description,
830 repo_description=description,
831 repo_landing_commit_ref=landing_commit_ref,
831 repo_landing_commit_ref=landing_commit_ref,
832 repo_clone_uri=clone_uri,
832 repo_clone_uri=clone_uri,
833 repo_push_uri=push_uri,
833 repo_push_uri=push_uri,
834 repo_private=private,
834 repo_private=private,
835 repo_copy_permissions=copy_permissions,
835 repo_copy_permissions=copy_permissions,
836 repo_enable_statistics=enable_statistics,
836 repo_enable_statistics=enable_statistics,
837 repo_enable_downloads=enable_downloads,
837 repo_enable_downloads=enable_downloads,
838 repo_enable_locking=enable_locking))
838 repo_enable_locking=enable_locking))
839 except validation_schema.Invalid as err:
839 except validation_schema.Invalid as err:
840 raise JSONRPCValidationError(colander_exc=err)
840 raise JSONRPCValidationError(colander_exc=err)
841
841
842 try:
842 try:
843 data = {
843 data = {
844 'owner': owner,
844 'owner': owner,
845 'repo_name': schema_data['repo_group']['repo_name_without_group'],
845 'repo_name': schema_data['repo_group']['repo_name_without_group'],
846 'repo_name_full': schema_data['repo_name'],
846 'repo_name_full': schema_data['repo_name'],
847 'repo_group': schema_data['repo_group']['repo_group_id'],
847 'repo_group': schema_data['repo_group']['repo_group_id'],
848 'repo_type': schema_data['repo_type'],
848 'repo_type': schema_data['repo_type'],
849 'repo_description': schema_data['repo_description'],
849 'repo_description': schema_data['repo_description'],
850 'repo_private': schema_data['repo_private'],
850 'repo_private': schema_data['repo_private'],
851 'clone_uri': schema_data['repo_clone_uri'],
851 'clone_uri': schema_data['repo_clone_uri'],
852 'push_uri': schema_data['repo_push_uri'],
852 'push_uri': schema_data['repo_push_uri'],
853 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
853 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
854 'enable_statistics': schema_data['repo_enable_statistics'],
854 'enable_statistics': schema_data['repo_enable_statistics'],
855 'enable_locking': schema_data['repo_enable_locking'],
855 'enable_locking': schema_data['repo_enable_locking'],
856 'enable_downloads': schema_data['repo_enable_downloads'],
856 'enable_downloads': schema_data['repo_enable_downloads'],
857 'repo_copy_permissions': schema_data['repo_copy_permissions'],
857 'repo_copy_permissions': schema_data['repo_copy_permissions'],
858 }
858 }
859
859
860 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
860 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
861 task_id = get_task_id(task)
861 task_id = get_task_id(task)
862 # no commit, it's done in RepoModel, or async via celery
862 # no commit, it's done in RepoModel, or async via celery
863 return {
863 return {
864 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
864 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
865 'success': True, # cannot return the repo data here since fork
865 'success': True, # cannot return the repo data here since fork
866 # can be done async
866 # can be done async
867 'task': task_id
867 'task': task_id
868 }
868 }
869 except Exception:
869 except Exception:
870 log.exception(
870 log.exception(
871 u"Exception while trying to create the repository %s",
871 u"Exception while trying to create the repository %s",
872 schema_data['repo_name'])
872 schema_data['repo_name'])
873 raise JSONRPCError(
873 raise JSONRPCError(
874 'failed to create repository `%s`' % (schema_data['repo_name'],))
874 'failed to create repository `%s`' % (schema_data['repo_name'],))
875
875
876
876
877 @jsonrpc_method()
877 @jsonrpc_method()
878 def add_field_to_repo(request, apiuser, repoid, key, label=Optional(''),
878 def add_field_to_repo(request, apiuser, repoid, key, label=Optional(''),
879 description=Optional('')):
879 description=Optional('')):
880 """
880 """
881 Adds an extra field to a repository.
881 Adds an extra field to a repository.
882
882
883 This command can only be run using an |authtoken| with at least
883 This command can only be run using an |authtoken| with at least
884 write permissions to the |repo|.
884 write permissions to the |repo|.
885
885
886 :param apiuser: This is filled automatically from the |authtoken|.
886 :param apiuser: This is filled automatically from the |authtoken|.
887 :type apiuser: AuthUser
887 :type apiuser: AuthUser
888 :param repoid: Set the repository name or repository id.
888 :param repoid: Set the repository name or repository id.
889 :type repoid: str or int
889 :type repoid: str or int
890 :param key: Create a unique field key for this repository.
890 :param key: Create a unique field key for this repository.
891 :type key: str
891 :type key: str
892 :param label:
892 :param label:
893 :type label: Optional(str)
893 :type label: Optional(str)
894 :param description:
894 :param description:
895 :type description: Optional(str)
895 :type description: Optional(str)
896 """
896 """
897 repo = get_repo_or_error(repoid)
897 repo = get_repo_or_error(repoid)
898 if not has_superadmin_permission(apiuser):
898 if not has_superadmin_permission(apiuser):
899 _perms = ('repository.admin',)
899 _perms = ('repository.admin',)
900 validate_repo_permissions(apiuser, repoid, repo, _perms)
900 validate_repo_permissions(apiuser, repoid, repo, _perms)
901
901
902 label = Optional.extract(label) or key
902 label = Optional.extract(label) or key
903 description = Optional.extract(description)
903 description = Optional.extract(description)
904
904
905 field = RepositoryField.get_by_key_name(key, repo)
905 field = RepositoryField.get_by_key_name(key, repo)
906 if field:
906 if field:
907 raise JSONRPCError('Field with key '
907 raise JSONRPCError('Field with key '
908 '`%s` exists for repo `%s`' % (key, repoid))
908 '`%s` exists for repo `%s`' % (key, repoid))
909
909
910 try:
910 try:
911 RepoModel().add_repo_field(repo, key, field_label=label,
911 RepoModel().add_repo_field(repo, key, field_label=label,
912 field_desc=description)
912 field_desc=description)
913 Session().commit()
913 Session().commit()
914 return {
914 return {
915 'msg': "Added new repository field `%s`" % (key,),
915 'msg': "Added new repository field `%s`" % (key,),
916 'success': True,
916 'success': True,
917 }
917 }
918 except Exception:
918 except Exception:
919 log.exception("Exception occurred while trying to add field to repo")
919 log.exception("Exception occurred while trying to add field to repo")
920 raise JSONRPCError(
920 raise JSONRPCError(
921 'failed to create new field for repository `%s`' % (repoid,))
921 'failed to create new field for repository `%s`' % (repoid,))
922
922
923
923
924 @jsonrpc_method()
924 @jsonrpc_method()
925 def remove_field_from_repo(request, apiuser, repoid, key):
925 def remove_field_from_repo(request, apiuser, repoid, key):
926 """
926 """
927 Removes an extra field from a repository.
927 Removes an extra field from a repository.
928
928
929 This command can only be run using an |authtoken| with at least
929 This command can only be run using an |authtoken| with at least
930 write permissions to the |repo|.
930 write permissions to the |repo|.
931
931
932 :param apiuser: This is filled automatically from the |authtoken|.
932 :param apiuser: This is filled automatically from the |authtoken|.
933 :type apiuser: AuthUser
933 :type apiuser: AuthUser
934 :param repoid: Set the repository name or repository ID.
934 :param repoid: Set the repository name or repository ID.
935 :type repoid: str or int
935 :type repoid: str or int
936 :param key: Set the unique field key for this repository.
936 :param key: Set the unique field key for this repository.
937 :type key: str
937 :type key: str
938 """
938 """
939
939
940 repo = get_repo_or_error(repoid)
940 repo = get_repo_or_error(repoid)
941 if not has_superadmin_permission(apiuser):
941 if not has_superadmin_permission(apiuser):
942 _perms = ('repository.admin',)
942 _perms = ('repository.admin',)
943 validate_repo_permissions(apiuser, repoid, repo, _perms)
943 validate_repo_permissions(apiuser, repoid, repo, _perms)
944
944
945 field = RepositoryField.get_by_key_name(key, repo)
945 field = RepositoryField.get_by_key_name(key, repo)
946 if not field:
946 if not field:
947 raise JSONRPCError('Field with key `%s` does not '
947 raise JSONRPCError('Field with key `%s` does not '
948 'exists for repo `%s`' % (key, repoid))
948 'exists for repo `%s`' % (key, repoid))
949
949
950 try:
950 try:
951 RepoModel().delete_repo_field(repo, field_key=key)
951 RepoModel().delete_repo_field(repo, field_key=key)
952 Session().commit()
952 Session().commit()
953 return {
953 return {
954 'msg': "Deleted repository field `%s`" % (key,),
954 'msg': "Deleted repository field `%s`" % (key,),
955 'success': True,
955 'success': True,
956 }
956 }
957 except Exception:
957 except Exception:
958 log.exception(
958 log.exception(
959 "Exception occurred while trying to delete field from repo")
959 "Exception occurred while trying to delete field from repo")
960 raise JSONRPCError(
960 raise JSONRPCError(
961 'failed to delete field for repository `%s`' % (repoid,))
961 'failed to delete field for repository `%s`' % (repoid,))
962
962
963
963
964 @jsonrpc_method()
964 @jsonrpc_method()
965 def update_repo(
965 def update_repo(
966 request, apiuser, repoid, repo_name=Optional(None),
966 request, apiuser, repoid, repo_name=Optional(None),
967 owner=Optional(OAttr('apiuser')), description=Optional(''),
967 owner=Optional(OAttr('apiuser')), description=Optional(''),
968 private=Optional(False),
968 private=Optional(False),
969 clone_uri=Optional(None), push_uri=Optional(None),
969 clone_uri=Optional(None), push_uri=Optional(None),
970 landing_rev=Optional(None), fork_of=Optional(None),
970 landing_rev=Optional(None), fork_of=Optional(None),
971 enable_statistics=Optional(False),
971 enable_statistics=Optional(False),
972 enable_locking=Optional(False),
972 enable_locking=Optional(False),
973 enable_downloads=Optional(False), fields=Optional('')):
973 enable_downloads=Optional(False), fields=Optional('')):
974 """
974 """
975 Updates a repository with the given information.
975 Updates a repository with the given information.
976
976
977 This command can only be run using an |authtoken| with at least
977 This command can only be run using an |authtoken| with at least
978 admin permissions to the |repo|.
978 admin permissions to the |repo|.
979
979
980 * If the repository name contains "/", repository will be updated
980 * If the repository name contains "/", repository will be updated
981 accordingly with a repository group or nested repository groups
981 accordingly with a repository group or nested repository groups
982
982
983 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
983 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
984 called "repo-test" and place it inside group "foo/bar".
984 called "repo-test" and place it inside group "foo/bar".
985 You have to have permissions to access and write to the last repository
985 You have to have permissions to access and write to the last repository
986 group ("bar" in this example)
986 group ("bar" in this example)
987
987
988 :param apiuser: This is filled automatically from the |authtoken|.
988 :param apiuser: This is filled automatically from the |authtoken|.
989 :type apiuser: AuthUser
989 :type apiuser: AuthUser
990 :param repoid: repository name or repository ID.
990 :param repoid: repository name or repository ID.
991 :type repoid: str or int
991 :type repoid: str or int
992 :param repo_name: Update the |repo| name, including the
992 :param repo_name: Update the |repo| name, including the
993 repository group it's in.
993 repository group it's in.
994 :type repo_name: str
994 :type repo_name: str
995 :param owner: Set the |repo| owner.
995 :param owner: Set the |repo| owner.
996 :type owner: str
996 :type owner: str
997 :param fork_of: Set the |repo| as fork of another |repo|.
997 :param fork_of: Set the |repo| as fork of another |repo|.
998 :type fork_of: str
998 :type fork_of: str
999 :param description: Update the |repo| description.
999 :param description: Update the |repo| description.
1000 :type description: str
1000 :type description: str
1001 :param private: Set the |repo| as private. (True | False)
1001 :param private: Set the |repo| as private. (True | False)
1002 :type private: bool
1002 :type private: bool
1003 :param clone_uri: Update the |repo| clone URI.
1003 :param clone_uri: Update the |repo| clone URI.
1004 :type clone_uri: str
1004 :type clone_uri: str
1005 :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd
1005 :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd
1006 :type landing_rev: str
1006 :type landing_rev: str
1007 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1007 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1008 :type enable_statistics: bool
1008 :type enable_statistics: bool
1009 :param enable_locking: Enable |repo| locking.
1009 :param enable_locking: Enable |repo| locking.
1010 :type enable_locking: bool
1010 :type enable_locking: bool
1011 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1011 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1012 :type enable_downloads: bool
1012 :type enable_downloads: bool
1013 :param fields: Add extra fields to the |repo|. Use the following
1013 :param fields: Add extra fields to the |repo|. Use the following
1014 example format: ``field_key=field_val,field_key2=fieldval2``.
1014 example format: ``field_key=field_val,field_key2=fieldval2``.
1015 Escape ', ' with \,
1015 Escape ', ' with \,
1016 :type fields: str
1016 :type fields: str
1017 """
1017 """
1018
1018
1019 repo = get_repo_or_error(repoid)
1019 repo = get_repo_or_error(repoid)
1020
1020
1021 include_secrets = False
1021 include_secrets = False
1022 if not has_superadmin_permission(apiuser):
1022 if not has_superadmin_permission(apiuser):
1023 _perms = ('repository.admin',)
1023 _perms = ('repository.admin',)
1024 validate_repo_permissions(apiuser, repoid, repo, _perms)
1024 validate_repo_permissions(apiuser, repoid, repo, _perms)
1025 else:
1025 else:
1026 include_secrets = True
1026 include_secrets = True
1027
1027
1028 updates = dict(
1028 updates = dict(
1029 repo_name=repo_name
1029 repo_name=repo_name
1030 if not isinstance(repo_name, Optional) else repo.repo_name,
1030 if not isinstance(repo_name, Optional) else repo.repo_name,
1031
1031
1032 fork_id=fork_of
1032 fork_id=fork_of
1033 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
1033 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
1034
1034
1035 user=owner
1035 user=owner
1036 if not isinstance(owner, Optional) else repo.user.username,
1036 if not isinstance(owner, Optional) else repo.user.username,
1037
1037
1038 repo_description=description
1038 repo_description=description
1039 if not isinstance(description, Optional) else repo.description,
1039 if not isinstance(description, Optional) else repo.description,
1040
1040
1041 repo_private=private
1041 repo_private=private
1042 if not isinstance(private, Optional) else repo.private,
1042 if not isinstance(private, Optional) else repo.private,
1043
1043
1044 clone_uri=clone_uri
1044 clone_uri=clone_uri
1045 if not isinstance(clone_uri, Optional) else repo.clone_uri,
1045 if not isinstance(clone_uri, Optional) else repo.clone_uri,
1046
1046
1047 push_uri=push_uri
1047 push_uri=push_uri
1048 if not isinstance(push_uri, Optional) else repo.push_uri,
1048 if not isinstance(push_uri, Optional) else repo.push_uri,
1049
1049
1050 repo_landing_rev=landing_rev
1050 repo_landing_rev=landing_rev
1051 if not isinstance(landing_rev, Optional) else repo._landing_revision,
1051 if not isinstance(landing_rev, Optional) else repo._landing_revision,
1052
1052
1053 repo_enable_statistics=enable_statistics
1053 repo_enable_statistics=enable_statistics
1054 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
1054 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
1055
1055
1056 repo_enable_locking=enable_locking
1056 repo_enable_locking=enable_locking
1057 if not isinstance(enable_locking, Optional) else repo.enable_locking,
1057 if not isinstance(enable_locking, Optional) else repo.enable_locking,
1058
1058
1059 repo_enable_downloads=enable_downloads
1059 repo_enable_downloads=enable_downloads
1060 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
1060 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
1061
1061
1062 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1062 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1063 ref_choices, _labels = ScmModel().get_repo_landing_revs(
1063 ref_choices, _labels = ScmModel().get_repo_landing_revs(
1064 request.translate, repo=repo)
1064 request.translate, repo=repo)
1065 ref_choices = list(set(ref_choices + [landing_ref]))
1065 ref_choices = list(set(ref_choices + [landing_ref]))
1066
1066
1067 old_values = repo.get_api_data()
1067 old_values = repo.get_api_data()
1068 repo_type = repo.repo_type
1068 repo_type = repo.repo_type
1069 schema = repo_schema.RepoSchema().bind(
1069 schema = repo_schema.RepoSchema().bind(
1070 repo_type_options=rhodecode.BACKENDS.keys(),
1070 repo_type_options=rhodecode.BACKENDS.keys(),
1071 repo_ref_options=ref_choices,
1071 repo_ref_options=ref_choices,
1072 repo_type=repo_type,
1072 repo_type=repo_type,
1073 # user caller
1073 # user caller
1074 user=apiuser,
1074 user=apiuser,
1075 old_values=old_values)
1075 old_values=old_values)
1076 try:
1076 try:
1077 schema_data = schema.deserialize(dict(
1077 schema_data = schema.deserialize(dict(
1078 # we save old value, users cannot change type
1078 # we save old value, users cannot change type
1079 repo_type=repo_type,
1079 repo_type=repo_type,
1080
1080
1081 repo_name=updates['repo_name'],
1081 repo_name=updates['repo_name'],
1082 repo_owner=updates['user'],
1082 repo_owner=updates['user'],
1083 repo_description=updates['repo_description'],
1083 repo_description=updates['repo_description'],
1084 repo_clone_uri=updates['clone_uri'],
1084 repo_clone_uri=updates['clone_uri'],
1085 repo_push_uri=updates['push_uri'],
1085 repo_push_uri=updates['push_uri'],
1086 repo_fork_of=updates['fork_id'],
1086 repo_fork_of=updates['fork_id'],
1087 repo_private=updates['repo_private'],
1087 repo_private=updates['repo_private'],
1088 repo_landing_commit_ref=updates['repo_landing_rev'],
1088 repo_landing_commit_ref=updates['repo_landing_rev'],
1089 repo_enable_statistics=updates['repo_enable_statistics'],
1089 repo_enable_statistics=updates['repo_enable_statistics'],
1090 repo_enable_downloads=updates['repo_enable_downloads'],
1090 repo_enable_downloads=updates['repo_enable_downloads'],
1091 repo_enable_locking=updates['repo_enable_locking']))
1091 repo_enable_locking=updates['repo_enable_locking']))
1092 except validation_schema.Invalid as err:
1092 except validation_schema.Invalid as err:
1093 raise JSONRPCValidationError(colander_exc=err)
1093 raise JSONRPCValidationError(colander_exc=err)
1094
1094
1095 # save validated data back into the updates dict
1095 # save validated data back into the updates dict
1096 validated_updates = dict(
1096 validated_updates = dict(
1097 repo_name=schema_data['repo_group']['repo_name_without_group'],
1097 repo_name=schema_data['repo_group']['repo_name_without_group'],
1098 repo_group=schema_data['repo_group']['repo_group_id'],
1098 repo_group=schema_data['repo_group']['repo_group_id'],
1099
1099
1100 user=schema_data['repo_owner'],
1100 user=schema_data['repo_owner'],
1101 repo_description=schema_data['repo_description'],
1101 repo_description=schema_data['repo_description'],
1102 repo_private=schema_data['repo_private'],
1102 repo_private=schema_data['repo_private'],
1103 clone_uri=schema_data['repo_clone_uri'],
1103 clone_uri=schema_data['repo_clone_uri'],
1104 push_uri=schema_data['repo_push_uri'],
1104 push_uri=schema_data['repo_push_uri'],
1105 repo_landing_rev=schema_data['repo_landing_commit_ref'],
1105 repo_landing_rev=schema_data['repo_landing_commit_ref'],
1106 repo_enable_statistics=schema_data['repo_enable_statistics'],
1106 repo_enable_statistics=schema_data['repo_enable_statistics'],
1107 repo_enable_locking=schema_data['repo_enable_locking'],
1107 repo_enable_locking=schema_data['repo_enable_locking'],
1108 repo_enable_downloads=schema_data['repo_enable_downloads'],
1108 repo_enable_downloads=schema_data['repo_enable_downloads'],
1109 )
1109 )
1110
1110
1111 if schema_data['repo_fork_of']:
1111 if schema_data['repo_fork_of']:
1112 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
1112 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
1113 validated_updates['fork_id'] = fork_repo.repo_id
1113 validated_updates['fork_id'] = fork_repo.repo_id
1114
1114
1115 # extra fields
1115 # extra fields
1116 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
1116 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
1117 if fields:
1117 if fields:
1118 validated_updates.update(fields)
1118 validated_updates.update(fields)
1119
1119
1120 try:
1120 try:
1121 RepoModel().update(repo, **validated_updates)
1121 RepoModel().update(repo, **validated_updates)
1122 audit_logger.store_api(
1122 audit_logger.store_api(
1123 'repo.edit', action_data={'old_data': old_values},
1123 'repo.edit', action_data={'old_data': old_values},
1124 user=apiuser, repo=repo)
1124 user=apiuser, repo=repo)
1125 Session().commit()
1125 Session().commit()
1126 return {
1126 return {
1127 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
1127 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
1128 'repository': repo.get_api_data(include_secrets=include_secrets)
1128 'repository': repo.get_api_data(include_secrets=include_secrets)
1129 }
1129 }
1130 except Exception:
1130 except Exception:
1131 log.exception(
1131 log.exception(
1132 u"Exception while trying to update the repository %s",
1132 u"Exception while trying to update the repository %s",
1133 repoid)
1133 repoid)
1134 raise JSONRPCError('failed to update repo `%s`' % repoid)
1134 raise JSONRPCError('failed to update repo `%s`' % repoid)
1135
1135
1136
1136
1137 @jsonrpc_method()
1137 @jsonrpc_method()
1138 def fork_repo(request, apiuser, repoid, fork_name,
1138 def fork_repo(request, apiuser, repoid, fork_name,
1139 owner=Optional(OAttr('apiuser')),
1139 owner=Optional(OAttr('apiuser')),
1140 description=Optional(''),
1140 description=Optional(''),
1141 private=Optional(False),
1141 private=Optional(False),
1142 clone_uri=Optional(None),
1142 clone_uri=Optional(None),
1143 landing_rev=Optional(None),
1143 landing_rev=Optional(None),
1144 copy_permissions=Optional(False)):
1144 copy_permissions=Optional(False)):
1145 """
1145 """
1146 Creates a fork of the specified |repo|.
1146 Creates a fork of the specified |repo|.
1147
1147
1148 * If the fork_name contains "/", fork will be created inside
1148 * If the fork_name contains "/", fork will be created inside
1149 a repository group or nested repository groups
1149 a repository group or nested repository groups
1150
1150
1151 For example "foo/bar/fork-repo" will create fork called "fork-repo"
1151 For example "foo/bar/fork-repo" will create fork called "fork-repo"
1152 inside group "foo/bar". You have to have permissions to access and
1152 inside group "foo/bar". You have to have permissions to access and
1153 write to the last repository group ("bar" in this example)
1153 write to the last repository group ("bar" in this example)
1154
1154
1155 This command can only be run using an |authtoken| with minimum
1155 This command can only be run using an |authtoken| with minimum
1156 read permissions of the forked repo, create fork permissions for an user.
1156 read permissions of the forked repo, create fork permissions for an user.
1157
1157
1158 :param apiuser: This is filled automatically from the |authtoken|.
1158 :param apiuser: This is filled automatically from the |authtoken|.
1159 :type apiuser: AuthUser
1159 :type apiuser: AuthUser
1160 :param repoid: Set repository name or repository ID.
1160 :param repoid: Set repository name or repository ID.
1161 :type repoid: str or int
1161 :type repoid: str or int
1162 :param fork_name: Set the fork name, including it's repository group membership.
1162 :param fork_name: Set the fork name, including it's repository group membership.
1163 :type fork_name: str
1163 :type fork_name: str
1164 :param owner: Set the fork owner.
1164 :param owner: Set the fork owner.
1165 :type owner: str
1165 :type owner: str
1166 :param description: Set the fork description.
1166 :param description: Set the fork description.
1167 :type description: str
1167 :type description: str
1168 :param copy_permissions: Copy permissions from parent |repo|. The
1168 :param copy_permissions: Copy permissions from parent |repo|. The
1169 default is False.
1169 default is False.
1170 :type copy_permissions: bool
1170 :type copy_permissions: bool
1171 :param private: Make the fork private. The default is False.
1171 :param private: Make the fork private. The default is False.
1172 :type private: bool
1172 :type private: bool
1173 :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd
1173 :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd
1174
1174
1175 Example output:
1175 Example output:
1176
1176
1177 .. code-block:: bash
1177 .. code-block:: bash
1178
1178
1179 id : <id_for_response>
1179 id : <id_for_response>
1180 api_key : "<api_key>"
1180 api_key : "<api_key>"
1181 args: {
1181 args: {
1182 "repoid" : "<reponame or repo_id>",
1182 "repoid" : "<reponame or repo_id>",
1183 "fork_name": "<forkname>",
1183 "fork_name": "<forkname>",
1184 "owner": "<username or user_id = Optional(=apiuser)>",
1184 "owner": "<username or user_id = Optional(=apiuser)>",
1185 "description": "<description>",
1185 "description": "<description>",
1186 "copy_permissions": "<bool>",
1186 "copy_permissions": "<bool>",
1187 "private": "<bool>",
1187 "private": "<bool>",
1188 "landing_rev": "<landing_rev>"
1188 "landing_rev": "<landing_rev>"
1189 }
1189 }
1190
1190
1191 Example error output:
1191 Example error output:
1192
1192
1193 .. code-block:: bash
1193 .. code-block:: bash
1194
1194
1195 id : <id_given_in_input>
1195 id : <id_given_in_input>
1196 result: {
1196 result: {
1197 "msg": "Created fork of `<reponame>` as `<forkname>`",
1197 "msg": "Created fork of `<reponame>` as `<forkname>`",
1198 "success": true,
1198 "success": true,
1199 "task": "<celery task id or None if done sync>"
1199 "task": "<celery task id or None if done sync>"
1200 }
1200 }
1201 error: null
1201 error: null
1202
1202
1203 """
1203 """
1204
1204
1205 repo = get_repo_or_error(repoid)
1205 repo = get_repo_or_error(repoid)
1206 repo_name = repo.repo_name
1206 repo_name = repo.repo_name
1207
1207
1208 if not has_superadmin_permission(apiuser):
1208 if not has_superadmin_permission(apiuser):
1209 # check if we have at least read permission for
1209 # check if we have at least read permission for
1210 # this repo that we fork !
1210 # this repo that we fork !
1211 _perms = ('repository.admin', 'repository.write', 'repository.read')
1211 _perms = ('repository.admin', 'repository.write', 'repository.read')
1212 validate_repo_permissions(apiuser, repoid, repo, _perms)
1212 validate_repo_permissions(apiuser, repoid, repo, _perms)
1213
1213
1214 # check if the regular user has at least fork permissions as well
1214 # check if the regular user has at least fork permissions as well
1215 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
1215 if not HasPermissionAnyApi(PermissionModel.FORKING_ENABLED)(user=apiuser):
1216 raise JSONRPCForbidden()
1216 raise JSONRPCForbidden()
1217
1217
1218 # check if user can set owner parameter
1218 # check if user can set owner parameter
1219 owner = validate_set_owner_permissions(apiuser, owner)
1219 owner = validate_set_owner_permissions(apiuser, owner)
1220
1220
1221 description = Optional.extract(description)
1221 description = Optional.extract(description)
1222 copy_permissions = Optional.extract(copy_permissions)
1222 copy_permissions = Optional.extract(copy_permissions)
1223 clone_uri = Optional.extract(clone_uri)
1223 clone_uri = Optional.extract(clone_uri)
1224
1224
1225 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1225 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1226 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
1226 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
1227 ref_choices = list(set(ref_choices + [landing_ref]))
1227 ref_choices = list(set(ref_choices + [landing_ref]))
1228 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
1228 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
1229
1229
1230 private = Optional.extract(private)
1230 private = Optional.extract(private)
1231
1231
1232 schema = repo_schema.RepoSchema().bind(
1232 schema = repo_schema.RepoSchema().bind(
1233 repo_type_options=rhodecode.BACKENDS.keys(),
1233 repo_type_options=rhodecode.BACKENDS.keys(),
1234 repo_ref_options=ref_choices,
1234 repo_ref_options=ref_choices,
1235 repo_type=repo.repo_type,
1235 repo_type=repo.repo_type,
1236 # user caller
1236 # user caller
1237 user=apiuser)
1237 user=apiuser)
1238
1238
1239 try:
1239 try:
1240 schema_data = schema.deserialize(dict(
1240 schema_data = schema.deserialize(dict(
1241 repo_name=fork_name,
1241 repo_name=fork_name,
1242 repo_type=repo.repo_type,
1242 repo_type=repo.repo_type,
1243 repo_owner=owner.username,
1243 repo_owner=owner.username,
1244 repo_description=description,
1244 repo_description=description,
1245 repo_landing_commit_ref=landing_commit_ref,
1245 repo_landing_commit_ref=landing_commit_ref,
1246 repo_clone_uri=clone_uri,
1246 repo_clone_uri=clone_uri,
1247 repo_private=private,
1247 repo_private=private,
1248 repo_copy_permissions=copy_permissions))
1248 repo_copy_permissions=copy_permissions))
1249 except validation_schema.Invalid as err:
1249 except validation_schema.Invalid as err:
1250 raise JSONRPCValidationError(colander_exc=err)
1250 raise JSONRPCValidationError(colander_exc=err)
1251
1251
1252 try:
1252 try:
1253 data = {
1253 data = {
1254 'fork_parent_id': repo.repo_id,
1254 'fork_parent_id': repo.repo_id,
1255
1255
1256 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1256 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1257 'repo_name_full': schema_data['repo_name'],
1257 'repo_name_full': schema_data['repo_name'],
1258 'repo_group': schema_data['repo_group']['repo_group_id'],
1258 'repo_group': schema_data['repo_group']['repo_group_id'],
1259 'repo_type': schema_data['repo_type'],
1259 'repo_type': schema_data['repo_type'],
1260 'description': schema_data['repo_description'],
1260 'description': schema_data['repo_description'],
1261 'private': schema_data['repo_private'],
1261 'private': schema_data['repo_private'],
1262 'copy_permissions': schema_data['repo_copy_permissions'],
1262 'copy_permissions': schema_data['repo_copy_permissions'],
1263 'landing_rev': schema_data['repo_landing_commit_ref'],
1263 'landing_rev': schema_data['repo_landing_commit_ref'],
1264 }
1264 }
1265
1265
1266 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1266 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1267 # no commit, it's done in RepoModel, or async via celery
1267 # no commit, it's done in RepoModel, or async via celery
1268 task_id = get_task_id(task)
1268 task_id = get_task_id(task)
1269
1269
1270 return {
1270 return {
1271 'msg': 'Created fork of `%s` as `%s`' % (
1271 'msg': 'Created fork of `%s` as `%s`' % (
1272 repo.repo_name, schema_data['repo_name']),
1272 repo.repo_name, schema_data['repo_name']),
1273 'success': True, # cannot return the repo data here since fork
1273 'success': True, # cannot return the repo data here since fork
1274 # can be done async
1274 # can be done async
1275 'task': task_id
1275 'task': task_id
1276 }
1276 }
1277 except Exception:
1277 except Exception:
1278 log.exception(
1278 log.exception(
1279 u"Exception while trying to create fork %s",
1279 u"Exception while trying to create fork %s",
1280 schema_data['repo_name'])
1280 schema_data['repo_name'])
1281 raise JSONRPCError(
1281 raise JSONRPCError(
1282 'failed to fork repository `%s` as `%s`' % (
1282 'failed to fork repository `%s` as `%s`' % (
1283 repo_name, schema_data['repo_name']))
1283 repo_name, schema_data['repo_name']))
1284
1284
1285
1285
1286 @jsonrpc_method()
1286 @jsonrpc_method()
1287 def delete_repo(request, apiuser, repoid, forks=Optional('')):
1287 def delete_repo(request, apiuser, repoid, forks=Optional('')):
1288 """
1288 """
1289 Deletes a repository.
1289 Deletes a repository.
1290
1290
1291 * When the `forks` parameter is set it's possible to detach or delete
1291 * When the `forks` parameter is set it's possible to detach or delete
1292 forks of deleted repository.
1292 forks of deleted repository.
1293
1293
1294 This command can only be run using an |authtoken| with admin
1294 This command can only be run using an |authtoken| with admin
1295 permissions on the |repo|.
1295 permissions on the |repo|.
1296
1296
1297 :param apiuser: This is filled automatically from the |authtoken|.
1297 :param apiuser: This is filled automatically from the |authtoken|.
1298 :type apiuser: AuthUser
1298 :type apiuser: AuthUser
1299 :param repoid: Set the repository name or repository ID.
1299 :param repoid: Set the repository name or repository ID.
1300 :type repoid: str or int
1300 :type repoid: str or int
1301 :param forks: Set to `detach` or `delete` forks from the |repo|.
1301 :param forks: Set to `detach` or `delete` forks from the |repo|.
1302 :type forks: Optional(str)
1302 :type forks: Optional(str)
1303
1303
1304 Example error output:
1304 Example error output:
1305
1305
1306 .. code-block:: bash
1306 .. code-block:: bash
1307
1307
1308 id : <id_given_in_input>
1308 id : <id_given_in_input>
1309 result: {
1309 result: {
1310 "msg": "Deleted repository `<reponame>`",
1310 "msg": "Deleted repository `<reponame>`",
1311 "success": true
1311 "success": true
1312 }
1312 }
1313 error: null
1313 error: null
1314 """
1314 """
1315
1315
1316 repo = get_repo_or_error(repoid)
1316 repo = get_repo_or_error(repoid)
1317 repo_name = repo.repo_name
1317 repo_name = repo.repo_name
1318 if not has_superadmin_permission(apiuser):
1318 if not has_superadmin_permission(apiuser):
1319 _perms = ('repository.admin',)
1319 _perms = ('repository.admin',)
1320 validate_repo_permissions(apiuser, repoid, repo, _perms)
1320 validate_repo_permissions(apiuser, repoid, repo, _perms)
1321
1321
1322 try:
1322 try:
1323 handle_forks = Optional.extract(forks)
1323 handle_forks = Optional.extract(forks)
1324 _forks_msg = ''
1324 _forks_msg = ''
1325 _forks = [f for f in repo.forks]
1325 _forks = [f for f in repo.forks]
1326 if handle_forks == 'detach':
1326 if handle_forks == 'detach':
1327 _forks_msg = ' ' + 'Detached %s forks' % len(_forks)
1327 _forks_msg = ' ' + 'Detached %s forks' % len(_forks)
1328 elif handle_forks == 'delete':
1328 elif handle_forks == 'delete':
1329 _forks_msg = ' ' + 'Deleted %s forks' % len(_forks)
1329 _forks_msg = ' ' + 'Deleted %s forks' % len(_forks)
1330 elif _forks:
1330 elif _forks:
1331 raise JSONRPCError(
1331 raise JSONRPCError(
1332 'Cannot delete `%s` it still contains attached forks' %
1332 'Cannot delete `%s` it still contains attached forks' %
1333 (repo.repo_name,)
1333 (repo.repo_name,)
1334 )
1334 )
1335 old_data = repo.get_api_data()
1335 old_data = repo.get_api_data()
1336 RepoModel().delete(repo, forks=forks)
1336 RepoModel().delete(repo, forks=forks)
1337
1337
1338 repo = audit_logger.RepoWrap(repo_id=None,
1338 repo = audit_logger.RepoWrap(repo_id=None,
1339 repo_name=repo.repo_name)
1339 repo_name=repo.repo_name)
1340
1340
1341 audit_logger.store_api(
1341 audit_logger.store_api(
1342 'repo.delete', action_data={'old_data': old_data},
1342 'repo.delete', action_data={'old_data': old_data},
1343 user=apiuser, repo=repo)
1343 user=apiuser, repo=repo)
1344
1344
1345 ScmModel().mark_for_invalidation(repo_name, delete=True)
1345 ScmModel().mark_for_invalidation(repo_name, delete=True)
1346 Session().commit()
1346 Session().commit()
1347 return {
1347 return {
1348 'msg': 'Deleted repository `%s`%s' % (repo_name, _forks_msg),
1348 'msg': 'Deleted repository `%s`%s' % (repo_name, _forks_msg),
1349 'success': True
1349 'success': True
1350 }
1350 }
1351 except Exception:
1351 except Exception:
1352 log.exception("Exception occurred while trying to delete repo")
1352 log.exception("Exception occurred while trying to delete repo")
1353 raise JSONRPCError(
1353 raise JSONRPCError(
1354 'failed to delete repository `%s`' % (repo_name,)
1354 'failed to delete repository `%s`' % (repo_name,)
1355 )
1355 )
1356
1356
1357
1357
1358 #TODO: marcink, change name ?
1358 #TODO: marcink, change name ?
1359 @jsonrpc_method()
1359 @jsonrpc_method()
1360 def invalidate_cache(request, apiuser, repoid, delete_keys=Optional(False)):
1360 def invalidate_cache(request, apiuser, repoid, delete_keys=Optional(False)):
1361 """
1361 """
1362 Invalidates the cache for the specified repository.
1362 Invalidates the cache for the specified repository.
1363
1363
1364 This command can only be run using an |authtoken| with admin rights to
1364 This command can only be run using an |authtoken| with admin rights to
1365 the specified repository.
1365 the specified repository.
1366
1366
1367 This command takes the following options:
1367 This command takes the following options:
1368
1368
1369 :param apiuser: This is filled automatically from |authtoken|.
1369 :param apiuser: This is filled automatically from |authtoken|.
1370 :type apiuser: AuthUser
1370 :type apiuser: AuthUser
1371 :param repoid: Sets the repository name or repository ID.
1371 :param repoid: Sets the repository name or repository ID.
1372 :type repoid: str or int
1372 :type repoid: str or int
1373 :param delete_keys: This deletes the invalidated keys instead of
1373 :param delete_keys: This deletes the invalidated keys instead of
1374 just flagging them.
1374 just flagging them.
1375 :type delete_keys: Optional(``True`` | ``False``)
1375 :type delete_keys: Optional(``True`` | ``False``)
1376
1376
1377 Example output:
1377 Example output:
1378
1378
1379 .. code-block:: bash
1379 .. code-block:: bash
1380
1380
1381 id : <id_given_in_input>
1381 id : <id_given_in_input>
1382 result : {
1382 result : {
1383 'msg': Cache for repository `<repository name>` was invalidated,
1383 'msg': Cache for repository `<repository name>` was invalidated,
1384 'repository': <repository name>
1384 'repository': <repository name>
1385 }
1385 }
1386 error : null
1386 error : null
1387
1387
1388 Example error output:
1388 Example error output:
1389
1389
1390 .. code-block:: bash
1390 .. code-block:: bash
1391
1391
1392 id : <id_given_in_input>
1392 id : <id_given_in_input>
1393 result : null
1393 result : null
1394 error : {
1394 error : {
1395 'Error occurred during cache invalidation action'
1395 'Error occurred during cache invalidation action'
1396 }
1396 }
1397
1397
1398 """
1398 """
1399
1399
1400 repo = get_repo_or_error(repoid)
1400 repo = get_repo_or_error(repoid)
1401 if not has_superadmin_permission(apiuser):
1401 if not has_superadmin_permission(apiuser):
1402 _perms = ('repository.admin', 'repository.write',)
1402 _perms = ('repository.admin', 'repository.write',)
1403 validate_repo_permissions(apiuser, repoid, repo, _perms)
1403 validate_repo_permissions(apiuser, repoid, repo, _perms)
1404
1404
1405 delete = Optional.extract(delete_keys)
1405 delete = Optional.extract(delete_keys)
1406 try:
1406 try:
1407 ScmModel().mark_for_invalidation(repo.repo_name, delete=delete)
1407 ScmModel().mark_for_invalidation(repo.repo_name, delete=delete)
1408 return {
1408 return {
1409 'msg': 'Cache for repository `%s` was invalidated' % (repoid,),
1409 'msg': 'Cache for repository `%s` was invalidated' % (repoid,),
1410 'repository': repo.repo_name
1410 'repository': repo.repo_name
1411 }
1411 }
1412 except Exception:
1412 except Exception:
1413 log.exception(
1413 log.exception(
1414 "Exception occurred while trying to invalidate repo cache")
1414 "Exception occurred while trying to invalidate repo cache")
1415 raise JSONRPCError(
1415 raise JSONRPCError(
1416 'Error occurred during cache invalidation action'
1416 'Error occurred during cache invalidation action'
1417 )
1417 )
1418
1418
1419
1419
1420 #TODO: marcink, change name ?
1420 #TODO: marcink, change name ?
1421 @jsonrpc_method()
1421 @jsonrpc_method()
1422 def lock(request, apiuser, repoid, locked=Optional(None),
1422 def lock(request, apiuser, repoid, locked=Optional(None),
1423 userid=Optional(OAttr('apiuser'))):
1423 userid=Optional(OAttr('apiuser'))):
1424 """
1424 """
1425 Sets the lock state of the specified |repo| by the given user.
1425 Sets the lock state of the specified |repo| by the given user.
1426 From more information, see :ref:`repo-locking`.
1426 From more information, see :ref:`repo-locking`.
1427
1427
1428 * If the ``userid`` option is not set, the repository is locked to the
1428 * If the ``userid`` option is not set, the repository is locked to the
1429 user who called the method.
1429 user who called the method.
1430 * If the ``locked`` parameter is not set, the current lock state of the
1430 * If the ``locked`` parameter is not set, the current lock state of the
1431 repository is displayed.
1431 repository is displayed.
1432
1432
1433 This command can only be run using an |authtoken| with admin rights to
1433 This command can only be run using an |authtoken| with admin rights to
1434 the specified repository.
1434 the specified repository.
1435
1435
1436 This command takes the following options:
1436 This command takes the following options:
1437
1437
1438 :param apiuser: This is filled automatically from the |authtoken|.
1438 :param apiuser: This is filled automatically from the |authtoken|.
1439 :type apiuser: AuthUser
1439 :type apiuser: AuthUser
1440 :param repoid: Sets the repository name or repository ID.
1440 :param repoid: Sets the repository name or repository ID.
1441 :type repoid: str or int
1441 :type repoid: str or int
1442 :param locked: Sets the lock state.
1442 :param locked: Sets the lock state.
1443 :type locked: Optional(``True`` | ``False``)
1443 :type locked: Optional(``True`` | ``False``)
1444 :param userid: Set the repository lock to this user.
1444 :param userid: Set the repository lock to this user.
1445 :type userid: Optional(str or int)
1445 :type userid: Optional(str or int)
1446
1446
1447 Example error output:
1447 Example error output:
1448
1448
1449 .. code-block:: bash
1449 .. code-block:: bash
1450
1450
1451 id : <id_given_in_input>
1451 id : <id_given_in_input>
1452 result : {
1452 result : {
1453 'repo': '<reponame>',
1453 'repo': '<reponame>',
1454 'locked': <bool: lock state>,
1454 'locked': <bool: lock state>,
1455 'locked_since': <int: lock timestamp>,
1455 'locked_since': <int: lock timestamp>,
1456 'locked_by': <username of person who made the lock>,
1456 'locked_by': <username of person who made the lock>,
1457 'lock_reason': <str: reason for locking>,
1457 'lock_reason': <str: reason for locking>,
1458 'lock_state_changed': <bool: True if lock state has been changed in this request>,
1458 'lock_state_changed': <bool: True if lock state has been changed in this request>,
1459 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
1459 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
1460 or
1460 or
1461 'msg': 'Repo `<repository name>` not locked.'
1461 'msg': 'Repo `<repository name>` not locked.'
1462 or
1462 or
1463 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
1463 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
1464 }
1464 }
1465 error : null
1465 error : null
1466
1466
1467 Example error output:
1467 Example error output:
1468
1468
1469 .. code-block:: bash
1469 .. code-block:: bash
1470
1470
1471 id : <id_given_in_input>
1471 id : <id_given_in_input>
1472 result : null
1472 result : null
1473 error : {
1473 error : {
1474 'Error occurred locking repository `<reponame>`'
1474 'Error occurred locking repository `<reponame>`'
1475 }
1475 }
1476 """
1476 """
1477
1477
1478 repo = get_repo_or_error(repoid)
1478 repo = get_repo_or_error(repoid)
1479 if not has_superadmin_permission(apiuser):
1479 if not has_superadmin_permission(apiuser):
1480 # check if we have at least write permission for this repo !
1480 # check if we have at least write permission for this repo !
1481 _perms = ('repository.admin', 'repository.write',)
1481 _perms = ('repository.admin', 'repository.write',)
1482 validate_repo_permissions(apiuser, repoid, repo, _perms)
1482 validate_repo_permissions(apiuser, repoid, repo, _perms)
1483
1483
1484 # make sure normal user does not pass someone else userid,
1484 # make sure normal user does not pass someone else userid,
1485 # he is not allowed to do that
1485 # he is not allowed to do that
1486 if not isinstance(userid, Optional) and userid != apiuser.user_id:
1486 if not isinstance(userid, Optional) and userid != apiuser.user_id:
1487 raise JSONRPCError('userid is not the same as your user')
1487 raise JSONRPCError('userid is not the same as your user')
1488
1488
1489 if isinstance(userid, Optional):
1489 if isinstance(userid, Optional):
1490 userid = apiuser.user_id
1490 userid = apiuser.user_id
1491
1491
1492 user = get_user_or_error(userid)
1492 user = get_user_or_error(userid)
1493
1493
1494 if isinstance(locked, Optional):
1494 if isinstance(locked, Optional):
1495 lockobj = repo.locked
1495 lockobj = repo.locked
1496
1496
1497 if lockobj[0] is None:
1497 if lockobj[0] is None:
1498 _d = {
1498 _d = {
1499 'repo': repo.repo_name,
1499 'repo': repo.repo_name,
1500 'locked': False,
1500 'locked': False,
1501 'locked_since': None,
1501 'locked_since': None,
1502 'locked_by': None,
1502 'locked_by': None,
1503 'lock_reason': None,
1503 'lock_reason': None,
1504 'lock_state_changed': False,
1504 'lock_state_changed': False,
1505 'msg': 'Repo `%s` not locked.' % repo.repo_name
1505 'msg': 'Repo `%s` not locked.' % repo.repo_name
1506 }
1506 }
1507 return _d
1507 return _d
1508 else:
1508 else:
1509 _user_id, _time, _reason = lockobj
1509 _user_id, _time, _reason = lockobj
1510 lock_user = get_user_or_error(userid)
1510 lock_user = get_user_or_error(userid)
1511 _d = {
1511 _d = {
1512 'repo': repo.repo_name,
1512 'repo': repo.repo_name,
1513 'locked': True,
1513 'locked': True,
1514 'locked_since': _time,
1514 'locked_since': _time,
1515 'locked_by': lock_user.username,
1515 'locked_by': lock_user.username,
1516 'lock_reason': _reason,
1516 'lock_reason': _reason,
1517 'lock_state_changed': False,
1517 'lock_state_changed': False,
1518 'msg': ('Repo `%s` locked by `%s` on `%s`.'
1518 'msg': ('Repo `%s` locked by `%s` on `%s`.'
1519 % (repo.repo_name, lock_user.username,
1519 % (repo.repo_name, lock_user.username,
1520 json.dumps(time_to_datetime(_time))))
1520 json.dumps(time_to_datetime(_time))))
1521 }
1521 }
1522 return _d
1522 return _d
1523
1523
1524 # force locked state through a flag
1524 # force locked state through a flag
1525 else:
1525 else:
1526 locked = str2bool(locked)
1526 locked = str2bool(locked)
1527 lock_reason = Repository.LOCK_API
1527 lock_reason = Repository.LOCK_API
1528 try:
1528 try:
1529 if locked:
1529 if locked:
1530 lock_time = time.time()
1530 lock_time = time.time()
1531 Repository.lock(repo, user.user_id, lock_time, lock_reason)
1531 Repository.lock(repo, user.user_id, lock_time, lock_reason)
1532 else:
1532 else:
1533 lock_time = None
1533 lock_time = None
1534 Repository.unlock(repo)
1534 Repository.unlock(repo)
1535 _d = {
1535 _d = {
1536 'repo': repo.repo_name,
1536 'repo': repo.repo_name,
1537 'locked': locked,
1537 'locked': locked,
1538 'locked_since': lock_time,
1538 'locked_since': lock_time,
1539 'locked_by': user.username,
1539 'locked_by': user.username,
1540 'lock_reason': lock_reason,
1540 'lock_reason': lock_reason,
1541 'lock_state_changed': True,
1541 'lock_state_changed': True,
1542 'msg': ('User `%s` set lock state for repo `%s` to `%s`'
1542 'msg': ('User `%s` set lock state for repo `%s` to `%s`'
1543 % (user.username, repo.repo_name, locked))
1543 % (user.username, repo.repo_name, locked))
1544 }
1544 }
1545 return _d
1545 return _d
1546 except Exception:
1546 except Exception:
1547 log.exception(
1547 log.exception(
1548 "Exception occurred while trying to lock repository")
1548 "Exception occurred while trying to lock repository")
1549 raise JSONRPCError(
1549 raise JSONRPCError(
1550 'Error occurred locking repository `%s`' % repo.repo_name
1550 'Error occurred locking repository `%s`' % repo.repo_name
1551 )
1551 )
1552
1552
1553
1553
1554 @jsonrpc_method()
1554 @jsonrpc_method()
1555 def comment_commit(
1555 def comment_commit(
1556 request, apiuser, repoid, commit_id, message, status=Optional(None),
1556 request, apiuser, repoid, commit_id, message, status=Optional(None),
1557 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
1557 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
1558 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
1558 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
1559 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
1559 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
1560 """
1560 """
1561 Set a commit comment, and optionally change the status of the commit.
1561 Set a commit comment, and optionally change the status of the commit.
1562
1562
1563 :param apiuser: This is filled automatically from the |authtoken|.
1563 :param apiuser: This is filled automatically from the |authtoken|.
1564 :type apiuser: AuthUser
1564 :type apiuser: AuthUser
1565 :param repoid: Set the repository name or repository ID.
1565 :param repoid: Set the repository name or repository ID.
1566 :type repoid: str or int
1566 :type repoid: str or int
1567 :param commit_id: Specify the commit_id for which to set a comment.
1567 :param commit_id: Specify the commit_id for which to set a comment.
1568 :type commit_id: str
1568 :type commit_id: str
1569 :param message: The comment text.
1569 :param message: The comment text.
1570 :type message: str
1570 :type message: str
1571 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
1571 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
1572 'approved', 'rejected', 'under_review'
1572 'approved', 'rejected', 'under_review'
1573 :type status: str
1573 :type status: str
1574 :param comment_type: Comment type, one of: 'note', 'todo'
1574 :param comment_type: Comment type, one of: 'note', 'todo'
1575 :type comment_type: Optional(str), default: 'note'
1575 :type comment_type: Optional(str), default: 'note'
1576 :param resolves_comment_id: id of comment which this one will resolve
1576 :param resolves_comment_id: id of comment which this one will resolve
1577 :type resolves_comment_id: Optional(int)
1577 :type resolves_comment_id: Optional(int)
1578 :param extra_recipients: list of user ids or usernames to add
1578 :param extra_recipients: list of user ids or usernames to add
1579 notifications for this comment. Acts like a CC for notification
1579 notifications for this comment. Acts like a CC for notification
1580 :type extra_recipients: Optional(list)
1580 :type extra_recipients: Optional(list)
1581 :param userid: Set the user name of the comment creator.
1581 :param userid: Set the user name of the comment creator.
1582 :type userid: Optional(str or int)
1582 :type userid: Optional(str or int)
1583 :param send_email: Define if this comment should also send email notification
1583 :param send_email: Define if this comment should also send email notification
1584 :type send_email: Optional(bool)
1584 :type send_email: Optional(bool)
1585
1585
1586 Example error output:
1586 Example error output:
1587
1587
1588 .. code-block:: bash
1588 .. code-block:: bash
1589
1589
1590 {
1590 {
1591 "id" : <id_given_in_input>,
1591 "id" : <id_given_in_input>,
1592 "result" : {
1592 "result" : {
1593 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
1593 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
1594 "status_change": null or <status>,
1594 "status_change": null or <status>,
1595 "success": true
1595 "success": true
1596 },
1596 },
1597 "error" : null
1597 "error" : null
1598 }
1598 }
1599
1599
1600 """
1600 """
1601 _ = request.translate
1601 _ = request.translate
1602
1602
1603 repo = get_repo_or_error(repoid)
1603 repo = get_repo_or_error(repoid)
1604 if not has_superadmin_permission(apiuser):
1604 if not has_superadmin_permission(apiuser):
1605 _perms = ('repository.read', 'repository.write', 'repository.admin')
1605 _perms = ('repository.read', 'repository.write', 'repository.admin')
1606 validate_repo_permissions(apiuser, repoid, repo, _perms)
1606 validate_repo_permissions(apiuser, repoid, repo, _perms)
1607 db_repo_name = repo.repo_name
1607 db_repo_name = repo.repo_name
1608
1608
1609 try:
1609 try:
1610 commit = repo.scm_instance().get_commit(commit_id=commit_id)
1610 commit = repo.scm_instance().get_commit(commit_id=commit_id)
1611 commit_id = commit.raw_id
1611 commit_id = commit.raw_id
1612 except Exception as e:
1612 except Exception as e:
1613 log.exception('Failed to fetch commit')
1613 log.exception('Failed to fetch commit')
1614 raise JSONRPCError(safe_str(e))
1614 raise JSONRPCError(safe_str(e))
1615
1615
1616 if isinstance(userid, Optional):
1616 if isinstance(userid, Optional):
1617 userid = apiuser.user_id
1617 userid = apiuser.user_id
1618
1618
1619 user = get_user_or_error(userid)
1619 user = get_user_or_error(userid)
1620 status = Optional.extract(status)
1620 status = Optional.extract(status)
1621 comment_type = Optional.extract(comment_type)
1621 comment_type = Optional.extract(comment_type)
1622 resolves_comment_id = Optional.extract(resolves_comment_id)
1622 resolves_comment_id = Optional.extract(resolves_comment_id)
1623 extra_recipients = Optional.extract(extra_recipients)
1623 extra_recipients = Optional.extract(extra_recipients)
1624 send_email = Optional.extract(send_email, binary=True)
1624 send_email = Optional.extract(send_email, binary=True)
1625
1625
1626 allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES]
1626 allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES]
1627 if status and status not in allowed_statuses:
1627 if status and status not in allowed_statuses:
1628 raise JSONRPCError('Bad status, must be on '
1628 raise JSONRPCError('Bad status, must be on '
1629 'of %s got %s' % (allowed_statuses, status,))
1629 'of %s got %s' % (allowed_statuses, status,))
1630
1630
1631 if resolves_comment_id:
1631 if resolves_comment_id:
1632 comment = ChangesetComment.get(resolves_comment_id)
1632 comment = ChangesetComment.get(resolves_comment_id)
1633 if not comment:
1633 if not comment:
1634 raise JSONRPCError(
1634 raise JSONRPCError(
1635 'Invalid resolves_comment_id `%s` for this commit.'
1635 'Invalid resolves_comment_id `%s` for this commit.'
1636 % resolves_comment_id)
1636 % resolves_comment_id)
1637 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
1637 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
1638 raise JSONRPCError(
1638 raise JSONRPCError(
1639 'Comment `%s` is wrong type for setting status to resolved.'
1639 'Comment `%s` is wrong type for setting status to resolved.'
1640 % resolves_comment_id)
1640 % resolves_comment_id)
1641
1641
1642 try:
1642 try:
1643 rc_config = SettingsModel().get_all_settings()
1643 rc_config = SettingsModel().get_all_settings()
1644 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
1644 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
1645 status_change_label = ChangesetStatus.get_status_lbl(status)
1645 status_change_label = ChangesetStatus.get_status_lbl(status)
1646 comment = CommentsModel().create(
1646 comment = CommentsModel().create(
1647 message, repo, user, commit_id=commit_id,
1647 message, repo, user, commit_id=commit_id,
1648 status_change=status_change_label,
1648 status_change=status_change_label,
1649 status_change_type=status,
1649 status_change_type=status,
1650 renderer=renderer,
1650 renderer=renderer,
1651 comment_type=comment_type,
1651 comment_type=comment_type,
1652 resolves_comment_id=resolves_comment_id,
1652 resolves_comment_id=resolves_comment_id,
1653 auth_user=apiuser,
1653 auth_user=apiuser,
1654 extra_recipients=extra_recipients,
1654 extra_recipients=extra_recipients,
1655 send_email=send_email
1655 send_email=send_email
1656 )
1656 )
1657 is_inline = comment.is_inline
1657 is_inline = comment.is_inline
1658
1658
1659 if status:
1659 if status:
1660 # also do a status change
1660 # also do a status change
1661 try:
1661 try:
1662 ChangesetStatusModel().set_status(
1662 ChangesetStatusModel().set_status(
1663 repo, status, user, comment, revision=commit_id,
1663 repo, status, user, comment, revision=commit_id,
1664 dont_allow_on_closed_pull_request=True
1664 dont_allow_on_closed_pull_request=True
1665 )
1665 )
1666 except StatusChangeOnClosedPullRequestError:
1666 except StatusChangeOnClosedPullRequestError:
1667 log.exception(
1667 log.exception(
1668 "Exception occurred while trying to change repo commit status")
1668 "Exception occurred while trying to change repo commit status")
1669 msg = ('Changing status on a commit associated with '
1669 msg = ('Changing status on a commit associated with '
1670 'a closed pull request is not allowed')
1670 'a closed pull request is not allowed')
1671 raise JSONRPCError(msg)
1671 raise JSONRPCError(msg)
1672
1672
1673 CommentsModel().trigger_commit_comment_hook(
1673 CommentsModel().trigger_commit_comment_hook(
1674 repo, apiuser, 'create',
1674 repo, apiuser, 'create',
1675 data={'comment': comment, 'commit': commit})
1675 data={'comment': comment, 'commit': commit})
1676
1676
1677 Session().commit()
1677 Session().commit()
1678
1678
1679 comment_broadcast_channel = channelstream.comment_channel(
1679 comment_broadcast_channel = channelstream.comment_channel(
1680 db_repo_name, commit_obj=commit)
1680 db_repo_name, commit_obj=commit)
1681
1681
1682 comment_data = {'comment': comment, 'comment_id': comment.comment_id}
1682 comment_data = {'comment': comment, 'comment_id': comment.comment_id}
1683 comment_type = 'inline' if is_inline else 'general'
1683 comment_type = 'inline' if is_inline else 'general'
1684 channelstream.comment_channelstream_push(
1684 channelstream.comment_channelstream_push(
1685 request, comment_broadcast_channel, apiuser,
1685 request, comment_broadcast_channel, apiuser,
1686 _('posted a new {} comment').format(comment_type),
1686 _('posted a new {} comment').format(comment_type),
1687 comment_data=comment_data)
1687 comment_data=comment_data)
1688
1688
1689 return {
1689 return {
1690 'msg': (
1690 'msg': (
1691 'Commented on commit `%s` for repository `%s`' % (
1691 'Commented on commit `%s` for repository `%s`' % (
1692 comment.revision, repo.repo_name)),
1692 comment.revision, repo.repo_name)),
1693 'status_change': status,
1693 'status_change': status,
1694 'success': True,
1694 'success': True,
1695 }
1695 }
1696 except JSONRPCError:
1696 except JSONRPCError:
1697 # catch any inside errors, and re-raise them to prevent from
1697 # catch any inside errors, and re-raise them to prevent from
1698 # below global catch to silence them
1698 # below global catch to silence them
1699 raise
1699 raise
1700 except Exception:
1700 except Exception:
1701 log.exception("Exception occurred while trying to comment on commit")
1701 log.exception("Exception occurred while trying to comment on commit")
1702 raise JSONRPCError(
1702 raise JSONRPCError(
1703 'failed to set comment on repository `%s`' % (repo.repo_name,)
1703 'failed to set comment on repository `%s`' % (repo.repo_name,)
1704 )
1704 )
1705
1705
1706
1706
1707 @jsonrpc_method()
1707 @jsonrpc_method()
1708 def get_repo_comments(request, apiuser, repoid,
1708 def get_repo_comments(request, apiuser, repoid,
1709 commit_id=Optional(None), comment_type=Optional(None),
1709 commit_id=Optional(None), comment_type=Optional(None),
1710 userid=Optional(None)):
1710 userid=Optional(None)):
1711 """
1711 """
1712 Get all comments for a repository
1712 Get all comments for a repository
1713
1713
1714 :param apiuser: This is filled automatically from the |authtoken|.
1714 :param apiuser: This is filled automatically from the |authtoken|.
1715 :type apiuser: AuthUser
1715 :type apiuser: AuthUser
1716 :param repoid: Set the repository name or repository ID.
1716 :param repoid: Set the repository name or repository ID.
1717 :type repoid: str or int
1717 :type repoid: str or int
1718 :param commit_id: Optionally filter the comments by the commit_id
1718 :param commit_id: Optionally filter the comments by the commit_id
1719 :type commit_id: Optional(str), default: None
1719 :type commit_id: Optional(str), default: None
1720 :param comment_type: Optionally filter the comments by the comment_type
1720 :param comment_type: Optionally filter the comments by the comment_type
1721 one of: 'note', 'todo'
1721 one of: 'note', 'todo'
1722 :type comment_type: Optional(str), default: None
1722 :type comment_type: Optional(str), default: None
1723 :param userid: Optionally filter the comments by the author of comment
1723 :param userid: Optionally filter the comments by the author of comment
1724 :type userid: Optional(str or int), Default: None
1724 :type userid: Optional(str or int), Default: None
1725
1725
1726 Example error output:
1726 Example error output:
1727
1727
1728 .. code-block:: bash
1728 .. code-block:: bash
1729
1729
1730 {
1730 {
1731 "id" : <id_given_in_input>,
1731 "id" : <id_given_in_input>,
1732 "result" : [
1732 "result" : [
1733 {
1733 {
1734 "comment_author": <USER_DETAILS>,
1734 "comment_author": <USER_DETAILS>,
1735 "comment_created_on": "2017-02-01T14:38:16.309",
1735 "comment_created_on": "2017-02-01T14:38:16.309",
1736 "comment_f_path": "file.txt",
1736 "comment_f_path": "file.txt",
1737 "comment_id": 282,
1737 "comment_id": 282,
1738 "comment_lineno": "n1",
1738 "comment_lineno": "n1",
1739 "comment_resolved_by": null,
1739 "comment_resolved_by": null,
1740 "comment_status": [],
1740 "comment_status": [],
1741 "comment_text": "This file needs a header",
1741 "comment_text": "This file needs a header",
1742 "comment_type": "todo",
1742 "comment_type": "todo",
1743 "comment_last_version: 0
1743 "comment_last_version: 0
1744 }
1744 }
1745 ],
1745 ],
1746 "error" : null
1746 "error" : null
1747 }
1747 }
1748
1748
1749 """
1749 """
1750 repo = get_repo_or_error(repoid)
1750 repo = get_repo_or_error(repoid)
1751 if not has_superadmin_permission(apiuser):
1751 if not has_superadmin_permission(apiuser):
1752 _perms = ('repository.read', 'repository.write', 'repository.admin')
1752 _perms = ('repository.read', 'repository.write', 'repository.admin')
1753 validate_repo_permissions(apiuser, repoid, repo, _perms)
1753 validate_repo_permissions(apiuser, repoid, repo, _perms)
1754
1754
1755 commit_id = Optional.extract(commit_id)
1755 commit_id = Optional.extract(commit_id)
1756
1756
1757 userid = Optional.extract(userid)
1757 userid = Optional.extract(userid)
1758 if userid:
1758 if userid:
1759 user = get_user_or_error(userid)
1759 user = get_user_or_error(userid)
1760 else:
1760 else:
1761 user = None
1761 user = None
1762
1762
1763 comment_type = Optional.extract(comment_type)
1763 comment_type = Optional.extract(comment_type)
1764 if comment_type and comment_type not in ChangesetComment.COMMENT_TYPES:
1764 if comment_type and comment_type not in ChangesetComment.COMMENT_TYPES:
1765 raise JSONRPCError(
1765 raise JSONRPCError(
1766 'comment_type must be one of `{}` got {}'.format(
1766 'comment_type must be one of `{}` got {}'.format(
1767 ChangesetComment.COMMENT_TYPES, comment_type)
1767 ChangesetComment.COMMENT_TYPES, comment_type)
1768 )
1768 )
1769
1769
1770 comments = CommentsModel().get_repository_comments(
1770 comments = CommentsModel().get_repository_comments(
1771 repo=repo, comment_type=comment_type, user=user, commit_id=commit_id)
1771 repo=repo, comment_type=comment_type, user=user, commit_id=commit_id)
1772 return comments
1772 return comments
1773
1773
1774
1774
1775 @jsonrpc_method()
1775 @jsonrpc_method()
1776 def get_comment(request, apiuser, comment_id):
1776 def get_comment(request, apiuser, comment_id):
1777 """
1777 """
1778 Get single comment from repository or pull_request
1778 Get single comment from repository or pull_request
1779
1779
1780 :param apiuser: This is filled automatically from the |authtoken|.
1780 :param apiuser: This is filled automatically from the |authtoken|.
1781 :type apiuser: AuthUser
1781 :type apiuser: AuthUser
1782 :param comment_id: comment id found in the URL of comment
1782 :param comment_id: comment id found in the URL of comment
1783 :type comment_id: str or int
1783 :type comment_id: str or int
1784
1784
1785 Example error output:
1785 Example error output:
1786
1786
1787 .. code-block:: bash
1787 .. code-block:: bash
1788
1788
1789 {
1789 {
1790 "id" : <id_given_in_input>,
1790 "id" : <id_given_in_input>,
1791 "result" : {
1791 "result" : {
1792 "comment_author": <USER_DETAILS>,
1792 "comment_author": <USER_DETAILS>,
1793 "comment_created_on": "2017-02-01T14:38:16.309",
1793 "comment_created_on": "2017-02-01T14:38:16.309",
1794 "comment_f_path": "file.txt",
1794 "comment_f_path": "file.txt",
1795 "comment_id": 282,
1795 "comment_id": 282,
1796 "comment_lineno": "n1",
1796 "comment_lineno": "n1",
1797 "comment_resolved_by": null,
1797 "comment_resolved_by": null,
1798 "comment_status": [],
1798 "comment_status": [],
1799 "comment_text": "This file needs a header",
1799 "comment_text": "This file needs a header",
1800 "comment_type": "todo",
1800 "comment_type": "todo",
1801 "comment_last_version: 0
1801 "comment_last_version: 0
1802 },
1802 },
1803 "error" : null
1803 "error" : null
1804 }
1804 }
1805
1805
1806 """
1806 """
1807
1807
1808 comment = ChangesetComment.get(comment_id)
1808 comment = ChangesetComment.get(comment_id)
1809 if not comment:
1809 if not comment:
1810 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1810 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1811
1811
1812 perms = ('repository.read', 'repository.write', 'repository.admin')
1812 perms = ('repository.read', 'repository.write', 'repository.admin')
1813 has_comment_perm = HasRepoPermissionAnyApi(*perms)\
1813 has_comment_perm = HasRepoPermissionAnyApi(*perms)\
1814 (user=apiuser, repo_name=comment.repo.repo_name)
1814 (user=apiuser, repo_name=comment.repo.repo_name)
1815
1815
1816 if not has_comment_perm:
1816 if not has_comment_perm:
1817 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1817 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1818
1818
1819 return comment
1819 return comment
1820
1820
1821
1821
1822 @jsonrpc_method()
1822 @jsonrpc_method()
1823 def edit_comment(request, apiuser, message, comment_id, version,
1823 def edit_comment(request, apiuser, message, comment_id, version,
1824 userid=Optional(OAttr('apiuser'))):
1824 userid=Optional(OAttr('apiuser'))):
1825 """
1825 """
1826 Edit comment on the pull request or commit,
1826 Edit comment on the pull request or commit,
1827 specified by the `comment_id` and version. Initially version should be 0
1827 specified by the `comment_id` and version. Initially version should be 0
1828
1828
1829 :param apiuser: This is filled automatically from the |authtoken|.
1829 :param apiuser: This is filled automatically from the |authtoken|.
1830 :type apiuser: AuthUser
1830 :type apiuser: AuthUser
1831 :param comment_id: Specify the comment_id for editing
1831 :param comment_id: Specify the comment_id for editing
1832 :type comment_id: int
1832 :type comment_id: int
1833 :param version: version of the comment that will be created, starts from 0
1833 :param version: version of the comment that will be created, starts from 0
1834 :type version: int
1834 :type version: int
1835 :param message: The text content of the comment.
1835 :param message: The text content of the comment.
1836 :type message: str
1836 :type message: str
1837 :param userid: Comment on the pull request as this user
1837 :param userid: Comment on the pull request as this user
1838 :type userid: Optional(str or int)
1838 :type userid: Optional(str or int)
1839
1839
1840 Example output:
1840 Example output:
1841
1841
1842 .. code-block:: bash
1842 .. code-block:: bash
1843
1843
1844 id : <id_given_in_input>
1844 id : <id_given_in_input>
1845 result : {
1845 result : {
1846 "comment": "<comment data>",
1846 "comment": "<comment data>",
1847 "version": "<Integer>",
1847 "version": "<Integer>",
1848 },
1848 },
1849 error : null
1849 error : null
1850 """
1850 """
1851
1851
1852 auth_user = apiuser
1852 auth_user = apiuser
1853 comment = ChangesetComment.get(comment_id)
1853 comment = ChangesetComment.get(comment_id)
1854 if not comment:
1854 if not comment:
1855 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1855 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1856
1856
1857 is_super_admin = has_superadmin_permission(apiuser)
1857 is_super_admin = has_superadmin_permission(apiuser)
1858 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1858 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1859 (user=apiuser, repo_name=comment.repo.repo_name)
1859 (user=apiuser, repo_name=comment.repo.repo_name)
1860
1860
1861 if not isinstance(userid, Optional):
1861 if not isinstance(userid, Optional):
1862 if is_super_admin or is_repo_admin:
1862 if is_super_admin or is_repo_admin:
1863 apiuser = get_user_or_error(userid)
1863 apiuser = get_user_or_error(userid)
1864 auth_user = apiuser.AuthUser()
1864 auth_user = apiuser.AuthUser()
1865 else:
1865 else:
1866 raise JSONRPCError('userid is not the same as your user')
1866 raise JSONRPCError('userid is not the same as your user')
1867
1867
1868 comment_author = comment.author.user_id == auth_user.user_id
1868 comment_author = comment.author.user_id == auth_user.user_id
1869 if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1869 if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1870 raise JSONRPCError("you don't have access to edit this comment")
1870 raise JSONRPCError("you don't have access to edit this comment")
1871
1871
1872 try:
1872 try:
1873 comment_history = CommentsModel().edit(
1873 comment_history = CommentsModel().edit(
1874 comment_id=comment_id,
1874 comment_id=comment_id,
1875 text=message,
1875 text=message,
1876 auth_user=auth_user,
1876 auth_user=auth_user,
1877 version=version,
1877 version=version,
1878 )
1878 )
1879 Session().commit()
1879 Session().commit()
1880 except CommentVersionMismatch:
1880 except CommentVersionMismatch:
1881 raise JSONRPCError(
1881 raise JSONRPCError(
1882 'comment ({}) version ({}) mismatch'.format(comment_id, version)
1882 'comment ({}) version ({}) mismatch'.format(comment_id, version)
1883 )
1883 )
1884 if not comment_history and not message:
1884 if not comment_history and not message:
1885 raise JSONRPCError(
1885 raise JSONRPCError(
1886 "comment ({}) can't be changed with empty string".format(comment_id)
1886 "comment ({}) can't be changed with empty string".format(comment_id)
1887 )
1887 )
1888
1888
1889 if comment.pull_request:
1889 if comment.pull_request:
1890 pull_request = comment.pull_request
1890 pull_request = comment.pull_request
1891 PullRequestModel().trigger_pull_request_hook(
1891 PullRequestModel().trigger_pull_request_hook(
1892 pull_request, apiuser, 'comment_edit',
1892 pull_request, apiuser, 'comment_edit',
1893 data={'comment': comment})
1893 data={'comment': comment})
1894 else:
1894 else:
1895 db_repo = comment.repo
1895 db_repo = comment.repo
1896 commit_id = comment.revision
1896 commit_id = comment.revision
1897 commit = db_repo.get_commit(commit_id)
1897 commit = db_repo.get_commit(commit_id)
1898 CommentsModel().trigger_commit_comment_hook(
1898 CommentsModel().trigger_commit_comment_hook(
1899 db_repo, apiuser, 'edit',
1899 db_repo, apiuser, 'edit',
1900 data={'comment': comment, 'commit': commit})
1900 data={'comment': comment, 'commit': commit})
1901
1901
1902 data = {
1902 data = {
1903 'comment': comment,
1903 'comment': comment,
1904 'version': comment_history.version if comment_history else None,
1904 'version': comment_history.version if comment_history else None,
1905 }
1905 }
1906 return data
1906 return data
1907
1907
1908
1908
1909 # TODO(marcink): write this with all required logic for deleting a comments in PR or commits
1909 # TODO(marcink): write this with all required logic for deleting a comments in PR or commits
1910 # @jsonrpc_method()
1910 # @jsonrpc_method()
1911 # def delete_comment(request, apiuser, comment_id):
1911 # def delete_comment(request, apiuser, comment_id):
1912 # auth_user = apiuser
1912 # auth_user = apiuser
1913 #
1913 #
1914 # comment = ChangesetComment.get(comment_id)
1914 # comment = ChangesetComment.get(comment_id)
1915 # if not comment:
1915 # if not comment:
1916 # raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1916 # raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1917 #
1917 #
1918 # is_super_admin = has_superadmin_permission(apiuser)
1918 # is_super_admin = has_superadmin_permission(apiuser)
1919 # is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1919 # is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1920 # (user=apiuser, repo_name=comment.repo.repo_name)
1920 # (user=apiuser, repo_name=comment.repo.repo_name)
1921 #
1921 #
1922 # comment_author = comment.author.user_id == auth_user.user_id
1922 # comment_author = comment.author.user_id == auth_user.user_id
1923 # if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1923 # if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1924 # raise JSONRPCError("you don't have access to edit this comment")
1924 # raise JSONRPCError("you don't have access to edit this comment")
1925
1925
1926 @jsonrpc_method()
1926 @jsonrpc_method()
1927 def grant_user_permission(request, apiuser, repoid, userid, perm):
1927 def grant_user_permission(request, apiuser, repoid, userid, perm):
1928 """
1928 """
1929 Grant permissions for the specified user on the given repository,
1929 Grant permissions for the specified user on the given repository,
1930 or update existing permissions if found.
1930 or update existing permissions if found.
1931
1931
1932 This command can only be run using an |authtoken| with admin
1932 This command can only be run using an |authtoken| with admin
1933 permissions on the |repo|.
1933 permissions on the |repo|.
1934
1934
1935 :param apiuser: This is filled automatically from the |authtoken|.
1935 :param apiuser: This is filled automatically from the |authtoken|.
1936 :type apiuser: AuthUser
1936 :type apiuser: AuthUser
1937 :param repoid: Set the repository name or repository ID.
1937 :param repoid: Set the repository name or repository ID.
1938 :type repoid: str or int
1938 :type repoid: str or int
1939 :param userid: Set the user name.
1939 :param userid: Set the user name.
1940 :type userid: str
1940 :type userid: str
1941 :param perm: Set the user permissions, using the following format
1941 :param perm: Set the user permissions, using the following format
1942 ``(repository.(none|read|write|admin))``
1942 ``(repository.(none|read|write|admin))``
1943 :type perm: str
1943 :type perm: str
1944
1944
1945 Example output:
1945 Example output:
1946
1946
1947 .. code-block:: bash
1947 .. code-block:: bash
1948
1948
1949 id : <id_given_in_input>
1949 id : <id_given_in_input>
1950 result: {
1950 result: {
1951 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
1951 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
1952 "success": true
1952 "success": true
1953 }
1953 }
1954 error: null
1954 error: null
1955 """
1955 """
1956
1956
1957 repo = get_repo_or_error(repoid)
1957 repo = get_repo_or_error(repoid)
1958 user = get_user_or_error(userid)
1958 user = get_user_or_error(userid)
1959 perm = get_perm_or_error(perm)
1959 perm = get_perm_or_error(perm)
1960 if not has_superadmin_permission(apiuser):
1960 if not has_superadmin_permission(apiuser):
1961 _perms = ('repository.admin',)
1961 _perms = ('repository.admin',)
1962 validate_repo_permissions(apiuser, repoid, repo, _perms)
1962 validate_repo_permissions(apiuser, repoid, repo, _perms)
1963
1963
1964 perm_additions = [[user.user_id, perm.permission_name, "user"]]
1964 perm_additions = [[user.user_id, perm.permission_name, "user"]]
1965 try:
1965 try:
1966 changes = RepoModel().update_permissions(
1966 changes = RepoModel().update_permissions(
1967 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
1967 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
1968
1968
1969 action_data = {
1969 action_data = {
1970 'added': changes['added'],
1970 'added': changes['added'],
1971 'updated': changes['updated'],
1971 'updated': changes['updated'],
1972 'deleted': changes['deleted'],
1972 'deleted': changes['deleted'],
1973 }
1973 }
1974 audit_logger.store_api(
1974 audit_logger.store_api(
1975 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
1975 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
1976 Session().commit()
1976 Session().commit()
1977 PermissionModel().flush_user_permission_caches(changes)
1977 PermissionModel().flush_user_permission_caches(changes)
1978
1978
1979 return {
1979 return {
1980 'msg': 'Granted perm: `%s` for user: `%s` in repo: `%s`' % (
1980 'msg': 'Granted perm: `%s` for user: `%s` in repo: `%s`' % (
1981 perm.permission_name, user.username, repo.repo_name
1981 perm.permission_name, user.username, repo.repo_name
1982 ),
1982 ),
1983 'success': True
1983 'success': True
1984 }
1984 }
1985 except Exception:
1985 except Exception:
1986 log.exception("Exception occurred while trying edit permissions for repo")
1986 log.exception("Exception occurred while trying edit permissions for repo")
1987 raise JSONRPCError(
1987 raise JSONRPCError(
1988 'failed to edit permission for user: `%s` in repo: `%s`' % (
1988 'failed to edit permission for user: `%s` in repo: `%s`' % (
1989 userid, repoid
1989 userid, repoid
1990 )
1990 )
1991 )
1991 )
1992
1992
1993
1993
1994 @jsonrpc_method()
1994 @jsonrpc_method()
1995 def revoke_user_permission(request, apiuser, repoid, userid):
1995 def revoke_user_permission(request, apiuser, repoid, userid):
1996 """
1996 """
1997 Revoke permission for a user on the specified repository.
1997 Revoke permission for a user on the specified repository.
1998
1998
1999 This command can only be run using an |authtoken| with admin
1999 This command can only be run using an |authtoken| with admin
2000 permissions on the |repo|.
2000 permissions on the |repo|.
2001
2001
2002 :param apiuser: This is filled automatically from the |authtoken|.
2002 :param apiuser: This is filled automatically from the |authtoken|.
2003 :type apiuser: AuthUser
2003 :type apiuser: AuthUser
2004 :param repoid: Set the repository name or repository ID.
2004 :param repoid: Set the repository name or repository ID.
2005 :type repoid: str or int
2005 :type repoid: str or int
2006 :param userid: Set the user name of revoked user.
2006 :param userid: Set the user name of revoked user.
2007 :type userid: str or int
2007 :type userid: str or int
2008
2008
2009 Example error output:
2009 Example error output:
2010
2010
2011 .. code-block:: bash
2011 .. code-block:: bash
2012
2012
2013 id : <id_given_in_input>
2013 id : <id_given_in_input>
2014 result: {
2014 result: {
2015 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
2015 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
2016 "success": true
2016 "success": true
2017 }
2017 }
2018 error: null
2018 error: null
2019 """
2019 """
2020
2020
2021 repo = get_repo_or_error(repoid)
2021 repo = get_repo_or_error(repoid)
2022 user = get_user_or_error(userid)
2022 user = get_user_or_error(userid)
2023 if not has_superadmin_permission(apiuser):
2023 if not has_superadmin_permission(apiuser):
2024 _perms = ('repository.admin',)
2024 _perms = ('repository.admin',)
2025 validate_repo_permissions(apiuser, repoid, repo, _perms)
2025 validate_repo_permissions(apiuser, repoid, repo, _perms)
2026
2026
2027 perm_deletions = [[user.user_id, None, "user"]]
2027 perm_deletions = [[user.user_id, None, "user"]]
2028 try:
2028 try:
2029 changes = RepoModel().update_permissions(
2029 changes = RepoModel().update_permissions(
2030 repo=repo, perm_deletions=perm_deletions, cur_user=user)
2030 repo=repo, perm_deletions=perm_deletions, cur_user=user)
2031
2031
2032 action_data = {
2032 action_data = {
2033 'added': changes['added'],
2033 'added': changes['added'],
2034 'updated': changes['updated'],
2034 'updated': changes['updated'],
2035 'deleted': changes['deleted'],
2035 'deleted': changes['deleted'],
2036 }
2036 }
2037 audit_logger.store_api(
2037 audit_logger.store_api(
2038 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2038 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2039 Session().commit()
2039 Session().commit()
2040 PermissionModel().flush_user_permission_caches(changes)
2040 PermissionModel().flush_user_permission_caches(changes)
2041
2041
2042 return {
2042 return {
2043 'msg': 'Revoked perm for user: `%s` in repo: `%s`' % (
2043 'msg': 'Revoked perm for user: `%s` in repo: `%s`' % (
2044 user.username, repo.repo_name
2044 user.username, repo.repo_name
2045 ),
2045 ),
2046 'success': True
2046 'success': True
2047 }
2047 }
2048 except Exception:
2048 except Exception:
2049 log.exception("Exception occurred while trying revoke permissions to repo")
2049 log.exception("Exception occurred while trying revoke permissions to repo")
2050 raise JSONRPCError(
2050 raise JSONRPCError(
2051 'failed to edit permission for user: `%s` in repo: `%s`' % (
2051 'failed to edit permission for user: `%s` in repo: `%s`' % (
2052 userid, repoid
2052 userid, repoid
2053 )
2053 )
2054 )
2054 )
2055
2055
2056
2056
2057 @jsonrpc_method()
2057 @jsonrpc_method()
2058 def grant_user_group_permission(request, apiuser, repoid, usergroupid, perm):
2058 def grant_user_group_permission(request, apiuser, repoid, usergroupid, perm):
2059 """
2059 """
2060 Grant permission for a user group on the specified repository,
2060 Grant permission for a user group on the specified repository,
2061 or update existing permissions.
2061 or update existing permissions.
2062
2062
2063 This command can only be run using an |authtoken| with admin
2063 This command can only be run using an |authtoken| with admin
2064 permissions on the |repo|.
2064 permissions on the |repo|.
2065
2065
2066 :param apiuser: This is filled automatically from the |authtoken|.
2066 :param apiuser: This is filled automatically from the |authtoken|.
2067 :type apiuser: AuthUser
2067 :type apiuser: AuthUser
2068 :param repoid: Set the repository name or repository ID.
2068 :param repoid: Set the repository name or repository ID.
2069 :type repoid: str or int
2069 :type repoid: str or int
2070 :param usergroupid: Specify the ID of the user group.
2070 :param usergroupid: Specify the ID of the user group.
2071 :type usergroupid: str or int
2071 :type usergroupid: str or int
2072 :param perm: Set the user group permissions using the following
2072 :param perm: Set the user group permissions using the following
2073 format: (repository.(none|read|write|admin))
2073 format: (repository.(none|read|write|admin))
2074 :type perm: str
2074 :type perm: str
2075
2075
2076 Example output:
2076 Example output:
2077
2077
2078 .. code-block:: bash
2078 .. code-block:: bash
2079
2079
2080 id : <id_given_in_input>
2080 id : <id_given_in_input>
2081 result : {
2081 result : {
2082 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
2082 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
2083 "success": true
2083 "success": true
2084
2084
2085 }
2085 }
2086 error : null
2086 error : null
2087
2087
2088 Example error output:
2088 Example error output:
2089
2089
2090 .. code-block:: bash
2090 .. code-block:: bash
2091
2091
2092 id : <id_given_in_input>
2092 id : <id_given_in_input>
2093 result : null
2093 result : null
2094 error : {
2094 error : {
2095 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
2095 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
2096 }
2096 }
2097
2097
2098 """
2098 """
2099
2099
2100 repo = get_repo_or_error(repoid)
2100 repo = get_repo_or_error(repoid)
2101 perm = get_perm_or_error(perm)
2101 perm = get_perm_or_error(perm)
2102 if not has_superadmin_permission(apiuser):
2102 if not has_superadmin_permission(apiuser):
2103 _perms = ('repository.admin',)
2103 _perms = ('repository.admin',)
2104 validate_repo_permissions(apiuser, repoid, repo, _perms)
2104 validate_repo_permissions(apiuser, repoid, repo, _perms)
2105
2105
2106 user_group = get_user_group_or_error(usergroupid)
2106 user_group = get_user_group_or_error(usergroupid)
2107 if not has_superadmin_permission(apiuser):
2107 if not has_superadmin_permission(apiuser):
2108 # check if we have at least read permission for this user group !
2108 # check if we have at least read permission for this user group !
2109 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2109 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2110 if not HasUserGroupPermissionAnyApi(*_perms)(
2110 if not HasUserGroupPermissionAnyApi(*_perms)(
2111 user=apiuser, user_group_name=user_group.users_group_name):
2111 user=apiuser, user_group_name=user_group.users_group_name):
2112 raise JSONRPCError(
2112 raise JSONRPCError(
2113 'user group `%s` does not exist' % (usergroupid,))
2113 'user group `%s` does not exist' % (usergroupid,))
2114
2114
2115 perm_additions = [[user_group.users_group_id, perm.permission_name, "user_group"]]
2115 perm_additions = [[user_group.users_group_id, perm.permission_name, "user_group"]]
2116 try:
2116 try:
2117 changes = RepoModel().update_permissions(
2117 changes = RepoModel().update_permissions(
2118 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
2118 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
2119 action_data = {
2119 action_data = {
2120 'added': changes['added'],
2120 'added': changes['added'],
2121 'updated': changes['updated'],
2121 'updated': changes['updated'],
2122 'deleted': changes['deleted'],
2122 'deleted': changes['deleted'],
2123 }
2123 }
2124 audit_logger.store_api(
2124 audit_logger.store_api(
2125 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2125 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2126 Session().commit()
2126 Session().commit()
2127 PermissionModel().flush_user_permission_caches(changes)
2127 PermissionModel().flush_user_permission_caches(changes)
2128
2128
2129 return {
2129 return {
2130 'msg': 'Granted perm: `%s` for user group: `%s` in '
2130 'msg': 'Granted perm: `%s` for user group: `%s` in '
2131 'repo: `%s`' % (
2131 'repo: `%s`' % (
2132 perm.permission_name, user_group.users_group_name,
2132 perm.permission_name, user_group.users_group_name,
2133 repo.repo_name
2133 repo.repo_name
2134 ),
2134 ),
2135 'success': True
2135 'success': True
2136 }
2136 }
2137 except Exception:
2137 except Exception:
2138 log.exception(
2138 log.exception(
2139 "Exception occurred while trying change permission on repo")
2139 "Exception occurred while trying change permission on repo")
2140 raise JSONRPCError(
2140 raise JSONRPCError(
2141 'failed to edit permission for user group: `%s` in '
2141 'failed to edit permission for user group: `%s` in '
2142 'repo: `%s`' % (
2142 'repo: `%s`' % (
2143 usergroupid, repo.repo_name
2143 usergroupid, repo.repo_name
2144 )
2144 )
2145 )
2145 )
2146
2146
2147
2147
2148 @jsonrpc_method()
2148 @jsonrpc_method()
2149 def revoke_user_group_permission(request, apiuser, repoid, usergroupid):
2149 def revoke_user_group_permission(request, apiuser, repoid, usergroupid):
2150 """
2150 """
2151 Revoke the permissions of a user group on a given repository.
2151 Revoke the permissions of a user group on a given repository.
2152
2152
2153 This command can only be run using an |authtoken| with admin
2153 This command can only be run using an |authtoken| with admin
2154 permissions on the |repo|.
2154 permissions on the |repo|.
2155
2155
2156 :param apiuser: This is filled automatically from the |authtoken|.
2156 :param apiuser: This is filled automatically from the |authtoken|.
2157 :type apiuser: AuthUser
2157 :type apiuser: AuthUser
2158 :param repoid: Set the repository name or repository ID.
2158 :param repoid: Set the repository name or repository ID.
2159 :type repoid: str or int
2159 :type repoid: str or int
2160 :param usergroupid: Specify the user group ID.
2160 :param usergroupid: Specify the user group ID.
2161 :type usergroupid: str or int
2161 :type usergroupid: str or int
2162
2162
2163 Example output:
2163 Example output:
2164
2164
2165 .. code-block:: bash
2165 .. code-block:: bash
2166
2166
2167 id : <id_given_in_input>
2167 id : <id_given_in_input>
2168 result: {
2168 result: {
2169 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
2169 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
2170 "success": true
2170 "success": true
2171 }
2171 }
2172 error: null
2172 error: null
2173 """
2173 """
2174
2174
2175 repo = get_repo_or_error(repoid)
2175 repo = get_repo_or_error(repoid)
2176 if not has_superadmin_permission(apiuser):
2176 if not has_superadmin_permission(apiuser):
2177 _perms = ('repository.admin',)
2177 _perms = ('repository.admin',)
2178 validate_repo_permissions(apiuser, repoid, repo, _perms)
2178 validate_repo_permissions(apiuser, repoid, repo, _perms)
2179
2179
2180 user_group = get_user_group_or_error(usergroupid)
2180 user_group = get_user_group_or_error(usergroupid)
2181 if not has_superadmin_permission(apiuser):
2181 if not has_superadmin_permission(apiuser):
2182 # check if we have at least read permission for this user group !
2182 # check if we have at least read permission for this user group !
2183 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2183 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2184 if not HasUserGroupPermissionAnyApi(*_perms)(
2184 if not HasUserGroupPermissionAnyApi(*_perms)(
2185 user=apiuser, user_group_name=user_group.users_group_name):
2185 user=apiuser, user_group_name=user_group.users_group_name):
2186 raise JSONRPCError(
2186 raise JSONRPCError(
2187 'user group `%s` does not exist' % (usergroupid,))
2187 'user group `%s` does not exist' % (usergroupid,))
2188
2188
2189 perm_deletions = [[user_group.users_group_id, None, "user_group"]]
2189 perm_deletions = [[user_group.users_group_id, None, "user_group"]]
2190 try:
2190 try:
2191 changes = RepoModel().update_permissions(
2191 changes = RepoModel().update_permissions(
2192 repo=repo, perm_deletions=perm_deletions, cur_user=apiuser)
2192 repo=repo, perm_deletions=perm_deletions, cur_user=apiuser)
2193 action_data = {
2193 action_data = {
2194 'added': changes['added'],
2194 'added': changes['added'],
2195 'updated': changes['updated'],
2195 'updated': changes['updated'],
2196 'deleted': changes['deleted'],
2196 'deleted': changes['deleted'],
2197 }
2197 }
2198 audit_logger.store_api(
2198 audit_logger.store_api(
2199 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2199 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2200 Session().commit()
2200 Session().commit()
2201 PermissionModel().flush_user_permission_caches(changes)
2201 PermissionModel().flush_user_permission_caches(changes)
2202
2202
2203 return {
2203 return {
2204 'msg': 'Revoked perm for user group: `%s` in repo: `%s`' % (
2204 'msg': 'Revoked perm for user group: `%s` in repo: `%s`' % (
2205 user_group.users_group_name, repo.repo_name
2205 user_group.users_group_name, repo.repo_name
2206 ),
2206 ),
2207 'success': True
2207 'success': True
2208 }
2208 }
2209 except Exception:
2209 except Exception:
2210 log.exception("Exception occurred while trying revoke "
2210 log.exception("Exception occurred while trying revoke "
2211 "user group permission on repo")
2211 "user group permission on repo")
2212 raise JSONRPCError(
2212 raise JSONRPCError(
2213 'failed to edit permission for user group: `%s` in '
2213 'failed to edit permission for user group: `%s` in '
2214 'repo: `%s`' % (
2214 'repo: `%s`' % (
2215 user_group.users_group_name, repo.repo_name
2215 user_group.users_group_name, repo.repo_name
2216 )
2216 )
2217 )
2217 )
2218
2218
2219
2219
2220 @jsonrpc_method()
2220 @jsonrpc_method()
2221 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
2221 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
2222 """
2222 """
2223 Triggers a pull on the given repository from a remote location. You
2223 Triggers a pull on the given repository from a remote location. You
2224 can use this to keep remote repositories up-to-date.
2224 can use this to keep remote repositories up-to-date.
2225
2225
2226 This command can only be run using an |authtoken| with admin
2226 This command can only be run using an |authtoken| with admin
2227 rights to the specified repository. For more information,
2227 rights to the specified repository. For more information,
2228 see :ref:`config-token-ref`.
2228 see :ref:`config-token-ref`.
2229
2229
2230 This command takes the following options:
2230 This command takes the following options:
2231
2231
2232 :param apiuser: This is filled automatically from the |authtoken|.
2232 :param apiuser: This is filled automatically from the |authtoken|.
2233 :type apiuser: AuthUser
2233 :type apiuser: AuthUser
2234 :param repoid: The repository name or repository ID.
2234 :param repoid: The repository name or repository ID.
2235 :type repoid: str or int
2235 :type repoid: str or int
2236 :param remote_uri: Optional remote URI to pass in for pull
2236 :param remote_uri: Optional remote URI to pass in for pull
2237 :type remote_uri: str
2237 :type remote_uri: str
2238
2238
2239 Example output:
2239 Example output:
2240
2240
2241 .. code-block:: bash
2241 .. code-block:: bash
2242
2242
2243 id : <id_given_in_input>
2243 id : <id_given_in_input>
2244 result : {
2244 result : {
2245 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
2245 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
2246 "repository": "<repository name>"
2246 "repository": "<repository name>"
2247 }
2247 }
2248 error : null
2248 error : null
2249
2249
2250 Example error output:
2250 Example error output:
2251
2251
2252 .. code-block:: bash
2252 .. code-block:: bash
2253
2253
2254 id : <id_given_in_input>
2254 id : <id_given_in_input>
2255 result : null
2255 result : null
2256 error : {
2256 error : {
2257 "Unable to push changes from `<remote_url>`"
2257 "Unable to push changes from `<remote_url>`"
2258 }
2258 }
2259
2259
2260 """
2260 """
2261
2261
2262 repo = get_repo_or_error(repoid)
2262 repo = get_repo_or_error(repoid)
2263 remote_uri = Optional.extract(remote_uri)
2263 remote_uri = Optional.extract(remote_uri)
2264 remote_uri_display = remote_uri or repo.clone_uri_hidden
2264 remote_uri_display = remote_uri or repo.clone_uri_hidden
2265 if not has_superadmin_permission(apiuser):
2265 if not has_superadmin_permission(apiuser):
2266 _perms = ('repository.admin',)
2266 _perms = ('repository.admin',)
2267 validate_repo_permissions(apiuser, repoid, repo, _perms)
2267 validate_repo_permissions(apiuser, repoid, repo, _perms)
2268
2268
2269 try:
2269 try:
2270 ScmModel().pull_changes(
2270 ScmModel().pull_changes(
2271 repo.repo_name, apiuser.username, remote_uri=remote_uri)
2271 repo.repo_name, apiuser.username, remote_uri=remote_uri)
2272 return {
2272 return {
2273 'msg': 'Pulled from url `%s` on repo `%s`' % (
2273 'msg': 'Pulled from url `%s` on repo `%s`' % (
2274 remote_uri_display, repo.repo_name),
2274 remote_uri_display, repo.repo_name),
2275 'repository': repo.repo_name
2275 'repository': repo.repo_name
2276 }
2276 }
2277 except Exception:
2277 except Exception:
2278 log.exception("Exception occurred while trying to "
2278 log.exception("Exception occurred while trying to "
2279 "pull changes from remote location")
2279 "pull changes from remote location")
2280 raise JSONRPCError(
2280 raise JSONRPCError(
2281 'Unable to pull changes from `%s`' % remote_uri_display
2281 'Unable to pull changes from `%s`' % remote_uri_display
2282 )
2282 )
2283
2283
2284
2284
2285 @jsonrpc_method()
2285 @jsonrpc_method()
2286 def strip(request, apiuser, repoid, revision, branch):
2286 def strip(request, apiuser, repoid, revision, branch):
2287 """
2287 """
2288 Strips the given revision from the specified repository.
2288 Strips the given revision from the specified repository.
2289
2289
2290 * This will remove the revision and all of its decendants.
2290 * This will remove the revision and all of its decendants.
2291
2291
2292 This command can only be run using an |authtoken| with admin rights to
2292 This command can only be run using an |authtoken| with admin rights to
2293 the specified repository.
2293 the specified repository.
2294
2294
2295 This command takes the following options:
2295 This command takes the following options:
2296
2296
2297 :param apiuser: This is filled automatically from the |authtoken|.
2297 :param apiuser: This is filled automatically from the |authtoken|.
2298 :type apiuser: AuthUser
2298 :type apiuser: AuthUser
2299 :param repoid: The repository name or repository ID.
2299 :param repoid: The repository name or repository ID.
2300 :type repoid: str or int
2300 :type repoid: str or int
2301 :param revision: The revision you wish to strip.
2301 :param revision: The revision you wish to strip.
2302 :type revision: str
2302 :type revision: str
2303 :param branch: The branch from which to strip the revision.
2303 :param branch: The branch from which to strip the revision.
2304 :type branch: str
2304 :type branch: str
2305
2305
2306 Example output:
2306 Example output:
2307
2307
2308 .. code-block:: bash
2308 .. code-block:: bash
2309
2309
2310 id : <id_given_in_input>
2310 id : <id_given_in_input>
2311 result : {
2311 result : {
2312 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
2312 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
2313 "repository": "<repository name>"
2313 "repository": "<repository name>"
2314 }
2314 }
2315 error : null
2315 error : null
2316
2316
2317 Example error output:
2317 Example error output:
2318
2318
2319 .. code-block:: bash
2319 .. code-block:: bash
2320
2320
2321 id : <id_given_in_input>
2321 id : <id_given_in_input>
2322 result : null
2322 result : null
2323 error : {
2323 error : {
2324 "Unable to strip commit <commit_hash> from repo `<repository name>`"
2324 "Unable to strip commit <commit_hash> from repo `<repository name>`"
2325 }
2325 }
2326
2326
2327 """
2327 """
2328
2328
2329 repo = get_repo_or_error(repoid)
2329 repo = get_repo_or_error(repoid)
2330 if not has_superadmin_permission(apiuser):
2330 if not has_superadmin_permission(apiuser):
2331 _perms = ('repository.admin',)
2331 _perms = ('repository.admin',)
2332 validate_repo_permissions(apiuser, repoid, repo, _perms)
2332 validate_repo_permissions(apiuser, repoid, repo, _perms)
2333
2333
2334 try:
2334 try:
2335 ScmModel().strip(repo, revision, branch)
2335 ScmModel().strip(repo, revision, branch)
2336 audit_logger.store_api(
2336 audit_logger.store_api(
2337 'repo.commit.strip', action_data={'commit_id': revision},
2337 'repo.commit.strip', action_data={'commit_id': revision},
2338 repo=repo,
2338 repo=repo,
2339 user=apiuser, commit=True)
2339 user=apiuser, commit=True)
2340
2340
2341 return {
2341 return {
2342 'msg': 'Stripped commit %s from repo `%s`' % (
2342 'msg': 'Stripped commit %s from repo `%s`' % (
2343 revision, repo.repo_name),
2343 revision, repo.repo_name),
2344 'repository': repo.repo_name
2344 'repository': repo.repo_name
2345 }
2345 }
2346 except Exception:
2346 except Exception:
2347 log.exception("Exception while trying to strip")
2347 log.exception("Exception while trying to strip")
2348 raise JSONRPCError(
2348 raise JSONRPCError(
2349 'Unable to strip commit %s from repo `%s`' % (
2349 'Unable to strip commit %s from repo `%s`' % (
2350 revision, repo.repo_name)
2350 revision, repo.repo_name)
2351 )
2351 )
2352
2352
2353
2353
2354 @jsonrpc_method()
2354 @jsonrpc_method()
2355 def get_repo_settings(request, apiuser, repoid, key=Optional(None)):
2355 def get_repo_settings(request, apiuser, repoid, key=Optional(None)):
2356 """
2356 """
2357 Returns all settings for a repository. If key is given it only returns the
2357 Returns all settings for a repository. If key is given it only returns the
2358 setting identified by the key or null.
2358 setting identified by the key or null.
2359
2359
2360 :param apiuser: This is filled automatically from the |authtoken|.
2360 :param apiuser: This is filled automatically from the |authtoken|.
2361 :type apiuser: AuthUser
2361 :type apiuser: AuthUser
2362 :param repoid: The repository name or repository id.
2362 :param repoid: The repository name or repository id.
2363 :type repoid: str or int
2363 :type repoid: str or int
2364 :param key: Key of the setting to return.
2364 :param key: Key of the setting to return.
2365 :type: key: Optional(str)
2365 :type: key: Optional(str)
2366
2366
2367 Example output:
2367 Example output:
2368
2368
2369 .. code-block:: bash
2369 .. code-block:: bash
2370
2370
2371 {
2371 {
2372 "error": null,
2372 "error": null,
2373 "id": 237,
2373 "id": 237,
2374 "result": {
2374 "result": {
2375 "extensions_largefiles": true,
2375 "extensions_largefiles": true,
2376 "extensions_evolve": true,
2376 "extensions_evolve": true,
2377 "hooks_changegroup_push_logger": true,
2377 "hooks_changegroup_push_logger": true,
2378 "hooks_changegroup_repo_size": false,
2378 "hooks_changegroup_repo_size": false,
2379 "hooks_outgoing_pull_logger": true,
2379 "hooks_outgoing_pull_logger": true,
2380 "phases_publish": "True",
2380 "phases_publish": "True",
2381 "rhodecode_hg_use_rebase_for_merging": true,
2381 "rhodecode_hg_use_rebase_for_merging": true,
2382 "rhodecode_pr_merge_enabled": true,
2382 "rhodecode_pr_merge_enabled": true,
2383 "rhodecode_use_outdated_comments": true
2383 "rhodecode_use_outdated_comments": true
2384 }
2384 }
2385 }
2385 }
2386 """
2386 """
2387
2387
2388 # Restrict access to this api method to super-admins, and repo admins only.
2388 # Restrict access to this api method to super-admins, and repo admins only.
2389 repo = get_repo_or_error(repoid)
2389 repo = get_repo_or_error(repoid)
2390 if not has_superadmin_permission(apiuser):
2390 if not has_superadmin_permission(apiuser):
2391 _perms = ('repository.admin',)
2391 _perms = ('repository.admin',)
2392 validate_repo_permissions(apiuser, repoid, repo, _perms)
2392 validate_repo_permissions(apiuser, repoid, repo, _perms)
2393
2393
2394 try:
2394 try:
2395 settings_model = VcsSettingsModel(repo=repo)
2395 settings_model = VcsSettingsModel(repo=repo)
2396 settings = settings_model.get_global_settings()
2396 settings = settings_model.get_global_settings()
2397 settings.update(settings_model.get_repo_settings())
2397 settings.update(settings_model.get_repo_settings())
2398
2398
2399 # If only a single setting is requested fetch it from all settings.
2399 # If only a single setting is requested fetch it from all settings.
2400 key = Optional.extract(key)
2400 key = Optional.extract(key)
2401 if key is not None:
2401 if key is not None:
2402 settings = settings.get(key, None)
2402 settings = settings.get(key, None)
2403 except Exception:
2403 except Exception:
2404 msg = 'Failed to fetch settings for repository `{}`'.format(repoid)
2404 msg = 'Failed to fetch settings for repository `{}`'.format(repoid)
2405 log.exception(msg)
2405 log.exception(msg)
2406 raise JSONRPCError(msg)
2406 raise JSONRPCError(msg)
2407
2407
2408 return settings
2408 return settings
2409
2409
2410
2410
2411 @jsonrpc_method()
2411 @jsonrpc_method()
2412 def set_repo_settings(request, apiuser, repoid, settings):
2412 def set_repo_settings(request, apiuser, repoid, settings):
2413 """
2413 """
2414 Update repository settings. Returns true on success.
2414 Update repository settings. Returns true on success.
2415
2415
2416 :param apiuser: This is filled automatically from the |authtoken|.
2416 :param apiuser: This is filled automatically from the |authtoken|.
2417 :type apiuser: AuthUser
2417 :type apiuser: AuthUser
2418 :param repoid: The repository name or repository id.
2418 :param repoid: The repository name or repository id.
2419 :type repoid: str or int
2419 :type repoid: str or int
2420 :param settings: The new settings for the repository.
2420 :param settings: The new settings for the repository.
2421 :type: settings: dict
2421 :type: settings: dict
2422
2422
2423 Example output:
2423 Example output:
2424
2424
2425 .. code-block:: bash
2425 .. code-block:: bash
2426
2426
2427 {
2427 {
2428 "error": null,
2428 "error": null,
2429 "id": 237,
2429 "id": 237,
2430 "result": true
2430 "result": true
2431 }
2431 }
2432 """
2432 """
2433 # Restrict access to this api method to super-admins, and repo admins only.
2433 # Restrict access to this api method to super-admins, and repo admins only.
2434 repo = get_repo_or_error(repoid)
2434 repo = get_repo_or_error(repoid)
2435 if not has_superadmin_permission(apiuser):
2435 if not has_superadmin_permission(apiuser):
2436 _perms = ('repository.admin',)
2436 _perms = ('repository.admin',)
2437 validate_repo_permissions(apiuser, repoid, repo, _perms)
2437 validate_repo_permissions(apiuser, repoid, repo, _perms)
2438
2438
2439 if type(settings) is not dict:
2439 if type(settings) is not dict:
2440 raise JSONRPCError('Settings have to be a JSON Object.')
2440 raise JSONRPCError('Settings have to be a JSON Object.')
2441
2441
2442 try:
2442 try:
2443 settings_model = VcsSettingsModel(repo=repoid)
2443 settings_model = VcsSettingsModel(repo=repoid)
2444
2444
2445 # Merge global, repo and incoming settings.
2445 # Merge global, repo and incoming settings.
2446 new_settings = settings_model.get_global_settings()
2446 new_settings = settings_model.get_global_settings()
2447 new_settings.update(settings_model.get_repo_settings())
2447 new_settings.update(settings_model.get_repo_settings())
2448 new_settings.update(settings)
2448 new_settings.update(settings)
2449
2449
2450 # Update the settings.
2450 # Update the settings.
2451 inherit_global_settings = new_settings.get(
2451 inherit_global_settings = new_settings.get(
2452 'inherit_global_settings', False)
2452 'inherit_global_settings', False)
2453 settings_model.create_or_update_repo_settings(
2453 settings_model.create_or_update_repo_settings(
2454 new_settings, inherit_global_settings=inherit_global_settings)
2454 new_settings, inherit_global_settings=inherit_global_settings)
2455 Session().commit()
2455 Session().commit()
2456 except Exception:
2456 except Exception:
2457 msg = 'Failed to update settings for repository `{}`'.format(repoid)
2457 msg = 'Failed to update settings for repository `{}`'.format(repoid)
2458 log.exception(msg)
2458 log.exception(msg)
2459 raise JSONRPCError(msg)
2459 raise JSONRPCError(msg)
2460
2460
2461 # Indicate success.
2461 # Indicate success.
2462 return True
2462 return True
2463
2463
2464
2464
2465 @jsonrpc_method()
2465 @jsonrpc_method()
2466 def maintenance(request, apiuser, repoid):
2466 def maintenance(request, apiuser, repoid):
2467 """
2467 """
2468 Triggers a maintenance on the given repository.
2468 Triggers a maintenance on the given repository.
2469
2469
2470 This command can only be run using an |authtoken| with admin
2470 This command can only be run using an |authtoken| with admin
2471 rights to the specified repository. For more information,
2471 rights to the specified repository. For more information,
2472 see :ref:`config-token-ref`.
2472 see :ref:`config-token-ref`.
2473
2473
2474 This command takes the following options:
2474 This command takes the following options:
2475
2475
2476 :param apiuser: This is filled automatically from the |authtoken|.
2476 :param apiuser: This is filled automatically from the |authtoken|.
2477 :type apiuser: AuthUser
2477 :type apiuser: AuthUser
2478 :param repoid: The repository name or repository ID.
2478 :param repoid: The repository name or repository ID.
2479 :type repoid: str or int
2479 :type repoid: str or int
2480
2480
2481 Example output:
2481 Example output:
2482
2482
2483 .. code-block:: bash
2483 .. code-block:: bash
2484
2484
2485 id : <id_given_in_input>
2485 id : <id_given_in_input>
2486 result : {
2486 result : {
2487 "msg": "executed maintenance command",
2487 "msg": "executed maintenance command",
2488 "executed_actions": [
2488 "executed_actions": [
2489 <action_message>, <action_message2>...
2489 <action_message>, <action_message2>...
2490 ],
2490 ],
2491 "repository": "<repository name>"
2491 "repository": "<repository name>"
2492 }
2492 }
2493 error : null
2493 error : null
2494
2494
2495 Example error output:
2495 Example error output:
2496
2496
2497 .. code-block:: bash
2497 .. code-block:: bash
2498
2498
2499 id : <id_given_in_input>
2499 id : <id_given_in_input>
2500 result : null
2500 result : null
2501 error : {
2501 error : {
2502 "Unable to execute maintenance on `<reponame>`"
2502 "Unable to execute maintenance on `<reponame>`"
2503 }
2503 }
2504
2504
2505 """
2505 """
2506
2506
2507 repo = get_repo_or_error(repoid)
2507 repo = get_repo_or_error(repoid)
2508 if not has_superadmin_permission(apiuser):
2508 if not has_superadmin_permission(apiuser):
2509 _perms = ('repository.admin',)
2509 _perms = ('repository.admin',)
2510 validate_repo_permissions(apiuser, repoid, repo, _perms)
2510 validate_repo_permissions(apiuser, repoid, repo, _perms)
2511
2511
2512 try:
2512 try:
2513 maintenance = repo_maintenance.RepoMaintenance()
2513 maintenance = repo_maintenance.RepoMaintenance()
2514 executed_actions = maintenance.execute(repo)
2514 executed_actions = maintenance.execute(repo)
2515
2515
2516 return {
2516 return {
2517 'msg': 'executed maintenance command',
2517 'msg': 'executed maintenance command',
2518 'executed_actions': executed_actions,
2518 'executed_actions': executed_actions,
2519 'repository': repo.repo_name
2519 'repository': repo.repo_name
2520 }
2520 }
2521 except Exception:
2521 except Exception:
2522 log.exception("Exception occurred while trying to run maintenance")
2522 log.exception("Exception occurred while trying to run maintenance")
2523 raise JSONRPCError(
2523 raise JSONRPCError(
2524 'Unable to execute maintenance on `%s`' % repo.repo_name)
2524 'Unable to execute maintenance on `%s`' % repo.repo_name)
@@ -1,269 +1,270 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import os
22 import time
22 import time
23 import errno
23 import errno
24 import hashlib
24 import hashlib
25
25
26 from rhodecode.lib.ext_json import json
26 from rhodecode.lib.ext_json import json
27 from rhodecode.apps.file_store import utils
27 from rhodecode.apps.file_store import utils
28 from rhodecode.apps.file_store.extensions import resolve_extensions
28 from rhodecode.apps.file_store.extensions import resolve_extensions
29 from rhodecode.apps.file_store.exceptions import (
29 from rhodecode.apps.file_store.exceptions import (
30 FileNotAllowedException, FileOverSizeException)
30 FileNotAllowedException, FileOverSizeException)
31
31
32 METADATA_VER = 'v1'
32 METADATA_VER = 'v1'
33
33
34
34
35 def safe_make_dirs(dir_path):
35 def safe_make_dirs(dir_path):
36 if not os.path.exists(dir_path):
36 if not os.path.exists(dir_path):
37 try:
37 try:
38 os.makedirs(dir_path)
38 os.makedirs(dir_path)
39 except OSError as e:
39 except OSError as e:
40 if e.errno != errno.EEXIST:
40 if e.errno != errno.EEXIST:
41 raise
41 raise
42 return
42 return
43
43
44
44
45 class LocalFileStorage(object):
45 class LocalFileStorage(object):
46
46
47 @classmethod
47 @classmethod
48 def apply_counter(cls, counter, filename):
48 def apply_counter(cls, counter, filename):
49 name_counted = '%d-%s' % (counter, filename)
49 name_counted = '%d-%s' % (counter, filename)
50 return name_counted
50 return name_counted
51
51
52 @classmethod
52 @classmethod
53 def resolve_name(cls, name, directory):
53 def resolve_name(cls, name, directory):
54 """
54 """
55 Resolves a unique name and the correct path. If a filename
55 Resolves a unique name and the correct path. If a filename
56 for that path already exists then a numeric prefix with values > 0 will be
56 for that path already exists then a numeric prefix with values > 0 will be
57 added, for example test.jpg -> 1-test.jpg etc. initially file would have 0 prefix.
57 added, for example test.jpg -> 1-test.jpg etc. initially file would have 0 prefix.
58
58
59 :param name: base name of file
59 :param name: base name of file
60 :param directory: absolute directory path
60 :param directory: absolute directory path
61 """
61 """
62
62
63 counter = 0
63 counter = 0
64 while True:
64 while True:
65 name_counted = cls.apply_counter(counter, name)
65 name_counted = cls.apply_counter(counter, name)
66
66
67 # sub_store prefix to optimize disk usage, e.g some_path/ab/final_file
67 # sub_store prefix to optimize disk usage, e.g some_path/ab/final_file
68 sub_store = cls._sub_store_from_filename(name_counted)
68 sub_store = cls._sub_store_from_filename(name_counted)
69 sub_store_path = os.path.join(directory, sub_store)
69 sub_store_path = os.path.join(directory, sub_store)
70 safe_make_dirs(sub_store_path)
70 safe_make_dirs(sub_store_path)
71
71
72 path = os.path.join(sub_store_path, name_counted)
72 path = os.path.join(sub_store_path, name_counted)
73 if not os.path.exists(path):
73 if not os.path.exists(path):
74 return name_counted, path
74 return name_counted, path
75 counter += 1
75 counter += 1
76
76
77 @classmethod
77 @classmethod
78 def _sub_store_from_filename(cls, filename):
78 def _sub_store_from_filename(cls, filename):
79 return filename[:2]
79 return filename[:2]
80
80
81 @classmethod
81 @classmethod
82 def calculate_path_hash(cls, file_path):
82 def calculate_path_hash(cls, file_path):
83 """
83 """
84 Efficient calculation of file_path sha256 sum
84 Efficient calculation of file_path sha256 sum
85
85
86 :param file_path:
86 :param file_path:
87 :return: sha256sum
87 :return: sha256sum
88 """
88 """
89 digest = hashlib.sha256()
89 digest = hashlib.sha256()
90 with open(file_path, 'rb') as f:
90 with open(file_path, 'rb') as f:
91 for chunk in iter(lambda: f.read(1024 * 100), b""):
91 for chunk in iter(lambda: f.read(1024 * 100), b""):
92 digest.update(chunk)
92 digest.update(chunk)
93
93
94 return digest.hexdigest()
94 return digest.hexdigest()
95
95
96 def __init__(self, base_path, extension_groups=None):
96 def __init__(self, base_path, extension_groups=None):
97
97
98 """
98 """
99 Local file storage
99 Local file storage
100
100
101 :param base_path: the absolute base path where uploads are stored
101 :param base_path: the absolute base path where uploads are stored
102 :param extension_groups: extensions string
102 :param extension_groups: extensions string
103 """
103 """
104
104
105 extension_groups = extension_groups or ['any']
105 extension_groups = extension_groups or ['any']
106 self.base_path = base_path
106 self.base_path = base_path
107 self.extensions = resolve_extensions([], groups=extension_groups)
107 self.extensions = resolve_extensions([], groups=extension_groups)
108
108
109 def __repr__(self):
109 def __repr__(self):
110 return '{}@{}'.format(self.__class__, self.base_path)
110 return '{}@{}'.format(self.__class__, self.base_path)
111
111
112 def store_path(self, filename):
112 def store_path(self, filename):
113 """
113 """
114 Returns absolute file path of the filename, joined to the
114 Returns absolute file path of the filename, joined to the
115 base_path.
115 base_path.
116
116
117 :param filename: base name of file
117 :param filename: base name of file
118 """
118 """
119 prefix_dir = ''
119 prefix_dir = ''
120 if '/' in filename:
120 if '/' in filename:
121 prefix_dir, filename = filename.split('/')
121 prefix_dir, filename = filename.split('/')
122 sub_store = self._sub_store_from_filename(filename)
122 sub_store = self._sub_store_from_filename(filename)
123 else:
123 else:
124 sub_store = self._sub_store_from_filename(filename)
124 sub_store = self._sub_store_from_filename(filename)
125 return os.path.join(self.base_path, prefix_dir, sub_store, filename)
125 return os.path.join(self.base_path, prefix_dir, sub_store, filename)
126
126
127 def delete(self, filename):
127 def delete(self, filename):
128 """
128 """
129 Deletes the filename. Filename is resolved with the
129 Deletes the filename. Filename is resolved with the
130 absolute path based on base_path. If file does not exist,
130 absolute path based on base_path. If file does not exist,
131 returns **False**, otherwise **True**
131 returns **False**, otherwise **True**
132
132
133 :param filename: base name of file
133 :param filename: base name of file
134 """
134 """
135 if self.exists(filename):
135 if self.exists(filename):
136 os.remove(self.store_path(filename))
136 os.remove(self.store_path(filename))
137 return True
137 return True
138 return False
138 return False
139
139
140 def exists(self, filename):
140 def exists(self, filename):
141 """
141 """
142 Checks if file exists. Resolves filename's absolute
142 Checks if file exists. Resolves filename's absolute
143 path based on base_path.
143 path based on base_path.
144
144
145 :param filename: file_uid name of file, e.g 0-f62b2b2d-9708-4079-a071-ec3f958448d4.svg
145 :param filename: file_uid name of file, e.g 0-f62b2b2d-9708-4079-a071-ec3f958448d4.svg
146 """
146 """
147 return os.path.exists(self.store_path(filename))
147 return os.path.exists(self.store_path(filename))
148
148
149 def filename_allowed(self, filename, extensions=None):
149 def filename_allowed(self, filename, extensions=None):
150 """Checks if a filename has an allowed extension
150 """Checks if a filename has an allowed extension
151
151
152 :param filename: base name of file
152 :param filename: base name of file
153 :param extensions: iterable of extensions (or self.extensions)
153 :param extensions: iterable of extensions (or self.extensions)
154 """
154 """
155 _, ext = os.path.splitext(filename)
155 _, ext = os.path.splitext(filename)
156 return self.extension_allowed(ext, extensions)
156 return self.extension_allowed(ext, extensions)
157
157
158 def extension_allowed(self, ext, extensions=None):
158 def extension_allowed(self, ext, extensions=None):
159 """
159 """
160 Checks if an extension is permitted. Both e.g. ".jpg" and
160 Checks if an extension is permitted. Both e.g. ".jpg" and
161 "jpg" can be passed in. Extension lookup is case-insensitive.
161 "jpg" can be passed in. Extension lookup is case-insensitive.
162
162
163 :param ext: extension to check
163 :param ext: extension to check
164 :param extensions: iterable of extensions to validate against (or self.extensions)
164 :param extensions: iterable of extensions to validate against (or self.extensions)
165 """
165 """
166 def normalize_ext(_ext):
166 def normalize_ext(_ext):
167 if _ext.startswith('.'):
167 if _ext.startswith('.'):
168 _ext = _ext[1:]
168 _ext = _ext[1:]
169 return _ext.lower()
169 return _ext.lower()
170
170
171 extensions = extensions or self.extensions
171 extensions = extensions or self.extensions
172 if not extensions:
172 if not extensions:
173 return True
173 return True
174
174
175 ext = normalize_ext(ext)
175 ext = normalize_ext(ext)
176
176
177 return ext in [normalize_ext(x) for x in extensions]
177 return ext in [normalize_ext(x) for x in extensions]
178
178
179 def save_file(self, file_obj, filename, directory=None, extensions=None,
179 def save_file(self, file_obj, filename, directory=None, extensions=None,
180 extra_metadata=None, max_filesize=None, randomized_name=True, **kwargs):
180 extra_metadata=None, max_filesize=None, randomized_name=True, **kwargs):
181 """
181 """
182 Saves a file object to the uploads location.
182 Saves a file object to the uploads location.
183 Returns the resolved filename, i.e. the directory +
183 Returns the resolved filename, i.e. the directory +
184 the (randomized/incremented) base name.
184 the (randomized/incremented) base name.
185
185
186 :param file_obj: **cgi.FieldStorage** object (or similar)
186 :param file_obj: **cgi.FieldStorage** object (or similar)
187 :param filename: original filename
187 :param filename: original filename
188 :param directory: relative path of sub-directory
188 :param directory: relative path of sub-directory
189 :param extensions: iterable of allowed extensions, if not default
189 :param extensions: iterable of allowed extensions, if not default
190 :param max_filesize: maximum size of file that should be allowed
190 :param max_filesize: maximum size of file that should be allowed
191 :param randomized_name: generate random generated UID or fixed based on the filename
191 :param randomized_name: generate random generated UID or fixed based on the filename
192 :param extra_metadata: extra JSON metadata to store next to the file with .meta suffix
192 :param extra_metadata: extra JSON metadata to store next to the file with .meta suffix
193
193
194 """
194 """
195
195
196 extensions = extensions or self.extensions
196 extensions = extensions or self.extensions
197
197
198 if not self.filename_allowed(filename, extensions):
198 if not self.filename_allowed(filename, extensions):
199 raise FileNotAllowedException()
199 raise FileNotAllowedException()
200
200
201 if directory:
201 if directory:
202 dest_directory = os.path.join(self.base_path, directory)
202 dest_directory = os.path.join(self.base_path, directory)
203 else:
203 else:
204 dest_directory = self.base_path
204 dest_directory = self.base_path
205
205
206 safe_make_dirs(dest_directory)
206 safe_make_dirs(dest_directory)
207
207
208 uid_filename = utils.uid_filename(filename, randomized=randomized_name)
208 uid_filename = utils.uid_filename(filename, randomized=randomized_name)
209
209
210 # resolve also produces special sub-dir for file optimized store
210 # resolve also produces special sub-dir for file optimized store
211 filename, path = self.resolve_name(uid_filename, dest_directory)
211 filename, path = self.resolve_name(uid_filename, dest_directory)
212 stored_file_dir = os.path.dirname(path)
212 stored_file_dir = os.path.dirname(path)
213
213
214 no_body_seek = kwargs.pop('no_body_seek', False)
214 no_body_seek = kwargs.pop('no_body_seek', False)
215 if no_body_seek:
215 if no_body_seek:
216 pass
216 pass
217 else:
217 else:
218 file_obj.seek(0)
218 file_obj.seek(0)
219
219
220 with open(path, "wb") as dest:
220 with open(path, "wb") as dest:
221 length = 256 * 1024
221 length = 256 * 1024
222 while 1:
222 while 1:
223 buf = file_obj.read(length)
223 buf = file_obj.read(length)
224 if not buf:
224 if not buf:
225 break
225 break
226 dest.write(buf)
226 dest.write(buf)
227
227
228 metadata = {}
228 metadata = {}
229 if extra_metadata:
229 if extra_metadata:
230 metadata = extra_metadata
230 metadata = extra_metadata
231
231
232 size = os.stat(path).st_size
232 size = os.stat(path).st_size
233
233
234 if max_filesize and size > max_filesize:
234 if max_filesize and size > max_filesize:
235 # free up the copied file, and raise exc
235 # free up the copied file, and raise exc
236 os.remove(path)
236 os.remove(path)
237 raise FileOverSizeException()
237 raise FileOverSizeException()
238
238
239 file_hash = self.calculate_path_hash(path)
239 file_hash = self.calculate_path_hash(path)
240
240
241 metadata.update({
241 metadata.update({
242 "filename": filename,
242 "filename": filename,
243 "size": size,
243 "size": size,
244 "time": time.time(),
244 "time": time.time(),
245 "sha256": file_hash,
245 "sha256": file_hash,
246 "meta_ver": METADATA_VER
246 "meta_ver": METADATA_VER
247 })
247 })
248
248
249 filename_meta = filename + '.meta'
249 filename_meta = filename + '.meta'
250 with open(os.path.join(stored_file_dir, filename_meta), "wb") as dest_meta:
250 with open(os.path.join(stored_file_dir, filename_meta), "wb") as dest_meta:
251 dest_meta.write(json.dumps(metadata))
251 dest_meta.write(json.dumps(metadata))
252
252
253 if directory:
253 if directory:
254 filename = os.path.join(directory, filename)
254 filename = os.path.join(directory, filename)
255
255
256 return filename, metadata
256 return filename, metadata
257
257
258 def get_metadata(self, filename):
258 def get_metadata(self, filename, ignore_missing=False):
259 """
259 """
260 Reads JSON stored metadata for a file
260 Reads JSON stored metadata for a file
261
261
262 :param filename:
262 :param filename:
263 :return:
263 :return:
264 """
264 """
265 filename = self.store_path(filename)
265 filename = self.store_path(filename)
266 filename_meta = filename + '.meta'
266 filename_meta = filename + '.meta'
267
267 if ignore_missing and not os.path.isfile(filename_meta):
268 return {}
268 with open(filename_meta, "rb") as source_meta:
269 with open(filename_meta, "rb") as source_meta:
269 return json.loads(source_meta.read())
270 return json.loads(source_meta.read())
@@ -1,1227 +1,1227 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 from rhodecode.apps._base import add_route_with_slash
20 from rhodecode.apps._base import add_route_with_slash
21
21
22
22
23 def includeme(config):
23 def includeme(config):
24 from rhodecode.apps.repository.views.repo_artifacts import RepoArtifactsView
24 from rhodecode.apps.repository.views.repo_artifacts import RepoArtifactsView
25 from rhodecode.apps.repository.views.repo_audit_logs import AuditLogsView
25 from rhodecode.apps.repository.views.repo_audit_logs import AuditLogsView
26 from rhodecode.apps.repository.views.repo_automation import RepoAutomationView
26 from rhodecode.apps.repository.views.repo_automation import RepoAutomationView
27 from rhodecode.apps.repository.views.repo_bookmarks import RepoBookmarksView
27 from rhodecode.apps.repository.views.repo_bookmarks import RepoBookmarksView
28 from rhodecode.apps.repository.views.repo_branch_permissions import RepoSettingsBranchPermissionsView
28 from rhodecode.apps.repository.views.repo_branch_permissions import RepoSettingsBranchPermissionsView
29 from rhodecode.apps.repository.views.repo_branches import RepoBranchesView
29 from rhodecode.apps.repository.views.repo_branches import RepoBranchesView
30 from rhodecode.apps.repository.views.repo_caches import RepoCachesView
30 from rhodecode.apps.repository.views.repo_caches import RepoCachesView
31 from rhodecode.apps.repository.views.repo_changelog import RepoChangelogView
31 from rhodecode.apps.repository.views.repo_changelog import RepoChangelogView
32 from rhodecode.apps.repository.views.repo_checks import RepoChecksView
32 from rhodecode.apps.repository.views.repo_checks import RepoChecksView
33 from rhodecode.apps.repository.views.repo_commits import RepoCommitsView
33 from rhodecode.apps.repository.views.repo_commits import RepoCommitsView
34 from rhodecode.apps.repository.views.repo_compare import RepoCompareView
34 from rhodecode.apps.repository.views.repo_compare import RepoCompareView
35 from rhodecode.apps.repository.views.repo_feed import RepoFeedView
35 from rhodecode.apps.repository.views.repo_feed import RepoFeedView
36 from rhodecode.apps.repository.views.repo_files import RepoFilesView
36 from rhodecode.apps.repository.views.repo_files import RepoFilesView
37 from rhodecode.apps.repository.views.repo_forks import RepoForksView
37 from rhodecode.apps.repository.views.repo_forks import RepoForksView
38 from rhodecode.apps.repository.views.repo_maintainance import RepoMaintenanceView
38 from rhodecode.apps.repository.views.repo_maintainance import RepoMaintenanceView
39 from rhodecode.apps.repository.views.repo_permissions import RepoSettingsPermissionsView
39 from rhodecode.apps.repository.views.repo_permissions import RepoSettingsPermissionsView
40 from rhodecode.apps.repository.views.repo_pull_requests import RepoPullRequestsView
40 from rhodecode.apps.repository.views.repo_pull_requests import RepoPullRequestsView
41 from rhodecode.apps.repository.views.repo_review_rules import RepoReviewRulesView
41 from rhodecode.apps.repository.views.repo_review_rules import RepoReviewRulesView
42 from rhodecode.apps.repository.views.repo_settings import RepoSettingsView
42 from rhodecode.apps.repository.views.repo_settings import RepoSettingsView
43 from rhodecode.apps.repository.views.repo_settings_advanced import RepoSettingsAdvancedView
43 from rhodecode.apps.repository.views.repo_settings_advanced import RepoSettingsAdvancedView
44 from rhodecode.apps.repository.views.repo_settings_fields import RepoSettingsFieldsView
44 from rhodecode.apps.repository.views.repo_settings_fields import RepoSettingsFieldsView
45 from rhodecode.apps.repository.views.repo_settings_issue_trackers import RepoSettingsIssueTrackersView
45 from rhodecode.apps.repository.views.repo_settings_issue_trackers import RepoSettingsIssueTrackersView
46 from rhodecode.apps.repository.views.repo_settings_remote import RepoSettingsRemoteView
46 from rhodecode.apps.repository.views.repo_settings_remote import RepoSettingsRemoteView
47 from rhodecode.apps.repository.views.repo_settings_vcs import RepoSettingsVcsView
47 from rhodecode.apps.repository.views.repo_settings_vcs import RepoSettingsVcsView
48 from rhodecode.apps.repository.views.repo_strip import RepoStripView
48 from rhodecode.apps.repository.views.repo_strip import RepoStripView
49 from rhodecode.apps.repository.views.repo_summary import RepoSummaryView
49 from rhodecode.apps.repository.views.repo_summary import RepoSummaryView
50 from rhodecode.apps.repository.views.repo_tags import RepoTagsView
50 from rhodecode.apps.repository.views.repo_tags import RepoTagsView
51
51
52 # repo creating checks, special cases that aren't repo routes
52 # repo creating checks, special cases that aren't repo routes
53 config.add_route(
53 config.add_route(
54 name='repo_creating',
54 name='repo_creating',
55 pattern='/{repo_name:.*?[^/]}/repo_creating')
55 pattern='/{repo_name:.*?[^/]}/repo_creating')
56 config.add_view(
56 config.add_view(
57 RepoChecksView,
57 RepoChecksView,
58 attr='repo_creating',
58 attr='repo_creating',
59 route_name='repo_creating', request_method='GET',
59 route_name='repo_creating', request_method='GET',
60 renderer='rhodecode:templates/admin/repos/repo_creating.mako')
60 renderer='rhodecode:templates/admin/repos/repo_creating.mako')
61
61
62 config.add_route(
62 config.add_route(
63 name='repo_creating_check',
63 name='repo_creating_check',
64 pattern='/{repo_name:.*?[^/]}/repo_creating_check')
64 pattern='/{repo_name:.*?[^/]}/repo_creating_check')
65 config.add_view(
65 config.add_view(
66 RepoChecksView,
66 RepoChecksView,
67 attr='repo_creating_check',
67 attr='repo_creating_check',
68 route_name='repo_creating_check', request_method='GET',
68 route_name='repo_creating_check', request_method='GET',
69 renderer='json_ext')
69 renderer='json_ext')
70
70
71 # Summary
71 # Summary
72 # NOTE(marcink): one additional route is defined in very bottom, catch
72 # NOTE(marcink): one additional route is defined in very bottom, catch
73 # all pattern
73 # all pattern
74 config.add_route(
74 config.add_route(
75 name='repo_summary_explicit',
75 name='repo_summary_explicit',
76 pattern='/{repo_name:.*?[^/]}/summary', repo_route=True)
76 pattern='/{repo_name:.*?[^/]}/summary', repo_route=True)
77 config.add_view(
77 config.add_view(
78 RepoSummaryView,
78 RepoSummaryView,
79 attr='summary',
79 attr='summary',
80 route_name='repo_summary_explicit', request_method='GET',
80 route_name='repo_summary_explicit', request_method='GET',
81 renderer='rhodecode:templates/summary/summary.mako')
81 renderer='rhodecode:templates/summary/summary.mako')
82
82
83 config.add_route(
83 config.add_route(
84 name='repo_summary_commits',
84 name='repo_summary_commits',
85 pattern='/{repo_name:.*?[^/]}/summary-commits', repo_route=True)
85 pattern='/{repo_name:.*?[^/]}/summary-commits', repo_route=True)
86 config.add_view(
86 config.add_view(
87 RepoSummaryView,
87 RepoSummaryView,
88 attr='summary_commits',
88 attr='summary_commits',
89 route_name='repo_summary_commits', request_method='GET',
89 route_name='repo_summary_commits', request_method='GET',
90 renderer='rhodecode:templates/summary/summary_commits.mako')
90 renderer='rhodecode:templates/summary/summary_commits.mako')
91
91
92 # Commits
92 # Commits
93 config.add_route(
93 config.add_route(
94 name='repo_commit',
94 name='repo_commit',
95 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}', repo_route=True)
95 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}', repo_route=True)
96 config.add_view(
96 config.add_view(
97 RepoCommitsView,
97 RepoCommitsView,
98 attr='repo_commit_show',
98 attr='repo_commit_show',
99 route_name='repo_commit', request_method='GET',
99 route_name='repo_commit', request_method='GET',
100 renderer=None)
100 renderer=None)
101
101
102 config.add_route(
102 config.add_route(
103 name='repo_commit_children',
103 name='repo_commit_children',
104 pattern='/{repo_name:.*?[^/]}/changeset_children/{commit_id}', repo_route=True)
104 pattern='/{repo_name:.*?[^/]}/changeset_children/{commit_id}', repo_route=True)
105 config.add_view(
105 config.add_view(
106 RepoCommitsView,
106 RepoCommitsView,
107 attr='repo_commit_children',
107 attr='repo_commit_children',
108 route_name='repo_commit_children', request_method='GET',
108 route_name='repo_commit_children', request_method='GET',
109 renderer='json_ext', xhr=True)
109 renderer='json_ext', xhr=True)
110
110
111 config.add_route(
111 config.add_route(
112 name='repo_commit_parents',
112 name='repo_commit_parents',
113 pattern='/{repo_name:.*?[^/]}/changeset_parents/{commit_id}', repo_route=True)
113 pattern='/{repo_name:.*?[^/]}/changeset_parents/{commit_id}', repo_route=True)
114 config.add_view(
114 config.add_view(
115 RepoCommitsView,
115 RepoCommitsView,
116 attr='repo_commit_parents',
116 attr='repo_commit_parents',
117 route_name='repo_commit_parents', request_method='GET',
117 route_name='repo_commit_parents', request_method='GET',
118 renderer='json_ext')
118 renderer='json_ext')
119
119
120 config.add_route(
120 config.add_route(
121 name='repo_commit_raw',
121 name='repo_commit_raw',
122 pattern='/{repo_name:.*?[^/]}/changeset-diff/{commit_id}', repo_route=True)
122 pattern='/{repo_name:.*?[^/]}/changeset-diff/{commit_id}', repo_route=True)
123 config.add_view(
123 config.add_view(
124 RepoCommitsView,
124 RepoCommitsView,
125 attr='repo_commit_raw',
125 attr='repo_commit_raw',
126 route_name='repo_commit_raw', request_method='GET',
126 route_name='repo_commit_raw', request_method='GET',
127 renderer=None)
127 renderer=None)
128
128
129 config.add_route(
129 config.add_route(
130 name='repo_commit_patch',
130 name='repo_commit_patch',
131 pattern='/{repo_name:.*?[^/]}/changeset-patch/{commit_id}', repo_route=True)
131 pattern='/{repo_name:.*?[^/]}/changeset-patch/{commit_id}', repo_route=True)
132 config.add_view(
132 config.add_view(
133 RepoCommitsView,
133 RepoCommitsView,
134 attr='repo_commit_patch',
134 attr='repo_commit_patch',
135 route_name='repo_commit_patch', request_method='GET',
135 route_name='repo_commit_patch', request_method='GET',
136 renderer=None)
136 renderer=None)
137
137
138 config.add_route(
138 config.add_route(
139 name='repo_commit_download',
139 name='repo_commit_download',
140 pattern='/{repo_name:.*?[^/]}/changeset-download/{commit_id}', repo_route=True)
140 pattern='/{repo_name:.*?[^/]}/changeset-download/{commit_id}', repo_route=True)
141 config.add_view(
141 config.add_view(
142 RepoCommitsView,
142 RepoCommitsView,
143 attr='repo_commit_download',
143 attr='repo_commit_download',
144 route_name='repo_commit_download', request_method='GET',
144 route_name='repo_commit_download', request_method='GET',
145 renderer=None)
145 renderer=None)
146
146
147 config.add_route(
147 config.add_route(
148 name='repo_commit_data',
148 name='repo_commit_data',
149 pattern='/{repo_name:.*?[^/]}/changeset-data/{commit_id}', repo_route=True)
149 pattern='/{repo_name:.*?[^/]}/changeset-data/{commit_id}', repo_route=True)
150 config.add_view(
150 config.add_view(
151 RepoCommitsView,
151 RepoCommitsView,
152 attr='repo_commit_data',
152 attr='repo_commit_data',
153 route_name='repo_commit_data', request_method='GET',
153 route_name='repo_commit_data', request_method='GET',
154 renderer='json_ext', xhr=True)
154 renderer='json_ext', xhr=True)
155
155
156 config.add_route(
156 config.add_route(
157 name='repo_commit_comment_create',
157 name='repo_commit_comment_create',
158 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/create', repo_route=True)
158 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/create', repo_route=True)
159 config.add_view(
159 config.add_view(
160 RepoCommitsView,
160 RepoCommitsView,
161 attr='repo_commit_comment_create',
161 attr='repo_commit_comment_create',
162 route_name='repo_commit_comment_create', request_method='POST',
162 route_name='repo_commit_comment_create', request_method='POST',
163 renderer='json_ext')
163 renderer='json_ext')
164
164
165 config.add_route(
165 config.add_route(
166 name='repo_commit_comment_preview',
166 name='repo_commit_comment_preview',
167 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True)
167 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True)
168 config.add_view(
168 config.add_view(
169 RepoCommitsView,
169 RepoCommitsView,
170 attr='repo_commit_comment_preview',
170 attr='repo_commit_comment_preview',
171 route_name='repo_commit_comment_preview', request_method='POST',
171 route_name='repo_commit_comment_preview', request_method='POST',
172 renderer='string', xhr=True)
172 renderer='string', xhr=True)
173
173
174 config.add_route(
174 config.add_route(
175 name='repo_commit_comment_history_view',
175 name='repo_commit_comment_history_view',
176 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_history_id}/history_view', repo_route=True)
176 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_history_id}/history_view', repo_route=True)
177 config.add_view(
177 config.add_view(
178 RepoCommitsView,
178 RepoCommitsView,
179 attr='repo_commit_comment_history_view',
179 attr='repo_commit_comment_history_view',
180 route_name='repo_commit_comment_history_view', request_method='POST',
180 route_name='repo_commit_comment_history_view', request_method='POST',
181 renderer='string', xhr=True)
181 renderer='string', xhr=True)
182
182
183 config.add_route(
183 config.add_route(
184 name='repo_commit_comment_attachment_upload',
184 name='repo_commit_comment_attachment_upload',
185 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True)
185 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True)
186 config.add_view(
186 config.add_view(
187 RepoCommitsView,
187 RepoCommitsView,
188 attr='repo_commit_comment_attachment_upload',
188 attr='repo_commit_comment_attachment_upload',
189 route_name='repo_commit_comment_attachment_upload', request_method='POST',
189 route_name='repo_commit_comment_attachment_upload', request_method='POST',
190 renderer='json_ext', xhr=True)
190 renderer='json_ext', xhr=True)
191
191
192 config.add_route(
192 config.add_route(
193 name='repo_commit_comment_delete',
193 name='repo_commit_comment_delete',
194 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True)
194 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True)
195 config.add_view(
195 config.add_view(
196 RepoCommitsView,
196 RepoCommitsView,
197 attr='repo_commit_comment_delete',
197 attr='repo_commit_comment_delete',
198 route_name='repo_commit_comment_delete', request_method='POST',
198 route_name='repo_commit_comment_delete', request_method='POST',
199 renderer='json_ext')
199 renderer='json_ext')
200
200
201 config.add_route(
201 config.add_route(
202 name='repo_commit_comment_edit',
202 name='repo_commit_comment_edit',
203 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/edit', repo_route=True)
203 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/edit', repo_route=True)
204 config.add_view(
204 config.add_view(
205 RepoCommitsView,
205 RepoCommitsView,
206 attr='repo_commit_comment_edit',
206 attr='repo_commit_comment_edit',
207 route_name='repo_commit_comment_edit', request_method='POST',
207 route_name='repo_commit_comment_edit', request_method='POST',
208 renderer='json_ext')
208 renderer='json_ext')
209
209
210 # still working url for backward compat.
210 # still working url for backward compat.
211 config.add_route(
211 config.add_route(
212 name='repo_commit_raw_deprecated',
212 name='repo_commit_raw_deprecated',
213 pattern='/{repo_name:.*?[^/]}/raw-changeset/{commit_id}', repo_route=True)
213 pattern='/{repo_name:.*?[^/]}/raw-changeset/{commit_id}', repo_route=True)
214 config.add_view(
214 config.add_view(
215 RepoCommitsView,
215 RepoCommitsView,
216 attr='repo_commit_raw',
216 attr='repo_commit_raw',
217 route_name='repo_commit_raw_deprecated', request_method='GET',
217 route_name='repo_commit_raw_deprecated', request_method='GET',
218 renderer=None)
218 renderer=None)
219
219
220 # Files
220 # Files
221 config.add_route(
221 config.add_route(
222 name='repo_archivefile',
222 name='repo_archivefile',
223 pattern='/{repo_name:.*?[^/]}/archive/{fname:.*}', repo_route=True)
223 pattern='/{repo_name:.*?[^/]}/archive/{fname:.*}', repo_route=True)
224 config.add_view(
224 config.add_view(
225 RepoFilesView,
225 RepoFilesView,
226 attr='repo_archivefile',
226 attr='repo_archivefile',
227 route_name='repo_archivefile', request_method='GET',
227 route_name='repo_archivefile', request_method='GET',
228 renderer=None)
228 renderer=None)
229
229
230 config.add_route(
230 config.add_route(
231 name='repo_files_diff',
231 name='repo_files_diff',
232 pattern='/{repo_name:.*?[^/]}/diff/{f_path:.*}', repo_route=True)
232 pattern='/{repo_name:.*?[^/]}/diff/{f_path:.*}', repo_route=True)
233 config.add_view(
233 config.add_view(
234 RepoFilesView,
234 RepoFilesView,
235 attr='repo_files_diff',
235 attr='repo_files_diff',
236 route_name='repo_files_diff', request_method='GET',
236 route_name='repo_files_diff', request_method='GET',
237 renderer=None)
237 renderer=None)
238
238
239 config.add_route( # legacy route to make old links work
239 config.add_route( # legacy route to make old links work
240 name='repo_files_diff_2way_redirect',
240 name='repo_files_diff_2way_redirect',
241 pattern='/{repo_name:.*?[^/]}/diff-2way/{f_path:.*}', repo_route=True)
241 pattern='/{repo_name:.*?[^/]}/diff-2way/{f_path:.*}', repo_route=True)
242 config.add_view(
242 config.add_view(
243 RepoFilesView,
243 RepoFilesView,
244 attr='repo_files_diff_2way_redirect',
244 attr='repo_files_diff_2way_redirect',
245 route_name='repo_files_diff_2way_redirect', request_method='GET',
245 route_name='repo_files_diff_2way_redirect', request_method='GET',
246 renderer=None)
246 renderer=None)
247
247
248 config.add_route(
248 config.add_route(
249 name='repo_files',
249 name='repo_files',
250 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/{f_path:.*}', repo_route=True)
250 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/{f_path:.*}', repo_route=True)
251 config.add_view(
251 config.add_view(
252 RepoFilesView,
252 RepoFilesView,
253 attr='repo_files',
253 attr='repo_files',
254 route_name='repo_files', request_method='GET',
254 route_name='repo_files', request_method='GET',
255 renderer=None)
255 renderer=None)
256
256
257 config.add_route(
257 config.add_route(
258 name='repo_files:default_path',
258 name='repo_files:default_path',
259 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/', repo_route=True)
259 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/', repo_route=True)
260 config.add_view(
260 config.add_view(
261 RepoFilesView,
261 RepoFilesView,
262 attr='repo_files',
262 attr='repo_files',
263 route_name='repo_files:default_path', request_method='GET',
263 route_name='repo_files:default_path', request_method='GET',
264 renderer=None)
264 renderer=None)
265
265
266 config.add_route(
266 config.add_route(
267 name='repo_files:default_commit',
267 name='repo_files:default_commit',
268 pattern='/{repo_name:.*?[^/]}/files', repo_route=True)
268 pattern='/{repo_name:.*?[^/]}/files', repo_route=True)
269 config.add_view(
269 config.add_view(
270 RepoFilesView,
270 RepoFilesView,
271 attr='repo_files',
271 attr='repo_files',
272 route_name='repo_files:default_commit', request_method='GET',
272 route_name='repo_files:default_commit', request_method='GET',
273 renderer=None)
273 renderer=None)
274
274
275 config.add_route(
275 config.add_route(
276 name='repo_files:rendered',
276 name='repo_files:rendered',
277 pattern='/{repo_name:.*?[^/]}/render/{commit_id}/{f_path:.*}', repo_route=True)
277 pattern='/{repo_name:.*?[^/]}/render/{commit_id}/{f_path:.*}', repo_route=True)
278 config.add_view(
278 config.add_view(
279 RepoFilesView,
279 RepoFilesView,
280 attr='repo_files',
280 attr='repo_files',
281 route_name='repo_files:rendered', request_method='GET',
281 route_name='repo_files:rendered', request_method='GET',
282 renderer=None)
282 renderer=None)
283
283
284 config.add_route(
284 config.add_route(
285 name='repo_files:annotated',
285 name='repo_files:annotated',
286 pattern='/{repo_name:.*?[^/]}/annotate/{commit_id}/{f_path:.*}', repo_route=True)
286 pattern='/{repo_name:.*?[^/]}/annotate/{commit_id}/{f_path:.*}', repo_route=True)
287 config.add_view(
287 config.add_view(
288 RepoFilesView,
288 RepoFilesView,
289 attr='repo_files',
289 attr='repo_files',
290 route_name='repo_files:annotated', request_method='GET',
290 route_name='repo_files:annotated', request_method='GET',
291 renderer=None)
291 renderer=None)
292
292
293 config.add_route(
293 config.add_route(
294 name='repo_files:annotated_previous',
294 name='repo_files:annotated_previous',
295 pattern='/{repo_name:.*?[^/]}/annotate-previous/{commit_id}/{f_path:.*}', repo_route=True)
295 pattern='/{repo_name:.*?[^/]}/annotate-previous/{commit_id}/{f_path:.*}', repo_route=True)
296 config.add_view(
296 config.add_view(
297 RepoFilesView,
297 RepoFilesView,
298 attr='repo_files_annotated_previous',
298 attr='repo_files_annotated_previous',
299 route_name='repo_files:annotated_previous', request_method='GET',
299 route_name='repo_files:annotated_previous', request_method='GET',
300 renderer=None)
300 renderer=None)
301
301
302 config.add_route(
302 config.add_route(
303 name='repo_nodetree_full',
303 name='repo_nodetree_full',
304 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/{f_path:.*}', repo_route=True)
304 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/{f_path:.*}', repo_route=True)
305 config.add_view(
305 config.add_view(
306 RepoFilesView,
306 RepoFilesView,
307 attr='repo_nodetree_full',
307 attr='repo_nodetree_full',
308 route_name='repo_nodetree_full', request_method='GET',
308 route_name='repo_nodetree_full', request_method='GET',
309 renderer=None, xhr=True)
309 renderer=None, xhr=True)
310
310
311 config.add_route(
311 config.add_route(
312 name='repo_nodetree_full:default_path',
312 name='repo_nodetree_full:default_path',
313 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/', repo_route=True)
313 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/', repo_route=True)
314 config.add_view(
314 config.add_view(
315 RepoFilesView,
315 RepoFilesView,
316 attr='repo_nodetree_full',
316 attr='repo_nodetree_full',
317 route_name='repo_nodetree_full:default_path', request_method='GET',
317 route_name='repo_nodetree_full:default_path', request_method='GET',
318 renderer=None, xhr=True)
318 renderer=None, xhr=True)
319
319
320 config.add_route(
320 config.add_route(
321 name='repo_files_nodelist',
321 name='repo_files_nodelist',
322 pattern='/{repo_name:.*?[^/]}/nodelist/{commit_id}/{f_path:.*}', repo_route=True)
322 pattern='/{repo_name:.*?[^/]}/nodelist/{commit_id}/{f_path:.*}', repo_route=True)
323 config.add_view(
323 config.add_view(
324 RepoFilesView,
324 RepoFilesView,
325 attr='repo_nodelist',
325 attr='repo_nodelist',
326 route_name='repo_files_nodelist', request_method='GET',
326 route_name='repo_files_nodelist', request_method='GET',
327 renderer='json_ext', xhr=True)
327 renderer='json_ext', xhr=True)
328
328
329 config.add_route(
329 config.add_route(
330 name='repo_file_raw',
330 name='repo_file_raw',
331 pattern='/{repo_name:.*?[^/]}/raw/{commit_id}/{f_path:.*}', repo_route=True)
331 pattern='/{repo_name:.*?[^/]}/raw/{commit_id}/{f_path:.*}', repo_route=True)
332 config.add_view(
332 config.add_view(
333 RepoFilesView,
333 RepoFilesView,
334 attr='repo_file_raw',
334 attr='repo_file_raw',
335 route_name='repo_file_raw', request_method='GET',
335 route_name='repo_file_raw', request_method='GET',
336 renderer=None)
336 renderer=None)
337
337
338 config.add_route(
338 config.add_route(
339 name='repo_file_download',
339 name='repo_file_download',
340 pattern='/{repo_name:.*?[^/]}/download/{commit_id}/{f_path:.*}', repo_route=True)
340 pattern='/{repo_name:.*?[^/]}/download/{commit_id}/{f_path:.*}', repo_route=True)
341 config.add_view(
341 config.add_view(
342 RepoFilesView,
342 RepoFilesView,
343 attr='repo_file_download',
343 attr='repo_file_download',
344 route_name='repo_file_download', request_method='GET',
344 route_name='repo_file_download', request_method='GET',
345 renderer=None)
345 renderer=None)
346
346
347 config.add_route( # backward compat to keep old links working
347 config.add_route( # backward compat to keep old links working
348 name='repo_file_download:legacy',
348 name='repo_file_download:legacy',
349 pattern='/{repo_name:.*?[^/]}/rawfile/{commit_id}/{f_path:.*}',
349 pattern='/{repo_name:.*?[^/]}/rawfile/{commit_id}/{f_path:.*}',
350 repo_route=True)
350 repo_route=True)
351 config.add_view(
351 config.add_view(
352 RepoFilesView,
352 RepoFilesView,
353 attr='repo_file_download',
353 attr='repo_file_download',
354 route_name='repo_file_download:legacy', request_method='GET',
354 route_name='repo_file_download:legacy', request_method='GET',
355 renderer=None)
355 renderer=None)
356
356
357 config.add_route(
357 config.add_route(
358 name='repo_file_history',
358 name='repo_file_history',
359 pattern='/{repo_name:.*?[^/]}/history/{commit_id}/{f_path:.*}', repo_route=True)
359 pattern='/{repo_name:.*?[^/]}/history/{commit_id}/{f_path:.*}', repo_route=True)
360 config.add_view(
360 config.add_view(
361 RepoFilesView,
361 RepoFilesView,
362 attr='repo_file_history',
362 attr='repo_file_history',
363 route_name='repo_file_history', request_method='GET',
363 route_name='repo_file_history', request_method='GET',
364 renderer='json_ext')
364 renderer='json_ext')
365
365
366 config.add_route(
366 config.add_route(
367 name='repo_file_authors',
367 name='repo_file_authors',
368 pattern='/{repo_name:.*?[^/]}/authors/{commit_id}/{f_path:.*}', repo_route=True)
368 pattern='/{repo_name:.*?[^/]}/authors/{commit_id}/{f_path:.*}', repo_route=True)
369 config.add_view(
369 config.add_view(
370 RepoFilesView,
370 RepoFilesView,
371 attr='repo_file_authors',
371 attr='repo_file_authors',
372 route_name='repo_file_authors', request_method='GET',
372 route_name='repo_file_authors', request_method='GET',
373 renderer='rhodecode:templates/files/file_authors_box.mako')
373 renderer='rhodecode:templates/files/file_authors_box.mako')
374
374
375 config.add_route(
375 config.add_route(
376 name='repo_files_check_head',
376 name='repo_files_check_head',
377 pattern='/{repo_name:.*?[^/]}/check_head/{commit_id}/{f_path:.*}',
377 pattern='/{repo_name:.*?[^/]}/check_head/{commit_id}/{f_path:.*}',
378 repo_route=True)
378 repo_route=True)
379 config.add_view(
379 config.add_view(
380 RepoFilesView,
380 RepoFilesView,
381 attr='repo_files_check_head',
381 attr='repo_files_check_head',
382 route_name='repo_files_check_head', request_method='POST',
382 route_name='repo_files_check_head', request_method='POST',
383 renderer='json_ext', xhr=True)
383 renderer='json_ext', xhr=True)
384
384
385 config.add_route(
385 config.add_route(
386 name='repo_files_remove_file',
386 name='repo_files_remove_file',
387 pattern='/{repo_name:.*?[^/]}/remove_file/{commit_id}/{f_path:.*}',
387 pattern='/{repo_name:.*?[^/]}/remove_file/{commit_id}/{f_path:.*}',
388 repo_route=True)
388 repo_route=True)
389 config.add_view(
389 config.add_view(
390 RepoFilesView,
390 RepoFilesView,
391 attr='repo_files_remove_file',
391 attr='repo_files_remove_file',
392 route_name='repo_files_remove_file', request_method='GET',
392 route_name='repo_files_remove_file', request_method='GET',
393 renderer='rhodecode:templates/files/files_delete.mako')
393 renderer='rhodecode:templates/files/files_delete.mako')
394
394
395 config.add_route(
395 config.add_route(
396 name='repo_files_delete_file',
396 name='repo_files_delete_file',
397 pattern='/{repo_name:.*?[^/]}/delete_file/{commit_id}/{f_path:.*}',
397 pattern='/{repo_name:.*?[^/]}/delete_file/{commit_id}/{f_path:.*}',
398 repo_route=True)
398 repo_route=True)
399 config.add_view(
399 config.add_view(
400 RepoFilesView,
400 RepoFilesView,
401 attr='repo_files_delete_file',
401 attr='repo_files_delete_file',
402 route_name='repo_files_delete_file', request_method='POST',
402 route_name='repo_files_delete_file', request_method='POST',
403 renderer=None)
403 renderer=None)
404
404
405 config.add_route(
405 config.add_route(
406 name='repo_files_edit_file',
406 name='repo_files_edit_file',
407 pattern='/{repo_name:.*?[^/]}/edit_file/{commit_id}/{f_path:.*}',
407 pattern='/{repo_name:.*?[^/]}/edit_file/{commit_id}/{f_path:.*}',
408 repo_route=True)
408 repo_route=True)
409 config.add_view(
409 config.add_view(
410 RepoFilesView,
410 RepoFilesView,
411 attr='repo_files_edit_file',
411 attr='repo_files_edit_file',
412 route_name='repo_files_edit_file', request_method='GET',
412 route_name='repo_files_edit_file', request_method='GET',
413 renderer='rhodecode:templates/files/files_edit.mako')
413 renderer='rhodecode:templates/files/files_edit.mako')
414
414
415 config.add_route(
415 config.add_route(
416 name='repo_files_update_file',
416 name='repo_files_update_file',
417 pattern='/{repo_name:.*?[^/]}/update_file/{commit_id}/{f_path:.*}',
417 pattern='/{repo_name:.*?[^/]}/update_file/{commit_id}/{f_path:.*}',
418 repo_route=True)
418 repo_route=True)
419 config.add_view(
419 config.add_view(
420 RepoFilesView,
420 RepoFilesView,
421 attr='repo_files_update_file',
421 attr='repo_files_update_file',
422 route_name='repo_files_update_file', request_method='POST',
422 route_name='repo_files_update_file', request_method='POST',
423 renderer=None)
423 renderer=None)
424
424
425 config.add_route(
425 config.add_route(
426 name='repo_files_add_file',
426 name='repo_files_add_file',
427 pattern='/{repo_name:.*?[^/]}/add_file/{commit_id}/{f_path:.*}',
427 pattern='/{repo_name:.*?[^/]}/add_file/{commit_id}/{f_path:.*}',
428 repo_route=True)
428 repo_route=True)
429 config.add_view(
429 config.add_view(
430 RepoFilesView,
430 RepoFilesView,
431 attr='repo_files_add_file',
431 attr='repo_files_add_file',
432 route_name='repo_files_add_file', request_method='GET',
432 route_name='repo_files_add_file', request_method='GET',
433 renderer='rhodecode:templates/files/files_add.mako')
433 renderer='rhodecode:templates/files/files_add.mako')
434
434
435 config.add_route(
435 config.add_route(
436 name='repo_files_upload_file',
436 name='repo_files_upload_file',
437 pattern='/{repo_name:.*?[^/]}/upload_file/{commit_id}/{f_path:.*}',
437 pattern='/{repo_name:.*?[^/]}/upload_file/{commit_id}/{f_path:.*}',
438 repo_route=True)
438 repo_route=True)
439 config.add_view(
439 config.add_view(
440 RepoFilesView,
440 RepoFilesView,
441 attr='repo_files_add_file',
441 attr='repo_files_add_file',
442 route_name='repo_files_upload_file', request_method='GET',
442 route_name='repo_files_upload_file', request_method='GET',
443 renderer='rhodecode:templates/files/files_upload.mako')
443 renderer='rhodecode:templates/files/files_upload.mako')
444 config.add_view( # POST creates
444 config.add_view( # POST creates
445 RepoFilesView,
445 RepoFilesView,
446 attr='repo_files_upload_file',
446 attr='repo_files_upload_file',
447 route_name='repo_files_upload_file', request_method='POST',
447 route_name='repo_files_upload_file', request_method='POST',
448 renderer='json_ext')
448 renderer='json_ext')
449
449
450 config.add_route(
450 config.add_route(
451 name='repo_files_create_file',
451 name='repo_files_create_file',
452 pattern='/{repo_name:.*?[^/]}/create_file/{commit_id}/{f_path:.*}',
452 pattern='/{repo_name:.*?[^/]}/create_file/{commit_id}/{f_path:.*}',
453 repo_route=True)
453 repo_route=True)
454 config.add_view( # POST creates
454 config.add_view( # POST creates
455 RepoFilesView,
455 RepoFilesView,
456 attr='repo_files_create_file',
456 attr='repo_files_create_file',
457 route_name='repo_files_create_file', request_method='POST',
457 route_name='repo_files_create_file', request_method='POST',
458 renderer=None)
458 renderer=None)
459
459
460 # Refs data
460 # Refs data
461 config.add_route(
461 config.add_route(
462 name='repo_refs_data',
462 name='repo_refs_data',
463 pattern='/{repo_name:.*?[^/]}/refs-data', repo_route=True)
463 pattern='/{repo_name:.*?[^/]}/refs-data', repo_route=True)
464 config.add_view(
464 config.add_view(
465 RepoSummaryView,
465 RepoSummaryView,
466 attr='repo_refs_data',
466 attr='repo_refs_data',
467 route_name='repo_refs_data', request_method='GET',
467 route_name='repo_refs_data', request_method='GET',
468 renderer='json_ext')
468 renderer='json_ext')
469
469
470 config.add_route(
470 config.add_route(
471 name='repo_refs_changelog_data',
471 name='repo_refs_changelog_data',
472 pattern='/{repo_name:.*?[^/]}/refs-data-changelog', repo_route=True)
472 pattern='/{repo_name:.*?[^/]}/refs-data-changelog', repo_route=True)
473 config.add_view(
473 config.add_view(
474 RepoSummaryView,
474 RepoSummaryView,
475 attr='repo_refs_changelog_data',
475 attr='repo_refs_changelog_data',
476 route_name='repo_refs_changelog_data', request_method='GET',
476 route_name='repo_refs_changelog_data', request_method='GET',
477 renderer='json_ext')
477 renderer='json_ext')
478
478
479 config.add_route(
479 config.add_route(
480 name='repo_stats',
480 name='repo_stats',
481 pattern='/{repo_name:.*?[^/]}/repo_stats/{commit_id}', repo_route=True)
481 pattern='/{repo_name:.*?[^/]}/repo_stats/{commit_id}', repo_route=True)
482 config.add_view(
482 config.add_view(
483 RepoSummaryView,
483 RepoSummaryView,
484 attr='repo_stats',
484 attr='repo_stats',
485 route_name='repo_stats', request_method='GET',
485 route_name='repo_stats', request_method='GET',
486 renderer='json_ext')
486 renderer='json_ext')
487
487
488 # Commits
488 # Commits
489 config.add_route(
489 config.add_route(
490 name='repo_commits',
490 name='repo_commits',
491 pattern='/{repo_name:.*?[^/]}/commits', repo_route=True)
491 pattern='/{repo_name:.*?[^/]}/commits', repo_route=True)
492 config.add_view(
492 config.add_view(
493 RepoChangelogView,
493 RepoChangelogView,
494 attr='repo_changelog',
494 attr='repo_changelog',
495 route_name='repo_commits', request_method='GET',
495 route_name='repo_commits', request_method='GET',
496 renderer='rhodecode:templates/commits/changelog.mako')
496 renderer='rhodecode:templates/commits/changelog.mako')
497 # old routes for backward compat
497 # old routes for backward compat
498 config.add_view(
498 config.add_view(
499 RepoChangelogView,
499 RepoChangelogView,
500 attr='repo_changelog',
500 attr='repo_changelog',
501 route_name='repo_changelog', request_method='GET',
501 route_name='repo_changelog', request_method='GET',
502 renderer='rhodecode:templates/commits/changelog.mako')
502 renderer='rhodecode:templates/commits/changelog.mako')
503
503
504 config.add_route(
504 config.add_route(
505 name='repo_commits_elements',
505 name='repo_commits_elements',
506 pattern='/{repo_name:.*?[^/]}/commits_elements', repo_route=True)
506 pattern='/{repo_name:.*?[^/]}/commits_elements', repo_route=True)
507 config.add_view(
507 config.add_view(
508 RepoChangelogView,
508 RepoChangelogView,
509 attr='repo_commits_elements',
509 attr='repo_commits_elements',
510 route_name='repo_commits_elements', request_method=('GET', 'POST'),
510 route_name='repo_commits_elements', request_method=('GET', 'POST'),
511 renderer='rhodecode:templates/commits/changelog_elements.mako',
511 renderer='rhodecode:templates/commits/changelog_elements.mako',
512 xhr=True)
512 xhr=True)
513
513
514 config.add_route(
514 config.add_route(
515 name='repo_commits_elements_file',
515 name='repo_commits_elements_file',
516 pattern='/{repo_name:.*?[^/]}/commits_elements/{commit_id}/{f_path:.*}', repo_route=True)
516 pattern='/{repo_name:.*?[^/]}/commits_elements/{commit_id}/{f_path:.*}', repo_route=True)
517 config.add_view(
517 config.add_view(
518 RepoChangelogView,
518 RepoChangelogView,
519 attr='repo_commits_elements',
519 attr='repo_commits_elements',
520 route_name='repo_commits_elements_file', request_method=('GET', 'POST'),
520 route_name='repo_commits_elements_file', request_method=('GET', 'POST'),
521 renderer='rhodecode:templates/commits/changelog_elements.mako',
521 renderer='rhodecode:templates/commits/changelog_elements.mako',
522 xhr=True)
522 xhr=True)
523
523
524 config.add_route(
524 config.add_route(
525 name='repo_commits_file',
525 name='repo_commits_file',
526 pattern='/{repo_name:.*?[^/]}/commits/{commit_id}/{f_path:.*}', repo_route=True)
526 pattern='/{repo_name:.*?[^/]}/commits/{commit_id}/{f_path:.*}', repo_route=True)
527 config.add_view(
527 config.add_view(
528 RepoChangelogView,
528 RepoChangelogView,
529 attr='repo_changelog',
529 attr='repo_changelog',
530 route_name='repo_commits_file', request_method='GET',
530 route_name='repo_commits_file', request_method='GET',
531 renderer='rhodecode:templates/commits/changelog.mako')
531 renderer='rhodecode:templates/commits/changelog.mako')
532 # old routes for backward compat
532 # old routes for backward compat
533 config.add_view(
533 config.add_view(
534 RepoChangelogView,
534 RepoChangelogView,
535 attr='repo_changelog',
535 attr='repo_changelog',
536 route_name='repo_changelog_file', request_method='GET',
536 route_name='repo_changelog_file', request_method='GET',
537 renderer='rhodecode:templates/commits/changelog.mako')
537 renderer='rhodecode:templates/commits/changelog.mako')
538
538
539 # Changelog (old deprecated name for commits page)
539 # Changelog (old deprecated name for commits page)
540 config.add_route(
540 config.add_route(
541 name='repo_changelog',
541 name='repo_changelog',
542 pattern='/{repo_name:.*?[^/]}/changelog', repo_route=True)
542 pattern='/{repo_name:.*?[^/]}/changelog', repo_route=True)
543 config.add_route(
543 config.add_route(
544 name='repo_changelog_file',
544 name='repo_changelog_file',
545 pattern='/{repo_name:.*?[^/]}/changelog/{commit_id}/{f_path:.*}', repo_route=True)
545 pattern='/{repo_name:.*?[^/]}/changelog/{commit_id}/{f_path:.*}', repo_route=True)
546
546
547 # Compare
547 # Compare
548 config.add_route(
548 config.add_route(
549 name='repo_compare_select',
549 name='repo_compare_select',
550 pattern='/{repo_name:.*?[^/]}/compare', repo_route=True)
550 pattern='/{repo_name:.*?[^/]}/compare', repo_route=True)
551 config.add_view(
551 config.add_view(
552 RepoCompareView,
552 RepoCompareView,
553 attr='compare_select',
553 attr='compare_select',
554 route_name='repo_compare_select', request_method='GET',
554 route_name='repo_compare_select', request_method='GET',
555 renderer='rhodecode:templates/compare/compare_diff.mako')
555 renderer='rhodecode:templates/compare/compare_diff.mako')
556
556
557 config.add_route(
557 config.add_route(
558 name='repo_compare',
558 name='repo_compare',
559 pattern='/{repo_name:.*?[^/]}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', repo_route=True)
559 pattern='/{repo_name:.*?[^/]}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', repo_route=True)
560 config.add_view(
560 config.add_view(
561 RepoCompareView,
561 RepoCompareView,
562 attr='compare',
562 attr='compare',
563 route_name='repo_compare', request_method='GET',
563 route_name='repo_compare', request_method='GET',
564 renderer=None)
564 renderer=None)
565
565
566 # Tags
566 # Tags
567 config.add_route(
567 config.add_route(
568 name='tags_home',
568 name='tags_home',
569 pattern='/{repo_name:.*?[^/]}/tags', repo_route=True)
569 pattern='/{repo_name:.*?[^/]}/tags', repo_route=True)
570 config.add_view(
570 config.add_view(
571 RepoTagsView,
571 RepoTagsView,
572 attr='tags',
572 attr='tags',
573 route_name='tags_home', request_method='GET',
573 route_name='tags_home', request_method='GET',
574 renderer='rhodecode:templates/tags/tags.mako')
574 renderer='rhodecode:templates/tags/tags.mako')
575
575
576 # Branches
576 # Branches
577 config.add_route(
577 config.add_route(
578 name='branches_home',
578 name='branches_home',
579 pattern='/{repo_name:.*?[^/]}/branches', repo_route=True)
579 pattern='/{repo_name:.*?[^/]}/branches', repo_route=True)
580 config.add_view(
580 config.add_view(
581 RepoBranchesView,
581 RepoBranchesView,
582 attr='branches',
582 attr='branches',
583 route_name='branches_home', request_method='GET',
583 route_name='branches_home', request_method='GET',
584 renderer='rhodecode:templates/branches/branches.mako')
584 renderer='rhodecode:templates/branches/branches.mako')
585
585
586 # Bookmarks
586 # Bookmarks
587 config.add_route(
587 config.add_route(
588 name='bookmarks_home',
588 name='bookmarks_home',
589 pattern='/{repo_name:.*?[^/]}/bookmarks', repo_route=True)
589 pattern='/{repo_name:.*?[^/]}/bookmarks', repo_route=True)
590 config.add_view(
590 config.add_view(
591 RepoBookmarksView,
591 RepoBookmarksView,
592 attr='bookmarks',
592 attr='bookmarks',
593 route_name='bookmarks_home', request_method='GET',
593 route_name='bookmarks_home', request_method='GET',
594 renderer='rhodecode:templates/bookmarks/bookmarks.mako')
594 renderer='rhodecode:templates/bookmarks/bookmarks.mako')
595
595
596 # Forks
596 # Forks
597 config.add_route(
597 config.add_route(
598 name='repo_fork_new',
598 name='repo_fork_new',
599 pattern='/{repo_name:.*?[^/]}/fork', repo_route=True,
599 pattern='/{repo_name:.*?[^/]}/fork', repo_route=True,
600 repo_forbid_when_archived=True,
600 repo_forbid_when_archived=True,
601 repo_accepted_types=['hg', 'git'])
601 repo_accepted_types=['hg', 'git'])
602 config.add_view(
602 config.add_view(
603 RepoForksView,
603 RepoForksView,
604 attr='repo_fork_new',
604 attr='repo_fork_new',
605 route_name='repo_fork_new', request_method='GET',
605 route_name='repo_fork_new', request_method='GET',
606 renderer='rhodecode:templates/forks/forks.mako')
606 renderer='rhodecode:templates/forks/forks.mako')
607
607
608 config.add_route(
608 config.add_route(
609 name='repo_fork_create',
609 name='repo_fork_create',
610 pattern='/{repo_name:.*?[^/]}/fork/create', repo_route=True,
610 pattern='/{repo_name:.*?[^/]}/fork/create', repo_route=True,
611 repo_forbid_when_archived=True,
611 repo_forbid_when_archived=True,
612 repo_accepted_types=['hg', 'git'])
612 repo_accepted_types=['hg', 'git'])
613 config.add_view(
613 config.add_view(
614 RepoForksView,
614 RepoForksView,
615 attr='repo_fork_create',
615 attr='repo_fork_create',
616 route_name='repo_fork_create', request_method='POST',
616 route_name='repo_fork_create', request_method='POST',
617 renderer='rhodecode:templates/forks/fork.mako')
617 renderer='rhodecode:templates/forks/fork.mako')
618
618
619 config.add_route(
619 config.add_route(
620 name='repo_forks_show_all',
620 name='repo_forks_show_all',
621 pattern='/{repo_name:.*?[^/]}/forks', repo_route=True,
621 pattern='/{repo_name:.*?[^/]}/forks', repo_route=True,
622 repo_accepted_types=['hg', 'git'])
622 repo_accepted_types=['hg', 'git'])
623 config.add_view(
623 config.add_view(
624 RepoForksView,
624 RepoForksView,
625 attr='repo_forks_show_all',
625 attr='repo_forks_show_all',
626 route_name='repo_forks_show_all', request_method='GET',
626 route_name='repo_forks_show_all', request_method='GET',
627 renderer='rhodecode:templates/forks/forks.mako')
627 renderer='rhodecode:templates/forks/forks.mako')
628
628
629 config.add_route(
629 config.add_route(
630 name='repo_forks_data',
630 name='repo_forks_data',
631 pattern='/{repo_name:.*?[^/]}/forks/data', repo_route=True,
631 pattern='/{repo_name:.*?[^/]}/forks/data', repo_route=True,
632 repo_accepted_types=['hg', 'git'])
632 repo_accepted_types=['hg', 'git'])
633 config.add_view(
633 config.add_view(
634 RepoForksView,
634 RepoForksView,
635 attr='repo_forks_data',
635 attr='repo_forks_data',
636 route_name='repo_forks_data', request_method='GET',
636 route_name='repo_forks_data', request_method='GET',
637 renderer='json_ext', xhr=True)
637 renderer='json_ext', xhr=True)
638
638
639 # Pull Requests
639 # Pull Requests
640 config.add_route(
640 config.add_route(
641 name='pullrequest_show',
641 name='pullrequest_show',
642 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}',
642 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}',
643 repo_route=True)
643 repo_route=True)
644 config.add_view(
644 config.add_view(
645 RepoPullRequestsView,
645 RepoPullRequestsView,
646 attr='pull_request_show',
646 attr='pull_request_show',
647 route_name='pullrequest_show', request_method='GET',
647 route_name='pullrequest_show', request_method='GET',
648 renderer='rhodecode:templates/pullrequests/pullrequest_show.mako')
648 renderer='rhodecode:templates/pullrequests/pullrequest_show.mako')
649
649
650 config.add_route(
650 config.add_route(
651 name='pullrequest_show_all',
651 name='pullrequest_show_all',
652 pattern='/{repo_name:.*?[^/]}/pull-request',
652 pattern='/{repo_name:.*?[^/]}/pull-request',
653 repo_route=True, repo_accepted_types=['hg', 'git'])
653 repo_route=True, repo_accepted_types=['hg', 'git'])
654 config.add_view(
654 config.add_view(
655 RepoPullRequestsView,
655 RepoPullRequestsView,
656 attr='pull_request_list',
656 attr='pull_request_list',
657 route_name='pullrequest_show_all', request_method='GET',
657 route_name='pullrequest_show_all', request_method='GET',
658 renderer='rhodecode:templates/pullrequests/pullrequests.mako')
658 renderer='rhodecode:templates/pullrequests/pullrequests.mako')
659
659
660 config.add_route(
660 config.add_route(
661 name='pullrequest_show_all_data',
661 name='pullrequest_show_all_data',
662 pattern='/{repo_name:.*?[^/]}/pull-request-data',
662 pattern='/{repo_name:.*?[^/]}/pull-request-data',
663 repo_route=True, repo_accepted_types=['hg', 'git'])
663 repo_route=True, repo_accepted_types=['hg', 'git'])
664 config.add_view(
664 config.add_view(
665 RepoPullRequestsView,
665 RepoPullRequestsView,
666 attr='pull_request_list_data',
666 attr='pull_request_list_data',
667 route_name='pullrequest_show_all_data', request_method='GET',
667 route_name='pullrequest_show_all_data', request_method='GET',
668 renderer='json_ext', xhr=True)
668 renderer='json_ext', xhr=True)
669
669
670 config.add_route(
670 config.add_route(
671 name='pullrequest_repo_refs',
671 name='pullrequest_repo_refs',
672 pattern='/{repo_name:.*?[^/]}/pull-request/refs/{target_repo_name:.*?[^/]}',
672 pattern='/{repo_name:.*?[^/]}/pull-request/refs/{target_repo_name:.*?[^/]}',
673 repo_route=True)
673 repo_route=True)
674 config.add_view(
674 config.add_view(
675 RepoPullRequestsView,
675 RepoPullRequestsView,
676 attr='pull_request_repo_refs',
676 attr='pull_request_repo_refs',
677 route_name='pullrequest_repo_refs', request_method='GET',
677 route_name='pullrequest_repo_refs', request_method='GET',
678 renderer='json_ext', xhr=True)
678 renderer='json_ext', xhr=True)
679
679
680 config.add_route(
680 config.add_route(
681 name='pullrequest_repo_targets',
681 name='pullrequest_repo_targets',
682 pattern='/{repo_name:.*?[^/]}/pull-request/repo-targets',
682 pattern='/{repo_name:.*?[^/]}/pull-request/repo-targets',
683 repo_route=True)
683 repo_route=True)
684 config.add_view(
684 config.add_view(
685 RepoPullRequestsView,
685 RepoPullRequestsView,
686 attr='pullrequest_repo_targets',
686 attr='pullrequest_repo_targets',
687 route_name='pullrequest_repo_targets', request_method='GET',
687 route_name='pullrequest_repo_targets', request_method='GET',
688 renderer='json_ext', xhr=True)
688 renderer='json_ext', xhr=True)
689
689
690 config.add_route(
690 config.add_route(
691 name='pullrequest_new',
691 name='pullrequest_new',
692 pattern='/{repo_name:.*?[^/]}/pull-request/new',
692 pattern='/{repo_name:.*?[^/]}/pull-request/new',
693 repo_route=True, repo_accepted_types=['hg', 'git'],
693 repo_route=True, repo_accepted_types=['hg', 'git'],
694 repo_forbid_when_archived=True)
694 repo_forbid_when_archived=True)
695 config.add_view(
695 config.add_view(
696 RepoPullRequestsView,
696 RepoPullRequestsView,
697 attr='pull_request_new',
697 attr='pull_request_new',
698 route_name='pullrequest_new', request_method='GET',
698 route_name='pullrequest_new', request_method='GET',
699 renderer='rhodecode:templates/pullrequests/pullrequest.mako')
699 renderer='rhodecode:templates/pullrequests/pullrequest.mako')
700
700
701 config.add_route(
701 config.add_route(
702 name='pullrequest_create',
702 name='pullrequest_create',
703 pattern='/{repo_name:.*?[^/]}/pull-request/create',
703 pattern='/{repo_name:.*?[^/]}/pull-request/create',
704 repo_route=True, repo_accepted_types=['hg', 'git'],
704 repo_route=True, repo_accepted_types=['hg', 'git'],
705 repo_forbid_when_archived=True)
705 repo_forbid_when_archived=True)
706 config.add_view(
706 config.add_view(
707 RepoPullRequestsView,
707 RepoPullRequestsView,
708 attr='pull_request_create',
708 attr='pull_request_create',
709 route_name='pullrequest_create', request_method='POST',
709 route_name='pullrequest_create', request_method='POST',
710 renderer=None)
710 renderer=None)
711
711
712 config.add_route(
712 config.add_route(
713 name='pullrequest_update',
713 name='pullrequest_update',
714 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/update',
714 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/update',
715 repo_route=True, repo_forbid_when_archived=True)
715 repo_route=True, repo_forbid_when_archived=True)
716 config.add_view(
716 config.add_view(
717 RepoPullRequestsView,
717 RepoPullRequestsView,
718 attr='pull_request_update',
718 attr='pull_request_update',
719 route_name='pullrequest_update', request_method='POST',
719 route_name='pullrequest_update', request_method='POST',
720 renderer='json_ext')
720 renderer='json_ext')
721
721
722 config.add_route(
722 config.add_route(
723 name='pullrequest_merge',
723 name='pullrequest_merge',
724 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/merge',
724 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/merge',
725 repo_route=True, repo_forbid_when_archived=True)
725 repo_route=True, repo_forbid_when_archived=True)
726 config.add_view(
726 config.add_view(
727 RepoPullRequestsView,
727 RepoPullRequestsView,
728 attr='pull_request_merge',
728 attr='pull_request_merge',
729 route_name='pullrequest_merge', request_method='POST',
729 route_name='pullrequest_merge', request_method='POST',
730 renderer='json_ext')
730 renderer='json_ext')
731
731
732 config.add_route(
732 config.add_route(
733 name='pullrequest_delete',
733 name='pullrequest_delete',
734 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/delete',
734 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/delete',
735 repo_route=True, repo_forbid_when_archived=True)
735 repo_route=True, repo_forbid_when_archived=True)
736 config.add_view(
736 config.add_view(
737 RepoPullRequestsView,
737 RepoPullRequestsView,
738 attr='pull_request_delete',
738 attr='pull_request_delete',
739 route_name='pullrequest_delete', request_method='POST',
739 route_name='pullrequest_delete', request_method='POST',
740 renderer='json_ext')
740 renderer='json_ext')
741
741
742 config.add_route(
742 config.add_route(
743 name='pullrequest_comment_create',
743 name='pullrequest_comment_create',
744 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment',
744 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment',
745 repo_route=True)
745 repo_route=True)
746 config.add_view(
746 config.add_view(
747 RepoPullRequestsView,
747 RepoPullRequestsView,
748 attr='pull_request_comment_create',
748 attr='pull_request_comment_create',
749 route_name='pullrequest_comment_create', request_method='POST',
749 route_name='pullrequest_comment_create', request_method='POST',
750 renderer='json_ext')
750 renderer='json_ext')
751
751
752 config.add_route(
752 config.add_route(
753 name='pullrequest_comment_edit',
753 name='pullrequest_comment_edit',
754 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/edit',
754 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/edit',
755 repo_route=True, repo_accepted_types=['hg', 'git'])
755 repo_route=True, repo_accepted_types=['hg', 'git'])
756 config.add_view(
756 config.add_view(
757 RepoPullRequestsView,
757 RepoPullRequestsView,
758 attr='pull_request_comment_edit',
758 attr='pull_request_comment_edit',
759 route_name='pullrequest_comment_edit', request_method='POST',
759 route_name='pullrequest_comment_edit', request_method='POST',
760 renderer='json_ext')
760 renderer='json_ext')
761
761
762 config.add_route(
762 config.add_route(
763 name='pullrequest_comment_delete',
763 name='pullrequest_comment_delete',
764 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/delete',
764 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/delete',
765 repo_route=True, repo_accepted_types=['hg', 'git'])
765 repo_route=True, repo_accepted_types=['hg', 'git'])
766 config.add_view(
766 config.add_view(
767 RepoPullRequestsView,
767 RepoPullRequestsView,
768 attr='pull_request_comment_delete',
768 attr='pull_request_comment_delete',
769 route_name='pullrequest_comment_delete', request_method='POST',
769 route_name='pullrequest_comment_delete', request_method='POST',
770 renderer='json_ext')
770 renderer='json_ext')
771
771
772 config.add_route(
772 config.add_route(
773 name='pullrequest_comments',
773 name='pullrequest_comments',
774 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comments',
774 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comments',
775 repo_route=True)
775 repo_route=True)
776 config.add_view(
776 config.add_view(
777 RepoPullRequestsView,
777 RepoPullRequestsView,
778 attr='pullrequest_comments',
778 attr='pullrequest_comments',
779 route_name='pullrequest_comments', request_method='POST',
779 route_name='pullrequest_comments', request_method='POST',
780 renderer='string_html', xhr=True)
780 renderer='string_html', xhr=True)
781
781
782 config.add_route(
782 config.add_route(
783 name='pullrequest_todos',
783 name='pullrequest_todos',
784 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/todos',
784 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/todos',
785 repo_route=True)
785 repo_route=True)
786 config.add_view(
786 config.add_view(
787 RepoPullRequestsView,
787 RepoPullRequestsView,
788 attr='pullrequest_todos',
788 attr='pullrequest_todos',
789 route_name='pullrequest_todos', request_method='POST',
789 route_name='pullrequest_todos', request_method='POST',
790 renderer='string_html', xhr=True)
790 renderer='string_html', xhr=True)
791
791
792 config.add_route(
792 config.add_route(
793 name='pullrequest_drafts',
793 name='pullrequest_drafts',
794 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/drafts',
794 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/drafts',
795 repo_route=True)
795 repo_route=True)
796 config.add_view(
796 config.add_view(
797 RepoPullRequestsView,
797 RepoPullRequestsView,
798 attr='pullrequest_drafts',
798 attr='pullrequest_drafts',
799 route_name='pullrequest_drafts', request_method='POST',
799 route_name='pullrequest_drafts', request_method='POST',
800 renderer='string_html', xhr=True)
800 renderer='string_html', xhr=True)
801
801
802 # Artifacts, (EE feature)
802 # Artifacts, (EE feature)
803 config.add_route(
803 config.add_route(
804 name='repo_artifacts_list',
804 name='repo_artifacts_list',
805 pattern='/{repo_name:.*?[^/]}/artifacts', repo_route=True)
805 pattern='/{repo_name:.*?[^/]}/artifacts', repo_route=True)
806 config.add_view(
806 config.add_view(
807 RepoArtifactsView,
807 RepoArtifactsView,
808 attr='repo_artifacts',
808 attr='repo_artifacts',
809 route_name='repo_artifacts_list', request_method='GET',
809 route_name='repo_artifacts_list', request_method='GET',
810 renderer='rhodecode:templates/artifacts/artifact_list.mako')
810 renderer='rhodecode:templates/artifacts/artifact_list.mako')
811
811
812 # Settings
812 # Settings
813 config.add_route(
813 config.add_route(
814 name='edit_repo',
814 name='edit_repo',
815 pattern='/{repo_name:.*?[^/]}/settings', repo_route=True)
815 pattern='/{repo_name:.*?[^/]}/settings', repo_route=True)
816 config.add_view(
816 config.add_view(
817 RepoSettingsView,
817 RepoSettingsView,
818 attr='edit_settings',
818 attr='edit_settings',
819 route_name='edit_repo', request_method='GET',
819 route_name='edit_repo', request_method='GET',
820 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
820 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
821 # update is POST on edit_repo
821 # update is POST on edit_repo
822 config.add_view(
822 config.add_view(
823 RepoSettingsView,
823 RepoSettingsView,
824 attr='edit_settings_update',
824 attr='edit_settings_update',
825 route_name='edit_repo', request_method='POST',
825 route_name='edit_repo', request_method='POST',
826 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
826 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
827
827
828 # Settings advanced
828 # Settings advanced
829 config.add_route(
829 config.add_route(
830 name='edit_repo_advanced',
830 name='edit_repo_advanced',
831 pattern='/{repo_name:.*?[^/]}/settings/advanced', repo_route=True)
831 pattern='/{repo_name:.*?[^/]}/settings/advanced', repo_route=True)
832 config.add_view(
832 config.add_view(
833 RepoSettingsAdvancedView,
833 RepoSettingsAdvancedView,
834 attr='edit_advanced',
834 attr='edit_advanced',
835 route_name='edit_repo_advanced', request_method='GET',
835 route_name='edit_repo_advanced', request_method='GET',
836 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
836 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
837
837
838 config.add_route(
838 config.add_route(
839 name='edit_repo_advanced_archive',
839 name='edit_repo_advanced_archive',
840 pattern='/{repo_name:.*?[^/]}/settings/advanced/archive', repo_route=True)
840 pattern='/{repo_name:.*?[^/]}/settings/advanced/archive', repo_route=True)
841 config.add_view(
841 config.add_view(
842 RepoSettingsAdvancedView,
842 RepoSettingsAdvancedView,
843 attr='edit_advanced_archive',
843 attr='edit_advanced_archive',
844 route_name='edit_repo_advanced_archive', request_method='POST',
844 route_name='edit_repo_advanced_archive', request_method='POST',
845 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
845 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
846
846
847 config.add_route(
847 config.add_route(
848 name='edit_repo_advanced_delete',
848 name='edit_repo_advanced_delete',
849 pattern='/{repo_name:.*?[^/]}/settings/advanced/delete', repo_route=True)
849 pattern='/{repo_name:.*?[^/]}/settings/advanced/delete', repo_route=True)
850 config.add_view(
850 config.add_view(
851 RepoSettingsAdvancedView,
851 RepoSettingsAdvancedView,
852 attr='edit_advanced_delete',
852 attr='edit_advanced_delete',
853 route_name='edit_repo_advanced_delete', request_method='POST',
853 route_name='edit_repo_advanced_delete', request_method='POST',
854 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
854 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
855
855
856 config.add_route(
856 config.add_route(
857 name='edit_repo_advanced_locking',
857 name='edit_repo_advanced_locking',
858 pattern='/{repo_name:.*?[^/]}/settings/advanced/locking', repo_route=True)
858 pattern='/{repo_name:.*?[^/]}/settings/advanced/locking', repo_route=True)
859 config.add_view(
859 config.add_view(
860 RepoSettingsAdvancedView,
860 RepoSettingsAdvancedView,
861 attr='edit_advanced_toggle_locking',
861 attr='edit_advanced_toggle_locking',
862 route_name='edit_repo_advanced_locking', request_method='POST',
862 route_name='edit_repo_advanced_locking', request_method='POST',
863 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
863 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
864
864
865 config.add_route(
865 config.add_route(
866 name='edit_repo_advanced_journal',
866 name='edit_repo_advanced_journal',
867 pattern='/{repo_name:.*?[^/]}/settings/advanced/journal', repo_route=True)
867 pattern='/{repo_name:.*?[^/]}/settings/advanced/journal', repo_route=True)
868 config.add_view(
868 config.add_view(
869 RepoSettingsAdvancedView,
869 RepoSettingsAdvancedView,
870 attr='edit_advanced_journal',
870 attr='edit_advanced_journal',
871 route_name='edit_repo_advanced_journal', request_method='POST',
871 route_name='edit_repo_advanced_journal', request_method='POST',
872 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
872 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
873
873
874 config.add_route(
874 config.add_route(
875 name='edit_repo_advanced_fork',
875 name='edit_repo_advanced_fork',
876 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
876 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
877 config.add_view(
877 config.add_view(
878 RepoSettingsAdvancedView,
878 RepoSettingsAdvancedView,
879 attr='edit_advanced_fork',
879 attr='edit_advanced_fork',
880 route_name='edit_repo_advanced_fork', request_method='POST',
880 route_name='edit_repo_advanced_fork', request_method='POST',
881 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
881 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
882
882
883 config.add_route(
883 config.add_route(
884 name='edit_repo_advanced_hooks',
884 name='edit_repo_advanced_hooks',
885 pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True)
885 pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True)
886 config.add_view(
886 config.add_view(
887 RepoSettingsAdvancedView,
887 RepoSettingsAdvancedView,
888 attr='edit_advanced_install_hooks',
888 attr='edit_advanced_install_hooks',
889 route_name='edit_repo_advanced_hooks', request_method='GET',
889 route_name='edit_repo_advanced_hooks', request_method='GET',
890 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
890 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
891
891
892 # Caches
892 # Caches
893 config.add_route(
893 config.add_route(
894 name='edit_repo_caches',
894 name='edit_repo_caches',
895 pattern='/{repo_name:.*?[^/]}/settings/caches', repo_route=True)
895 pattern='/{repo_name:.*?[^/]}/settings/caches', repo_route=True)
896 config.add_view(
896 config.add_view(
897 RepoCachesView,
897 RepoCachesView,
898 attr='repo_caches',
898 attr='repo_caches',
899 route_name='edit_repo_caches', request_method='GET',
899 route_name='edit_repo_caches', request_method='GET',
900 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
900 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
901 config.add_view(
901 config.add_view(
902 RepoCachesView,
902 RepoCachesView,
903 attr='repo_caches_purge',
903 attr='repo_caches_purge',
904 route_name='edit_repo_caches', request_method='POST')
904 route_name='edit_repo_caches', request_method='POST')
905
905
906 # Permissions
906 # Permissions
907 config.add_route(
907 config.add_route(
908 name='edit_repo_perms',
908 name='edit_repo_perms',
909 pattern='/{repo_name:.*?[^/]}/settings/permissions', repo_route=True)
909 pattern='/{repo_name:.*?[^/]}/settings/permissions', repo_route=True)
910 config.add_view(
910 config.add_view(
911 RepoSettingsPermissionsView,
911 RepoSettingsPermissionsView,
912 attr='edit_permissions',
912 attr='edit_permissions',
913 route_name='edit_repo_perms', request_method='GET',
913 route_name='edit_repo_perms', request_method='GET',
914 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
914 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
915 config.add_view(
915 config.add_view(
916 RepoSettingsPermissionsView,
916 RepoSettingsPermissionsView,
917 attr='edit_permissions_update',
917 attr='edit_permissions_update',
918 route_name='edit_repo_perms', request_method='POST',
918 route_name='edit_repo_perms', request_method='POST',
919 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
919 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
920
920
921 config.add_route(
921 config.add_route(
922 name='edit_repo_perms_set_private',
922 name='edit_repo_perms_set_private',
923 pattern='/{repo_name:.*?[^/]}/settings/permissions/set_private', repo_route=True)
923 pattern='/{repo_name:.*?[^/]}/settings/permissions/set_private', repo_route=True)
924 config.add_view(
924 config.add_view(
925 RepoSettingsPermissionsView,
925 RepoSettingsPermissionsView,
926 attr='edit_permissions_set_private_repo',
926 attr='edit_permissions_set_private_repo',
927 route_name='edit_repo_perms_set_private', request_method='POST',
927 route_name='edit_repo_perms_set_private', request_method='POST',
928 renderer='json_ext')
928 renderer='json_ext')
929
929
930 # Permissions Branch (EE feature)
930 # Permissions Branch (EE feature)
931 config.add_route(
931 config.add_route(
932 name='edit_repo_perms_branch',
932 name='edit_repo_perms_branch',
933 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions', repo_route=True)
933 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions', repo_route=True)
934 config.add_view(
934 config.add_view(
935 RepoBranchesView,
935 RepoSettingsBranchPermissionsView,
936 attr='branches',
936 attr='branch_permissions',
937 route_name='edit_repo_perms_branch', request_method='GET',
937 route_name='edit_repo_perms_branch', request_method='GET',
938 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
938 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
939
939
940 config.add_route(
940 config.add_route(
941 name='edit_repo_perms_branch_delete',
941 name='edit_repo_perms_branch_delete',
942 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions/{rule_id}/delete',
942 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions/{rule_id}/delete',
943 repo_route=True)
943 repo_route=True)
944 ## Only implemented in EE
944 ## Only implemented in EE
945
945
946 # Maintenance
946 # Maintenance
947 config.add_route(
947 config.add_route(
948 name='edit_repo_maintenance',
948 name='edit_repo_maintenance',
949 pattern='/{repo_name:.*?[^/]}/settings/maintenance', repo_route=True)
949 pattern='/{repo_name:.*?[^/]}/settings/maintenance', repo_route=True)
950 config.add_view(
950 config.add_view(
951 RepoMaintenanceView,
951 RepoMaintenanceView,
952 attr='repo_maintenance',
952 attr='repo_maintenance',
953 route_name='edit_repo_maintenance_execute', request_method='GET',
953 route_name='edit_repo_maintenance', request_method='GET',
954 renderer='json', xhr=True)
954 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
955
955
956 config.add_route(
956 config.add_route(
957 name='edit_repo_maintenance_execute',
957 name='edit_repo_maintenance_execute',
958 pattern='/{repo_name:.*?[^/]}/settings/maintenance/execute', repo_route=True)
958 pattern='/{repo_name:.*?[^/]}/settings/maintenance/execute', repo_route=True)
959 config.add_view(
959 config.add_view(
960 RepoMaintenanceView,
960 RepoMaintenanceView,
961 attr='repo_maintenance_execute',
961 attr='repo_maintenance_execute',
962 route_name='edit_repo_maintenance', request_method='GET',
962 route_name='edit_repo_maintenance_execute', request_method='GET',
963 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
963 renderer='json', xhr=True)
964
964
965 # Fields
965 # Fields
966 config.add_route(
966 config.add_route(
967 name='edit_repo_fields',
967 name='edit_repo_fields',
968 pattern='/{repo_name:.*?[^/]}/settings/fields', repo_route=True)
968 pattern='/{repo_name:.*?[^/]}/settings/fields', repo_route=True)
969 config.add_view(
969 config.add_view(
970 RepoSettingsFieldsView,
970 RepoSettingsFieldsView,
971 attr='repo_field_edit',
971 attr='repo_field_edit',
972 route_name='edit_repo_fields', request_method='GET',
972 route_name='edit_repo_fields', request_method='GET',
973 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
973 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
974
974
975 config.add_route(
975 config.add_route(
976 name='edit_repo_fields_create',
976 name='edit_repo_fields_create',
977 pattern='/{repo_name:.*?[^/]}/settings/fields/create', repo_route=True)
977 pattern='/{repo_name:.*?[^/]}/settings/fields/create', repo_route=True)
978 config.add_view(
978 config.add_view(
979 RepoSettingsFieldsView,
979 RepoSettingsFieldsView,
980 attr='repo_field_create',
980 attr='repo_field_create',
981 route_name='edit_repo_fields_create', request_method='POST',
981 route_name='edit_repo_fields_create', request_method='POST',
982 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
982 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
983
983
984 config.add_route(
984 config.add_route(
985 name='edit_repo_fields_delete',
985 name='edit_repo_fields_delete',
986 pattern='/{repo_name:.*?[^/]}/settings/fields/{field_id}/delete', repo_route=True)
986 pattern='/{repo_name:.*?[^/]}/settings/fields/{field_id}/delete', repo_route=True)
987 config.add_view(
987 config.add_view(
988 RepoSettingsFieldsView,
988 RepoSettingsFieldsView,
989 attr='repo_field_delete',
989 attr='repo_field_delete',
990 route_name='edit_repo_fields_delete', request_method='POST',
990 route_name='edit_repo_fields_delete', request_method='POST',
991 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
991 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
992
992
993 # Locking
993 # Locking
994 config.add_route(
994 config.add_route(
995 name='repo_edit_toggle_locking',
995 name='repo_edit_toggle_locking',
996 pattern='/{repo_name:.*?[^/]}/settings/toggle_locking', repo_route=True)
996 pattern='/{repo_name:.*?[^/]}/settings/toggle_locking', repo_route=True)
997 config.add_view(
997 config.add_view(
998 RepoSettingsView,
998 RepoSettingsView,
999 attr='edit_advanced_toggle_locking',
999 attr='edit_advanced_toggle_locking',
1000 route_name='repo_edit_toggle_locking', request_method='GET',
1000 route_name='repo_edit_toggle_locking', request_method='GET',
1001 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1001 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1002
1002
1003 # Remote
1003 # Remote
1004 config.add_route(
1004 config.add_route(
1005 name='edit_repo_remote',
1005 name='edit_repo_remote',
1006 pattern='/{repo_name:.*?[^/]}/settings/remote', repo_route=True)
1006 pattern='/{repo_name:.*?[^/]}/settings/remote', repo_route=True)
1007 config.add_view(
1007 config.add_view(
1008 RepoSettingsRemoteView,
1008 RepoSettingsRemoteView,
1009 attr='repo_remote_edit_form',
1009 attr='repo_remote_edit_form',
1010 route_name='edit_repo_remote', request_method='GET',
1010 route_name='edit_repo_remote', request_method='GET',
1011 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1011 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1012
1012
1013 config.add_route(
1013 config.add_route(
1014 name='edit_repo_remote_pull',
1014 name='edit_repo_remote_pull',
1015 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
1015 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
1016 config.add_view(
1016 config.add_view(
1017 RepoSettingsRemoteView,
1017 RepoSettingsRemoteView,
1018 attr='repo_remote_pull_changes',
1018 attr='repo_remote_pull_changes',
1019 route_name='edit_repo_remote_pull', request_method='POST',
1019 route_name='edit_repo_remote_pull', request_method='POST',
1020 renderer=None)
1020 renderer=None)
1021
1021
1022 config.add_route(
1022 config.add_route(
1023 name='edit_repo_remote_push',
1023 name='edit_repo_remote_push',
1024 pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True)
1024 pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True)
1025
1025
1026 # Statistics
1026 # Statistics
1027 config.add_route(
1027 config.add_route(
1028 name='edit_repo_statistics',
1028 name='edit_repo_statistics',
1029 pattern='/{repo_name:.*?[^/]}/settings/statistics', repo_route=True)
1029 pattern='/{repo_name:.*?[^/]}/settings/statistics', repo_route=True)
1030 config.add_view(
1030 config.add_view(
1031 RepoSettingsView,
1031 RepoSettingsView,
1032 attr='edit_statistics_form',
1032 attr='edit_statistics_form',
1033 route_name='edit_repo_statistics', request_method='GET',
1033 route_name='edit_repo_statistics', request_method='GET',
1034 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1034 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1035
1035
1036 config.add_route(
1036 config.add_route(
1037 name='edit_repo_statistics_reset',
1037 name='edit_repo_statistics_reset',
1038 pattern='/{repo_name:.*?[^/]}/settings/statistics/update', repo_route=True)
1038 pattern='/{repo_name:.*?[^/]}/settings/statistics/update', repo_route=True)
1039 config.add_view(
1039 config.add_view(
1040 RepoSettingsView,
1040 RepoSettingsView,
1041 attr='repo_statistics_reset',
1041 attr='repo_statistics_reset',
1042 route_name='edit_repo_statistics_reset', request_method='POST',
1042 route_name='edit_repo_statistics_reset', request_method='POST',
1043 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1043 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1044
1044
1045 # Issue trackers
1045 # Issue trackers
1046 config.add_route(
1046 config.add_route(
1047 name='edit_repo_issuetracker',
1047 name='edit_repo_issuetracker',
1048 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers', repo_route=True)
1048 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers', repo_route=True)
1049 config.add_view(
1049 config.add_view(
1050 RepoSettingsIssueTrackersView,
1050 RepoSettingsIssueTrackersView,
1051 attr='repo_issuetracker',
1051 attr='repo_issuetracker',
1052 route_name='edit_repo_issuetracker', request_method='GET',
1052 route_name='edit_repo_issuetracker', request_method='GET',
1053 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1053 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1054
1054
1055 config.add_route(
1055 config.add_route(
1056 name='edit_repo_issuetracker_test',
1056 name='edit_repo_issuetracker_test',
1057 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/test', repo_route=True)
1057 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/test', repo_route=True)
1058 config.add_view(
1058 config.add_view(
1059 RepoSettingsIssueTrackersView,
1059 RepoSettingsIssueTrackersView,
1060 attr='repo_issuetracker_test',
1060 attr='repo_issuetracker_test',
1061 route_name='edit_repo_issuetracker_test', request_method='POST',
1061 route_name='edit_repo_issuetracker_test', request_method='POST',
1062 renderer='string', xhr=True)
1062 renderer='string', xhr=True)
1063
1063
1064 config.add_route(
1064 config.add_route(
1065 name='edit_repo_issuetracker_delete',
1065 name='edit_repo_issuetracker_delete',
1066 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/delete', repo_route=True)
1066 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/delete', repo_route=True)
1067 config.add_view(
1067 config.add_view(
1068 RepoSettingsIssueTrackersView,
1068 RepoSettingsIssueTrackersView,
1069 attr='repo_issuetracker_delete',
1069 attr='repo_issuetracker_delete',
1070 route_name='edit_repo_issuetracker_delete', request_method='POST',
1070 route_name='edit_repo_issuetracker_delete', request_method='POST',
1071 renderer='json_ext', xhr=True)
1071 renderer='json_ext', xhr=True)
1072
1072
1073 config.add_route(
1073 config.add_route(
1074 name='edit_repo_issuetracker_update',
1074 name='edit_repo_issuetracker_update',
1075 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/update', repo_route=True)
1075 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/update', repo_route=True)
1076 config.add_view(
1076 config.add_view(
1077 RepoSettingsIssueTrackersView,
1077 RepoSettingsIssueTrackersView,
1078 attr='repo_issuetracker_update',
1078 attr='repo_issuetracker_update',
1079 route_name='edit_repo_issuetracker_update', request_method='POST',
1079 route_name='edit_repo_issuetracker_update', request_method='POST',
1080 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1080 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1081
1081
1082 # VCS Settings
1082 # VCS Settings
1083 config.add_route(
1083 config.add_route(
1084 name='edit_repo_vcs',
1084 name='edit_repo_vcs',
1085 pattern='/{repo_name:.*?[^/]}/settings/vcs', repo_route=True)
1085 pattern='/{repo_name:.*?[^/]}/settings/vcs', repo_route=True)
1086 config.add_view(
1086 config.add_view(
1087 RepoSettingsVcsView,
1087 RepoSettingsVcsView,
1088 attr='repo_vcs_settings',
1088 attr='repo_vcs_settings',
1089 route_name='edit_repo_vcs', request_method='GET',
1089 route_name='edit_repo_vcs', request_method='GET',
1090 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1090 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1091
1091
1092 config.add_route(
1092 config.add_route(
1093 name='edit_repo_vcs_update',
1093 name='edit_repo_vcs_update',
1094 pattern='/{repo_name:.*?[^/]}/settings/vcs/update', repo_route=True)
1094 pattern='/{repo_name:.*?[^/]}/settings/vcs/update', repo_route=True)
1095 config.add_view(
1095 config.add_view(
1096 RepoSettingsVcsView,
1096 RepoSettingsVcsView,
1097 attr='repo_settings_vcs_update',
1097 attr='repo_settings_vcs_update',
1098 route_name='edit_repo_vcs_update', request_method='POST',
1098 route_name='edit_repo_vcs_update', request_method='POST',
1099 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1099 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1100
1100
1101 # svn pattern
1101 # svn pattern
1102 config.add_route(
1102 config.add_route(
1103 name='edit_repo_vcs_svn_pattern_delete',
1103 name='edit_repo_vcs_svn_pattern_delete',
1104 pattern='/{repo_name:.*?[^/]}/settings/vcs/svn_pattern/delete', repo_route=True)
1104 pattern='/{repo_name:.*?[^/]}/settings/vcs/svn_pattern/delete', repo_route=True)
1105 config.add_view(
1105 config.add_view(
1106 RepoSettingsVcsView,
1106 RepoSettingsVcsView,
1107 attr='repo_settings_delete_svn_pattern',
1107 attr='repo_settings_delete_svn_pattern',
1108 route_name='edit_repo_vcs_svn_pattern_delete', request_method='POST',
1108 route_name='edit_repo_vcs_svn_pattern_delete', request_method='POST',
1109 renderer='json_ext', xhr=True)
1109 renderer='json_ext', xhr=True)
1110
1110
1111 # Repo Review Rules (EE feature)
1111 # Repo Review Rules (EE feature)
1112 config.add_route(
1112 config.add_route(
1113 name='repo_reviewers',
1113 name='repo_reviewers',
1114 pattern='/{repo_name:.*?[^/]}/settings/review/rules', repo_route=True)
1114 pattern='/{repo_name:.*?[^/]}/settings/review/rules', repo_route=True)
1115 config.add_view(
1115 config.add_view(
1116 RepoReviewRulesView,
1116 RepoReviewRulesView,
1117 attr='repo_review_rules',
1117 attr='repo_review_rules',
1118 route_name='repo_reviewers', request_method='GET',
1118 route_name='repo_reviewers', request_method='GET',
1119 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1119 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1120
1120
1121 config.add_route(
1121 config.add_route(
1122 name='repo_default_reviewers_data',
1122 name='repo_default_reviewers_data',
1123 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
1123 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
1124 config.add_view(
1124 config.add_view(
1125 RepoReviewRulesView,
1125 RepoReviewRulesView,
1126 attr='repo_default_reviewers_data',
1126 attr='repo_default_reviewers_data',
1127 route_name='repo_default_reviewers_data', request_method='GET',
1127 route_name='repo_default_reviewers_data', request_method='GET',
1128 renderer='json_ext')
1128 renderer='json_ext')
1129
1129
1130 # Repo Automation (EE feature)
1130 # Repo Automation (EE feature)
1131 config.add_route(
1131 config.add_route(
1132 name='repo_automation',
1132 name='repo_automation',
1133 pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True)
1133 pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True)
1134 config.add_view(
1134 config.add_view(
1135 RepoAutomationView,
1135 RepoAutomationView,
1136 attr='repo_automation',
1136 attr='repo_automation',
1137 route_name='repo_automation', request_method='GET',
1137 route_name='repo_automation', request_method='GET',
1138 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1138 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1139
1139
1140 # Strip
1140 # Strip
1141 config.add_route(
1141 config.add_route(
1142 name='edit_repo_strip',
1142 name='edit_repo_strip',
1143 pattern='/{repo_name:.*?[^/]}/settings/strip', repo_route=True)
1143 pattern='/{repo_name:.*?[^/]}/settings/strip', repo_route=True)
1144 config.add_view(
1144 config.add_view(
1145 RepoStripView,
1145 RepoStripView,
1146 attr='strip',
1146 attr='strip',
1147 route_name='edit_repo_strip', request_method='GET',
1147 route_name='edit_repo_strip', request_method='GET',
1148 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1148 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1149
1149
1150 config.add_route(
1150 config.add_route(
1151 name='strip_check',
1151 name='strip_check',
1152 pattern='/{repo_name:.*?[^/]}/settings/strip_check', repo_route=True)
1152 pattern='/{repo_name:.*?[^/]}/settings/strip_check', repo_route=True)
1153 config.add_view(
1153 config.add_view(
1154 RepoStripView,
1154 RepoStripView,
1155 attr='strip_check',
1155 attr='strip_check',
1156 route_name='strip_check', request_method='POST',
1156 route_name='strip_check', request_method='POST',
1157 renderer='json', xhr=True)
1157 renderer='json', xhr=True)
1158
1158
1159 config.add_route(
1159 config.add_route(
1160 name='strip_execute',
1160 name='strip_execute',
1161 pattern='/{repo_name:.*?[^/]}/settings/strip_execute', repo_route=True)
1161 pattern='/{repo_name:.*?[^/]}/settings/strip_execute', repo_route=True)
1162 config.add_view(
1162 config.add_view(
1163 RepoStripView,
1163 RepoStripView,
1164 attr='strip_execute',
1164 attr='strip_execute',
1165 route_name='strip_execute', request_method='POST',
1165 route_name='strip_execute', request_method='POST',
1166 renderer='json', xhr=True)
1166 renderer='json', xhr=True)
1167
1167
1168 # Audit logs
1168 # Audit logs
1169 config.add_route(
1169 config.add_route(
1170 name='edit_repo_audit_logs',
1170 name='edit_repo_audit_logs',
1171 pattern='/{repo_name:.*?[^/]}/settings/audit_logs', repo_route=True)
1171 pattern='/{repo_name:.*?[^/]}/settings/audit_logs', repo_route=True)
1172 config.add_view(
1172 config.add_view(
1173 AuditLogsView,
1173 AuditLogsView,
1174 attr='repo_audit_logs',
1174 attr='repo_audit_logs',
1175 route_name='edit_repo_audit_logs', request_method='GET',
1175 route_name='edit_repo_audit_logs', request_method='GET',
1176 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1176 renderer='rhodecode:templates/admin/repos/repo_edit.mako')
1177
1177
1178 # ATOM/RSS Feed, shouldn't contain slashes for outlook compatibility
1178 # ATOM/RSS Feed, shouldn't contain slashes for outlook compatibility
1179 config.add_route(
1179 config.add_route(
1180 name='rss_feed_home',
1180 name='rss_feed_home',
1181 pattern='/{repo_name:.*?[^/]}/feed-rss', repo_route=True)
1181 pattern='/{repo_name:.*?[^/]}/feed-rss', repo_route=True)
1182 config.add_view(
1182 config.add_view(
1183 RepoFeedView,
1183 RepoFeedView,
1184 attr='rss',
1184 attr='rss',
1185 route_name='rss_feed_home', request_method='GET', renderer=None)
1185 route_name='rss_feed_home', request_method='GET', renderer=None)
1186
1186
1187 config.add_route(
1187 config.add_route(
1188 name='rss_feed_home_old',
1188 name='rss_feed_home_old',
1189 pattern='/{repo_name:.*?[^/]}/feed/rss', repo_route=True)
1189 pattern='/{repo_name:.*?[^/]}/feed/rss', repo_route=True)
1190 config.add_view(
1190 config.add_view(
1191 RepoFeedView,
1191 RepoFeedView,
1192 attr='rss',
1192 attr='rss',
1193 route_name='rss_feed_home_old', request_method='GET', renderer=None)
1193 route_name='rss_feed_home_old', request_method='GET', renderer=None)
1194
1194
1195 config.add_route(
1195 config.add_route(
1196 name='atom_feed_home',
1196 name='atom_feed_home',
1197 pattern='/{repo_name:.*?[^/]}/feed-atom', repo_route=True)
1197 pattern='/{repo_name:.*?[^/]}/feed-atom', repo_route=True)
1198 config.add_view(
1198 config.add_view(
1199 RepoFeedView,
1199 RepoFeedView,
1200 attr='atom',
1200 attr='atom',
1201 route_name='atom_feed_home', request_method='GET', renderer=None)
1201 route_name='atom_feed_home', request_method='GET', renderer=None)
1202
1202
1203 config.add_route(
1203 config.add_route(
1204 name='atom_feed_home_old',
1204 name='atom_feed_home_old',
1205 pattern='/{repo_name:.*?[^/]}/feed/atom', repo_route=True)
1205 pattern='/{repo_name:.*?[^/]}/feed/atom', repo_route=True)
1206 config.add_view(
1206 config.add_view(
1207 RepoFeedView,
1207 RepoFeedView,
1208 attr='atom',
1208 attr='atom',
1209 route_name='atom_feed_home_old', request_method='GET', renderer=None)
1209 route_name='atom_feed_home_old', request_method='GET', renderer=None)
1210
1210
1211 # NOTE(marcink): needs to be at the end for catch-all
1211 # NOTE(marcink): needs to be at the end for catch-all
1212 add_route_with_slash(
1212 add_route_with_slash(
1213 config,
1213 config,
1214 name='repo_summary',
1214 name='repo_summary',
1215 pattern='/{repo_name:.*?[^/]}', repo_route=True)
1215 pattern='/{repo_name:.*?[^/]}', repo_route=True)
1216 config.add_view(
1216 config.add_view(
1217 RepoSummaryView,
1217 RepoSummaryView,
1218 attr='summary',
1218 attr='summary',
1219 route_name='repo_summary', request_method='GET',
1219 route_name='repo_summary', request_method='GET',
1220 renderer='rhodecode:templates/summary/summary.mako')
1220 renderer='rhodecode:templates/summary/summary.mako')
1221
1221
1222 # TODO(marcink): there's no such route??
1222 # TODO(marcink): there's no such route??
1223 config.add_view(
1223 config.add_view(
1224 RepoSummaryView,
1224 RepoSummaryView,
1225 attr='summary',
1225 attr='summary',
1226 route_name='repo_summary_slash', request_method='GET',
1226 route_name='repo_summary_slash', request_method='GET',
1227 renderer='rhodecode:templates/summary/summary.mako') No newline at end of file
1227 renderer='rhodecode:templates/summary/summary.mako')
@@ -1,1070 +1,1092 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import os
22
22
23 import mock
23 import mock
24 import pytest
24 import pytest
25
25
26 from rhodecode.apps.repository.tests.test_repo_compare import ComparePage
26 from rhodecode.apps.repository.tests.test_repo_compare import ComparePage
27 from rhodecode.apps.repository.views.repo_files import RepoFilesView
27 from rhodecode.apps.repository.views.repo_files import RepoFilesView
28 from rhodecode.lib import helpers as h
28 from rhodecode.lib import helpers as h
29 from rhodecode.lib.compat import OrderedDict
29 from rhodecode.lib.compat import OrderedDict
30 from rhodecode.lib.ext_json import json
30 from rhodecode.lib.ext_json import json
31 from rhodecode.lib.vcs import nodes
31 from rhodecode.lib.vcs import nodes
32
32
33 from rhodecode.lib.vcs.conf import settings
33 from rhodecode.lib.vcs.conf import settings
34 from rhodecode.tests import assert_session_flash
34 from rhodecode.tests import assert_session_flash
35 from rhodecode.tests.fixture import Fixture
35 from rhodecode.tests.fixture import Fixture
36 from rhodecode.model.db import Session
36 from rhodecode.model.db import Session
37
37
38 fixture = Fixture()
38 fixture = Fixture()
39
39
40
40
41 def get_node_history(backend_type):
41 def get_node_history(backend_type):
42 return {
42 return {
43 'hg': json.loads(fixture.load_resource('hg_node_history_response.json')),
43 'hg': json.loads(fixture.load_resource('hg_node_history_response.json')),
44 'git': json.loads(fixture.load_resource('git_node_history_response.json')),
44 'git': json.loads(fixture.load_resource('git_node_history_response.json')),
45 'svn': json.loads(fixture.load_resource('svn_node_history_response.json')),
45 'svn': json.loads(fixture.load_resource('svn_node_history_response.json')),
46 }[backend_type]
46 }[backend_type]
47
47
48
48
49 def route_path(name, params=None, **kwargs):
49 def route_path(name, params=None, **kwargs):
50 import urllib
50 import urllib
51
51
52 base_url = {
52 base_url = {
53 'repo_summary': '/{repo_name}',
53 'repo_summary': '/{repo_name}',
54 'repo_archivefile': '/{repo_name}/archive/{fname}',
54 'repo_archivefile': '/{repo_name}/archive/{fname}',
55 'repo_files_diff': '/{repo_name}/diff/{f_path}',
55 'repo_files_diff': '/{repo_name}/diff/{f_path}',
56 'repo_files_diff_2way_redirect': '/{repo_name}/diff-2way/{f_path}',
56 'repo_files_diff_2way_redirect': '/{repo_name}/diff-2way/{f_path}',
57 'repo_files': '/{repo_name}/files/{commit_id}/{f_path}',
57 'repo_files': '/{repo_name}/files/{commit_id}/{f_path}',
58 'repo_files:default_path': '/{repo_name}/files/{commit_id}/',
58 'repo_files:default_path': '/{repo_name}/files/{commit_id}/',
59 'repo_files:default_commit': '/{repo_name}/files',
59 'repo_files:default_commit': '/{repo_name}/files',
60 'repo_files:rendered': '/{repo_name}/render/{commit_id}/{f_path}',
60 'repo_files:rendered': '/{repo_name}/render/{commit_id}/{f_path}',
61 'repo_files:annotated': '/{repo_name}/annotate/{commit_id}/{f_path}',
61 'repo_files:annotated': '/{repo_name}/annotate/{commit_id}/{f_path}',
62 'repo_files:annotated_previous': '/{repo_name}/annotate-previous/{commit_id}/{f_path}',
62 'repo_files:annotated_previous': '/{repo_name}/annotate-previous/{commit_id}/{f_path}',
63 'repo_files_nodelist': '/{repo_name}/nodelist/{commit_id}/{f_path}',
63 'repo_files_nodelist': '/{repo_name}/nodelist/{commit_id}/{f_path}',
64 'repo_file_raw': '/{repo_name}/raw/{commit_id}/{f_path}',
64 'repo_file_raw': '/{repo_name}/raw/{commit_id}/{f_path}',
65 'repo_file_download': '/{repo_name}/download/{commit_id}/{f_path}',
65 'repo_file_download': '/{repo_name}/download/{commit_id}/{f_path}',
66 'repo_file_history': '/{repo_name}/history/{commit_id}/{f_path}',
66 'repo_file_history': '/{repo_name}/history/{commit_id}/{f_path}',
67 'repo_file_authors': '/{repo_name}/authors/{commit_id}/{f_path}',
67 'repo_file_authors': '/{repo_name}/authors/{commit_id}/{f_path}',
68 'repo_files_remove_file': '/{repo_name}/remove_file/{commit_id}/{f_path}',
68 'repo_files_remove_file': '/{repo_name}/remove_file/{commit_id}/{f_path}',
69 'repo_files_delete_file': '/{repo_name}/delete_file/{commit_id}/{f_path}',
69 'repo_files_delete_file': '/{repo_name}/delete_file/{commit_id}/{f_path}',
70 'repo_files_edit_file': '/{repo_name}/edit_file/{commit_id}/{f_path}',
70 'repo_files_edit_file': '/{repo_name}/edit_file/{commit_id}/{f_path}',
71 'repo_files_update_file': '/{repo_name}/update_file/{commit_id}/{f_path}',
71 'repo_files_update_file': '/{repo_name}/update_file/{commit_id}/{f_path}',
72 'repo_files_add_file': '/{repo_name}/add_file/{commit_id}/{f_path}',
72 'repo_files_add_file': '/{repo_name}/add_file/{commit_id}/{f_path}',
73 'repo_files_create_file': '/{repo_name}/create_file/{commit_id}/{f_path}',
73 'repo_files_create_file': '/{repo_name}/create_file/{commit_id}/{f_path}',
74 'repo_nodetree_full': '/{repo_name}/nodetree_full/{commit_id}/{f_path}',
74 'repo_nodetree_full': '/{repo_name}/nodetree_full/{commit_id}/{f_path}',
75 'repo_nodetree_full:default_path': '/{repo_name}/nodetree_full/{commit_id}/',
75 'repo_nodetree_full:default_path': '/{repo_name}/nodetree_full/{commit_id}/',
76 }[name].format(**kwargs)
76 }[name].format(**kwargs)
77
77
78 if params:
78 if params:
79 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
79 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
80 return base_url
80 return base_url
81
81
82
82
83 def assert_files_in_response(response, files, params):
83 def assert_files_in_response(response, files, params):
84 template = (
84 template = (
85 'href="/%(repo_name)s/files/%(commit_id)s/%(name)s"')
85 'href="/%(repo_name)s/files/%(commit_id)s/%(name)s"')
86 _assert_items_in_response(response, files, template, params)
86 _assert_items_in_response(response, files, template, params)
87
87
88
88
89 def assert_dirs_in_response(response, dirs, params):
89 def assert_dirs_in_response(response, dirs, params):
90 template = (
90 template = (
91 'href="/%(repo_name)s/files/%(commit_id)s/%(name)s"')
91 'href="/%(repo_name)s/files/%(commit_id)s/%(name)s"')
92 _assert_items_in_response(response, dirs, template, params)
92 _assert_items_in_response(response, dirs, template, params)
93
93
94
94
95 def _assert_items_in_response(response, items, template, params):
95 def _assert_items_in_response(response, items, template, params):
96 for item in items:
96 for item in items:
97 item_params = {'name': item}
97 item_params = {'name': item}
98 item_params.update(params)
98 item_params.update(params)
99 response.mustcontain(template % item_params)
99 response.mustcontain(template % item_params)
100
100
101
101
102 def assert_timeago_in_response(response, items, params):
102 def assert_timeago_in_response(response, items, params):
103 for item in items:
103 for item in items:
104 response.mustcontain(h.age_component(params['date']))
104 response.mustcontain(h.age_component(params['date']))
105
105
106
106
107 @pytest.mark.usefixtures("app")
107 @pytest.mark.usefixtures("app")
108 class TestFilesViews(object):
108 class TestFilesViews(object):
109
109
110 def test_show_files(self, backend):
110 def test_show_files(self, backend):
111 response = self.app.get(
111 response = self.app.get(
112 route_path('repo_files',
112 route_path('repo_files',
113 repo_name=backend.repo_name,
113 repo_name=backend.repo_name,
114 commit_id='tip', f_path='/'))
114 commit_id='tip', f_path='/'))
115 commit = backend.repo.get_commit()
115 commit = backend.repo.get_commit()
116
116
117 params = {
117 params = {
118 'repo_name': backend.repo_name,
118 'repo_name': backend.repo_name,
119 'commit_id': commit.raw_id,
119 'commit_id': commit.raw_id,
120 'date': commit.date
120 'date': commit.date
121 }
121 }
122 assert_dirs_in_response(response, ['docs', 'vcs'], params)
122 assert_dirs_in_response(response, ['docs', 'vcs'], params)
123 files = [
123 files = [
124 '.gitignore',
124 '.gitignore',
125 '.hgignore',
125 '.hgignore',
126 '.hgtags',
126 '.hgtags',
127 # TODO: missing in Git
127 # TODO: missing in Git
128 # '.travis.yml',
128 # '.travis.yml',
129 'MANIFEST.in',
129 'MANIFEST.in',
130 'README.rst',
130 'README.rst',
131 # TODO: File is missing in svn repository
131 # TODO: File is missing in svn repository
132 # 'run_test_and_report.sh',
132 # 'run_test_and_report.sh',
133 'setup.cfg',
133 'setup.cfg',
134 'setup.py',
134 'setup.py',
135 'test_and_report.sh',
135 'test_and_report.sh',
136 'tox.ini',
136 'tox.ini',
137 ]
137 ]
138 assert_files_in_response(response, files, params)
138 assert_files_in_response(response, files, params)
139 assert_timeago_in_response(response, files, params)
139 assert_timeago_in_response(response, files, params)
140
140
141 def test_show_files_links_submodules_with_absolute_url(self, backend_hg):
141 def test_show_files_links_submodules_with_absolute_url(self, backend_hg):
142 repo = backend_hg['subrepos']
142 repo = backend_hg['subrepos']
143 response = self.app.get(
143 response = self.app.get(
144 route_path('repo_files',
144 route_path('repo_files',
145 repo_name=repo.repo_name,
145 repo_name=repo.repo_name,
146 commit_id='tip', f_path='/'))
146 commit_id='tip', f_path='/'))
147 assert_response = response.assert_response()
147 assert_response = response.assert_response()
148 assert_response.contains_one_link(
148 assert_response.contains_one_link(
149 'absolute-path @ 000000000000', 'http://example.com/absolute-path')
149 'absolute-path @ 000000000000', 'http://example.com/absolute-path')
150
150
151 def test_show_files_links_submodules_with_absolute_url_subpaths(
151 def test_show_files_links_submodules_with_absolute_url_subpaths(
152 self, backend_hg):
152 self, backend_hg):
153 repo = backend_hg['subrepos']
153 repo = backend_hg['subrepos']
154 response = self.app.get(
154 response = self.app.get(
155 route_path('repo_files',
155 route_path('repo_files',
156 repo_name=repo.repo_name,
156 repo_name=repo.repo_name,
157 commit_id='tip', f_path='/'))
157 commit_id='tip', f_path='/'))
158 assert_response = response.assert_response()
158 assert_response = response.assert_response()
159 assert_response.contains_one_link(
159 assert_response.contains_one_link(
160 'subpaths-path @ 000000000000',
160 'subpaths-path @ 000000000000',
161 'http://sub-base.example.com/subpaths-path')
161 'http://sub-base.example.com/subpaths-path')
162
162
163 @pytest.mark.xfail_backends("svn", reason="Depends on branch support")
163 @pytest.mark.xfail_backends("svn", reason="Depends on branch support")
164 def test_files_menu(self, backend):
164 def test_files_menu(self, backend):
165 new_branch = "temp_branch_name"
165 new_branch = "temp_branch_name"
166 commits = [
166 commits = [
167 {'message': 'a'},
167 {'message': 'a'},
168 {'message': 'b', 'branch': new_branch}
168 {'message': 'b', 'branch': new_branch}
169 ]
169 ]
170 backend.create_repo(commits)
170 backend.create_repo(commits)
171 backend.repo.landing_rev = "branch:%s" % new_branch
171 backend.repo.landing_rev = "branch:%s" % new_branch
172 Session().commit()
172 Session().commit()
173
173
174 # get response based on tip and not new commit
174 # get response based on tip and not new commit
175 response = self.app.get(
175 response = self.app.get(
176 route_path('repo_files',
176 route_path('repo_files',
177 repo_name=backend.repo_name,
177 repo_name=backend.repo_name,
178 commit_id='tip', f_path='/'))
178 commit_id='tip', f_path='/'))
179
179
180 # make sure Files menu url is not tip but new commit
180 # make sure Files menu url is not tip but new commit
181 landing_rev = backend.repo.landing_ref_name
181 landing_rev = backend.repo.landing_ref_name
182 files_url = route_path('repo_files:default_path',
182 files_url = route_path('repo_files:default_path',
183 repo_name=backend.repo_name,
183 repo_name=backend.repo_name,
184 commit_id=landing_rev, params={'at': landing_rev})
184 commit_id=landing_rev, params={'at': landing_rev})
185
185
186 assert landing_rev != 'tip'
186 assert landing_rev != 'tip'
187 response.mustcontain(
187 response.mustcontain(
188 '<li class="active"><a class="menulink" href="%s">' % files_url)
188 '<li class="active"><a class="menulink" href="%s">' % files_url)
189
189
190 def test_show_files_commit(self, backend):
190 def test_show_files_commit(self, backend):
191 commit = backend.repo.get_commit(commit_idx=32)
191 commit = backend.repo.get_commit(commit_idx=32)
192
192
193 response = self.app.get(
193 response = self.app.get(
194 route_path('repo_files',
194 route_path('repo_files',
195 repo_name=backend.repo_name,
195 repo_name=backend.repo_name,
196 commit_id=commit.raw_id, f_path='/'))
196 commit_id=commit.raw_id, f_path='/'))
197
197
198 dirs = ['docs', 'tests']
198 dirs = ['docs', 'tests']
199 files = ['README.rst']
199 files = ['README.rst']
200 params = {
200 params = {
201 'repo_name': backend.repo_name,
201 'repo_name': backend.repo_name,
202 'commit_id': commit.raw_id,
202 'commit_id': commit.raw_id,
203 }
203 }
204 assert_dirs_in_response(response, dirs, params)
204 assert_dirs_in_response(response, dirs, params)
205 assert_files_in_response(response, files, params)
205 assert_files_in_response(response, files, params)
206
206
207 def test_show_files_different_branch(self, backend):
207 def test_show_files_different_branch(self, backend):
208 branches = dict(
208 branches = dict(
209 hg=(150, ['git']),
209 hg=(150, ['git']),
210 # TODO: Git test repository does not contain other branches
210 # TODO: Git test repository does not contain other branches
211 git=(633, ['master']),
211 git=(633, ['master']),
212 # TODO: Branch support in Subversion
212 # TODO: Branch support in Subversion
213 svn=(150, [])
213 svn=(150, [])
214 )
214 )
215 idx, branches = branches[backend.alias]
215 idx, branches = branches[backend.alias]
216 commit = backend.repo.get_commit(commit_idx=idx)
216 commit = backend.repo.get_commit(commit_idx=idx)
217 response = self.app.get(
217 response = self.app.get(
218 route_path('repo_files',
218 route_path('repo_files',
219 repo_name=backend.repo_name,
219 repo_name=backend.repo_name,
220 commit_id=commit.raw_id, f_path='/'))
220 commit_id=commit.raw_id, f_path='/'))
221
221
222 assert_response = response.assert_response()
222 assert_response = response.assert_response()
223 for branch in branches:
223 for branch in branches:
224 assert_response.element_contains('.tags .branchtag', branch)
224 assert_response.element_contains('.tags .branchtag', branch)
225
225
226 def test_show_files_paging(self, backend):
226 def test_show_files_paging(self, backend):
227 repo = backend.repo
227 repo = backend.repo
228 indexes = [73, 92, 109, 1, 0]
228 indexes = [73, 92, 109, 1, 0]
229 idx_map = [(rev, repo.get_commit(commit_idx=rev).raw_id)
229 idx_map = [(rev, repo.get_commit(commit_idx=rev).raw_id)
230 for rev in indexes]
230 for rev in indexes]
231
231
232 for idx in idx_map:
232 for idx in idx_map:
233 response = self.app.get(
233 response = self.app.get(
234 route_path('repo_files',
234 route_path('repo_files',
235 repo_name=backend.repo_name,
235 repo_name=backend.repo_name,
236 commit_id=idx[1], f_path='/'))
236 commit_id=idx[1], f_path='/'))
237
237
238 response.mustcontain("""r%s:%s""" % (idx[0], idx[1][:8]))
238 response.mustcontain("""r%s:%s""" % (idx[0], idx[1][:8]))
239
239
240 def test_file_source(self, backend):
240 def test_file_source(self, backend):
241 commit = backend.repo.get_commit(commit_idx=167)
241 commit = backend.repo.get_commit(commit_idx=167)
242 response = self.app.get(
242 response = self.app.get(
243 route_path('repo_files',
243 route_path('repo_files',
244 repo_name=backend.repo_name,
244 repo_name=backend.repo_name,
245 commit_id=commit.raw_id, f_path='vcs/nodes.py'))
245 commit_id=commit.raw_id, f_path='vcs/nodes.py'))
246
246
247 msgbox = """<div class="commit">%s</div>"""
247 msgbox = """<div class="commit">%s</div>"""
248 response.mustcontain(msgbox % (commit.message, ))
248 response.mustcontain(msgbox % (commit.message, ))
249
249
250 assert_response = response.assert_response()
250 assert_response = response.assert_response()
251 if commit.branch:
251 if commit.branch:
252 assert_response.element_contains(
252 assert_response.element_contains(
253 '.tags.tags-main .branchtag', commit.branch)
253 '.tags.tags-main .branchtag', commit.branch)
254 if commit.tags:
254 if commit.tags:
255 for tag in commit.tags:
255 for tag in commit.tags:
256 assert_response.element_contains('.tags.tags-main .tagtag', tag)
256 assert_response.element_contains('.tags.tags-main .tagtag', tag)
257
257
258 def test_file_source_annotated(self, backend):
258 def test_file_source_annotated(self, backend):
259 response = self.app.get(
259 response = self.app.get(
260 route_path('repo_files:annotated',
260 route_path('repo_files:annotated',
261 repo_name=backend.repo_name,
261 repo_name=backend.repo_name,
262 commit_id='tip', f_path='vcs/nodes.py'))
262 commit_id='tip', f_path='vcs/nodes.py'))
263 expected_commits = {
263 expected_commits = {
264 'hg': 'r356',
264 'hg': 'r356',
265 'git': 'r345',
265 'git': 'r345',
266 'svn': 'r208',
266 'svn': 'r208',
267 }
267 }
268 response.mustcontain(expected_commits[backend.alias])
268 response.mustcontain(expected_commits[backend.alias])
269
269
270 def test_file_source_authors(self, backend):
270 def test_file_source_authors(self, backend):
271 response = self.app.get(
271 response = self.app.get(
272 route_path('repo_file_authors',
272 route_path('repo_file_authors',
273 repo_name=backend.repo_name,
273 repo_name=backend.repo_name,
274 commit_id='tip', f_path='vcs/nodes.py'))
274 commit_id='tip', f_path='vcs/nodes.py'))
275 expected_authors = {
275 expected_authors = {
276 'hg': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
276 'hg': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
277 'git': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
277 'git': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
278 'svn': ('marcin', 'lukasz'),
278 'svn': ('marcin', 'lukasz'),
279 }
279 }
280
280
281 for author in expected_authors[backend.alias]:
281 for author in expected_authors[backend.alias]:
282 response.mustcontain(author)
282 response.mustcontain(author)
283
283
284 def test_file_source_authors_with_annotation(self, backend):
284 def test_file_source_authors_with_annotation(self, backend):
285 response = self.app.get(
285 response = self.app.get(
286 route_path('repo_file_authors',
286 route_path('repo_file_authors',
287 repo_name=backend.repo_name,
287 repo_name=backend.repo_name,
288 commit_id='tip', f_path='vcs/nodes.py',
288 commit_id='tip', f_path='vcs/nodes.py',
289 params=dict(annotate=1)))
289 params=dict(annotate=1)))
290 expected_authors = {
290 expected_authors = {
291 'hg': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
291 'hg': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
292 'git': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
292 'git': ('Marcin Kuzminski', 'Lukasz Balcerzak'),
293 'svn': ('marcin', 'lukasz'),
293 'svn': ('marcin', 'lukasz'),
294 }
294 }
295
295
296 for author in expected_authors[backend.alias]:
296 for author in expected_authors[backend.alias]:
297 response.mustcontain(author)
297 response.mustcontain(author)
298
298
299 def test_file_source_history(self, backend, xhr_header):
299 def test_file_source_history(self, backend, xhr_header):
300 response = self.app.get(
300 response = self.app.get(
301 route_path('repo_file_history',
301 route_path('repo_file_history',
302 repo_name=backend.repo_name,
302 repo_name=backend.repo_name,
303 commit_id='tip', f_path='vcs/nodes.py'),
303 commit_id='tip', f_path='vcs/nodes.py'),
304 extra_environ=xhr_header)
304 extra_environ=xhr_header)
305 assert get_node_history(backend.alias) == json.loads(response.body)
305 assert get_node_history(backend.alias) == json.loads(response.body)
306
306
307 def test_file_source_history_svn(self, backend_svn, xhr_header):
307 def test_file_source_history_svn(self, backend_svn, xhr_header):
308 simple_repo = backend_svn['svn-simple-layout']
308 simple_repo = backend_svn['svn-simple-layout']
309 response = self.app.get(
309 response = self.app.get(
310 route_path('repo_file_history',
310 route_path('repo_file_history',
311 repo_name=simple_repo.repo_name,
311 repo_name=simple_repo.repo_name,
312 commit_id='tip', f_path='trunk/example.py'),
312 commit_id='tip', f_path='trunk/example.py'),
313 extra_environ=xhr_header)
313 extra_environ=xhr_header)
314
314
315 expected_data = json.loads(
315 expected_data = json.loads(
316 fixture.load_resource('svn_node_history_branches.json'))
316 fixture.load_resource('svn_node_history_branches.json'))
317
317
318 assert expected_data == response.json
318 assert expected_data == response.json
319
319
320 def test_file_source_history_with_annotation(self, backend, xhr_header):
320 def test_file_source_history_with_annotation(self, backend, xhr_header):
321 response = self.app.get(
321 response = self.app.get(
322 route_path('repo_file_history',
322 route_path('repo_file_history',
323 repo_name=backend.repo_name,
323 repo_name=backend.repo_name,
324 commit_id='tip', f_path='vcs/nodes.py',
324 commit_id='tip', f_path='vcs/nodes.py',
325 params=dict(annotate=1)),
325 params=dict(annotate=1)),
326
326
327 extra_environ=xhr_header)
327 extra_environ=xhr_header)
328 assert get_node_history(backend.alias) == json.loads(response.body)
328 assert get_node_history(backend.alias) == json.loads(response.body)
329
329
330 def test_tree_search_top_level(self, backend, xhr_header):
330 def test_tree_search_top_level(self, backend, xhr_header):
331 commit = backend.repo.get_commit(commit_idx=173)
331 commit = backend.repo.get_commit(commit_idx=173)
332 response = self.app.get(
332 response = self.app.get(
333 route_path('repo_files_nodelist',
333 route_path('repo_files_nodelist',
334 repo_name=backend.repo_name,
334 repo_name=backend.repo_name,
335 commit_id=commit.raw_id, f_path='/'),
335 commit_id=commit.raw_id, f_path='/'),
336 extra_environ=xhr_header)
336 extra_environ=xhr_header)
337 assert 'nodes' in response.json
337 assert 'nodes' in response.json
338 assert {'name': 'docs', 'type': 'dir'} in response.json['nodes']
338 assert {'name': 'docs', 'type': 'dir'} in response.json['nodes']
339
339
340 def test_tree_search_missing_xhr(self, backend):
340 def test_tree_search_missing_xhr(self, backend):
341 self.app.get(
341 self.app.get(
342 route_path('repo_files_nodelist',
342 route_path('repo_files_nodelist',
343 repo_name=backend.repo_name,
343 repo_name=backend.repo_name,
344 commit_id='tip', f_path='/'),
344 commit_id='tip', f_path='/'),
345 status=404)
345 status=404)
346
346
347 def test_tree_search_at_path(self, backend, xhr_header):
347 def test_tree_search_at_path(self, backend, xhr_header):
348 commit = backend.repo.get_commit(commit_idx=173)
348 commit = backend.repo.get_commit(commit_idx=173)
349 response = self.app.get(
349 response = self.app.get(
350 route_path('repo_files_nodelist',
350 route_path('repo_files_nodelist',
351 repo_name=backend.repo_name,
351 repo_name=backend.repo_name,
352 commit_id=commit.raw_id, f_path='/docs'),
352 commit_id=commit.raw_id, f_path='/docs'),
353 extra_environ=xhr_header)
353 extra_environ=xhr_header)
354 assert 'nodes' in response.json
354 assert 'nodes' in response.json
355 nodes = response.json['nodes']
355 nodes = response.json['nodes']
356 assert {'name': 'docs/api', 'type': 'dir'} in nodes
356 assert {'name': 'docs/api', 'type': 'dir'} in nodes
357 assert {'name': 'docs/index.rst', 'type': 'file'} in nodes
357 assert {'name': 'docs/index.rst', 'type': 'file'} in nodes
358
358
359 def test_tree_search_at_path_2nd_level(self, backend, xhr_header):
359 def test_tree_search_at_path_2nd_level(self, backend, xhr_header):
360 commit = backend.repo.get_commit(commit_idx=173)
360 commit = backend.repo.get_commit(commit_idx=173)
361 response = self.app.get(
361 response = self.app.get(
362 route_path('repo_files_nodelist',
362 route_path('repo_files_nodelist',
363 repo_name=backend.repo_name,
363 repo_name=backend.repo_name,
364 commit_id=commit.raw_id, f_path='/docs/api'),
364 commit_id=commit.raw_id, f_path='/docs/api'),
365 extra_environ=xhr_header)
365 extra_environ=xhr_header)
366 assert 'nodes' in response.json
366 assert 'nodes' in response.json
367 nodes = response.json['nodes']
367 nodes = response.json['nodes']
368 assert {'name': 'docs/api/index.rst', 'type': 'file'} in nodes
368 assert {'name': 'docs/api/index.rst', 'type': 'file'} in nodes
369
369
370 def test_tree_search_at_path_missing_xhr(self, backend):
370 def test_tree_search_at_path_missing_xhr(self, backend):
371 self.app.get(
371 self.app.get(
372 route_path('repo_files_nodelist',
372 route_path('repo_files_nodelist',
373 repo_name=backend.repo_name,
373 repo_name=backend.repo_name,
374 commit_id='tip', f_path='/docs'),
374 commit_id='tip', f_path='/docs'),
375 status=404)
375 status=404)
376
376
377 def test_nodetree(self, backend, xhr_header):
377 def test_nodetree(self, backend, xhr_header):
378 commit = backend.repo.get_commit(commit_idx=173)
378 commit = backend.repo.get_commit(commit_idx=173)
379 response = self.app.get(
379 response = self.app.get(
380 route_path('repo_nodetree_full',
380 route_path('repo_nodetree_full',
381 repo_name=backend.repo_name,
381 repo_name=backend.repo_name,
382 commit_id=commit.raw_id, f_path='/'),
382 commit_id=commit.raw_id, f_path='/'),
383 extra_environ=xhr_header)
383 extra_environ=xhr_header)
384
384
385 assert_response = response.assert_response()
385 assert_response = response.assert_response()
386
386
387 for attr in ['data-commit-id', 'data-date', 'data-author']:
387 for attr in ['data-commit-id', 'data-date', 'data-author']:
388 elements = assert_response.get_elements('[{}]'.format(attr))
388 elements = assert_response.get_elements('[{}]'.format(attr))
389 assert len(elements) > 1
389 assert len(elements) > 1
390
390
391 for element in elements:
391 for element in elements:
392 assert element.get(attr)
392 assert element.get(attr)
393
393
394 def test_nodetree_if_file(self, backend, xhr_header):
394 def test_nodetree_if_file(self, backend, xhr_header):
395 commit = backend.repo.get_commit(commit_idx=173)
395 commit = backend.repo.get_commit(commit_idx=173)
396 response = self.app.get(
396 response = self.app.get(
397 route_path('repo_nodetree_full',
397 route_path('repo_nodetree_full',
398 repo_name=backend.repo_name,
398 repo_name=backend.repo_name,
399 commit_id=commit.raw_id, f_path='README.rst'),
399 commit_id=commit.raw_id, f_path='README.rst'),
400 extra_environ=xhr_header)
400 extra_environ=xhr_header)
401 assert response.body == ''
401 assert response.body == ''
402
402
403 def test_nodetree_wrong_path(self, backend, xhr_header):
403 def test_nodetree_wrong_path(self, backend, xhr_header):
404 commit = backend.repo.get_commit(commit_idx=173)
404 commit = backend.repo.get_commit(commit_idx=173)
405 response = self.app.get(
405 response = self.app.get(
406 route_path('repo_nodetree_full',
406 route_path('repo_nodetree_full',
407 repo_name=backend.repo_name,
407 repo_name=backend.repo_name,
408 commit_id=commit.raw_id, f_path='/dont-exist'),
408 commit_id=commit.raw_id, f_path='/dont-exist'),
409 extra_environ=xhr_header)
409 extra_environ=xhr_header)
410
410
411 err = 'error: There is no file nor ' \
411 err = 'error: There is no file nor ' \
412 'directory at the given path'
412 'directory at the given path'
413 assert err in response.body
413 assert err in response.body
414
414
415 def test_nodetree_missing_xhr(self, backend):
415 def test_nodetree_missing_xhr(self, backend):
416 self.app.get(
416 self.app.get(
417 route_path('repo_nodetree_full',
417 route_path('repo_nodetree_full',
418 repo_name=backend.repo_name,
418 repo_name=backend.repo_name,
419 commit_id='tip', f_path='/'),
419 commit_id='tip', f_path='/'),
420 status=404)
420 status=404)
421
421
422
422
423 @pytest.mark.usefixtures("app", "autologin_user")
423 @pytest.mark.usefixtures("app", "autologin_user")
424 class TestRawFileHandling(object):
424 class TestRawFileHandling(object):
425
425
426 def test_download_file(self, backend):
426 def test_download_file(self, backend):
427 commit = backend.repo.get_commit(commit_idx=173)
427 commit = backend.repo.get_commit(commit_idx=173)
428 response = self.app.get(
428 response = self.app.get(
429 route_path('repo_file_download',
429 route_path('repo_file_download',
430 repo_name=backend.repo_name,
430 repo_name=backend.repo_name,
431 commit_id=commit.raw_id, f_path='vcs/nodes.py'),)
431 commit_id=commit.raw_id, f_path='vcs/nodes.py'),)
432
432
433 assert response.content_disposition == 'attachment; filename="nodes.py"; filename*=UTF-8\'\'nodes.py'
433 assert response.content_disposition == 'attachment; filename="nodes.py"; filename*=UTF-8\'\'nodes.py'
434 assert response.content_type == "text/x-python"
434 assert response.content_type == "text/x-python"
435
435
436 def test_download_file_wrong_cs(self, backend):
436 def test_download_file_wrong_cs(self, backend):
437 raw_id = u'ERRORce30c96924232dffcd24178a07ffeb5dfc'
437 raw_id = u'ERRORce30c96924232dffcd24178a07ffeb5dfc'
438
438
439 response = self.app.get(
439 response = self.app.get(
440 route_path('repo_file_download',
440 route_path('repo_file_download',
441 repo_name=backend.repo_name,
441 repo_name=backend.repo_name,
442 commit_id=raw_id, f_path='vcs/nodes.svg'),
442 commit_id=raw_id, f_path='vcs/nodes.svg'),
443 status=404)
443 status=404)
444
444
445 msg = """No such commit exists for this repository"""
445 msg = """No such commit exists for this repository"""
446 response.mustcontain(msg)
446 response.mustcontain(msg)
447
447
448 def test_download_file_wrong_f_path(self, backend):
448 def test_download_file_wrong_f_path(self, backend):
449 commit = backend.repo.get_commit(commit_idx=173)
449 commit = backend.repo.get_commit(commit_idx=173)
450 f_path = 'vcs/ERRORnodes.py'
450 f_path = 'vcs/ERRORnodes.py'
451
451
452 response = self.app.get(
452 response = self.app.get(
453 route_path('repo_file_download',
453 route_path('repo_file_download',
454 repo_name=backend.repo_name,
454 repo_name=backend.repo_name,
455 commit_id=commit.raw_id, f_path=f_path),
455 commit_id=commit.raw_id, f_path=f_path),
456 status=404)
456 status=404)
457
457
458 msg = (
458 msg = (
459 "There is no file nor directory at the given path: "
459 "There is no file nor directory at the given path: "
460 "`%s` at commit %s" % (f_path, commit.short_id))
460 "`%s` at commit %s" % (f_path, commit.short_id))
461 response.mustcontain(msg)
461 response.mustcontain(msg)
462
462
463 def test_file_raw(self, backend):
463 def test_file_raw(self, backend):
464 commit = backend.repo.get_commit(commit_idx=173)
464 commit = backend.repo.get_commit(commit_idx=173)
465 response = self.app.get(
465 response = self.app.get(
466 route_path('repo_file_raw',
466 route_path('repo_file_raw',
467 repo_name=backend.repo_name,
467 repo_name=backend.repo_name,
468 commit_id=commit.raw_id, f_path='vcs/nodes.py'),)
468 commit_id=commit.raw_id, f_path='vcs/nodes.py'),)
469
469
470 assert response.content_type == "text/plain"
470 assert response.content_type == "text/plain"
471
471
472 def test_file_raw_binary(self, backend):
472 def test_file_raw_binary(self, backend):
473 commit = backend.repo.get_commit()
473 commit = backend.repo.get_commit()
474 response = self.app.get(
474 response = self.app.get(
475 route_path('repo_file_raw',
475 route_path('repo_file_raw',
476 repo_name=backend.repo_name,
476 repo_name=backend.repo_name,
477 commit_id=commit.raw_id,
477 commit_id=commit.raw_id,
478 f_path='docs/theme/ADC/static/breadcrumb_background.png'),)
478 f_path='docs/theme/ADC/static/breadcrumb_background.png'),)
479
479
480 assert response.content_disposition == 'inline'
480 assert response.content_disposition == 'inline'
481
481
482 def test_raw_file_wrong_cs(self, backend):
482 def test_raw_file_wrong_cs(self, backend):
483 raw_id = u'ERRORcce30c96924232dffcd24178a07ffeb5dfc'
483 raw_id = u'ERRORcce30c96924232dffcd24178a07ffeb5dfc'
484
484
485 response = self.app.get(
485 response = self.app.get(
486 route_path('repo_file_raw',
486 route_path('repo_file_raw',
487 repo_name=backend.repo_name,
487 repo_name=backend.repo_name,
488 commit_id=raw_id, f_path='vcs/nodes.svg'),
488 commit_id=raw_id, f_path='vcs/nodes.svg'),
489 status=404)
489 status=404)
490
490
491 msg = """No such commit exists for this repository"""
491 msg = """No such commit exists for this repository"""
492 response.mustcontain(msg)
492 response.mustcontain(msg)
493
493
494 def test_raw_wrong_f_path(self, backend):
494 def test_raw_wrong_f_path(self, backend):
495 commit = backend.repo.get_commit(commit_idx=173)
495 commit = backend.repo.get_commit(commit_idx=173)
496 f_path = 'vcs/ERRORnodes.py'
496 f_path = 'vcs/ERRORnodes.py'
497 response = self.app.get(
497 response = self.app.get(
498 route_path('repo_file_raw',
498 route_path('repo_file_raw',
499 repo_name=backend.repo_name,
499 repo_name=backend.repo_name,
500 commit_id=commit.raw_id, f_path=f_path),
500 commit_id=commit.raw_id, f_path=f_path),
501 status=404)
501 status=404)
502
502
503 msg = (
503 msg = (
504 "There is no file nor directory at the given path: "
504 "There is no file nor directory at the given path: "
505 "`%s` at commit %s" % (f_path, commit.short_id))
505 "`%s` at commit %s" % (f_path, commit.short_id))
506 response.mustcontain(msg)
506 response.mustcontain(msg)
507
507
508 def test_raw_svg_should_not_be_rendered(self, backend):
508 def test_raw_svg_should_not_be_rendered(self, backend):
509 backend.create_repo()
509 backend.create_repo()
510 backend.ensure_file("xss.svg")
510 backend.ensure_file("xss.svg")
511 response = self.app.get(
511 response = self.app.get(
512 route_path('repo_file_raw',
512 route_path('repo_file_raw',
513 repo_name=backend.repo_name,
513 repo_name=backend.repo_name,
514 commit_id='tip', f_path='xss.svg'),)
514 commit_id='tip', f_path='xss.svg'),)
515 # If the content type is image/svg+xml then it allows to render HTML
515 # If the content type is image/svg+xml then it allows to render HTML
516 # and malicious SVG.
516 # and malicious SVG.
517 assert response.content_type == "text/plain"
517 assert response.content_type == "text/plain"
518
518
519
519
520 @pytest.mark.usefixtures("app")
520 @pytest.mark.usefixtures("app")
521 class TestRepositoryArchival(object):
521 class TestRepositoryArchival(object):
522
522
523 def test_archival(self, backend):
523 def test_archival(self, backend):
524 backend.enable_downloads()
524 backend.enable_downloads()
525 commit = backend.repo.get_commit(commit_idx=173)
525 commit = backend.repo.get_commit(commit_idx=173)
526 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
526 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
527
527
528 short = commit.short_id + extension
528 short = commit.short_id + extension
529 fname = commit.raw_id + extension
529 fname = commit.raw_id + extension
530 filename = '%s-%s' % (backend.repo_name, short)
530 filename = '%s-%s' % (backend.repo_name, short)
531 response = self.app.get(
531 response = self.app.get(
532 route_path('repo_archivefile',
532 route_path('repo_archivefile',
533 repo_name=backend.repo_name,
533 repo_name=backend.repo_name,
534 fname=fname))
534 fname=fname))
535
535
536 assert response.status == '200 OK'
536 assert response.status == '200 OK'
537 headers = [
537 headers = [
538 ('Content-Disposition', 'attachment; filename=%s' % filename),
538 ('Content-Disposition', 'attachment; filename=%s' % filename),
539 ('Content-Type', '%s' % content_type),
539 ('Content-Type', '%s' % content_type),
540 ]
540 ]
541
541
542 for header in headers:
542 for header in headers:
543 assert header in response.headers.items()
543 assert header in response.headers.items()
544
544
545 def test_archival_no_hash(self, backend):
546 backend.enable_downloads()
547 commit = backend.repo.get_commit(commit_idx=173)
548 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
549
550 short = 'plain' + extension
551 fname = commit.raw_id + extension
552 filename = '%s-%s' % (backend.repo_name, short)
553 response = self.app.get(
554 route_path('repo_archivefile',
555 repo_name=backend.repo_name,
556 fname=fname, params={'with_hash': 0}))
557
558 assert response.status == '200 OK'
559 headers = [
560 ('Content-Disposition', 'attachment; filename=%s' % filename),
561 ('Content-Type', '%s' % content_type),
562 ]
563
564 for header in headers:
565 assert header in response.headers.items()
566
545 @pytest.mark.parametrize('arch_ext',[
567 @pytest.mark.parametrize('arch_ext',[
546 'tar', 'rar', 'x', '..ax', '.zipz', 'tar.gz.tar'])
568 'tar', 'rar', 'x', '..ax', '.zipz', 'tar.gz.tar'])
547 def test_archival_wrong_ext(self, backend, arch_ext):
569 def test_archival_wrong_ext(self, backend, arch_ext):
548 backend.enable_downloads()
570 backend.enable_downloads()
549 commit = backend.repo.get_commit(commit_idx=173)
571 commit = backend.repo.get_commit(commit_idx=173)
550
572
551 fname = commit.raw_id + '.' + arch_ext
573 fname = commit.raw_id + '.' + arch_ext
552
574
553 response = self.app.get(
575 response = self.app.get(
554 route_path('repo_archivefile',
576 route_path('repo_archivefile',
555 repo_name=backend.repo_name,
577 repo_name=backend.repo_name,
556 fname=fname))
578 fname=fname))
557 response.mustcontain(
579 response.mustcontain(
558 'Unknown archive type for: `{}`'.format(fname))
580 'Unknown archive type for: `{}`'.format(fname))
559
581
560 @pytest.mark.parametrize('commit_id', [
582 @pytest.mark.parametrize('commit_id', [
561 '00x000000', 'tar', 'wrong', '@$@$42413232', '232dffcd'])
583 '00x000000', 'tar', 'wrong', '@$@$42413232', '232dffcd'])
562 def test_archival_wrong_commit_id(self, backend, commit_id):
584 def test_archival_wrong_commit_id(self, backend, commit_id):
563 backend.enable_downloads()
585 backend.enable_downloads()
564 fname = '%s.zip' % commit_id
586 fname = '%s.zip' % commit_id
565
587
566 response = self.app.get(
588 response = self.app.get(
567 route_path('repo_archivefile',
589 route_path('repo_archivefile',
568 repo_name=backend.repo_name,
590 repo_name=backend.repo_name,
569 fname=fname))
591 fname=fname))
570 response.mustcontain('Unknown commit_id')
592 response.mustcontain('Unknown commit_id')
571
593
572
594
573 @pytest.mark.usefixtures("app")
595 @pytest.mark.usefixtures("app")
574 class TestFilesDiff(object):
596 class TestFilesDiff(object):
575
597
576 @pytest.mark.parametrize("diff", ['diff', 'download', 'raw'])
598 @pytest.mark.parametrize("diff", ['diff', 'download', 'raw'])
577 def test_file_full_diff(self, backend, diff):
599 def test_file_full_diff(self, backend, diff):
578 commit1 = backend.repo.get_commit(commit_idx=-1)
600 commit1 = backend.repo.get_commit(commit_idx=-1)
579 commit2 = backend.repo.get_commit(commit_idx=-2)
601 commit2 = backend.repo.get_commit(commit_idx=-2)
580
602
581 response = self.app.get(
603 response = self.app.get(
582 route_path('repo_files_diff',
604 route_path('repo_files_diff',
583 repo_name=backend.repo_name,
605 repo_name=backend.repo_name,
584 f_path='README'),
606 f_path='README'),
585 params={
607 params={
586 'diff1': commit2.raw_id,
608 'diff1': commit2.raw_id,
587 'diff2': commit1.raw_id,
609 'diff2': commit1.raw_id,
588 'fulldiff': '1',
610 'fulldiff': '1',
589 'diff': diff,
611 'diff': diff,
590 })
612 })
591
613
592 if diff == 'diff':
614 if diff == 'diff':
593 # use redirect since this is OLD view redirecting to compare page
615 # use redirect since this is OLD view redirecting to compare page
594 response = response.follow()
616 response = response.follow()
595
617
596 # It's a symlink to README.rst
618 # It's a symlink to README.rst
597 response.mustcontain('README.rst')
619 response.mustcontain('README.rst')
598 response.mustcontain('No newline at end of file')
620 response.mustcontain('No newline at end of file')
599
621
600 def test_file_binary_diff(self, backend):
622 def test_file_binary_diff(self, backend):
601 commits = [
623 commits = [
602 {'message': 'First commit'},
624 {'message': 'First commit'},
603 {'message': 'Commit with binary',
625 {'message': 'Commit with binary',
604 'added': [nodes.FileNode('file.bin', content='\0BINARY\0')]},
626 'added': [nodes.FileNode('file.bin', content='\0BINARY\0')]},
605 ]
627 ]
606 repo = backend.create_repo(commits=commits)
628 repo = backend.create_repo(commits=commits)
607
629
608 response = self.app.get(
630 response = self.app.get(
609 route_path('repo_files_diff',
631 route_path('repo_files_diff',
610 repo_name=backend.repo_name,
632 repo_name=backend.repo_name,
611 f_path='file.bin'),
633 f_path='file.bin'),
612 params={
634 params={
613 'diff1': repo.get_commit(commit_idx=0).raw_id,
635 'diff1': repo.get_commit(commit_idx=0).raw_id,
614 'diff2': repo.get_commit(commit_idx=1).raw_id,
636 'diff2': repo.get_commit(commit_idx=1).raw_id,
615 'fulldiff': '1',
637 'fulldiff': '1',
616 'diff': 'diff',
638 'diff': 'diff',
617 })
639 })
618 # use redirect since this is OLD view redirecting to compare page
640 # use redirect since this is OLD view redirecting to compare page
619 response = response.follow()
641 response = response.follow()
620 response.mustcontain('Collapse 1 commit')
642 response.mustcontain('Collapse 1 commit')
621 file_changes = (1, 0, 0)
643 file_changes = (1, 0, 0)
622
644
623 compare_page = ComparePage(response)
645 compare_page = ComparePage(response)
624 compare_page.contains_change_summary(*file_changes)
646 compare_page.contains_change_summary(*file_changes)
625
647
626 if backend.alias == 'svn':
648 if backend.alias == 'svn':
627 response.mustcontain('new file 10644')
649 response.mustcontain('new file 10644')
628 # TODO(marcink): SVN doesn't yet detect binary changes
650 # TODO(marcink): SVN doesn't yet detect binary changes
629 else:
651 else:
630 response.mustcontain('new file 100644')
652 response.mustcontain('new file 100644')
631 response.mustcontain('binary diff hidden')
653 response.mustcontain('binary diff hidden')
632
654
633 def test_diff_2way(self, backend):
655 def test_diff_2way(self, backend):
634 commit1 = backend.repo.get_commit(commit_idx=-1)
656 commit1 = backend.repo.get_commit(commit_idx=-1)
635 commit2 = backend.repo.get_commit(commit_idx=-2)
657 commit2 = backend.repo.get_commit(commit_idx=-2)
636 response = self.app.get(
658 response = self.app.get(
637 route_path('repo_files_diff_2way_redirect',
659 route_path('repo_files_diff_2way_redirect',
638 repo_name=backend.repo_name,
660 repo_name=backend.repo_name,
639 f_path='README'),
661 f_path='README'),
640 params={
662 params={
641 'diff1': commit2.raw_id,
663 'diff1': commit2.raw_id,
642 'diff2': commit1.raw_id,
664 'diff2': commit1.raw_id,
643 })
665 })
644 # use redirect since this is OLD view redirecting to compare page
666 # use redirect since this is OLD view redirecting to compare page
645 response = response.follow()
667 response = response.follow()
646
668
647 # It's a symlink to README.rst
669 # It's a symlink to README.rst
648 response.mustcontain('README.rst')
670 response.mustcontain('README.rst')
649 response.mustcontain('No newline at end of file')
671 response.mustcontain('No newline at end of file')
650
672
651 def test_requires_one_commit_id(self, backend, autologin_user):
673 def test_requires_one_commit_id(self, backend, autologin_user):
652 response = self.app.get(
674 response = self.app.get(
653 route_path('repo_files_diff',
675 route_path('repo_files_diff',
654 repo_name=backend.repo_name,
676 repo_name=backend.repo_name,
655 f_path='README.rst'),
677 f_path='README.rst'),
656 status=400)
678 status=400)
657 response.mustcontain(
679 response.mustcontain(
658 'Need query parameter', 'diff1', 'diff2', 'to generate a diff.')
680 'Need query parameter', 'diff1', 'diff2', 'to generate a diff.')
659
681
660 def test_returns_no_files_if_file_does_not_exist(self, vcsbackend):
682 def test_returns_no_files_if_file_does_not_exist(self, vcsbackend):
661 repo = vcsbackend.repo
683 repo = vcsbackend.repo
662 response = self.app.get(
684 response = self.app.get(
663 route_path('repo_files_diff',
685 route_path('repo_files_diff',
664 repo_name=repo.name,
686 repo_name=repo.name,
665 f_path='does-not-exist-in-any-commit'),
687 f_path='does-not-exist-in-any-commit'),
666 params={
688 params={
667 'diff1': repo[0].raw_id,
689 'diff1': repo[0].raw_id,
668 'diff2': repo[1].raw_id
690 'diff2': repo[1].raw_id
669 })
691 })
670
692
671 response = response.follow()
693 response = response.follow()
672 response.mustcontain('No files')
694 response.mustcontain('No files')
673
695
674 def test_returns_redirect_if_file_not_changed(self, backend):
696 def test_returns_redirect_if_file_not_changed(self, backend):
675 commit = backend.repo.get_commit(commit_idx=-1)
697 commit = backend.repo.get_commit(commit_idx=-1)
676 response = self.app.get(
698 response = self.app.get(
677 route_path('repo_files_diff_2way_redirect',
699 route_path('repo_files_diff_2way_redirect',
678 repo_name=backend.repo_name,
700 repo_name=backend.repo_name,
679 f_path='README'),
701 f_path='README'),
680 params={
702 params={
681 'diff1': commit.raw_id,
703 'diff1': commit.raw_id,
682 'diff2': commit.raw_id,
704 'diff2': commit.raw_id,
683 })
705 })
684
706
685 response = response.follow()
707 response = response.follow()
686 response.mustcontain('No files')
708 response.mustcontain('No files')
687 response.mustcontain('No commits in this compare')
709 response.mustcontain('No commits in this compare')
688
710
689 def test_supports_diff_to_different_path_svn(self, backend_svn):
711 def test_supports_diff_to_different_path_svn(self, backend_svn):
690 #TODO: check this case
712 #TODO: check this case
691 return
713 return
692
714
693 repo = backend_svn['svn-simple-layout'].scm_instance()
715 repo = backend_svn['svn-simple-layout'].scm_instance()
694 commit_id_1 = '24'
716 commit_id_1 = '24'
695 commit_id_2 = '26'
717 commit_id_2 = '26'
696
718
697 response = self.app.get(
719 response = self.app.get(
698 route_path('repo_files_diff',
720 route_path('repo_files_diff',
699 repo_name=backend_svn.repo_name,
721 repo_name=backend_svn.repo_name,
700 f_path='trunk/example.py'),
722 f_path='trunk/example.py'),
701 params={
723 params={
702 'diff1': 'tags/v0.2/example.py@' + commit_id_1,
724 'diff1': 'tags/v0.2/example.py@' + commit_id_1,
703 'diff2': commit_id_2,
725 'diff2': commit_id_2,
704 })
726 })
705
727
706 response = response.follow()
728 response = response.follow()
707 response.mustcontain(
729 response.mustcontain(
708 # diff contains this
730 # diff contains this
709 "Will print out a useful message on invocation.")
731 "Will print out a useful message on invocation.")
710
732
711 # Note: Expecting that we indicate the user what's being compared
733 # Note: Expecting that we indicate the user what's being compared
712 response.mustcontain("trunk/example.py")
734 response.mustcontain("trunk/example.py")
713 response.mustcontain("tags/v0.2/example.py")
735 response.mustcontain("tags/v0.2/example.py")
714
736
715 def test_show_rev_redirects_to_svn_path(self, backend_svn):
737 def test_show_rev_redirects_to_svn_path(self, backend_svn):
716 #TODO: check this case
738 #TODO: check this case
717 return
739 return
718
740
719 repo = backend_svn['svn-simple-layout'].scm_instance()
741 repo = backend_svn['svn-simple-layout'].scm_instance()
720 commit_id = repo[-1].raw_id
742 commit_id = repo[-1].raw_id
721
743
722 response = self.app.get(
744 response = self.app.get(
723 route_path('repo_files_diff',
745 route_path('repo_files_diff',
724 repo_name=backend_svn.repo_name,
746 repo_name=backend_svn.repo_name,
725 f_path='trunk/example.py'),
747 f_path='trunk/example.py'),
726 params={
748 params={
727 'diff1': 'branches/argparse/example.py@' + commit_id,
749 'diff1': 'branches/argparse/example.py@' + commit_id,
728 'diff2': commit_id,
750 'diff2': commit_id,
729 },
751 },
730 status=302)
752 status=302)
731 response = response.follow()
753 response = response.follow()
732 assert response.headers['Location'].endswith(
754 assert response.headers['Location'].endswith(
733 'svn-svn-simple-layout/files/26/branches/argparse/example.py')
755 'svn-svn-simple-layout/files/26/branches/argparse/example.py')
734
756
735 def test_show_rev_and_annotate_redirects_to_svn_path(self, backend_svn):
757 def test_show_rev_and_annotate_redirects_to_svn_path(self, backend_svn):
736 #TODO: check this case
758 #TODO: check this case
737 return
759 return
738
760
739 repo = backend_svn['svn-simple-layout'].scm_instance()
761 repo = backend_svn['svn-simple-layout'].scm_instance()
740 commit_id = repo[-1].raw_id
762 commit_id = repo[-1].raw_id
741 response = self.app.get(
763 response = self.app.get(
742 route_path('repo_files_diff',
764 route_path('repo_files_diff',
743 repo_name=backend_svn.repo_name,
765 repo_name=backend_svn.repo_name,
744 f_path='trunk/example.py'),
766 f_path='trunk/example.py'),
745 params={
767 params={
746 'diff1': 'branches/argparse/example.py@' + commit_id,
768 'diff1': 'branches/argparse/example.py@' + commit_id,
747 'diff2': commit_id,
769 'diff2': commit_id,
748 'show_rev': 'Show at Revision',
770 'show_rev': 'Show at Revision',
749 'annotate': 'true',
771 'annotate': 'true',
750 },
772 },
751 status=302)
773 status=302)
752 response = response.follow()
774 response = response.follow()
753 assert response.headers['Location'].endswith(
775 assert response.headers['Location'].endswith(
754 'svn-svn-simple-layout/annotate/26/branches/argparse/example.py')
776 'svn-svn-simple-layout/annotate/26/branches/argparse/example.py')
755
777
756
778
757 @pytest.mark.usefixtures("app", "autologin_user")
779 @pytest.mark.usefixtures("app", "autologin_user")
758 class TestModifyFilesWithWebInterface(object):
780 class TestModifyFilesWithWebInterface(object):
759
781
760 def test_add_file_view(self, backend):
782 def test_add_file_view(self, backend):
761 self.app.get(
783 self.app.get(
762 route_path('repo_files_add_file',
784 route_path('repo_files_add_file',
763 repo_name=backend.repo_name,
785 repo_name=backend.repo_name,
764 commit_id='tip', f_path='/')
786 commit_id='tip', f_path='/')
765 )
787 )
766
788
767 @pytest.mark.xfail_backends("svn", reason="Depends on online editing")
789 @pytest.mark.xfail_backends("svn", reason="Depends on online editing")
768 def test_add_file_into_repo_missing_content(self, backend, csrf_token):
790 def test_add_file_into_repo_missing_content(self, backend, csrf_token):
769 backend.create_repo()
791 backend.create_repo()
770 filename = 'init.py'
792 filename = 'init.py'
771 response = self.app.post(
793 response = self.app.post(
772 route_path('repo_files_create_file',
794 route_path('repo_files_create_file',
773 repo_name=backend.repo_name,
795 repo_name=backend.repo_name,
774 commit_id='tip', f_path='/'),
796 commit_id='tip', f_path='/'),
775 params={
797 params={
776 'content': "",
798 'content': "",
777 'filename': filename,
799 'filename': filename,
778 'csrf_token': csrf_token,
800 'csrf_token': csrf_token,
779 },
801 },
780 status=302)
802 status=302)
781 expected_msg = 'Successfully committed new file `{}`'.format(os.path.join(filename))
803 expected_msg = 'Successfully committed new file `{}`'.format(os.path.join(filename))
782 assert_session_flash(response, expected_msg)
804 assert_session_flash(response, expected_msg)
783
805
784 def test_add_file_into_repo_missing_filename(self, backend, csrf_token):
806 def test_add_file_into_repo_missing_filename(self, backend, csrf_token):
785 commit_id = backend.repo.get_commit().raw_id
807 commit_id = backend.repo.get_commit().raw_id
786 response = self.app.post(
808 response = self.app.post(
787 route_path('repo_files_create_file',
809 route_path('repo_files_create_file',
788 repo_name=backend.repo_name,
810 repo_name=backend.repo_name,
789 commit_id=commit_id, f_path='/'),
811 commit_id=commit_id, f_path='/'),
790 params={
812 params={
791 'content': "foo",
813 'content': "foo",
792 'csrf_token': csrf_token,
814 'csrf_token': csrf_token,
793 },
815 },
794 status=302)
816 status=302)
795
817
796 assert_session_flash(response, 'No filename specified')
818 assert_session_flash(response, 'No filename specified')
797
819
798 def test_add_file_into_repo_errors_and_no_commits(
820 def test_add_file_into_repo_errors_and_no_commits(
799 self, backend, csrf_token):
821 self, backend, csrf_token):
800 repo = backend.create_repo()
822 repo = backend.create_repo()
801 # Create a file with no filename, it will display an error but
823 # Create a file with no filename, it will display an error but
802 # the repo has no commits yet
824 # the repo has no commits yet
803 response = self.app.post(
825 response = self.app.post(
804 route_path('repo_files_create_file',
826 route_path('repo_files_create_file',
805 repo_name=repo.repo_name,
827 repo_name=repo.repo_name,
806 commit_id='tip', f_path='/'),
828 commit_id='tip', f_path='/'),
807 params={
829 params={
808 'content': "foo",
830 'content': "foo",
809 'csrf_token': csrf_token,
831 'csrf_token': csrf_token,
810 },
832 },
811 status=302)
833 status=302)
812
834
813 assert_session_flash(response, 'No filename specified')
835 assert_session_flash(response, 'No filename specified')
814
836
815 # Not allowed, redirect to the summary
837 # Not allowed, redirect to the summary
816 redirected = response.follow()
838 redirected = response.follow()
817 summary_url = h.route_path('repo_summary', repo_name=repo.repo_name)
839 summary_url = h.route_path('repo_summary', repo_name=repo.repo_name)
818
840
819 # As there are no commits, displays the summary page with the error of
841 # As there are no commits, displays the summary page with the error of
820 # creating a file with no filename
842 # creating a file with no filename
821
843
822 assert redirected.request.path == summary_url
844 assert redirected.request.path == summary_url
823
845
824 @pytest.mark.parametrize("filename, clean_filename", [
846 @pytest.mark.parametrize("filename, clean_filename", [
825 ('/abs/foo', 'abs/foo'),
847 ('/abs/foo', 'abs/foo'),
826 ('../rel/foo', 'rel/foo'),
848 ('../rel/foo', 'rel/foo'),
827 ('file/../foo/foo', 'file/foo/foo'),
849 ('file/../foo/foo', 'file/foo/foo'),
828 ])
850 ])
829 def test_add_file_into_repo_bad_filenames(self, filename, clean_filename, backend, csrf_token):
851 def test_add_file_into_repo_bad_filenames(self, filename, clean_filename, backend, csrf_token):
830 repo = backend.create_repo()
852 repo = backend.create_repo()
831 commit_id = repo.get_commit().raw_id
853 commit_id = repo.get_commit().raw_id
832
854
833 response = self.app.post(
855 response = self.app.post(
834 route_path('repo_files_create_file',
856 route_path('repo_files_create_file',
835 repo_name=repo.repo_name,
857 repo_name=repo.repo_name,
836 commit_id=commit_id, f_path='/'),
858 commit_id=commit_id, f_path='/'),
837 params={
859 params={
838 'content': "foo",
860 'content': "foo",
839 'filename': filename,
861 'filename': filename,
840 'csrf_token': csrf_token,
862 'csrf_token': csrf_token,
841 },
863 },
842 status=302)
864 status=302)
843
865
844 expected_msg = 'Successfully committed new file `{}`'.format(clean_filename)
866 expected_msg = 'Successfully committed new file `{}`'.format(clean_filename)
845 assert_session_flash(response, expected_msg)
867 assert_session_flash(response, expected_msg)
846
868
847 @pytest.mark.parametrize("cnt, filename, content", [
869 @pytest.mark.parametrize("cnt, filename, content", [
848 (1, 'foo.txt', "Content"),
870 (1, 'foo.txt', "Content"),
849 (2, 'dir/foo.rst', "Content"),
871 (2, 'dir/foo.rst', "Content"),
850 (3, 'dir/foo-second.rst', "Content"),
872 (3, 'dir/foo-second.rst', "Content"),
851 (4, 'rel/dir/foo.bar', "Content"),
873 (4, 'rel/dir/foo.bar', "Content"),
852 ])
874 ])
853 def test_add_file_into_empty_repo(self, cnt, filename, content, backend, csrf_token):
875 def test_add_file_into_empty_repo(self, cnt, filename, content, backend, csrf_token):
854 repo = backend.create_repo()
876 repo = backend.create_repo()
855 commit_id = repo.get_commit().raw_id
877 commit_id = repo.get_commit().raw_id
856 response = self.app.post(
878 response = self.app.post(
857 route_path('repo_files_create_file',
879 route_path('repo_files_create_file',
858 repo_name=repo.repo_name,
880 repo_name=repo.repo_name,
859 commit_id=commit_id, f_path='/'),
881 commit_id=commit_id, f_path='/'),
860 params={
882 params={
861 'content': content,
883 'content': content,
862 'filename': filename,
884 'filename': filename,
863 'csrf_token': csrf_token,
885 'csrf_token': csrf_token,
864 },
886 },
865 status=302)
887 status=302)
866
888
867 expected_msg = 'Successfully committed new file `{}`'.format(filename)
889 expected_msg = 'Successfully committed new file `{}`'.format(filename)
868 assert_session_flash(response, expected_msg)
890 assert_session_flash(response, expected_msg)
869
891
870 def test_edit_file_view(self, backend):
892 def test_edit_file_view(self, backend):
871 response = self.app.get(
893 response = self.app.get(
872 route_path('repo_files_edit_file',
894 route_path('repo_files_edit_file',
873 repo_name=backend.repo_name,
895 repo_name=backend.repo_name,
874 commit_id=backend.default_head_id,
896 commit_id=backend.default_head_id,
875 f_path='vcs/nodes.py'),
897 f_path='vcs/nodes.py'),
876 status=200)
898 status=200)
877 response.mustcontain("Module holding everything related to vcs nodes.")
899 response.mustcontain("Module holding everything related to vcs nodes.")
878
900
879 def test_edit_file_view_not_on_branch(self, backend):
901 def test_edit_file_view_not_on_branch(self, backend):
880 repo = backend.create_repo()
902 repo = backend.create_repo()
881 backend.ensure_file("vcs/nodes.py")
903 backend.ensure_file("vcs/nodes.py")
882
904
883 response = self.app.get(
905 response = self.app.get(
884 route_path('repo_files_edit_file',
906 route_path('repo_files_edit_file',
885 repo_name=repo.repo_name,
907 repo_name=repo.repo_name,
886 commit_id='tip',
908 commit_id='tip',
887 f_path='vcs/nodes.py'),
909 f_path='vcs/nodes.py'),
888 status=302)
910 status=302)
889 assert_session_flash(
911 assert_session_flash(
890 response, 'Cannot modify file. Given commit `tip` is not head of a branch.')
912 response, 'Cannot modify file. Given commit `tip` is not head of a branch.')
891
913
892 def test_edit_file_view_commit_changes(self, backend, csrf_token):
914 def test_edit_file_view_commit_changes(self, backend, csrf_token):
893 repo = backend.create_repo()
915 repo = backend.create_repo()
894 backend.ensure_file("vcs/nodes.py", content="print 'hello'")
916 backend.ensure_file("vcs/nodes.py", content="print 'hello'")
895
917
896 response = self.app.post(
918 response = self.app.post(
897 route_path('repo_files_update_file',
919 route_path('repo_files_update_file',
898 repo_name=repo.repo_name,
920 repo_name=repo.repo_name,
899 commit_id=backend.default_head_id,
921 commit_id=backend.default_head_id,
900 f_path='vcs/nodes.py'),
922 f_path='vcs/nodes.py'),
901 params={
923 params={
902 'content': "print 'hello world'",
924 'content': "print 'hello world'",
903 'message': 'I committed',
925 'message': 'I committed',
904 'filename': "vcs/nodes.py",
926 'filename': "vcs/nodes.py",
905 'csrf_token': csrf_token,
927 'csrf_token': csrf_token,
906 },
928 },
907 status=302)
929 status=302)
908 assert_session_flash(
930 assert_session_flash(
909 response, 'Successfully committed changes to file `vcs/nodes.py`')
931 response, 'Successfully committed changes to file `vcs/nodes.py`')
910 tip = repo.get_commit(commit_idx=-1)
932 tip = repo.get_commit(commit_idx=-1)
911 assert tip.message == 'I committed'
933 assert tip.message == 'I committed'
912
934
913 def test_edit_file_view_commit_changes_default_message(self, backend,
935 def test_edit_file_view_commit_changes_default_message(self, backend,
914 csrf_token):
936 csrf_token):
915 repo = backend.create_repo()
937 repo = backend.create_repo()
916 backend.ensure_file("vcs/nodes.py", content="print 'hello'")
938 backend.ensure_file("vcs/nodes.py", content="print 'hello'")
917
939
918 commit_id = (
940 commit_id = (
919 backend.default_branch_name or
941 backend.default_branch_name or
920 backend.repo.scm_instance().commit_ids[-1])
942 backend.repo.scm_instance().commit_ids[-1])
921
943
922 response = self.app.post(
944 response = self.app.post(
923 route_path('repo_files_update_file',
945 route_path('repo_files_update_file',
924 repo_name=repo.repo_name,
946 repo_name=repo.repo_name,
925 commit_id=commit_id,
947 commit_id=commit_id,
926 f_path='vcs/nodes.py'),
948 f_path='vcs/nodes.py'),
927 params={
949 params={
928 'content': "print 'hello world'",
950 'content': "print 'hello world'",
929 'message': '',
951 'message': '',
930 'filename': "vcs/nodes.py",
952 'filename': "vcs/nodes.py",
931 'csrf_token': csrf_token,
953 'csrf_token': csrf_token,
932 },
954 },
933 status=302)
955 status=302)
934 assert_session_flash(
956 assert_session_flash(
935 response, 'Successfully committed changes to file `vcs/nodes.py`')
957 response, 'Successfully committed changes to file `vcs/nodes.py`')
936 tip = repo.get_commit(commit_idx=-1)
958 tip = repo.get_commit(commit_idx=-1)
937 assert tip.message == 'Edited file vcs/nodes.py via RhodeCode Enterprise'
959 assert tip.message == 'Edited file vcs/nodes.py via RhodeCode Enterprise'
938
960
939 def test_delete_file_view(self, backend):
961 def test_delete_file_view(self, backend):
940 self.app.get(
962 self.app.get(
941 route_path('repo_files_remove_file',
963 route_path('repo_files_remove_file',
942 repo_name=backend.repo_name,
964 repo_name=backend.repo_name,
943 commit_id=backend.default_head_id,
965 commit_id=backend.default_head_id,
944 f_path='vcs/nodes.py'),
966 f_path='vcs/nodes.py'),
945 status=200)
967 status=200)
946
968
947 def test_delete_file_view_not_on_branch(self, backend):
969 def test_delete_file_view_not_on_branch(self, backend):
948 repo = backend.create_repo()
970 repo = backend.create_repo()
949 backend.ensure_file('vcs/nodes.py')
971 backend.ensure_file('vcs/nodes.py')
950
972
951 response = self.app.get(
973 response = self.app.get(
952 route_path('repo_files_remove_file',
974 route_path('repo_files_remove_file',
953 repo_name=repo.repo_name,
975 repo_name=repo.repo_name,
954 commit_id='tip',
976 commit_id='tip',
955 f_path='vcs/nodes.py'),
977 f_path='vcs/nodes.py'),
956 status=302)
978 status=302)
957 assert_session_flash(
979 assert_session_flash(
958 response, 'Cannot modify file. Given commit `tip` is not head of a branch.')
980 response, 'Cannot modify file. Given commit `tip` is not head of a branch.')
959
981
960 def test_delete_file_view_commit_changes(self, backend, csrf_token):
982 def test_delete_file_view_commit_changes(self, backend, csrf_token):
961 repo = backend.create_repo()
983 repo = backend.create_repo()
962 backend.ensure_file("vcs/nodes.py")
984 backend.ensure_file("vcs/nodes.py")
963
985
964 response = self.app.post(
986 response = self.app.post(
965 route_path('repo_files_delete_file',
987 route_path('repo_files_delete_file',
966 repo_name=repo.repo_name,
988 repo_name=repo.repo_name,
967 commit_id=backend.default_head_id,
989 commit_id=backend.default_head_id,
968 f_path='vcs/nodes.py'),
990 f_path='vcs/nodes.py'),
969 params={
991 params={
970 'message': 'i commited',
992 'message': 'i commited',
971 'csrf_token': csrf_token,
993 'csrf_token': csrf_token,
972 },
994 },
973 status=302)
995 status=302)
974 assert_session_flash(
996 assert_session_flash(
975 response, 'Successfully deleted file `vcs/nodes.py`')
997 response, 'Successfully deleted file `vcs/nodes.py`')
976
998
977
999
978 @pytest.mark.usefixtures("app")
1000 @pytest.mark.usefixtures("app")
979 class TestFilesViewOtherCases(object):
1001 class TestFilesViewOtherCases(object):
980
1002
981 def test_access_empty_repo_redirect_to_summary_with_alert_write_perms(
1003 def test_access_empty_repo_redirect_to_summary_with_alert_write_perms(
982 self, backend_stub, autologin_regular_user, user_regular,
1004 self, backend_stub, autologin_regular_user, user_regular,
983 user_util):
1005 user_util):
984
1006
985 repo = backend_stub.create_repo()
1007 repo = backend_stub.create_repo()
986 user_util.grant_user_permission_to_repo(
1008 user_util.grant_user_permission_to_repo(
987 repo, user_regular, 'repository.write')
1009 repo, user_regular, 'repository.write')
988 response = self.app.get(
1010 response = self.app.get(
989 route_path('repo_files',
1011 route_path('repo_files',
990 repo_name=repo.repo_name,
1012 repo_name=repo.repo_name,
991 commit_id='tip', f_path='/'))
1013 commit_id='tip', f_path='/'))
992
1014
993 repo_file_add_url = route_path(
1015 repo_file_add_url = route_path(
994 'repo_files_add_file',
1016 'repo_files_add_file',
995 repo_name=repo.repo_name,
1017 repo_name=repo.repo_name,
996 commit_id=0, f_path='')
1018 commit_id=0, f_path='')
997
1019
998 assert_session_flash(
1020 assert_session_flash(
999 response,
1021 response,
1000 'There are no files yet. <a class="alert-link" '
1022 'There are no files yet. <a class="alert-link" '
1001 'href="{}">Click here to add a new file.</a>'
1023 'href="{}">Click here to add a new file.</a>'
1002 .format(repo_file_add_url))
1024 .format(repo_file_add_url))
1003
1025
1004 def test_access_empty_repo_redirect_to_summary_with_alert_no_write_perms(
1026 def test_access_empty_repo_redirect_to_summary_with_alert_no_write_perms(
1005 self, backend_stub, autologin_regular_user):
1027 self, backend_stub, autologin_regular_user):
1006 repo = backend_stub.create_repo()
1028 repo = backend_stub.create_repo()
1007 # init session for anon user
1029 # init session for anon user
1008 route_path('repo_summary', repo_name=repo.repo_name)
1030 route_path('repo_summary', repo_name=repo.repo_name)
1009
1031
1010 repo_file_add_url = route_path(
1032 repo_file_add_url = route_path(
1011 'repo_files_add_file',
1033 'repo_files_add_file',
1012 repo_name=repo.repo_name,
1034 repo_name=repo.repo_name,
1013 commit_id=0, f_path='')
1035 commit_id=0, f_path='')
1014
1036
1015 response = self.app.get(
1037 response = self.app.get(
1016 route_path('repo_files',
1038 route_path('repo_files',
1017 repo_name=repo.repo_name,
1039 repo_name=repo.repo_name,
1018 commit_id='tip', f_path='/'))
1040 commit_id='tip', f_path='/'))
1019
1041
1020 assert_session_flash(response, no_=repo_file_add_url)
1042 assert_session_flash(response, no_=repo_file_add_url)
1021
1043
1022 @pytest.mark.parametrize('file_node', [
1044 @pytest.mark.parametrize('file_node', [
1023 'archive/file.zip',
1045 'archive/file.zip',
1024 'diff/my-file.txt',
1046 'diff/my-file.txt',
1025 'render.py',
1047 'render.py',
1026 'render',
1048 'render',
1027 'remove_file',
1049 'remove_file',
1028 'remove_file/to-delete.txt',
1050 'remove_file/to-delete.txt',
1029 ])
1051 ])
1030 def test_file_names_equal_to_routes_parts(self, backend, file_node):
1052 def test_file_names_equal_to_routes_parts(self, backend, file_node):
1031 backend.create_repo()
1053 backend.create_repo()
1032 backend.ensure_file(file_node)
1054 backend.ensure_file(file_node)
1033
1055
1034 self.app.get(
1056 self.app.get(
1035 route_path('repo_files',
1057 route_path('repo_files',
1036 repo_name=backend.repo_name,
1058 repo_name=backend.repo_name,
1037 commit_id='tip', f_path=file_node),
1059 commit_id='tip', f_path=file_node),
1038 status=200)
1060 status=200)
1039
1061
1040
1062
1041 class TestAdjustFilePathForSvn(object):
1063 class TestAdjustFilePathForSvn(object):
1042 """
1064 """
1043 SVN specific adjustments of node history in RepoFilesView.
1065 SVN specific adjustments of node history in RepoFilesView.
1044 """
1066 """
1045
1067
1046 def test_returns_path_relative_to_matched_reference(self):
1068 def test_returns_path_relative_to_matched_reference(self):
1047 repo = self._repo(branches=['trunk'])
1069 repo = self._repo(branches=['trunk'])
1048 self.assert_file_adjustment('trunk/file', 'file', repo)
1070 self.assert_file_adjustment('trunk/file', 'file', repo)
1049
1071
1050 def test_does_not_modify_file_if_no_reference_matches(self):
1072 def test_does_not_modify_file_if_no_reference_matches(self):
1051 repo = self._repo(branches=['trunk'])
1073 repo = self._repo(branches=['trunk'])
1052 self.assert_file_adjustment('notes/file', 'notes/file', repo)
1074 self.assert_file_adjustment('notes/file', 'notes/file', repo)
1053
1075
1054 def test_does_not_adjust_partial_directory_names(self):
1076 def test_does_not_adjust_partial_directory_names(self):
1055 repo = self._repo(branches=['trun'])
1077 repo = self._repo(branches=['trun'])
1056 self.assert_file_adjustment('trunk/file', 'trunk/file', repo)
1078 self.assert_file_adjustment('trunk/file', 'trunk/file', repo)
1057
1079
1058 def test_is_robust_to_patterns_which_prefix_other_patterns(self):
1080 def test_is_robust_to_patterns_which_prefix_other_patterns(self):
1059 repo = self._repo(branches=['trunk', 'trunk/new', 'trunk/old'])
1081 repo = self._repo(branches=['trunk', 'trunk/new', 'trunk/old'])
1060 self.assert_file_adjustment('trunk/new/file', 'file', repo)
1082 self.assert_file_adjustment('trunk/new/file', 'file', repo)
1061
1083
1062 def assert_file_adjustment(self, f_path, expected, repo):
1084 def assert_file_adjustment(self, f_path, expected, repo):
1063 result = RepoFilesView.adjust_file_path_for_svn(f_path, repo)
1085 result = RepoFilesView.adjust_file_path_for_svn(f_path, repo)
1064 assert result == expected
1086 assert result == expected
1065
1087
1066 def _repo(self, branches=None):
1088 def _repo(self, branches=None):
1067 repo = mock.Mock()
1089 repo = mock.Mock()
1068 repo.branches = OrderedDict((name, '0') for name in branches or [])
1090 repo.branches = OrderedDict((name, '0') for name in branches or [])
1069 repo.tags = {}
1091 repo.tags = {}
1070 return repo
1092 return repo
@@ -1,358 +1,358 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21
21
22 import logging
22 import logging
23
23
24 from pyramid.httpexceptions import HTTPNotFound, HTTPFound
24 from pyramid.httpexceptions import HTTPNotFound, HTTPFound
25
25
26 from pyramid.renderers import render
26 from pyramid.renderers import render
27 from pyramid.response import Response
27 from pyramid.response import Response
28
28
29 from rhodecode.apps._base import RepoAppView
29 from rhodecode.apps._base import RepoAppView
30 import rhodecode.lib.helpers as h
30 import rhodecode.lib.helpers as h
31 from rhodecode.lib.auth import (
31 from rhodecode.lib.auth import (
32 LoginRequired, HasRepoPermissionAnyDecorator)
32 LoginRequired, HasRepoPermissionAnyDecorator)
33
33
34 from rhodecode.lib.ext_json import json
34 from rhodecode.lib.ext_json import json
35 from rhodecode.lib.graphmod import _colored, _dagwalker
35 from rhodecode.lib.graphmod import _colored, _dagwalker
36 from rhodecode.lib.helpers import RepoPage
36 from rhodecode.lib.helpers import RepoPage
37 from rhodecode.lib.utils2 import safe_int, safe_str, str2bool
37 from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode
38 from rhodecode.lib.vcs.exceptions import (
38 from rhodecode.lib.vcs.exceptions import (
39 RepositoryError, CommitDoesNotExistError,
39 RepositoryError, CommitDoesNotExistError,
40 CommitError, NodeDoesNotExistError, EmptyRepositoryError)
40 CommitError, NodeDoesNotExistError, EmptyRepositoryError)
41
41
42 log = logging.getLogger(__name__)
42 log = logging.getLogger(__name__)
43
43
44 DEFAULT_CHANGELOG_SIZE = 20
44 DEFAULT_CHANGELOG_SIZE = 20
45
45
46
46
47 class RepoChangelogView(RepoAppView):
47 class RepoChangelogView(RepoAppView):
48
48
49 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
49 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
50 """
50 """
51 This is a safe way to get commit. If an error occurs it redirects to
51 This is a safe way to get commit. If an error occurs it redirects to
52 tip with proper message
52 tip with proper message
53
53
54 :param commit_id: id of commit to fetch
54 :param commit_id: id of commit to fetch
55 :param redirect_after: toggle redirection
55 :param redirect_after: toggle redirection
56 """
56 """
57 _ = self.request.translate
57 _ = self.request.translate
58
58
59 try:
59 try:
60 return self.rhodecode_vcs_repo.get_commit(commit_id)
60 return self.rhodecode_vcs_repo.get_commit(commit_id)
61 except EmptyRepositoryError:
61 except EmptyRepositoryError:
62 if not redirect_after:
62 if not redirect_after:
63 return None
63 return None
64
64
65 h.flash(h.literal(
65 h.flash(h.literal(
66 _('There are no commits yet')), category='warning')
66 _('There are no commits yet')), category='warning')
67 raise HTTPFound(
67 raise HTTPFound(
68 h.route_path('repo_summary', repo_name=self.db_repo_name))
68 h.route_path('repo_summary', repo_name=self.db_repo_name))
69
69
70 except (CommitDoesNotExistError, LookupError):
70 except (CommitDoesNotExistError, LookupError):
71 msg = _('No such commit exists for this repository')
71 msg = _('No such commit exists for this repository')
72 h.flash(msg, category='error')
72 h.flash(msg, category='error')
73 raise HTTPNotFound()
73 raise HTTPNotFound()
74 except RepositoryError as e:
74 except RepositoryError as e:
75 h.flash(safe_str(h.escape(e)), category='error')
75 h.flash(safe_str(h.escape(e)), category='error')
76 raise HTTPNotFound()
76 raise HTTPNotFound()
77
77
78 def _graph(self, repo, commits, prev_data=None, next_data=None):
78 def _graph(self, repo, commits, prev_data=None, next_data=None):
79 """
79 """
80 Generates a DAG graph for repo
80 Generates a DAG graph for repo
81
81
82 :param repo: repo instance
82 :param repo: repo instance
83 :param commits: list of commits
83 :param commits: list of commits
84 """
84 """
85 if not commits:
85 if not commits:
86 return json.dumps([]), json.dumps([])
86 return json.dumps([]), json.dumps([])
87
87
88 def serialize(commit, parents=True):
88 def serialize(commit, parents=True):
89 data = dict(
89 data = dict(
90 raw_id=commit.raw_id,
90 raw_id=commit.raw_id,
91 idx=commit.idx,
91 idx=commit.idx,
92 branch=None,
92 branch=None,
93 )
93 )
94 if parents:
94 if parents:
95 data['parents'] = [
95 data['parents'] = [
96 serialize(x, parents=False) for x in commit.parents]
96 serialize(x, parents=False) for x in commit.parents]
97 return data
97 return data
98
98
99 prev_data = prev_data or []
99 prev_data = prev_data or []
100 next_data = next_data or []
100 next_data = next_data or []
101
101
102 current = [serialize(x) for x in commits]
102 current = [serialize(x) for x in commits]
103 commits = prev_data + current + next_data
103 commits = prev_data + current + next_data
104
104
105 dag = _dagwalker(repo, commits)
105 dag = _dagwalker(repo, commits)
106
106
107 data = [[commit_id, vtx, edges, branch]
107 data = [[commit_id, vtx, edges, branch]
108 for commit_id, vtx, edges, branch in _colored(dag)]
108 for commit_id, vtx, edges, branch in _colored(dag)]
109 return json.dumps(data), json.dumps(current)
109 return json.dumps(data), json.dumps(current)
110
110
111 def _check_if_valid_branch(self, branch_name, repo_name, f_path):
111 def _check_if_valid_branch(self, branch_name, repo_name, f_path):
112 if branch_name not in self.rhodecode_vcs_repo.branches_all:
112 if branch_name not in self.rhodecode_vcs_repo.branches_all:
113 h.flash('Branch {} is not found.'.format(h.escape(branch_name)),
113 h.flash(u'Branch {} is not found.'.format(h.escape(safe_unicode(branch_name))),
114 category='warning')
114 category='warning')
115 redirect_url = h.route_path(
115 redirect_url = h.route_path(
116 'repo_commits_file', repo_name=repo_name,
116 'repo_commits_file', repo_name=repo_name,
117 commit_id=branch_name, f_path=f_path or '')
117 commit_id=branch_name, f_path=f_path or '')
118 raise HTTPFound(redirect_url)
118 raise HTTPFound(redirect_url)
119
119
120 def _load_changelog_data(
120 def _load_changelog_data(
121 self, c, collection, page, chunk_size, branch_name=None,
121 self, c, collection, page, chunk_size, branch_name=None,
122 dynamic=False, f_path=None, commit_id=None):
122 dynamic=False, f_path=None, commit_id=None):
123
123
124 def url_generator(page_num):
124 def url_generator(page_num):
125 query_params = {
125 query_params = {
126 'page': page_num
126 'page': page_num
127 }
127 }
128
128
129 if branch_name:
129 if branch_name:
130 query_params.update({
130 query_params.update({
131 'branch': branch_name
131 'branch': branch_name
132 })
132 })
133
133
134 if f_path:
134 if f_path:
135 # changelog for file
135 # changelog for file
136 return h.route_path(
136 return h.route_path(
137 'repo_commits_file',
137 'repo_commits_file',
138 repo_name=c.rhodecode_db_repo.repo_name,
138 repo_name=c.rhodecode_db_repo.repo_name,
139 commit_id=commit_id, f_path=f_path,
139 commit_id=commit_id, f_path=f_path,
140 _query=query_params)
140 _query=query_params)
141 else:
141 else:
142 return h.route_path(
142 return h.route_path(
143 'repo_commits',
143 'repo_commits',
144 repo_name=c.rhodecode_db_repo.repo_name, _query=query_params)
144 repo_name=c.rhodecode_db_repo.repo_name, _query=query_params)
145
145
146 c.total_cs = len(collection)
146 c.total_cs = len(collection)
147 c.showing_commits = min(chunk_size, c.total_cs)
147 c.showing_commits = min(chunk_size, c.total_cs)
148 c.pagination = RepoPage(collection, page=page, item_count=c.total_cs,
148 c.pagination = RepoPage(collection, page=page, item_count=c.total_cs,
149 items_per_page=chunk_size, url_maker=url_generator)
149 items_per_page=chunk_size, url_maker=url_generator)
150
150
151 c.next_page = c.pagination.next_page
151 c.next_page = c.pagination.next_page
152 c.prev_page = c.pagination.previous_page
152 c.prev_page = c.pagination.previous_page
153
153
154 if dynamic:
154 if dynamic:
155 if self.request.GET.get('chunk') != 'next':
155 if self.request.GET.get('chunk') != 'next':
156 c.next_page = None
156 c.next_page = None
157 if self.request.GET.get('chunk') != 'prev':
157 if self.request.GET.get('chunk') != 'prev':
158 c.prev_page = None
158 c.prev_page = None
159
159
160 page_commit_ids = [x.raw_id for x in c.pagination]
160 page_commit_ids = [x.raw_id for x in c.pagination]
161 c.comments = c.rhodecode_db_repo.get_comments(page_commit_ids)
161 c.comments = c.rhodecode_db_repo.get_comments(page_commit_ids)
162 c.statuses = c.rhodecode_db_repo.statuses(page_commit_ids)
162 c.statuses = c.rhodecode_db_repo.statuses(page_commit_ids)
163
163
164 def load_default_context(self):
164 def load_default_context(self):
165 c = self._get_local_tmpl_context(include_app_defaults=True)
165 c = self._get_local_tmpl_context(include_app_defaults=True)
166
166
167 c.rhodecode_repo = self.rhodecode_vcs_repo
167 c.rhodecode_repo = self.rhodecode_vcs_repo
168
168
169 return c
169 return c
170
170
171 def _get_preload_attrs(self):
171 def _get_preload_attrs(self):
172 pre_load = ['author', 'branch', 'date', 'message', 'parents',
172 pre_load = ['author', 'branch', 'date', 'message', 'parents',
173 'obsolete', 'phase', 'hidden']
173 'obsolete', 'phase', 'hidden']
174 return pre_load
174 return pre_load
175
175
176 @LoginRequired()
176 @LoginRequired()
177 @HasRepoPermissionAnyDecorator(
177 @HasRepoPermissionAnyDecorator(
178 'repository.read', 'repository.write', 'repository.admin')
178 'repository.read', 'repository.write', 'repository.admin')
179 def repo_changelog(self):
179 def repo_changelog(self):
180 c = self.load_default_context()
180 c = self.load_default_context()
181
181
182 commit_id = self.request.matchdict.get('commit_id')
182 commit_id = self.request.matchdict.get('commit_id')
183 f_path = self._get_f_path(self.request.matchdict)
183 f_path = self._get_f_path(self.request.matchdict)
184 show_hidden = str2bool(self.request.GET.get('evolve'))
184 show_hidden = str2bool(self.request.GET.get('evolve'))
185
185
186 chunk_size = 20
186 chunk_size = 20
187
187
188 c.branch_name = branch_name = self.request.GET.get('branch') or ''
188 c.branch_name = branch_name = self.request.GET.get('branch') or ''
189 c.book_name = book_name = self.request.GET.get('bookmark') or ''
189 c.book_name = book_name = self.request.GET.get('bookmark') or ''
190 c.f_path = f_path
190 c.f_path = f_path
191 c.commit_id = commit_id
191 c.commit_id = commit_id
192 c.show_hidden = show_hidden
192 c.show_hidden = show_hidden
193
193
194 hist_limit = safe_int(self.request.GET.get('limit')) or None
194 hist_limit = safe_int(self.request.GET.get('limit')) or None
195
195
196 p = safe_int(self.request.GET.get('page', 1), 1)
196 p = safe_int(self.request.GET.get('page', 1), 1)
197
197
198 c.selected_name = branch_name or book_name
198 c.selected_name = branch_name or book_name
199 if not commit_id and branch_name:
199 if not commit_id and branch_name:
200 self._check_if_valid_branch(branch_name, self.db_repo_name, f_path)
200 self._check_if_valid_branch(branch_name, self.db_repo_name, f_path)
201
201
202 c.changelog_for_path = f_path
202 c.changelog_for_path = f_path
203 pre_load = self._get_preload_attrs()
203 pre_load = self._get_preload_attrs()
204
204
205 partial_xhr = self.request.environ.get('HTTP_X_PARTIAL_XHR')
205 partial_xhr = self.request.environ.get('HTTP_X_PARTIAL_XHR')
206
206
207 try:
207 try:
208 if f_path:
208 if f_path:
209 log.debug('generating changelog for path %s', f_path)
209 log.debug('generating changelog for path %s', f_path)
210 # get the history for the file !
210 # get the history for the file !
211 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
211 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
212
212
213 try:
213 try:
214 collection = base_commit.get_path_history(
214 collection = base_commit.get_path_history(
215 f_path, limit=hist_limit, pre_load=pre_load)
215 f_path, limit=hist_limit, pre_load=pre_load)
216 if collection and partial_xhr:
216 if collection and partial_xhr:
217 # for ajax call we remove first one since we're looking
217 # for ajax call we remove first one since we're looking
218 # at it right now in the context of a file commit
218 # at it right now in the context of a file commit
219 collection.pop(0)
219 collection.pop(0)
220 except (NodeDoesNotExistError, CommitError):
220 except (NodeDoesNotExistError, CommitError):
221 # this node is not present at tip!
221 # this node is not present at tip!
222 try:
222 try:
223 commit = self._get_commit_or_redirect(commit_id)
223 commit = self._get_commit_or_redirect(commit_id)
224 collection = commit.get_path_history(f_path)
224 collection = commit.get_path_history(f_path)
225 except RepositoryError as e:
225 except RepositoryError as e:
226 h.flash(safe_str(e), category='warning')
226 h.flash(safe_str(e), category='warning')
227 redirect_url = h.route_path(
227 redirect_url = h.route_path(
228 'repo_commits', repo_name=self.db_repo_name)
228 'repo_commits', repo_name=self.db_repo_name)
229 raise HTTPFound(redirect_url)
229 raise HTTPFound(redirect_url)
230 collection = list(reversed(collection))
230 collection = list(reversed(collection))
231 else:
231 else:
232 collection = self.rhodecode_vcs_repo.get_commits(
232 collection = self.rhodecode_vcs_repo.get_commits(
233 branch_name=branch_name, show_hidden=show_hidden,
233 branch_name=branch_name, show_hidden=show_hidden,
234 pre_load=pre_load, translate_tags=False)
234 pre_load=pre_load, translate_tags=False)
235
235
236 self._load_changelog_data(
236 self._load_changelog_data(
237 c, collection, p, chunk_size, c.branch_name,
237 c, collection, p, chunk_size, c.branch_name,
238 f_path=f_path, commit_id=commit_id)
238 f_path=f_path, commit_id=commit_id)
239
239
240 except EmptyRepositoryError as e:
240 except EmptyRepositoryError as e:
241 h.flash(safe_str(h.escape(e)), category='warning')
241 h.flash(safe_str(h.escape(e)), category='warning')
242 raise HTTPFound(
242 raise HTTPFound(
243 h.route_path('repo_summary', repo_name=self.db_repo_name))
243 h.route_path('repo_summary', repo_name=self.db_repo_name))
244 except HTTPFound:
244 except HTTPFound:
245 raise
245 raise
246 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
246 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
247 log.exception(safe_str(e))
247 log.exception(safe_str(e))
248 h.flash(safe_str(h.escape(e)), category='error')
248 h.flash(safe_str(h.escape(e)), category='error')
249
249
250 if commit_id:
250 if commit_id:
251 # from single commit page, we redirect to main commits
251 # from single commit page, we redirect to main commits
252 raise HTTPFound(
252 raise HTTPFound(
253 h.route_path('repo_commits', repo_name=self.db_repo_name))
253 h.route_path('repo_commits', repo_name=self.db_repo_name))
254 else:
254 else:
255 # otherwise we redirect to summary
255 # otherwise we redirect to summary
256 raise HTTPFound(
256 raise HTTPFound(
257 h.route_path('repo_summary', repo_name=self.db_repo_name))
257 h.route_path('repo_summary', repo_name=self.db_repo_name))
258
258
259 if partial_xhr or self.request.environ.get('HTTP_X_PJAX'):
259 if partial_xhr or self.request.environ.get('HTTP_X_PJAX'):
260 # case when loading dynamic file history in file view
260 # case when loading dynamic file history in file view
261 # loading from ajax, we don't want the first result, it's popped
261 # loading from ajax, we don't want the first result, it's popped
262 # in the code above
262 # in the code above
263 html = render(
263 html = render(
264 'rhodecode:templates/commits/changelog_file_history.mako',
264 'rhodecode:templates/commits/changelog_file_history.mako',
265 self._get_template_context(c), self.request)
265 self._get_template_context(c), self.request)
266 return Response(html)
266 return Response(html)
267
267
268 commit_ids = []
268 commit_ids = []
269 if not f_path:
269 if not f_path:
270 # only load graph data when not in file history mode
270 # only load graph data when not in file history mode
271 commit_ids = c.pagination
271 commit_ids = c.pagination
272
272
273 c.graph_data, c.graph_commits = self._graph(
273 c.graph_data, c.graph_commits = self._graph(
274 self.rhodecode_vcs_repo, commit_ids)
274 self.rhodecode_vcs_repo, commit_ids)
275
275
276 return self._get_template_context(c)
276 return self._get_template_context(c)
277
277
278 @LoginRequired()
278 @LoginRequired()
279 @HasRepoPermissionAnyDecorator(
279 @HasRepoPermissionAnyDecorator(
280 'repository.read', 'repository.write', 'repository.admin')
280 'repository.read', 'repository.write', 'repository.admin')
281 def repo_commits_elements(self):
281 def repo_commits_elements(self):
282 c = self.load_default_context()
282 c = self.load_default_context()
283 commit_id = self.request.matchdict.get('commit_id')
283 commit_id = self.request.matchdict.get('commit_id')
284 f_path = self._get_f_path(self.request.matchdict)
284 f_path = self._get_f_path(self.request.matchdict)
285 show_hidden = str2bool(self.request.GET.get('evolve'))
285 show_hidden = str2bool(self.request.GET.get('evolve'))
286
286
287 chunk_size = 20
287 chunk_size = 20
288 hist_limit = safe_int(self.request.GET.get('limit')) or None
288 hist_limit = safe_int(self.request.GET.get('limit')) or None
289
289
290 def wrap_for_error(err):
290 def wrap_for_error(err):
291 html = '<tr>' \
291 html = '<tr>' \
292 '<td colspan="9" class="alert alert-error">ERROR: {}</td>' \
292 '<td colspan="9" class="alert alert-error">ERROR: {}</td>' \
293 '</tr>'.format(err)
293 '</tr>'.format(err)
294 return Response(html)
294 return Response(html)
295
295
296 c.branch_name = branch_name = self.request.GET.get('branch') or ''
296 c.branch_name = branch_name = self.request.GET.get('branch') or ''
297 c.book_name = book_name = self.request.GET.get('bookmark') or ''
297 c.book_name = book_name = self.request.GET.get('bookmark') or ''
298 c.f_path = f_path
298 c.f_path = f_path
299 c.commit_id = commit_id
299 c.commit_id = commit_id
300 c.show_hidden = show_hidden
300 c.show_hidden = show_hidden
301
301
302 c.selected_name = branch_name or book_name
302 c.selected_name = branch_name or book_name
303 if branch_name and branch_name not in self.rhodecode_vcs_repo.branches_all:
303 if branch_name and branch_name not in self.rhodecode_vcs_repo.branches_all:
304 return wrap_for_error(
304 return wrap_for_error(
305 safe_str('Branch: {} is not valid'.format(branch_name)))
305 safe_str('Branch: {} is not valid'.format(branch_name)))
306
306
307 pre_load = self._get_preload_attrs()
307 pre_load = self._get_preload_attrs()
308
308
309 if f_path:
309 if f_path:
310 try:
310 try:
311 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
311 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
312 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
312 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
313 log.exception(safe_str(e))
313 log.exception(safe_str(e))
314 raise HTTPFound(
314 raise HTTPFound(
315 h.route_path('repo_commits', repo_name=self.db_repo_name))
315 h.route_path('repo_commits', repo_name=self.db_repo_name))
316
316
317 collection = base_commit.get_path_history(
317 collection = base_commit.get_path_history(
318 f_path, limit=hist_limit, pre_load=pre_load)
318 f_path, limit=hist_limit, pre_load=pre_load)
319 collection = list(reversed(collection))
319 collection = list(reversed(collection))
320 else:
320 else:
321 collection = self.rhodecode_vcs_repo.get_commits(
321 collection = self.rhodecode_vcs_repo.get_commits(
322 branch_name=branch_name, show_hidden=show_hidden, pre_load=pre_load,
322 branch_name=branch_name, show_hidden=show_hidden, pre_load=pre_load,
323 translate_tags=False)
323 translate_tags=False)
324
324
325 p = safe_int(self.request.GET.get('page', 1), 1)
325 p = safe_int(self.request.GET.get('page', 1), 1)
326 try:
326 try:
327 self._load_changelog_data(
327 self._load_changelog_data(
328 c, collection, p, chunk_size, dynamic=True,
328 c, collection, p, chunk_size, dynamic=True,
329 f_path=f_path, commit_id=commit_id)
329 f_path=f_path, commit_id=commit_id)
330 except EmptyRepositoryError as e:
330 except EmptyRepositoryError as e:
331 return wrap_for_error(safe_str(e))
331 return wrap_for_error(safe_str(e))
332 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
332 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
333 log.exception('Failed to fetch commits')
333 log.exception('Failed to fetch commits')
334 return wrap_for_error(safe_str(e))
334 return wrap_for_error(safe_str(e))
335
335
336 prev_data = None
336 prev_data = None
337 next_data = None
337 next_data = None
338
338
339 try:
339 try:
340 prev_graph = json.loads(self.request.POST.get('graph') or '{}')
340 prev_graph = json.loads(self.request.POST.get('graph') or '{}')
341 except json.JSONDecodeError:
341 except json.JSONDecodeError:
342 prev_graph = {}
342 prev_graph = {}
343
343
344 if self.request.GET.get('chunk') == 'prev':
344 if self.request.GET.get('chunk') == 'prev':
345 next_data = prev_graph
345 next_data = prev_graph
346 elif self.request.GET.get('chunk') == 'next':
346 elif self.request.GET.get('chunk') == 'next':
347 prev_data = prev_graph
347 prev_data = prev_graph
348
348
349 commit_ids = []
349 commit_ids = []
350 if not f_path:
350 if not f_path:
351 # only load graph data when not in file history mode
351 # only load graph data when not in file history mode
352 commit_ids = c.pagination
352 commit_ids = c.pagination
353
353
354 c.graph_data, c.graph_commits = self._graph(
354 c.graph_data, c.graph_commits = self._graph(
355 self.rhodecode_vcs_repo, commit_ids,
355 self.rhodecode_vcs_repo, commit_ids,
356 prev_data=prev_data, next_data=next_data)
356 prev_data=prev_data, next_data=next_data)
357
357
358 return self._get_template_context(c)
358 return self._get_template_context(c)
@@ -1,809 +1,813 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import collections
22 import collections
23
23
24 from pyramid.httpexceptions import (
24 from pyramid.httpexceptions import (
25 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
25 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
26 from pyramid.renderers import render
26 from pyramid.renderers import render
27 from pyramid.response import Response
27 from pyramid.response import Response
28
28
29 from rhodecode.apps._base import RepoAppView
29 from rhodecode.apps._base import RepoAppView
30 from rhodecode.apps.file_store import utils as store_utils
30 from rhodecode.apps.file_store import utils as store_utils
31 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
31 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
32
32
33 from rhodecode.lib import diffs, codeblocks, channelstream
33 from rhodecode.lib import diffs, codeblocks, channelstream
34 from rhodecode.lib.auth import (
34 from rhodecode.lib.auth import (
35 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
35 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
36 from rhodecode.lib.ext_json import json
36 from rhodecode.lib.ext_json import json
37 from rhodecode.lib.compat import OrderedDict
37 from rhodecode.lib.compat import OrderedDict
38 from rhodecode.lib.diffs import (
38 from rhodecode.lib.diffs import (
39 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
39 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
40 get_diff_whitespace_flag)
40 get_diff_whitespace_flag)
41 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
41 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
42 import rhodecode.lib.helpers as h
42 import rhodecode.lib.helpers as h
43 from rhodecode.lib.utils2 import safe_unicode, str2bool, StrictAttributeDict, safe_str
43 from rhodecode.lib.utils2 import safe_unicode, str2bool, StrictAttributeDict, safe_str
44 from rhodecode.lib.vcs.backends.base import EmptyCommit
44 from rhodecode.lib.vcs.backends.base import EmptyCommit
45 from rhodecode.lib.vcs.exceptions import (
45 from rhodecode.lib.vcs.exceptions import (
46 RepositoryError, CommitDoesNotExistError)
46 RepositoryError, CommitDoesNotExistError)
47 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
47 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
48 ChangesetCommentHistory
48 ChangesetCommentHistory
49 from rhodecode.model.changeset_status import ChangesetStatusModel
49 from rhodecode.model.changeset_status import ChangesetStatusModel
50 from rhodecode.model.comment import CommentsModel
50 from rhodecode.model.comment import CommentsModel
51 from rhodecode.model.meta import Session
51 from rhodecode.model.meta import Session
52 from rhodecode.model.settings import VcsSettingsModel
52 from rhodecode.model.settings import VcsSettingsModel
53
53
54 log = logging.getLogger(__name__)
54 log = logging.getLogger(__name__)
55
55
56
56
57 def _update_with_GET(params, request):
57 def _update_with_GET(params, request):
58 for k in ['diff1', 'diff2', 'diff']:
58 for k in ['diff1', 'diff2', 'diff']:
59 params[k] += request.GET.getall(k)
59 params[k] += request.GET.getall(k)
60
60
61
61
62 class RepoCommitsView(RepoAppView):
62 class RepoCommitsView(RepoAppView):
63 def load_default_context(self):
63 def load_default_context(self):
64 c = self._get_local_tmpl_context(include_app_defaults=True)
64 c = self._get_local_tmpl_context(include_app_defaults=True)
65 c.rhodecode_repo = self.rhodecode_vcs_repo
65 c.rhodecode_repo = self.rhodecode_vcs_repo
66
66
67 return c
67 return c
68
68
69 def _is_diff_cache_enabled(self, target_repo):
69 def _is_diff_cache_enabled(self, target_repo):
70 caching_enabled = self._get_general_setting(
70 caching_enabled = self._get_general_setting(
71 target_repo, 'rhodecode_diff_cache')
71 target_repo, 'rhodecode_diff_cache')
72 log.debug('Diff caching enabled: %s', caching_enabled)
72 log.debug('Diff caching enabled: %s', caching_enabled)
73 return caching_enabled
73 return caching_enabled
74
74
75 def _commit(self, commit_id_range, method):
75 def _commit(self, commit_id_range, method):
76 _ = self.request.translate
76 _ = self.request.translate
77 c = self.load_default_context()
77 c = self.load_default_context()
78 c.fulldiff = self.request.GET.get('fulldiff')
78 c.fulldiff = self.request.GET.get('fulldiff')
79 redirect_to_combined = str2bool(self.request.GET.get('redirect_combined'))
79 redirect_to_combined = str2bool(self.request.GET.get('redirect_combined'))
80
80
81 # fetch global flags of ignore ws or context lines
81 # fetch global flags of ignore ws or context lines
82 diff_context = get_diff_context(self.request)
82 diff_context = get_diff_context(self.request)
83 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
83 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
84
84
85 # diff_limit will cut off the whole diff if the limit is applied
85 # diff_limit will cut off the whole diff if the limit is applied
86 # otherwise it will just hide the big files from the front-end
86 # otherwise it will just hide the big files from the front-end
87 diff_limit = c.visual.cut_off_limit_diff
87 diff_limit = c.visual.cut_off_limit_diff
88 file_limit = c.visual.cut_off_limit_file
88 file_limit = c.visual.cut_off_limit_file
89
89
90 # get ranges of commit ids if preset
90 # get ranges of commit ids if preset
91 commit_range = commit_id_range.split('...')[:2]
91 commit_range = commit_id_range.split('...')[:2]
92
92
93 try:
93 try:
94 pre_load = ['affected_files', 'author', 'branch', 'date',
94 pre_load = ['affected_files', 'author', 'branch', 'date',
95 'message', 'parents']
95 'message', 'parents']
96 if self.rhodecode_vcs_repo.alias == 'hg':
96 if self.rhodecode_vcs_repo.alias == 'hg':
97 pre_load += ['hidden', 'obsolete', 'phase']
97 pre_load += ['hidden', 'obsolete', 'phase']
98
98
99 if len(commit_range) == 2:
99 if len(commit_range) == 2:
100 commits = self.rhodecode_vcs_repo.get_commits(
100 commits = self.rhodecode_vcs_repo.get_commits(
101 start_id=commit_range[0], end_id=commit_range[1],
101 start_id=commit_range[0], end_id=commit_range[1],
102 pre_load=pre_load, translate_tags=False)
102 pre_load=pre_load, translate_tags=False)
103 commits = list(commits)
103 commits = list(commits)
104 else:
104 else:
105 commits = [self.rhodecode_vcs_repo.get_commit(
105 commits = [self.rhodecode_vcs_repo.get_commit(
106 commit_id=commit_id_range, pre_load=pre_load)]
106 commit_id=commit_id_range, pre_load=pre_load)]
107
107
108 c.commit_ranges = commits
108 c.commit_ranges = commits
109 if not c.commit_ranges:
109 if not c.commit_ranges:
110 raise RepositoryError('The commit range returned an empty result')
110 raise RepositoryError('The commit range returned an empty result')
111 except CommitDoesNotExistError as e:
111 except CommitDoesNotExistError as e:
112 msg = _('No such commit exists. Org exception: `{}`').format(safe_str(e))
112 msg = _('No such commit exists. Org exception: `{}`').format(safe_str(e))
113 h.flash(msg, category='error')
113 h.flash(msg, category='error')
114 raise HTTPNotFound()
114 raise HTTPNotFound()
115 except Exception:
115 except Exception:
116 log.exception("General failure")
116 log.exception("General failure")
117 raise HTTPNotFound()
117 raise HTTPNotFound()
118 single_commit = len(c.commit_ranges) == 1
118 single_commit = len(c.commit_ranges) == 1
119
119
120 if redirect_to_combined and not single_commit:
120 if redirect_to_combined and not single_commit:
121 source_ref = getattr(c.commit_ranges[0].parents[0]
121 source_ref = getattr(c.commit_ranges[0].parents[0]
122 if c.commit_ranges[0].parents else h.EmptyCommit(), 'raw_id')
122 if c.commit_ranges[0].parents else h.EmptyCommit(), 'raw_id')
123 target_ref = c.commit_ranges[-1].raw_id
123 target_ref = c.commit_ranges[-1].raw_id
124 next_url = h.route_path(
124 next_url = h.route_path(
125 'repo_compare',
125 'repo_compare',
126 repo_name=c.repo_name,
126 repo_name=c.repo_name,
127 source_ref_type='rev',
127 source_ref_type='rev',
128 source_ref=source_ref,
128 source_ref=source_ref,
129 target_ref_type='rev',
129 target_ref_type='rev',
130 target_ref=target_ref)
130 target_ref=target_ref)
131 raise HTTPFound(next_url)
131 raise HTTPFound(next_url)
132
132
133 c.changes = OrderedDict()
133 c.changes = OrderedDict()
134 c.lines_added = 0
134 c.lines_added = 0
135 c.lines_deleted = 0
135 c.lines_deleted = 0
136
136
137 # auto collapse if we have more than limit
137 # auto collapse if we have more than limit
138 collapse_limit = diffs.DiffProcessor._collapse_commits_over
138 collapse_limit = diffs.DiffProcessor._collapse_commits_over
139 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
139 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
140
140
141 c.commit_statuses = ChangesetStatus.STATUSES
141 c.commit_statuses = ChangesetStatus.STATUSES
142 c.inline_comments = []
142 c.inline_comments = []
143 c.files = []
143 c.files = []
144
144
145 c.comments = []
145 c.comments = []
146 c.unresolved_comments = []
146 c.unresolved_comments = []
147 c.resolved_comments = []
147 c.resolved_comments = []
148
148
149 # Single commit
149 # Single commit
150 if single_commit:
150 if single_commit:
151 commit = c.commit_ranges[0]
151 commit = c.commit_ranges[0]
152 c.comments = CommentsModel().get_comments(
152 c.comments = CommentsModel().get_comments(
153 self.db_repo.repo_id,
153 self.db_repo.repo_id,
154 revision=commit.raw_id)
154 revision=commit.raw_id)
155
155
156 # comments from PR
156 # comments from PR
157 statuses = ChangesetStatusModel().get_statuses(
157 statuses = ChangesetStatusModel().get_statuses(
158 self.db_repo.repo_id, commit.raw_id,
158 self.db_repo.repo_id, commit.raw_id,
159 with_revisions=True)
159 with_revisions=True)
160
160
161 prs = set()
161 prs = set()
162 reviewers = list()
162 reviewers = list()
163 reviewers_duplicates = set() # to not have duplicates from multiple votes
163 reviewers_duplicates = set() # to not have duplicates from multiple votes
164 for c_status in statuses:
164 for c_status in statuses:
165
165
166 # extract associated pull-requests from votes
166 # extract associated pull-requests from votes
167 if c_status.pull_request:
167 if c_status.pull_request:
168 prs.add(c_status.pull_request)
168 prs.add(c_status.pull_request)
169
169
170 # extract reviewers
170 # extract reviewers
171 _user_id = c_status.author.user_id
171 _user_id = c_status.author.user_id
172 if _user_id not in reviewers_duplicates:
172 if _user_id not in reviewers_duplicates:
173 reviewers.append(
173 reviewers.append(
174 StrictAttributeDict({
174 StrictAttributeDict({
175 'user': c_status.author,
175 'user': c_status.author,
176
176
177 # fake attributed for commit, page that we don't have
177 # fake attributed for commit, page that we don't have
178 # but we share the display with PR page
178 # but we share the display with PR page
179 'mandatory': False,
179 'mandatory': False,
180 'reasons': [],
180 'reasons': [],
181 'rule_user_group_data': lambda: None
181 'rule_user_group_data': lambda: None
182 })
182 })
183 )
183 )
184 reviewers_duplicates.add(_user_id)
184 reviewers_duplicates.add(_user_id)
185
185
186 c.reviewers_count = len(reviewers)
186 c.reviewers_count = len(reviewers)
187 c.observers_count = 0
187 c.observers_count = 0
188
188
189 # from associated statuses, check the pull requests, and
189 # from associated statuses, check the pull requests, and
190 # show comments from them
190 # show comments from them
191 for pr in prs:
191 for pr in prs:
192 c.comments.extend(pr.comments)
192 c.comments.extend(pr.comments)
193
193
194 c.unresolved_comments = CommentsModel()\
194 c.unresolved_comments = CommentsModel()\
195 .get_commit_unresolved_todos(commit.raw_id)
195 .get_commit_unresolved_todos(commit.raw_id)
196 c.resolved_comments = CommentsModel()\
196 c.resolved_comments = CommentsModel()\
197 .get_commit_resolved_todos(commit.raw_id)
197 .get_commit_resolved_todos(commit.raw_id)
198
198
199 c.inline_comments_flat = CommentsModel()\
199 c.inline_comments_flat = CommentsModel()\
200 .get_commit_inline_comments(commit.raw_id)
200 .get_commit_inline_comments(commit.raw_id)
201
201
202 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
202 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
203 statuses, reviewers)
203 statuses, reviewers)
204
204
205 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
205 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
206
206
207 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
207 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
208
208
209 for review_obj, member, reasons, mandatory, status in review_statuses:
209 for review_obj, member, reasons, mandatory, status in review_statuses:
210 member_reviewer = h.reviewer_as_json(
210 member_reviewer = h.reviewer_as_json(
211 member, reasons=reasons, mandatory=mandatory, role=None,
211 member, reasons=reasons, mandatory=mandatory, role=None,
212 user_group=None
212 user_group=None
213 )
213 )
214
214
215 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
215 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
216 member_reviewer['review_status'] = current_review_status
216 member_reviewer['review_status'] = current_review_status
217 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
217 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
218 member_reviewer['allowed_to_update'] = False
218 member_reviewer['allowed_to_update'] = False
219 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
219 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
220
220
221 c.commit_set_reviewers_data_json = json.dumps(c.commit_set_reviewers_data_json)
221 c.commit_set_reviewers_data_json = json.dumps(c.commit_set_reviewers_data_json)
222
222
223 # NOTE(marcink): this uses the same voting logic as in pull-requests
223 # NOTE(marcink): this uses the same voting logic as in pull-requests
224 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
224 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
225 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
225 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
226
226
227 diff = None
227 diff = None
228 # Iterate over ranges (default commit view is always one commit)
228 # Iterate over ranges (default commit view is always one commit)
229 for commit in c.commit_ranges:
229 for commit in c.commit_ranges:
230 c.changes[commit.raw_id] = []
230 c.changes[commit.raw_id] = []
231
231
232 commit2 = commit
232 commit2 = commit
233 commit1 = commit.first_parent
233 commit1 = commit.first_parent
234
234
235 if method == 'show':
235 if method == 'show':
236 inline_comments = CommentsModel().get_inline_comments(
236 inline_comments = CommentsModel().get_inline_comments(
237 self.db_repo.repo_id, revision=commit.raw_id)
237 self.db_repo.repo_id, revision=commit.raw_id)
238 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
238 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
239 inline_comments))
239 inline_comments))
240 c.inline_comments = inline_comments
240 c.inline_comments = inline_comments
241
241
242 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
242 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
243 self.db_repo)
243 self.db_repo)
244 cache_file_path = diff_cache_exist(
244 cache_file_path = diff_cache_exist(
245 cache_path, 'diff', commit.raw_id,
245 cache_path, 'diff', commit.raw_id,
246 hide_whitespace_changes, diff_context, c.fulldiff)
246 hide_whitespace_changes, diff_context, c.fulldiff)
247
247
248 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
248 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
249 force_recache = str2bool(self.request.GET.get('force_recache'))
249 force_recache = str2bool(self.request.GET.get('force_recache'))
250
250
251 cached_diff = None
251 cached_diff = None
252 if caching_enabled:
252 if caching_enabled:
253 cached_diff = load_cached_diff(cache_file_path)
253 cached_diff = load_cached_diff(cache_file_path)
254
254
255 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
255 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
256 if not force_recache and has_proper_diff_cache:
256 if not force_recache and has_proper_diff_cache:
257 diffset = cached_diff['diff']
257 diffset = cached_diff['diff']
258 else:
258 else:
259 vcs_diff = self.rhodecode_vcs_repo.get_diff(
259 vcs_diff = self.rhodecode_vcs_repo.get_diff(
260 commit1, commit2,
260 commit1, commit2,
261 ignore_whitespace=hide_whitespace_changes,
261 ignore_whitespace=hide_whitespace_changes,
262 context=diff_context)
262 context=diff_context)
263
263
264 diff_processor = diffs.DiffProcessor(
264 diff_processor = diffs.DiffProcessor(
265 vcs_diff, format='newdiff', diff_limit=diff_limit,
265 vcs_diff, format='newdiff', diff_limit=diff_limit,
266 file_limit=file_limit, show_full_diff=c.fulldiff)
266 file_limit=file_limit, show_full_diff=c.fulldiff)
267
267
268 _parsed = diff_processor.prepare()
268 _parsed = diff_processor.prepare()
269
269
270 diffset = codeblocks.DiffSet(
270 diffset = codeblocks.DiffSet(
271 repo_name=self.db_repo_name,
271 repo_name=self.db_repo_name,
272 source_node_getter=codeblocks.diffset_node_getter(commit1),
272 source_node_getter=codeblocks.diffset_node_getter(commit1),
273 target_node_getter=codeblocks.diffset_node_getter(commit2))
273 target_node_getter=codeblocks.diffset_node_getter(commit2))
274
274
275 diffset = self.path_filter.render_patchset_filtered(
275 diffset = self.path_filter.render_patchset_filtered(
276 diffset, _parsed, commit1.raw_id, commit2.raw_id)
276 diffset, _parsed, commit1.raw_id, commit2.raw_id)
277
277
278 # save cached diff
278 # save cached diff
279 if caching_enabled:
279 if caching_enabled:
280 cache_diff(cache_file_path, diffset, None)
280 cache_diff(cache_file_path, diffset, None)
281
281
282 c.limited_diff = diffset.limited_diff
282 c.limited_diff = diffset.limited_diff
283 c.changes[commit.raw_id] = diffset
283 c.changes[commit.raw_id] = diffset
284 else:
284 else:
285 # TODO(marcink): no cache usage here...
285 # TODO(marcink): no cache usage here...
286 _diff = self.rhodecode_vcs_repo.get_diff(
286 _diff = self.rhodecode_vcs_repo.get_diff(
287 commit1, commit2,
287 commit1, commit2,
288 ignore_whitespace=hide_whitespace_changes, context=diff_context)
288 ignore_whitespace=hide_whitespace_changes, context=diff_context)
289 diff_processor = diffs.DiffProcessor(
289 diff_processor = diffs.DiffProcessor(
290 _diff, format='newdiff', diff_limit=diff_limit,
290 _diff, format='newdiff', diff_limit=diff_limit,
291 file_limit=file_limit, show_full_diff=c.fulldiff)
291 file_limit=file_limit, show_full_diff=c.fulldiff)
292 # downloads/raw we only need RAW diff nothing else
292 # downloads/raw we only need RAW diff nothing else
293 diff = self.path_filter.get_raw_patch(diff_processor)
293 diff = self.path_filter.get_raw_patch(diff_processor)
294 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
294 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
295
295
296 # sort comments by how they were generated
296 # sort comments by how they were generated
297 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
297 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
298 c.at_version_num = None
298 c.at_version_num = None
299
299
300 if len(c.commit_ranges) == 1:
300 if len(c.commit_ranges) == 1:
301 c.commit = c.commit_ranges[0]
301 c.commit = c.commit_ranges[0]
302 c.parent_tmpl = ''.join(
302 c.parent_tmpl = ''.join(
303 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
303 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
304
304
305 if method == 'download':
305 if method == 'download':
306 response = Response(diff)
306 response = Response(diff)
307 response.content_type = 'text/plain'
307 response.content_type = 'text/plain'
308 response.content_disposition = (
308 response.content_disposition = (
309 'attachment; filename=%s.diff' % commit_id_range[:12])
309 'attachment; filename=%s.diff' % commit_id_range[:12])
310 return response
310 return response
311 elif method == 'patch':
311 elif method == 'patch':
312 c.diff = safe_unicode(diff)
312 c.diff = safe_unicode(diff)
313 patch = render(
313 patch = render(
314 'rhodecode:templates/changeset/patch_changeset.mako',
314 'rhodecode:templates/changeset/patch_changeset.mako',
315 self._get_template_context(c), self.request)
315 self._get_template_context(c), self.request)
316 response = Response(patch)
316 response = Response(patch)
317 response.content_type = 'text/plain'
317 response.content_type = 'text/plain'
318 return response
318 return response
319 elif method == 'raw':
319 elif method == 'raw':
320 response = Response(diff)
320 response = Response(diff)
321 response.content_type = 'text/plain'
321 response.content_type = 'text/plain'
322 return response
322 return response
323 elif method == 'show':
323 elif method == 'show':
324 if len(c.commit_ranges) == 1:
324 if len(c.commit_ranges) == 1:
325 html = render(
325 html = render(
326 'rhodecode:templates/changeset/changeset.mako',
326 'rhodecode:templates/changeset/changeset.mako',
327 self._get_template_context(c), self.request)
327 self._get_template_context(c), self.request)
328 return Response(html)
328 return Response(html)
329 else:
329 else:
330 c.ancestor = None
330 c.ancestor = None
331 c.target_repo = self.db_repo
331 c.target_repo = self.db_repo
332 html = render(
332 html = render(
333 'rhodecode:templates/changeset/changeset_range.mako',
333 'rhodecode:templates/changeset/changeset_range.mako',
334 self._get_template_context(c), self.request)
334 self._get_template_context(c), self.request)
335 return Response(html)
335 return Response(html)
336
336
337 raise HTTPBadRequest()
337 raise HTTPBadRequest()
338
338
339 @LoginRequired()
339 @LoginRequired()
340 @HasRepoPermissionAnyDecorator(
340 @HasRepoPermissionAnyDecorator(
341 'repository.read', 'repository.write', 'repository.admin')
341 'repository.read', 'repository.write', 'repository.admin')
342 def repo_commit_show(self):
342 def repo_commit_show(self):
343 commit_id = self.request.matchdict['commit_id']
343 commit_id = self.request.matchdict['commit_id']
344 return self._commit(commit_id, method='show')
344 return self._commit(commit_id, method='show')
345
345
346 @LoginRequired()
346 @LoginRequired()
347 @HasRepoPermissionAnyDecorator(
347 @HasRepoPermissionAnyDecorator(
348 'repository.read', 'repository.write', 'repository.admin')
348 'repository.read', 'repository.write', 'repository.admin')
349 def repo_commit_raw(self):
349 def repo_commit_raw(self):
350 commit_id = self.request.matchdict['commit_id']
350 commit_id = self.request.matchdict['commit_id']
351 return self._commit(commit_id, method='raw')
351 return self._commit(commit_id, method='raw')
352
352
353 @LoginRequired()
353 @LoginRequired()
354 @HasRepoPermissionAnyDecorator(
354 @HasRepoPermissionAnyDecorator(
355 'repository.read', 'repository.write', 'repository.admin')
355 'repository.read', 'repository.write', 'repository.admin')
356 def repo_commit_patch(self):
356 def repo_commit_patch(self):
357 commit_id = self.request.matchdict['commit_id']
357 commit_id = self.request.matchdict['commit_id']
358 return self._commit(commit_id, method='patch')
358 return self._commit(commit_id, method='patch')
359
359
360 @LoginRequired()
360 @LoginRequired()
361 @HasRepoPermissionAnyDecorator(
361 @HasRepoPermissionAnyDecorator(
362 'repository.read', 'repository.write', 'repository.admin')
362 'repository.read', 'repository.write', 'repository.admin')
363 def repo_commit_download(self):
363 def repo_commit_download(self):
364 commit_id = self.request.matchdict['commit_id']
364 commit_id = self.request.matchdict['commit_id']
365 return self._commit(commit_id, method='download')
365 return self._commit(commit_id, method='download')
366
366
367 def _commit_comments_create(self, commit_id, comments):
367 def _commit_comments_create(self, commit_id, comments):
368 _ = self.request.translate
368 _ = self.request.translate
369 data = {}
369 data = {}
370 if not comments:
370 if not comments:
371 return
371 return
372
372
373 commit = self.db_repo.get_commit(commit_id)
373 commit = self.db_repo.get_commit(commit_id)
374
374
375 all_drafts = len([x for x in comments if str2bool(x['is_draft'])]) == len(comments)
375 all_drafts = len([x for x in comments if str2bool(x['is_draft'])]) == len(comments)
376 for entry in comments:
376 for entry in comments:
377 c = self.load_default_context()
377 c = self.load_default_context()
378 comment_type = entry['comment_type']
378 comment_type = entry['comment_type']
379 text = entry['text']
379 text = entry['text']
380 status = entry['status']
380 status = entry['status']
381 is_draft = str2bool(entry['is_draft'])
381 is_draft = str2bool(entry['is_draft'])
382 resolves_comment_id = entry['resolves_comment_id']
382 resolves_comment_id = entry['resolves_comment_id']
383 f_path = entry['f_path']
383 f_path = entry['f_path']
384 line_no = entry['line']
384 line_no = entry['line']
385 target_elem_id = 'file-{}'.format(h.safeid(h.safe_unicode(f_path)))
385 target_elem_id = 'file-{}'.format(h.safeid(h.safe_unicode(f_path)))
386
386
387 if status:
387 if status:
388 text = text or (_('Status change %(transition_icon)s %(status)s')
388 text = text or (_('Status change %(transition_icon)s %(status)s')
389 % {'transition_icon': '>',
389 % {'transition_icon': '>',
390 'status': ChangesetStatus.get_status_lbl(status)})
390 'status': ChangesetStatus.get_status_lbl(status)})
391
391
392 comment = CommentsModel().create(
392 comment = CommentsModel().create(
393 text=text,
393 text=text,
394 repo=self.db_repo.repo_id,
394 repo=self.db_repo.repo_id,
395 user=self._rhodecode_db_user.user_id,
395 user=self._rhodecode_db_user.user_id,
396 commit_id=commit_id,
396 commit_id=commit_id,
397 f_path=f_path,
397 f_path=f_path,
398 line_no=line_no,
398 line_no=line_no,
399 status_change=(ChangesetStatus.get_status_lbl(status)
399 status_change=(ChangesetStatus.get_status_lbl(status)
400 if status else None),
400 if status else None),
401 status_change_type=status,
401 status_change_type=status,
402 comment_type=comment_type,
402 comment_type=comment_type,
403 is_draft=is_draft,
403 is_draft=is_draft,
404 resolves_comment_id=resolves_comment_id,
404 resolves_comment_id=resolves_comment_id,
405 auth_user=self._rhodecode_user,
405 auth_user=self._rhodecode_user,
406 send_email=not is_draft, # skip notification for draft comments
406 send_email=not is_draft, # skip notification for draft comments
407 )
407 )
408 is_inline = comment.is_inline
408 is_inline = comment.is_inline
409
409
410 # get status if set !
410 # get status if set !
411 if status:
411 if status:
412 # `dont_allow_on_closed_pull_request = True` means
412 # `dont_allow_on_closed_pull_request = True` means
413 # if latest status was from pull request and it's closed
413 # if latest status was from pull request and it's closed
414 # disallow changing status !
414 # disallow changing status !
415
415
416 try:
416 try:
417 ChangesetStatusModel().set_status(
417 ChangesetStatusModel().set_status(
418 self.db_repo.repo_id,
418 self.db_repo.repo_id,
419 status,
419 status,
420 self._rhodecode_db_user.user_id,
420 self._rhodecode_db_user.user_id,
421 comment,
421 comment,
422 revision=commit_id,
422 revision=commit_id,
423 dont_allow_on_closed_pull_request=True
423 dont_allow_on_closed_pull_request=True
424 )
424 )
425 except StatusChangeOnClosedPullRequestError:
425 except StatusChangeOnClosedPullRequestError:
426 msg = _('Changing the status of a commit associated with '
426 msg = _('Changing the status of a commit associated with '
427 'a closed pull request is not allowed')
427 'a closed pull request is not allowed')
428 log.exception(msg)
428 log.exception(msg)
429 h.flash(msg, category='warning')
429 h.flash(msg, category='warning')
430 raise HTTPFound(h.route_path(
430 raise HTTPFound(h.route_path(
431 'repo_commit', repo_name=self.db_repo_name,
431 'repo_commit', repo_name=self.db_repo_name,
432 commit_id=commit_id))
432 commit_id=commit_id))
433
433
434 Session().flush()
434 Session().flush()
435 # this is somehow required to get access to some relationship
435 # this is somehow required to get access to some relationship
436 # loaded on comment
436 # loaded on comment
437 Session().refresh(comment)
437 Session().refresh(comment)
438
438
439 # skip notifications for drafts
439 # skip notifications for drafts
440 if not is_draft:
440 if not is_draft:
441 CommentsModel().trigger_commit_comment_hook(
441 CommentsModel().trigger_commit_comment_hook(
442 self.db_repo, self._rhodecode_user, 'create',
442 self.db_repo, self._rhodecode_user, 'create',
443 data={'comment': comment, 'commit': commit})
443 data={'comment': comment, 'commit': commit})
444
444
445 comment_id = comment.comment_id
445 comment_id = comment.comment_id
446 data[comment_id] = {
446 data[comment_id] = {
447 'target_id': target_elem_id
447 'target_id': target_elem_id
448 }
448 }
449 Session().flush()
449 Session().flush()
450
450
451 c.co = comment
451 c.co = comment
452 c.at_version_num = 0
452 c.at_version_num = 0
453 c.is_new = True
453 c.is_new = True
454 rendered_comment = render(
454 rendered_comment = render(
455 'rhodecode:templates/changeset/changeset_comment_block.mako',
455 'rhodecode:templates/changeset/changeset_comment_block.mako',
456 self._get_template_context(c), self.request)
456 self._get_template_context(c), self.request)
457
457
458 data[comment_id].update(comment.get_dict())
458 data[comment_id].update(comment.get_dict())
459 data[comment_id].update({'rendered_text': rendered_comment})
459 data[comment_id].update({'rendered_text': rendered_comment})
460
460
461 # finalize, commit and redirect
461 # finalize, commit and redirect
462 Session().commit()
462 Session().commit()
463
463
464 # skip channelstream for draft comments
464 # skip channelstream for draft comments
465 if not all_drafts:
465 if not all_drafts:
466 comment_broadcast_channel = channelstream.comment_channel(
466 comment_broadcast_channel = channelstream.comment_channel(
467 self.db_repo_name, commit_obj=commit)
467 self.db_repo_name, commit_obj=commit)
468
468
469 comment_data = data
469 comment_data = data
470 posted_comment_type = 'inline' if is_inline else 'general'
470 posted_comment_type = 'inline' if is_inline else 'general'
471 if len(data) == 1:
471 if len(data) == 1:
472 msg = _('posted {} new {} comment').format(len(data), posted_comment_type)
472 msg = _('posted {} new {} comment').format(len(data), posted_comment_type)
473 else:
473 else:
474 msg = _('posted {} new {} comments').format(len(data), posted_comment_type)
474 msg = _('posted {} new {} comments').format(len(data), posted_comment_type)
475
475
476 channelstream.comment_channelstream_push(
476 channelstream.comment_channelstream_push(
477 self.request, comment_broadcast_channel, self._rhodecode_user, msg,
477 self.request, comment_broadcast_channel, self._rhodecode_user, msg,
478 comment_data=comment_data)
478 comment_data=comment_data)
479
479
480 return data
480 return data
481
481
482 @LoginRequired()
482 @LoginRequired()
483 @NotAnonymous()
483 @NotAnonymous()
484 @HasRepoPermissionAnyDecorator(
484 @HasRepoPermissionAnyDecorator(
485 'repository.read', 'repository.write', 'repository.admin')
485 'repository.read', 'repository.write', 'repository.admin')
486 @CSRFRequired()
486 @CSRFRequired()
487 def repo_commit_comment_create(self):
487 def repo_commit_comment_create(self):
488 _ = self.request.translate
488 _ = self.request.translate
489 commit_id = self.request.matchdict['commit_id']
489 commit_id = self.request.matchdict['commit_id']
490
490
491 multi_commit_ids = []
491 multi_commit_ids = []
492 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
492 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
493 if _commit_id not in ['', None, EmptyCommit.raw_id]:
493 if _commit_id not in ['', None, EmptyCommit.raw_id]:
494 if _commit_id not in multi_commit_ids:
494 if _commit_id not in multi_commit_ids:
495 multi_commit_ids.append(_commit_id)
495 multi_commit_ids.append(_commit_id)
496
496
497 commit_ids = multi_commit_ids or [commit_id]
497 commit_ids = multi_commit_ids or [commit_id]
498
498
499 data = []
499 data = []
500 # Multiple comments for each passed commit id
500 # Multiple comments for each passed commit id
501 for current_id in filter(None, commit_ids):
501 for current_id in filter(None, commit_ids):
502 comment_data = {
502 comment_data = {
503 'comment_type': self.request.POST.get('comment_type'),
503 'comment_type': self.request.POST.get('comment_type'),
504 'text': self.request.POST.get('text'),
504 'text': self.request.POST.get('text'),
505 'status': self.request.POST.get('changeset_status', None),
505 'status': self.request.POST.get('changeset_status', None),
506 'is_draft': self.request.POST.get('draft'),
506 'is_draft': self.request.POST.get('draft'),
507 'resolves_comment_id': self.request.POST.get('resolves_comment_id', None),
507 'resolves_comment_id': self.request.POST.get('resolves_comment_id', None),
508 'close_pull_request': self.request.POST.get('close_pull_request'),
508 'close_pull_request': self.request.POST.get('close_pull_request'),
509 'f_path': self.request.POST.get('f_path'),
509 'f_path': self.request.POST.get('f_path'),
510 'line': self.request.POST.get('line'),
510 'line': self.request.POST.get('line'),
511 }
511 }
512 comment = self._commit_comments_create(commit_id=current_id, comments=[comment_data])
512 comment = self._commit_comments_create(commit_id=current_id, comments=[comment_data])
513 data.append(comment)
513 data.append(comment)
514
514
515 return data if len(data) > 1 else data[0]
515 return data if len(data) > 1 else data[0]
516
516
517 @LoginRequired()
517 @LoginRequired()
518 @NotAnonymous()
518 @NotAnonymous()
519 @HasRepoPermissionAnyDecorator(
519 @HasRepoPermissionAnyDecorator(
520 'repository.read', 'repository.write', 'repository.admin')
520 'repository.read', 'repository.write', 'repository.admin')
521 @CSRFRequired()
521 @CSRFRequired()
522 def repo_commit_comment_preview(self):
522 def repo_commit_comment_preview(self):
523 # Technically a CSRF token is not needed as no state changes with this
523 # Technically a CSRF token is not needed as no state changes with this
524 # call. However, as this is a POST is better to have it, so automated
524 # call. However, as this is a POST is better to have it, so automated
525 # tools don't flag it as potential CSRF.
525 # tools don't flag it as potential CSRF.
526 # Post is required because the payload could be bigger than the maximum
526 # Post is required because the payload could be bigger than the maximum
527 # allowed by GET.
527 # allowed by GET.
528
528
529 text = self.request.POST.get('text')
529 text = self.request.POST.get('text')
530 renderer = self.request.POST.get('renderer') or 'rst'
530 renderer = self.request.POST.get('renderer') or 'rst'
531 if text:
531 if text:
532 return h.render(text, renderer=renderer, mentions=True,
532 return h.render(text, renderer=renderer, mentions=True,
533 repo_name=self.db_repo_name)
533 repo_name=self.db_repo_name)
534 return ''
534 return ''
535
535
536 @LoginRequired()
536 @LoginRequired()
537 @HasRepoPermissionAnyDecorator(
537 @HasRepoPermissionAnyDecorator(
538 'repository.read', 'repository.write', 'repository.admin')
538 'repository.read', 'repository.write', 'repository.admin')
539 @CSRFRequired()
539 @CSRFRequired()
540 def repo_commit_comment_history_view(self):
540 def repo_commit_comment_history_view(self):
541 c = self.load_default_context()
541 c = self.load_default_context()
542
542
543 comment_history_id = self.request.matchdict['comment_history_id']
543 comment_history_id = self.request.matchdict['comment_history_id']
544 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
544 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
545 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
545 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
546
546
547 if is_repo_comment:
547 if is_repo_comment:
548 c.comment_history = comment_history
548 c.comment_history = comment_history
549
549
550 rendered_comment = render(
550 rendered_comment = render(
551 'rhodecode:templates/changeset/comment_history.mako',
551 'rhodecode:templates/changeset/comment_history.mako',
552 self._get_template_context(c)
552 self._get_template_context(c)
553 , self.request)
553 , self.request)
554 return rendered_comment
554 return rendered_comment
555 else:
555 else:
556 log.warning('No permissions for user %s to show comment_history_id: %s',
556 log.warning('No permissions for user %s to show comment_history_id: %s',
557 self._rhodecode_db_user, comment_history_id)
557 self._rhodecode_db_user, comment_history_id)
558 raise HTTPNotFound()
558 raise HTTPNotFound()
559
559
560 @LoginRequired()
560 @LoginRequired()
561 @NotAnonymous()
561 @NotAnonymous()
562 @HasRepoPermissionAnyDecorator(
562 @HasRepoPermissionAnyDecorator(
563 'repository.read', 'repository.write', 'repository.admin')
563 'repository.read', 'repository.write', 'repository.admin')
564 @CSRFRequired()
564 @CSRFRequired()
565 def repo_commit_comment_attachment_upload(self):
565 def repo_commit_comment_attachment_upload(self):
566 c = self.load_default_context()
566 c = self.load_default_context()
567 upload_key = 'attachment'
567 upload_key = 'attachment'
568
568
569 file_obj = self.request.POST.get(upload_key)
569 file_obj = self.request.POST.get(upload_key)
570
570
571 if file_obj is None:
571 if file_obj is None:
572 self.request.response.status = 400
572 self.request.response.status = 400
573 return {'store_fid': None,
573 return {'store_fid': None,
574 'access_path': None,
574 'access_path': None,
575 'error': '{} data field is missing'.format(upload_key)}
575 'error': '{} data field is missing'.format(upload_key)}
576
576
577 if not hasattr(file_obj, 'filename'):
577 if not hasattr(file_obj, 'filename'):
578 self.request.response.status = 400
578 self.request.response.status = 400
579 return {'store_fid': None,
579 return {'store_fid': None,
580 'access_path': None,
580 'access_path': None,
581 'error': 'filename cannot be read from the data field'}
581 'error': 'filename cannot be read from the data field'}
582
582
583 filename = file_obj.filename
583 filename = file_obj.filename
584 file_display_name = filename
584 file_display_name = filename
585
585
586 metadata = {
586 metadata = {
587 'user_uploaded': {'username': self._rhodecode_user.username,
587 'user_uploaded': {'username': self._rhodecode_user.username,
588 'user_id': self._rhodecode_user.user_id,
588 'user_id': self._rhodecode_user.user_id,
589 'ip': self._rhodecode_user.ip_addr}}
589 'ip': self._rhodecode_user.ip_addr}}
590
590
591 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
591 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
592 allowed_extensions = [
592 allowed_extensions = [
593 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
593 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
594 '.pptx', '.txt', '.xlsx', '.zip']
594 '.pptx', '.txt', '.xlsx', '.zip']
595 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
595 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
596
596
597 try:
597 try:
598 storage = store_utils.get_file_storage(self.request.registry.settings)
598 storage = store_utils.get_file_storage(self.request.registry.settings)
599 store_uid, metadata = storage.save_file(
599 store_uid, metadata = storage.save_file(
600 file_obj.file, filename, extra_metadata=metadata,
600 file_obj.file, filename, extra_metadata=metadata,
601 extensions=allowed_extensions, max_filesize=max_file_size)
601 extensions=allowed_extensions, max_filesize=max_file_size)
602 except FileNotAllowedException:
602 except FileNotAllowedException:
603 self.request.response.status = 400
603 self.request.response.status = 400
604 permitted_extensions = ', '.join(allowed_extensions)
604 permitted_extensions = ', '.join(allowed_extensions)
605 error_msg = 'File `{}` is not allowed. ' \
605 error_msg = 'File `{}` is not allowed. ' \
606 'Only following extensions are permitted: {}'.format(
606 'Only following extensions are permitted: {}'.format(
607 filename, permitted_extensions)
607 filename, permitted_extensions)
608 return {'store_fid': None,
608 return {'store_fid': None,
609 'access_path': None,
609 'access_path': None,
610 'error': error_msg}
610 'error': error_msg}
611 except FileOverSizeException:
611 except FileOverSizeException:
612 self.request.response.status = 400
612 self.request.response.status = 400
613 limit_mb = h.format_byte_size_binary(max_file_size)
613 limit_mb = h.format_byte_size_binary(max_file_size)
614 return {'store_fid': None,
614 return {'store_fid': None,
615 'access_path': None,
615 'access_path': None,
616 'error': 'File {} is exceeding allowed limit of {}.'.format(
616 'error': 'File {} is exceeding allowed limit of {}.'.format(
617 filename, limit_mb)}
617 filename, limit_mb)}
618
618
619 try:
619 try:
620 entry = FileStore.create(
620 entry = FileStore.create(
621 file_uid=store_uid, filename=metadata["filename"],
621 file_uid=store_uid, filename=metadata["filename"],
622 file_hash=metadata["sha256"], file_size=metadata["size"],
622 file_hash=metadata["sha256"], file_size=metadata["size"],
623 file_display_name=file_display_name,
623 file_display_name=file_display_name,
624 file_description=u'comment attachment `{}`'.format(safe_unicode(filename)),
624 file_description=u'comment attachment `{}`'.format(safe_unicode(filename)),
625 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
625 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
626 scope_repo_id=self.db_repo.repo_id
626 scope_repo_id=self.db_repo.repo_id
627 )
627 )
628 Session().add(entry)
628 Session().add(entry)
629 Session().commit()
629 Session().commit()
630 log.debug('Stored upload in DB as %s', entry)
630 log.debug('Stored upload in DB as %s', entry)
631 except Exception:
631 except Exception:
632 log.exception('Failed to store file %s', filename)
632 log.exception('Failed to store file %s', filename)
633 self.request.response.status = 400
633 self.request.response.status = 400
634 return {'store_fid': None,
634 return {'store_fid': None,
635 'access_path': None,
635 'access_path': None,
636 'error': 'File {} failed to store in DB.'.format(filename)}
636 'error': 'File {} failed to store in DB.'.format(filename)}
637
637
638 Session().commit()
638 Session().commit()
639
639
640 return {
640 return {
641 'store_fid': store_uid,
641 'store_fid': store_uid,
642 'access_path': h.route_path(
642 'access_path': h.route_path(
643 'download_file', fid=store_uid),
643 'download_file', fid=store_uid),
644 'fqn_access_path': h.route_url(
644 'fqn_access_path': h.route_url(
645 'download_file', fid=store_uid),
645 'download_file', fid=store_uid),
646 'repo_access_path': h.route_path(
646 'repo_access_path': h.route_path(
647 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
647 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
648 'repo_fqn_access_path': h.route_url(
648 'repo_fqn_access_path': h.route_url(
649 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
649 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
650 }
650 }
651
651
652 @LoginRequired()
652 @LoginRequired()
653 @NotAnonymous()
653 @NotAnonymous()
654 @HasRepoPermissionAnyDecorator(
654 @HasRepoPermissionAnyDecorator(
655 'repository.read', 'repository.write', 'repository.admin')
655 'repository.read', 'repository.write', 'repository.admin')
656 @CSRFRequired()
656 @CSRFRequired()
657 def repo_commit_comment_delete(self):
657 def repo_commit_comment_delete(self):
658 commit_id = self.request.matchdict['commit_id']
658 commit_id = self.request.matchdict['commit_id']
659 comment_id = self.request.matchdict['comment_id']
659 comment_id = self.request.matchdict['comment_id']
660
660
661 comment = ChangesetComment.get_or_404(comment_id)
661 comment = ChangesetComment.get_or_404(comment_id)
662 if not comment:
662 if not comment:
663 log.debug('Comment with id:%s not found, skipping', comment_id)
663 log.debug('Comment with id:%s not found, skipping', comment_id)
664 # comment already deleted in another call probably
664 # comment already deleted in another call probably
665 return True
665 return True
666
666
667 if comment.immutable:
667 if comment.immutable:
668 # don't allow deleting comments that are immutable
668 # don't allow deleting comments that are immutable
669 raise HTTPForbidden()
669 raise HTTPForbidden()
670
670
671 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
671 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
672 super_admin = h.HasPermissionAny('hg.admin')()
672 super_admin = h.HasPermissionAny('hg.admin')()
673 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
673 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
674 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
674 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
675 comment_repo_admin = is_repo_admin and is_repo_comment
675 comment_repo_admin = is_repo_admin and is_repo_comment
676
676
677 if comment.draft and not comment_owner:
678 # We never allow to delete draft comments for other than owners
679 raise HTTPNotFound()
680
677 if super_admin or comment_owner or comment_repo_admin:
681 if super_admin or comment_owner or comment_repo_admin:
678 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
682 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
679 Session().commit()
683 Session().commit()
680 return True
684 return True
681 else:
685 else:
682 log.warning('No permissions for user %s to delete comment_id: %s',
686 log.warning('No permissions for user %s to delete comment_id: %s',
683 self._rhodecode_db_user, comment_id)
687 self._rhodecode_db_user, comment_id)
684 raise HTTPNotFound()
688 raise HTTPNotFound()
685
689
686 @LoginRequired()
690 @LoginRequired()
687 @NotAnonymous()
691 @NotAnonymous()
688 @HasRepoPermissionAnyDecorator(
692 @HasRepoPermissionAnyDecorator(
689 'repository.read', 'repository.write', 'repository.admin')
693 'repository.read', 'repository.write', 'repository.admin')
690 @CSRFRequired()
694 @CSRFRequired()
691 def repo_commit_comment_edit(self):
695 def repo_commit_comment_edit(self):
692 self.load_default_context()
696 self.load_default_context()
693
697
694 commit_id = self.request.matchdict['commit_id']
698 commit_id = self.request.matchdict['commit_id']
695 comment_id = self.request.matchdict['comment_id']
699 comment_id = self.request.matchdict['comment_id']
696 comment = ChangesetComment.get_or_404(comment_id)
700 comment = ChangesetComment.get_or_404(comment_id)
697
701
698 if comment.immutable:
702 if comment.immutable:
699 # don't allow deleting comments that are immutable
703 # don't allow deleting comments that are immutable
700 raise HTTPForbidden()
704 raise HTTPForbidden()
701
705
702 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
706 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
703 super_admin = h.HasPermissionAny('hg.admin')()
707 super_admin = h.HasPermissionAny('hg.admin')()
704 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
708 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
705 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
709 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
706 comment_repo_admin = is_repo_admin and is_repo_comment
710 comment_repo_admin = is_repo_admin and is_repo_comment
707
711
708 if super_admin or comment_owner or comment_repo_admin:
712 if super_admin or comment_owner or comment_repo_admin:
709 text = self.request.POST.get('text')
713 text = self.request.POST.get('text')
710 version = self.request.POST.get('version')
714 version = self.request.POST.get('version')
711 if text == comment.text:
715 if text == comment.text:
712 log.warning(
716 log.warning(
713 'Comment(repo): '
717 'Comment(repo): '
714 'Trying to create new version '
718 'Trying to create new version '
715 'with the same comment body {}'.format(
719 'with the same comment body {}'.format(
716 comment_id,
720 comment_id,
717 )
721 )
718 )
722 )
719 raise HTTPNotFound()
723 raise HTTPNotFound()
720
724
721 if version.isdigit():
725 if version.isdigit():
722 version = int(version)
726 version = int(version)
723 else:
727 else:
724 log.warning(
728 log.warning(
725 'Comment(repo): Wrong version type {} {} '
729 'Comment(repo): Wrong version type {} {} '
726 'for comment {}'.format(
730 'for comment {}'.format(
727 version,
731 version,
728 type(version),
732 type(version),
729 comment_id,
733 comment_id,
730 )
734 )
731 )
735 )
732 raise HTTPNotFound()
736 raise HTTPNotFound()
733
737
734 try:
738 try:
735 comment_history = CommentsModel().edit(
739 comment_history = CommentsModel().edit(
736 comment_id=comment_id,
740 comment_id=comment_id,
737 text=text,
741 text=text,
738 auth_user=self._rhodecode_user,
742 auth_user=self._rhodecode_user,
739 version=version,
743 version=version,
740 )
744 )
741 except CommentVersionMismatch:
745 except CommentVersionMismatch:
742 raise HTTPConflict()
746 raise HTTPConflict()
743
747
744 if not comment_history:
748 if not comment_history:
745 raise HTTPNotFound()
749 raise HTTPNotFound()
746
750
747 if not comment.draft:
751 if not comment.draft:
748 commit = self.db_repo.get_commit(commit_id)
752 commit = self.db_repo.get_commit(commit_id)
749 CommentsModel().trigger_commit_comment_hook(
753 CommentsModel().trigger_commit_comment_hook(
750 self.db_repo, self._rhodecode_user, 'edit',
754 self.db_repo, self._rhodecode_user, 'edit',
751 data={'comment': comment, 'commit': commit})
755 data={'comment': comment, 'commit': commit})
752
756
753 Session().commit()
757 Session().commit()
754 return {
758 return {
755 'comment_history_id': comment_history.comment_history_id,
759 'comment_history_id': comment_history.comment_history_id,
756 'comment_id': comment.comment_id,
760 'comment_id': comment.comment_id,
757 'comment_version': comment_history.version,
761 'comment_version': comment_history.version,
758 'comment_author_username': comment_history.author.username,
762 'comment_author_username': comment_history.author.username,
759 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16),
763 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16),
760 'comment_created_on': h.age_component(comment_history.created_on,
764 'comment_created_on': h.age_component(comment_history.created_on,
761 time_is_local=True),
765 time_is_local=True),
762 }
766 }
763 else:
767 else:
764 log.warning('No permissions for user %s to edit comment_id: %s',
768 log.warning('No permissions for user %s to edit comment_id: %s',
765 self._rhodecode_db_user, comment_id)
769 self._rhodecode_db_user, comment_id)
766 raise HTTPNotFound()
770 raise HTTPNotFound()
767
771
768 @LoginRequired()
772 @LoginRequired()
769 @HasRepoPermissionAnyDecorator(
773 @HasRepoPermissionAnyDecorator(
770 'repository.read', 'repository.write', 'repository.admin')
774 'repository.read', 'repository.write', 'repository.admin')
771 def repo_commit_data(self):
775 def repo_commit_data(self):
772 commit_id = self.request.matchdict['commit_id']
776 commit_id = self.request.matchdict['commit_id']
773 self.load_default_context()
777 self.load_default_context()
774
778
775 try:
779 try:
776 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
780 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
777 except CommitDoesNotExistError as e:
781 except CommitDoesNotExistError as e:
778 return EmptyCommit(message=str(e))
782 return EmptyCommit(message=str(e))
779
783
780 @LoginRequired()
784 @LoginRequired()
781 @HasRepoPermissionAnyDecorator(
785 @HasRepoPermissionAnyDecorator(
782 'repository.read', 'repository.write', 'repository.admin')
786 'repository.read', 'repository.write', 'repository.admin')
783 def repo_commit_children(self):
787 def repo_commit_children(self):
784 commit_id = self.request.matchdict['commit_id']
788 commit_id = self.request.matchdict['commit_id']
785 self.load_default_context()
789 self.load_default_context()
786
790
787 try:
791 try:
788 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
792 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
789 children = commit.children
793 children = commit.children
790 except CommitDoesNotExistError:
794 except CommitDoesNotExistError:
791 children = []
795 children = []
792
796
793 result = {"results": children}
797 result = {"results": children}
794 return result
798 return result
795
799
796 @LoginRequired()
800 @LoginRequired()
797 @HasRepoPermissionAnyDecorator(
801 @HasRepoPermissionAnyDecorator(
798 'repository.read', 'repository.write', 'repository.admin')
802 'repository.read', 'repository.write', 'repository.admin')
799 def repo_commit_parents(self):
803 def repo_commit_parents(self):
800 commit_id = self.request.matchdict['commit_id']
804 commit_id = self.request.matchdict['commit_id']
801 self.load_default_context()
805 self.load_default_context()
802
806
803 try:
807 try:
804 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
808 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
805 parents = commit.parents
809 parents = commit.parents
806 except CommitDoesNotExistError:
810 except CommitDoesNotExistError:
807 parents = []
811 parents = []
808 result = {"results": parents}
812 result = {"results": parents}
809 return result
813 return result
@@ -1,206 +1,210 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2017-2020 RhodeCode GmbH
3 # Copyright (C) 2017-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 import pytz
20 import pytz
21 import logging
21 import logging
22
22
23 from pyramid.response import Response
23 from pyramid.response import Response
24
24
25 from rhodecode.apps._base import RepoAppView
25 from rhodecode.apps._base import RepoAppView
26 from rhodecode.lib.feedgenerator import Rss201rev2Feed, Atom1Feed
26 from rhodecode.lib.feedgenerator import Rss201rev2Feed, Atom1Feed
27 from rhodecode.lib import audit_logger
27 from rhodecode.lib import audit_logger
28 from rhodecode.lib import rc_cache
28 from rhodecode.lib import rc_cache
29 from rhodecode.lib import helpers as h
29 from rhodecode.lib import helpers as h
30 from rhodecode.lib.auth import (
30 from rhodecode.lib.auth import (
31 LoginRequired, HasRepoPermissionAnyDecorator)
31 LoginRequired, HasRepoPermissionAnyDecorator)
32 from rhodecode.lib.diffs import DiffProcessor, LimitedDiffContainer
32 from rhodecode.lib.diffs import DiffProcessor, LimitedDiffContainer
33 from rhodecode.lib.utils2 import str2bool, safe_int, md5_safe
33 from rhodecode.lib.utils2 import str2bool, safe_int, md5_safe
34 from rhodecode.model.db import UserApiKeys, CacheKey
34 from rhodecode.model.db import UserApiKeys, CacheKey
35
35
36 log = logging.getLogger(__name__)
36 log = logging.getLogger(__name__)
37
37
38
38
39 class RepoFeedView(RepoAppView):
39 class RepoFeedView(RepoAppView):
40 def load_default_context(self):
40 def load_default_context(self):
41 c = self._get_local_tmpl_context()
41 c = self._get_local_tmpl_context()
42 self._load_defaults()
42 self._load_defaults()
43 return c
43 return c
44
44
45 def _get_config(self):
45 def _get_config(self):
46 import rhodecode
46 import rhodecode
47 config = rhodecode.CONFIG
47 config = rhodecode.CONFIG
48
48
49 return {
49 return {
50 'language': 'en-us',
50 'language': 'en-us',
51 'feed_ttl': '5', # TTL of feed,
51 'feed_ttl': '5', # TTL of feed,
52 'feed_include_diff':
52 'feed_include_diff':
53 str2bool(config.get('rss_include_diff', False)),
53 str2bool(config.get('rss_include_diff', False)),
54 'feed_items_per_page':
54 'feed_items_per_page':
55 safe_int(config.get('rss_items_per_page', 20)),
55 safe_int(config.get('rss_items_per_page', 20)),
56 'feed_diff_limit':
56 'feed_diff_limit':
57 # we need to protect from parsing huge diffs here other way
57 # we need to protect from parsing huge diffs here other way
58 # we can kill the server
58 # we can kill the server
59 safe_int(config.get('rss_cut_off_limit', 32 * 1024)),
59 safe_int(config.get('rss_cut_off_limit', 32 * 1024)),
60 }
60 }
61
61
62 def _load_defaults(self):
62 def _load_defaults(self):
63 _ = self.request.translate
63 _ = self.request.translate
64 config = self._get_config()
64 config = self._get_config()
65 # common values for feeds
65 # common values for feeds
66 self.description = _('Changes on %s repository')
66 self.description = _('Changes on %s repository')
67 self.title = _('%s %s feed') % (self.db_repo_name, '%s')
67 self.title = _('%s %s feed') % (self.db_repo_name, '%s')
68 self.language = config["language"]
68 self.language = config["language"]
69 self.ttl = config["feed_ttl"]
69 self.ttl = config["feed_ttl"]
70 self.feed_include_diff = config['feed_include_diff']
70 self.feed_include_diff = config['feed_include_diff']
71 self.feed_diff_limit = config['feed_diff_limit']
71 self.feed_diff_limit = config['feed_diff_limit']
72 self.feed_items_per_page = config['feed_items_per_page']
72 self.feed_items_per_page = config['feed_items_per_page']
73
73
74 def _changes(self, commit):
74 def _changes(self, commit):
75 diff_processor = DiffProcessor(
75 diff_processor = DiffProcessor(
76 commit.diff(), diff_limit=self.feed_diff_limit)
76 commit.diff(), diff_limit=self.feed_diff_limit)
77 _parsed = diff_processor.prepare(inline_diff=False)
77 _parsed = diff_processor.prepare(inline_diff=False)
78 limited_diff = isinstance(_parsed, LimitedDiffContainer)
78 limited_diff = isinstance(_parsed, LimitedDiffContainer)
79
79
80 return diff_processor, _parsed, limited_diff
80 return diff_processor, _parsed, limited_diff
81
81
82 def _get_title(self, commit):
82 def _get_title(self, commit):
83 return h.chop_at_smart(commit.message, '\n', suffix_if_chopped='...')
83 return h.chop_at_smart(commit.message, '\n', suffix_if_chopped='...')
84
84
85 def _get_description(self, commit):
85 def _get_description(self, commit):
86 _renderer = self.request.get_partial_renderer(
86 _renderer = self.request.get_partial_renderer(
87 'rhodecode:templates/feed/atom_feed_entry.mako')
87 'rhodecode:templates/feed/atom_feed_entry.mako')
88 diff_processor, parsed_diff, limited_diff = self._changes(commit)
88 diff_processor, parsed_diff, limited_diff = self._changes(commit)
89 filtered_parsed_diff, has_hidden_changes = self.path_filter.filter_patchset(parsed_diff)
89 filtered_parsed_diff, has_hidden_changes = self.path_filter.filter_patchset(parsed_diff)
90 return _renderer(
90 return _renderer(
91 'body',
91 'body',
92 commit=commit,
92 commit=commit,
93 parsed_diff=filtered_parsed_diff,
93 parsed_diff=filtered_parsed_diff,
94 limited_diff=limited_diff,
94 limited_diff=limited_diff,
95 feed_include_diff=self.feed_include_diff,
95 feed_include_diff=self.feed_include_diff,
96 diff_processor=diff_processor,
96 diff_processor=diff_processor,
97 has_hidden_changes=has_hidden_changes
97 has_hidden_changes=has_hidden_changes
98 )
98 )
99
99
100 def _set_timezone(self, date, tzinfo=pytz.utc):
100 def _set_timezone(self, date, tzinfo=pytz.utc):
101 if not getattr(date, "tzinfo", None):
101 if not getattr(date, "tzinfo", None):
102 date.replace(tzinfo=tzinfo)
102 date.replace(tzinfo=tzinfo)
103 return date
103 return date
104
104
105 def _get_commits(self):
105 def _get_commits(self):
106 pre_load = ['author', 'branch', 'date', 'message', 'parents']
106 pre_load = ['author', 'branch', 'date', 'message', 'parents']
107 if self.rhodecode_vcs_repo.is_empty():
108 return []
109
107 collection = self.rhodecode_vcs_repo.get_commits(
110 collection = self.rhodecode_vcs_repo.get_commits(
108 branch_name=None, show_hidden=False, pre_load=pre_load,
111 branch_name=None, show_hidden=False, pre_load=pre_load,
109 translate_tags=False)
112 translate_tags=False)
110
113
111 return list(collection[-self.feed_items_per_page:])
114 return list(collection[-self.feed_items_per_page:])
112
115
113 def uid(self, repo_id, commit_id):
116 def uid(self, repo_id, commit_id):
114 return '{}:{}'.format(md5_safe(repo_id), md5_safe(commit_id))
117 return '{}:{}'.format(md5_safe(repo_id), md5_safe(commit_id))
115
118
116 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_FEED])
119 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_FEED])
117 @HasRepoPermissionAnyDecorator(
120 @HasRepoPermissionAnyDecorator(
118 'repository.read', 'repository.write', 'repository.admin')
121 'repository.read', 'repository.write', 'repository.admin')
119 def atom(self):
122 def atom(self):
120 """
123 """
121 Produce an atom-1.0 feed via feedgenerator module
124 Produce an atom-1.0 feed via feedgenerator module
122 """
125 """
123 self.load_default_context()
126 self.load_default_context()
124 force_recache = self.get_recache_flag()
127 force_recache = self.get_recache_flag()
125
128
126 cache_namespace_uid = 'cache_repo_feed.{}'.format(self.db_repo.repo_id)
129 cache_namespace_uid = 'cache_repo_feed.{}'.format(self.db_repo.repo_id)
127 condition = not (self.path_filter.is_enabled or force_recache)
130 condition = not (self.path_filter.is_enabled or force_recache)
128 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
131 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
129
132
130 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
133 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
131 condition=condition)
134 condition=condition)
132 def generate_atom_feed(repo_id, _repo_name, _commit_id, _feed_type):
135 def generate_atom_feed(repo_id, _repo_name, _commit_id, _feed_type):
133 feed = Atom1Feed(
136 feed = Atom1Feed(
134 title=self.title % 'atom',
137 title=self.title % 'atom',
135 link=h.route_url('repo_summary', repo_name=_repo_name),
138 link=h.route_url('repo_summary', repo_name=_repo_name),
136 description=self.description % _repo_name,
139 description=self.description % _repo_name,
137 language=self.language,
140 language=self.language,
138 ttl=self.ttl
141 ttl=self.ttl
139 )
142 )
143
140 for commit in reversed(self._get_commits()):
144 for commit in reversed(self._get_commits()):
141 date = self._set_timezone(commit.date)
145 date = self._set_timezone(commit.date)
142 feed.add_item(
146 feed.add_item(
143 unique_id=self.uid(repo_id, commit.raw_id),
147 unique_id=self.uid(repo_id, commit.raw_id),
144 title=self._get_title(commit),
148 title=self._get_title(commit),
145 author_name=commit.author,
149 author_name=commit.author,
146 description=self._get_description(commit),
150 description=self._get_description(commit),
147 link=h.route_url(
151 link=h.route_url(
148 'repo_commit', repo_name=_repo_name,
152 'repo_commit', repo_name=_repo_name,
149 commit_id=commit.raw_id),
153 commit_id=commit.raw_id),
150 pubdate=date,)
154 pubdate=date,)
151
155
152 return feed.content_type, feed.writeString('utf-8')
156 return feed.content_type, feed.writeString('utf-8')
153
157
154 commit_id = self.db_repo.changeset_cache.get('raw_id')
158 commit_id = self.db_repo.changeset_cache.get('raw_id')
155 content_type, feed = generate_atom_feed(
159 content_type, feed = generate_atom_feed(
156 self.db_repo.repo_id, self.db_repo.repo_name, commit_id, 'atom')
160 self.db_repo.repo_id, self.db_repo.repo_name, commit_id, 'atom')
157
161
158 response = Response(feed)
162 response = Response(feed)
159 response.content_type = content_type
163 response.content_type = content_type
160 return response
164 return response
161
165
162 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_FEED])
166 @LoginRequired(auth_token_access=[UserApiKeys.ROLE_FEED])
163 @HasRepoPermissionAnyDecorator(
167 @HasRepoPermissionAnyDecorator(
164 'repository.read', 'repository.write', 'repository.admin')
168 'repository.read', 'repository.write', 'repository.admin')
165 def rss(self):
169 def rss(self):
166 """
170 """
167 Produce an rss2 feed via feedgenerator module
171 Produce an rss2 feed via feedgenerator module
168 """
172 """
169 self.load_default_context()
173 self.load_default_context()
170 force_recache = self.get_recache_flag()
174 force_recache = self.get_recache_flag()
171
175
172 cache_namespace_uid = 'cache_repo_feed.{}'.format(self.db_repo.repo_id)
176 cache_namespace_uid = 'cache_repo_feed.{}'.format(self.db_repo.repo_id)
173 condition = not (self.path_filter.is_enabled or force_recache)
177 condition = not (self.path_filter.is_enabled or force_recache)
174 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
178 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
175
179
176 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
180 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
177 condition=condition)
181 condition=condition)
178 def generate_rss_feed(repo_id, _repo_name, _commit_id, _feed_type):
182 def generate_rss_feed(repo_id, _repo_name, _commit_id, _feed_type):
179 feed = Rss201rev2Feed(
183 feed = Rss201rev2Feed(
180 title=self.title % 'rss',
184 title=self.title % 'rss',
181 link=h.route_url('repo_summary', repo_name=_repo_name),
185 link=h.route_url('repo_summary', repo_name=_repo_name),
182 description=self.description % _repo_name,
186 description=self.description % _repo_name,
183 language=self.language,
187 language=self.language,
184 ttl=self.ttl
188 ttl=self.ttl
185 )
189 )
186
190
187 for commit in reversed(self._get_commits()):
191 for commit in reversed(self._get_commits()):
188 date = self._set_timezone(commit.date)
192 date = self._set_timezone(commit.date)
189 feed.add_item(
193 feed.add_item(
190 unique_id=self.uid(repo_id, commit.raw_id),
194 unique_id=self.uid(repo_id, commit.raw_id),
191 title=self._get_title(commit),
195 title=self._get_title(commit),
192 author_name=commit.author,
196 author_name=commit.author,
193 description=self._get_description(commit),
197 description=self._get_description(commit),
194 link=h.route_url(
198 link=h.route_url(
195 'repo_commit', repo_name=_repo_name,
199 'repo_commit', repo_name=_repo_name,
196 commit_id=commit.raw_id),
200 commit_id=commit.raw_id),
197 pubdate=date,)
201 pubdate=date,)
198 return feed.content_type, feed.writeString('utf-8')
202 return feed.content_type, feed.writeString('utf-8')
199
203
200 commit_id = self.db_repo.changeset_cache.get('raw_id')
204 commit_id = self.db_repo.changeset_cache.get('raw_id')
201 content_type, feed = generate_rss_feed(
205 content_type, feed = generate_rss_feed(
202 self.db_repo.repo_id, self.db_repo.repo_name, commit_id, 'rss')
206 self.db_repo.repo_id, self.db_repo.repo_name, commit_id, 'rss')
203
207
204 response = Response(feed)
208 response = Response(feed)
205 response.content_type = content_type
209 response.content_type = content_type
206 return response
210 return response
@@ -1,1574 +1,1581 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2011-2020 RhodeCode GmbH
3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import itertools
21 import itertools
22 import logging
22 import logging
23 import os
23 import os
24 import shutil
24 import shutil
25 import tempfile
25 import tempfile
26 import collections
26 import collections
27 import urllib
27 import urllib
28 import pathlib2
28 import pathlib2
29
29
30 from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound
30 from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound
31
31
32 from pyramid.renderers import render
32 from pyramid.renderers import render
33 from pyramid.response import Response
33 from pyramid.response import Response
34
34
35 import rhodecode
35 import rhodecode
36 from rhodecode.apps._base import RepoAppView
36 from rhodecode.apps._base import RepoAppView
37
37
38
38
39 from rhodecode.lib import diffs, helpers as h, rc_cache
39 from rhodecode.lib import diffs, helpers as h, rc_cache
40 from rhodecode.lib import audit_logger
40 from rhodecode.lib import audit_logger
41 from rhodecode.lib.view_utils import parse_path_ref
41 from rhodecode.lib.view_utils import parse_path_ref
42 from rhodecode.lib.exceptions import NonRelativePathError
42 from rhodecode.lib.exceptions import NonRelativePathError
43 from rhodecode.lib.codeblocks import (
43 from rhodecode.lib.codeblocks import (
44 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
44 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
45 from rhodecode.lib.utils2 import (
45 from rhodecode.lib.utils2 import (
46 convert_line_endings, detect_mode, safe_str, str2bool, safe_int, sha1, safe_unicode)
46 convert_line_endings, detect_mode, safe_str, str2bool, safe_int, sha1, safe_unicode)
47 from rhodecode.lib.auth import (
47 from rhodecode.lib.auth import (
48 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
48 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
49 from rhodecode.lib.vcs import path as vcspath
49 from rhodecode.lib.vcs import path as vcspath
50 from rhodecode.lib.vcs.backends.base import EmptyCommit
50 from rhodecode.lib.vcs.backends.base import EmptyCommit
51 from rhodecode.lib.vcs.conf import settings
51 from rhodecode.lib.vcs.conf import settings
52 from rhodecode.lib.vcs.nodes import FileNode
52 from rhodecode.lib.vcs.nodes import FileNode
53 from rhodecode.lib.vcs.exceptions import (
53 from rhodecode.lib.vcs.exceptions import (
54 RepositoryError, CommitDoesNotExistError, EmptyRepositoryError,
54 RepositoryError, CommitDoesNotExistError, EmptyRepositoryError,
55 ImproperArchiveTypeError, VCSError, NodeAlreadyExistsError,
55 ImproperArchiveTypeError, VCSError, NodeAlreadyExistsError,
56 NodeDoesNotExistError, CommitError, NodeError)
56 NodeDoesNotExistError, CommitError, NodeError)
57
57
58 from rhodecode.model.scm import ScmModel
58 from rhodecode.model.scm import ScmModel
59 from rhodecode.model.db import Repository
59 from rhodecode.model.db import Repository
60
60
61 log = logging.getLogger(__name__)
61 log = logging.getLogger(__name__)
62
62
63
63
64 class RepoFilesView(RepoAppView):
64 class RepoFilesView(RepoAppView):
65
65
66 @staticmethod
66 @staticmethod
67 def adjust_file_path_for_svn(f_path, repo):
67 def adjust_file_path_for_svn(f_path, repo):
68 """
68 """
69 Computes the relative path of `f_path`.
69 Computes the relative path of `f_path`.
70
70
71 This is mainly based on prefix matching of the recognized tags and
71 This is mainly based on prefix matching of the recognized tags and
72 branches in the underlying repository.
72 branches in the underlying repository.
73 """
73 """
74 tags_and_branches = itertools.chain(
74 tags_and_branches = itertools.chain(
75 repo.branches.iterkeys(),
75 repo.branches.iterkeys(),
76 repo.tags.iterkeys())
76 repo.tags.iterkeys())
77 tags_and_branches = sorted(tags_and_branches, key=len, reverse=True)
77 tags_and_branches = sorted(tags_and_branches, key=len, reverse=True)
78
78
79 for name in tags_and_branches:
79 for name in tags_and_branches:
80 if f_path.startswith('{}/'.format(name)):
80 if f_path.startswith('{}/'.format(name)):
81 f_path = vcspath.relpath(f_path, name)
81 f_path = vcspath.relpath(f_path, name)
82 break
82 break
83 return f_path
83 return f_path
84
84
85 def load_default_context(self):
85 def load_default_context(self):
86 c = self._get_local_tmpl_context(include_app_defaults=True)
86 c = self._get_local_tmpl_context(include_app_defaults=True)
87 c.rhodecode_repo = self.rhodecode_vcs_repo
87 c.rhodecode_repo = self.rhodecode_vcs_repo
88 c.enable_downloads = self.db_repo.enable_downloads
88 c.enable_downloads = self.db_repo.enable_downloads
89 return c
89 return c
90
90
91 def _ensure_not_locked(self, commit_id='tip'):
91 def _ensure_not_locked(self, commit_id='tip'):
92 _ = self.request.translate
92 _ = self.request.translate
93
93
94 repo = self.db_repo
94 repo = self.db_repo
95 if repo.enable_locking and repo.locked[0]:
95 if repo.enable_locking and repo.locked[0]:
96 h.flash(_('This repository has been locked by %s on %s')
96 h.flash(_('This repository has been locked by %s on %s')
97 % (h.person_by_id(repo.locked[0]),
97 % (h.person_by_id(repo.locked[0]),
98 h.format_date(h.time_to_datetime(repo.locked[1]))),
98 h.format_date(h.time_to_datetime(repo.locked[1]))),
99 'warning')
99 'warning')
100 files_url = h.route_path(
100 files_url = h.route_path(
101 'repo_files:default_path',
101 'repo_files:default_path',
102 repo_name=self.db_repo_name, commit_id=commit_id)
102 repo_name=self.db_repo_name, commit_id=commit_id)
103 raise HTTPFound(files_url)
103 raise HTTPFound(files_url)
104
104
105 def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False):
105 def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False):
106 _ = self.request.translate
106 _ = self.request.translate
107
107
108 if not is_head:
108 if not is_head:
109 message = _('Cannot modify file. '
109 message = _('Cannot modify file. '
110 'Given commit `{}` is not head of a branch.').format(commit_id)
110 'Given commit `{}` is not head of a branch.').format(commit_id)
111 h.flash(message, category='warning')
111 h.flash(message, category='warning')
112
112
113 if json_mode:
113 if json_mode:
114 return message
114 return message
115
115
116 files_url = h.route_path(
116 files_url = h.route_path(
117 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id,
117 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id,
118 f_path=f_path)
118 f_path=f_path)
119 raise HTTPFound(files_url)
119 raise HTTPFound(files_url)
120
120
121 def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False):
121 def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False):
122 _ = self.request.translate
122 _ = self.request.translate
123
123
124 rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission(
124 rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission(
125 self.db_repo_name, branch_name)
125 self.db_repo_name, branch_name)
126 if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']:
126 if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']:
127 message = _('Branch `{}` changes forbidden by rule {}.').format(
127 message = _('Branch `{}` changes forbidden by rule {}.').format(
128 h.escape(branch_name), h.escape(rule))
128 h.escape(branch_name), h.escape(rule))
129 h.flash(message, 'warning')
129 h.flash(message, 'warning')
130
130
131 if json_mode:
131 if json_mode:
132 return message
132 return message
133
133
134 files_url = h.route_path(
134 files_url = h.route_path(
135 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id)
135 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id)
136
136
137 raise HTTPFound(files_url)
137 raise HTTPFound(files_url)
138
138
139 def _get_commit_and_path(self):
139 def _get_commit_and_path(self):
140 default_commit_id = self.db_repo.landing_ref_name
140 default_commit_id = self.db_repo.landing_ref_name
141 default_f_path = '/'
141 default_f_path = '/'
142
142
143 commit_id = self.request.matchdict.get(
143 commit_id = self.request.matchdict.get(
144 'commit_id', default_commit_id)
144 'commit_id', default_commit_id)
145 f_path = self._get_f_path(self.request.matchdict, default_f_path)
145 f_path = self._get_f_path(self.request.matchdict, default_f_path)
146 return commit_id, f_path
146 return commit_id, f_path
147
147
148 def _get_default_encoding(self, c):
148 def _get_default_encoding(self, c):
149 enc_list = getattr(c, 'default_encodings', [])
149 enc_list = getattr(c, 'default_encodings', [])
150 return enc_list[0] if enc_list else 'UTF-8'
150 return enc_list[0] if enc_list else 'UTF-8'
151
151
152 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
152 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
153 """
153 """
154 This is a safe way to get commit. If an error occurs it redirects to
154 This is a safe way to get commit. If an error occurs it redirects to
155 tip with proper message
155 tip with proper message
156
156
157 :param commit_id: id of commit to fetch
157 :param commit_id: id of commit to fetch
158 :param redirect_after: toggle redirection
158 :param redirect_after: toggle redirection
159 """
159 """
160 _ = self.request.translate
160 _ = self.request.translate
161
161
162 try:
162 try:
163 return self.rhodecode_vcs_repo.get_commit(commit_id)
163 return self.rhodecode_vcs_repo.get_commit(commit_id)
164 except EmptyRepositoryError:
164 except EmptyRepositoryError:
165 if not redirect_after:
165 if not redirect_after:
166 return None
166 return None
167
167
168 _url = h.route_path(
168 _url = h.route_path(
169 'repo_files_add_file',
169 'repo_files_add_file',
170 repo_name=self.db_repo_name, commit_id=0, f_path='')
170 repo_name=self.db_repo_name, commit_id=0, f_path='')
171
171
172 if h.HasRepoPermissionAny(
172 if h.HasRepoPermissionAny(
173 'repository.write', 'repository.admin')(self.db_repo_name):
173 'repository.write', 'repository.admin')(self.db_repo_name):
174 add_new = h.link_to(
174 add_new = h.link_to(
175 _('Click here to add a new file.'), _url, class_="alert-link")
175 _('Click here to add a new file.'), _url, class_="alert-link")
176 else:
176 else:
177 add_new = ""
177 add_new = ""
178
178
179 h.flash(h.literal(
179 h.flash(h.literal(
180 _('There are no files yet. %s') % add_new), category='warning')
180 _('There are no files yet. %s') % add_new), category='warning')
181 raise HTTPFound(
181 raise HTTPFound(
182 h.route_path('repo_summary', repo_name=self.db_repo_name))
182 h.route_path('repo_summary', repo_name=self.db_repo_name))
183
183
184 except (CommitDoesNotExistError, LookupError) as e:
184 except (CommitDoesNotExistError, LookupError) as e:
185 msg = _('No such commit exists for this repository. Commit: {}').format(commit_id)
185 msg = _('No such commit exists for this repository. Commit: {}').format(commit_id)
186 h.flash(msg, category='error')
186 h.flash(msg, category='error')
187 raise HTTPNotFound()
187 raise HTTPNotFound()
188 except RepositoryError as e:
188 except RepositoryError as e:
189 h.flash(safe_str(h.escape(e)), category='error')
189 h.flash(safe_str(h.escape(e)), category='error')
190 raise HTTPNotFound()
190 raise HTTPNotFound()
191
191
192 def _get_filenode_or_redirect(self, commit_obj, path):
192 def _get_filenode_or_redirect(self, commit_obj, path):
193 """
193 """
194 Returns file_node, if error occurs or given path is directory,
194 Returns file_node, if error occurs or given path is directory,
195 it'll redirect to top level path
195 it'll redirect to top level path
196 """
196 """
197 _ = self.request.translate
197 _ = self.request.translate
198
198
199 try:
199 try:
200 file_node = commit_obj.get_node(path)
200 file_node = commit_obj.get_node(path)
201 if file_node.is_dir():
201 if file_node.is_dir():
202 raise RepositoryError('The given path is a directory')
202 raise RepositoryError('The given path is a directory')
203 except CommitDoesNotExistError:
203 except CommitDoesNotExistError:
204 log.exception('No such commit exists for this repository')
204 log.exception('No such commit exists for this repository')
205 h.flash(_('No such commit exists for this repository'), category='error')
205 h.flash(_('No such commit exists for this repository'), category='error')
206 raise HTTPNotFound()
206 raise HTTPNotFound()
207 except RepositoryError as e:
207 except RepositoryError as e:
208 log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e)
208 log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e)
209 h.flash(safe_str(h.escape(e)), category='error')
209 h.flash(safe_str(h.escape(e)), category='error')
210 raise HTTPNotFound()
210 raise HTTPNotFound()
211
211
212 return file_node
212 return file_node
213
213
214 def _is_valid_head(self, commit_id, repo, landing_ref):
214 def _is_valid_head(self, commit_id, repo, landing_ref):
215 branch_name = sha_commit_id = ''
215 branch_name = sha_commit_id = ''
216 is_head = False
216 is_head = False
217 log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo)
217 log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo)
218
218
219 for _branch_name, branch_commit_id in repo.branches.items():
219 for _branch_name, branch_commit_id in repo.branches.items():
220 # simple case we pass in branch name, it's a HEAD
220 # simple case we pass in branch name, it's a HEAD
221 if commit_id == _branch_name:
221 if commit_id == _branch_name:
222 is_head = True
222 is_head = True
223 branch_name = _branch_name
223 branch_name = _branch_name
224 sha_commit_id = branch_commit_id
224 sha_commit_id = branch_commit_id
225 break
225 break
226 # case when we pass in full sha commit_id, which is a head
226 # case when we pass in full sha commit_id, which is a head
227 elif commit_id == branch_commit_id:
227 elif commit_id == branch_commit_id:
228 is_head = True
228 is_head = True
229 branch_name = _branch_name
229 branch_name = _branch_name
230 sha_commit_id = branch_commit_id
230 sha_commit_id = branch_commit_id
231 break
231 break
232
232
233 if h.is_svn(repo) and not repo.is_empty():
233 if h.is_svn(repo) and not repo.is_empty():
234 # Note: Subversion only has one head.
234 # Note: Subversion only has one head.
235 if commit_id == repo.get_commit(commit_idx=-1).raw_id:
235 if commit_id == repo.get_commit(commit_idx=-1).raw_id:
236 is_head = True
236 is_head = True
237 return branch_name, sha_commit_id, is_head
237 return branch_name, sha_commit_id, is_head
238
238
239 # checked branches, means we only need to try to get the branch/commit_sha
239 # checked branches, means we only need to try to get the branch/commit_sha
240 if repo.is_empty():
240 if repo.is_empty():
241 is_head = True
241 is_head = True
242 branch_name = landing_ref
242 branch_name = landing_ref
243 sha_commit_id = EmptyCommit().raw_id
243 sha_commit_id = EmptyCommit().raw_id
244 else:
244 else:
245 commit = repo.get_commit(commit_id=commit_id)
245 commit = repo.get_commit(commit_id=commit_id)
246 if commit:
246 if commit:
247 branch_name = commit.branch
247 branch_name = commit.branch
248 sha_commit_id = commit.raw_id
248 sha_commit_id = commit.raw_id
249
249
250 return branch_name, sha_commit_id, is_head
250 return branch_name, sha_commit_id, is_head
251
251
252 def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False, at_rev=None):
252 def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False, at_rev=None):
253
253
254 repo_id = self.db_repo.repo_id
254 repo_id = self.db_repo.repo_id
255 force_recache = self.get_recache_flag()
255 force_recache = self.get_recache_flag()
256
256
257 cache_seconds = safe_int(
257 cache_seconds = safe_int(
258 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
258 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
259 cache_on = not force_recache and cache_seconds > 0
259 cache_on = not force_recache and cache_seconds > 0
260 log.debug(
260 log.debug(
261 'Computing FILE TREE for repo_id %s commit_id `%s` and path `%s`'
261 'Computing FILE TREE for repo_id %s commit_id `%s` and path `%s`'
262 'with caching: %s[TTL: %ss]' % (
262 'with caching: %s[TTL: %ss]' % (
263 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
263 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
264
264
265 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
265 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
266 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
266 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
267
267
268 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
268 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
269 def compute_file_tree(ver, _name_hash, _repo_id, _commit_id, _f_path, _full_load, _at_rev):
269 def compute_file_tree(ver, _name_hash, _repo_id, _commit_id, _f_path, _full_load, _at_rev):
270 log.debug('Generating cached file tree at ver:%s for repo_id: %s, %s, %s',
270 log.debug('Generating cached file tree at ver:%s for repo_id: %s, %s, %s',
271 ver, _repo_id, _commit_id, _f_path)
271 ver, _repo_id, _commit_id, _f_path)
272
272
273 c.full_load = _full_load
273 c.full_load = _full_load
274 return render(
274 return render(
275 'rhodecode:templates/files/files_browser_tree.mako',
275 'rhodecode:templates/files/files_browser_tree.mako',
276 self._get_template_context(c), self.request, _at_rev)
276 self._get_template_context(c), self.request, _at_rev)
277
277
278 return compute_file_tree(
278 return compute_file_tree(
279 rc_cache.FILE_TREE_CACHE_VER, self.db_repo.repo_name_hash,
279 rc_cache.FILE_TREE_CACHE_VER, self.db_repo.repo_name_hash,
280 self.db_repo.repo_id, commit_id, f_path, full_load, at_rev)
280 self.db_repo.repo_id, commit_id, f_path, full_load, at_rev)
281
281
282 def _get_archive_spec(self, fname):
282 def _get_archive_spec(self, fname):
283 log.debug('Detecting archive spec for: `%s`', fname)
283 log.debug('Detecting archive spec for: `%s`', fname)
284
284
285 fileformat = None
285 fileformat = None
286 ext = None
286 ext = None
287 content_type = None
287 content_type = None
288 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
288 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
289
289
290 if fname.endswith(extension):
290 if fname.endswith(extension):
291 fileformat = a_type
291 fileformat = a_type
292 log.debug('archive is of type: %s', fileformat)
292 log.debug('archive is of type: %s', fileformat)
293 ext = extension
293 ext = extension
294 break
294 break
295
295
296 if not fileformat:
296 if not fileformat:
297 raise ValueError()
297 raise ValueError()
298
298
299 # left over part of whole fname is the commit
299 # left over part of whole fname is the commit
300 commit_id = fname[:-len(ext)]
300 commit_id = fname[:-len(ext)]
301
301
302 return commit_id, ext, fileformat, content_type
302 return commit_id, ext, fileformat, content_type
303
303
304 def create_pure_path(self, *parts):
304 def create_pure_path(self, *parts):
305 # Split paths and sanitize them, removing any ../ etc
305 # Split paths and sanitize them, removing any ../ etc
306 sanitized_path = [
306 sanitized_path = [
307 x for x in pathlib2.PurePath(*parts).parts
307 x for x in pathlib2.PurePath(*parts).parts
308 if x not in ['.', '..']]
308 if x not in ['.', '..']]
309
309
310 pure_path = pathlib2.PurePath(*sanitized_path)
310 pure_path = pathlib2.PurePath(*sanitized_path)
311 return pure_path
311 return pure_path
312
312
313 def _is_lf_enabled(self, target_repo):
313 def _is_lf_enabled(self, target_repo):
314 lf_enabled = False
314 lf_enabled = False
315
315
316 lf_key_for_vcs_map = {
316 lf_key_for_vcs_map = {
317 'hg': 'extensions_largefiles',
317 'hg': 'extensions_largefiles',
318 'git': 'vcs_git_lfs_enabled'
318 'git': 'vcs_git_lfs_enabled'
319 }
319 }
320
320
321 lf_key_for_vcs = lf_key_for_vcs_map.get(target_repo.repo_type)
321 lf_key_for_vcs = lf_key_for_vcs_map.get(target_repo.repo_type)
322
322
323 if lf_key_for_vcs:
323 if lf_key_for_vcs:
324 lf_enabled = self._get_repo_setting(target_repo, lf_key_for_vcs)
324 lf_enabled = self._get_repo_setting(target_repo, lf_key_for_vcs)
325
325
326 return lf_enabled
326 return lf_enabled
327
327
328 def _get_archive_name(self, db_repo_name, commit_sha, ext, subrepos=False, path_sha=''):
328 def _get_archive_name(self, db_repo_name, commit_sha, ext, subrepos=False, path_sha='', with_hash=True):
329 # original backward compat name of archive
329 # original backward compat name of archive
330 clean_name = safe_str(db_repo_name.replace('/', '_'))
330 clean_name = safe_str(db_repo_name.replace('/', '_'))
331
331
332 # e.g vcsserver.zip
332 # e.g vcsserver.zip
333 # e.g vcsserver-abcdefgh.zip
333 # e.g vcsserver-abcdefgh.zip
334 # e.g vcsserver-abcdefgh-defghijk.zip
334 # e.g vcsserver-abcdefgh-defghijk.zip
335 archive_name = '{}{}{}{}{}'.format(
335 archive_name = '{}{}{}{}{}{}'.format(
336 clean_name,
336 clean_name,
337 '-sub' if subrepos else '',
337 '-sub' if subrepos else '',
338 commit_sha,
338 commit_sha,
339 '-{}'.format('plain') if not with_hash else '',
339 '-{}'.format(path_sha) if path_sha else '',
340 '-{}'.format(path_sha) if path_sha else '',
340 ext)
341 ext)
341 return archive_name
342 return archive_name
342
343
343 @LoginRequired()
344 @LoginRequired()
344 @HasRepoPermissionAnyDecorator(
345 @HasRepoPermissionAnyDecorator(
345 'repository.read', 'repository.write', 'repository.admin')
346 'repository.read', 'repository.write', 'repository.admin')
346 def repo_archivefile(self):
347 def repo_archivefile(self):
347 # archive cache config
348 # archive cache config
348 from rhodecode import CONFIG
349 from rhodecode import CONFIG
349 _ = self.request.translate
350 _ = self.request.translate
350 self.load_default_context()
351 self.load_default_context()
351 default_at_path = '/'
352 default_at_path = '/'
352 fname = self.request.matchdict['fname']
353 fname = self.request.matchdict['fname']
353 subrepos = self.request.GET.get('subrepos') == 'true'
354 subrepos = self.request.GET.get('subrepos') == 'true'
354 with_hash = str2bool(self.request.GET.get('with_hash', '1'))
355 with_hash = str2bool(self.request.GET.get('with_hash', '1'))
355 at_path = self.request.GET.get('at_path') or default_at_path
356 at_path = self.request.GET.get('at_path') or default_at_path
356
357
357 if not self.db_repo.enable_downloads:
358 if not self.db_repo.enable_downloads:
358 return Response(_('Downloads disabled'))
359 return Response(_('Downloads disabled'))
359
360
360 try:
361 try:
361 commit_id, ext, fileformat, content_type = \
362 commit_id, ext, fileformat, content_type = \
362 self._get_archive_spec(fname)
363 self._get_archive_spec(fname)
363 except ValueError:
364 except ValueError:
364 return Response(_('Unknown archive type for: `{}`').format(
365 return Response(_('Unknown archive type for: `{}`').format(
365 h.escape(fname)))
366 h.escape(fname)))
366
367
367 try:
368 try:
368 commit = self.rhodecode_vcs_repo.get_commit(commit_id)
369 commit = self.rhodecode_vcs_repo.get_commit(commit_id)
369 except CommitDoesNotExistError:
370 except CommitDoesNotExistError:
370 return Response(_('Unknown commit_id {}').format(
371 return Response(_('Unknown commit_id {}').format(
371 h.escape(commit_id)))
372 h.escape(commit_id)))
372 except EmptyRepositoryError:
373 except EmptyRepositoryError:
373 return Response(_('Empty repository'))
374 return Response(_('Empty repository'))
374
375
376 # we used a ref, or a shorter version, lets redirect client ot use explicit hash
377 if commit_id != commit.raw_id:
378 fname='{}{}'.format(commit.raw_id, ext)
379 raise HTTPFound(self.request.current_route_path(fname=fname))
380
375 try:
381 try:
376 at_path = commit.get_node(at_path).path or default_at_path
382 at_path = commit.get_node(at_path).path or default_at_path
377 except Exception:
383 except Exception:
378 return Response(_('No node at path {} for this repository').format(at_path))
384 return Response(_('No node at path {} for this repository').format(at_path))
379
385
380 # path sha is part of subdir
386 # path sha is part of subdir
381 path_sha = ''
387 path_sha = ''
382 if at_path != default_at_path:
388 if at_path != default_at_path:
383 path_sha = sha1(at_path)[:8]
389 path_sha = sha1(at_path)[:8]
384 short_sha = '-{}'.format(safe_str(commit.short_id))
390 short_sha = '-{}'.format(safe_str(commit.short_id))
385 # used for cache etc
391 # used for cache etc
386 archive_name = self._get_archive_name(
392 archive_name = self._get_archive_name(
387 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
393 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
388 path_sha=path_sha)
394 path_sha=path_sha, with_hash=with_hash)
389
395
390 if not with_hash:
396 if not with_hash:
391 short_sha = ''
397 short_sha = ''
392 path_sha = ''
398 path_sha = ''
393
399
394 # what end client gets served
400 # what end client gets served
395 response_archive_name = self._get_archive_name(
401 response_archive_name = self._get_archive_name(
396 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
402 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
397 path_sha=path_sha)
403 path_sha=path_sha, with_hash=with_hash)
398 # remove extension from our archive directory name
404 # remove extension from our archive directory name
399 archive_dir_name = response_archive_name[:-len(ext)]
405 archive_dir_name = response_archive_name[:-len(ext)]
400
406
401 use_cached_archive = False
407 use_cached_archive = False
402 archive_cache_dir = CONFIG.get('archive_cache_dir')
408 archive_cache_dir = CONFIG.get('archive_cache_dir')
403 archive_cache_enabled = archive_cache_dir and not self.request.GET.get('no_cache')
409 archive_cache_enabled = archive_cache_dir and not self.request.GET.get('no_cache')
404 cached_archive_path = None
410 cached_archive_path = None
405
411
406 if archive_cache_enabled:
412 if archive_cache_enabled:
407 # check if we it's ok to write
413 # check if we it's ok to write, and re-create the archive cache
408 if not os.path.isdir(CONFIG['archive_cache_dir']):
414 if not os.path.isdir(CONFIG['archive_cache_dir']):
409 os.makedirs(CONFIG['archive_cache_dir'])
415 os.makedirs(CONFIG['archive_cache_dir'])
416
410 cached_archive_path = os.path.join(
417 cached_archive_path = os.path.join(
411 CONFIG['archive_cache_dir'], archive_name)
418 CONFIG['archive_cache_dir'], archive_name)
412 if os.path.isfile(cached_archive_path):
419 if os.path.isfile(cached_archive_path):
413 log.debug('Found cached archive in %s', cached_archive_path)
420 log.debug('Found cached archive in %s', cached_archive_path)
414 fd, archive = None, cached_archive_path
421 fd, archive = None, cached_archive_path
415 use_cached_archive = True
422 use_cached_archive = True
416 else:
423 else:
417 log.debug('Archive %s is not yet cached', archive_name)
424 log.debug('Archive %s is not yet cached', archive_name)
418
425
419 # generate new archive, as previous was not found in the cache
426 # generate new archive, as previous was not found in the cache
420 if not use_cached_archive:
427 if not use_cached_archive:
421 _dir = os.path.abspath(archive_cache_dir) if archive_cache_dir else None
428 _dir = os.path.abspath(archive_cache_dir) if archive_cache_dir else None
422 fd, archive = tempfile.mkstemp(dir=_dir)
429 fd, archive = tempfile.mkstemp(dir=_dir)
423 log.debug('Creating new temp archive in %s', archive)
430 log.debug('Creating new temp archive in %s', archive)
424 try:
431 try:
425 commit.archive_repo(archive, archive_dir_name=archive_dir_name,
432 commit.archive_repo(archive, archive_dir_name=archive_dir_name,
426 kind=fileformat, subrepos=subrepos,
433 kind=fileformat, subrepos=subrepos,
427 archive_at_path=at_path)
434 archive_at_path=at_path)
428 except ImproperArchiveTypeError:
435 except ImproperArchiveTypeError:
429 return _('Unknown archive type')
436 return _('Unknown archive type')
430 if archive_cache_enabled:
437 if archive_cache_enabled:
431 # if we generated the archive and we have cache enabled
438 # if we generated the archive and we have cache enabled
432 # let's use this for future
439 # let's use this for future
433 log.debug('Storing new archive in %s', cached_archive_path)
440 log.debug('Storing new archive in %s', cached_archive_path)
434 shutil.move(archive, cached_archive_path)
441 shutil.move(archive, cached_archive_path)
435 archive = cached_archive_path
442 archive = cached_archive_path
436
443
437 # store download action
444 # store download action
438 audit_logger.store_web(
445 audit_logger.store_web(
439 'repo.archive.download', action_data={
446 'repo.archive.download', action_data={
440 'user_agent': self.request.user_agent,
447 'user_agent': self.request.user_agent,
441 'archive_name': archive_name,
448 'archive_name': archive_name,
442 'archive_spec': fname,
449 'archive_spec': fname,
443 'archive_cached': use_cached_archive},
450 'archive_cached': use_cached_archive},
444 user=self._rhodecode_user,
451 user=self._rhodecode_user,
445 repo=self.db_repo,
452 repo=self.db_repo,
446 commit=True
453 commit=True
447 )
454 )
448
455
449 def get_chunked_archive(archive_path):
456 def get_chunked_archive(archive_path):
450 with open(archive_path, 'rb') as stream:
457 with open(archive_path, 'rb') as stream:
451 while True:
458 while True:
452 data = stream.read(16 * 1024)
459 data = stream.read(16 * 1024)
453 if not data:
460 if not data:
454 if fd: # fd means we used temporary file
461 if fd: # fd means we used temporary file
455 os.close(fd)
462 os.close(fd)
456 if not archive_cache_enabled:
463 if not archive_cache_enabled:
457 log.debug('Destroying temp archive %s', archive_path)
464 log.debug('Destroying temp archive %s', archive_path)
458 os.remove(archive_path)
465 os.remove(archive_path)
459 break
466 break
460 yield data
467 yield data
461
468
462 response = Response(app_iter=get_chunked_archive(archive))
469 response = Response(app_iter=get_chunked_archive(archive))
463 response.content_disposition = str('attachment; filename=%s' % response_archive_name)
470 response.content_disposition = str('attachment; filename=%s' % response_archive_name)
464 response.content_type = str(content_type)
471 response.content_type = str(content_type)
465
472
466 return response
473 return response
467
474
468 def _get_file_node(self, commit_id, f_path):
475 def _get_file_node(self, commit_id, f_path):
469 if commit_id not in ['', None, 'None', '0' * 12, '0' * 40]:
476 if commit_id not in ['', None, 'None', '0' * 12, '0' * 40]:
470 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
477 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
471 try:
478 try:
472 node = commit.get_node(f_path)
479 node = commit.get_node(f_path)
473 if node.is_dir():
480 if node.is_dir():
474 raise NodeError('%s path is a %s not a file'
481 raise NodeError('%s path is a %s not a file'
475 % (node, type(node)))
482 % (node, type(node)))
476 except NodeDoesNotExistError:
483 except NodeDoesNotExistError:
477 commit = EmptyCommit(
484 commit = EmptyCommit(
478 commit_id=commit_id,
485 commit_id=commit_id,
479 idx=commit.idx,
486 idx=commit.idx,
480 repo=commit.repository,
487 repo=commit.repository,
481 alias=commit.repository.alias,
488 alias=commit.repository.alias,
482 message=commit.message,
489 message=commit.message,
483 author=commit.author,
490 author=commit.author,
484 date=commit.date)
491 date=commit.date)
485 node = FileNode(f_path, '', commit=commit)
492 node = FileNode(f_path, '', commit=commit)
486 else:
493 else:
487 commit = EmptyCommit(
494 commit = EmptyCommit(
488 repo=self.rhodecode_vcs_repo,
495 repo=self.rhodecode_vcs_repo,
489 alias=self.rhodecode_vcs_repo.alias)
496 alias=self.rhodecode_vcs_repo.alias)
490 node = FileNode(f_path, '', commit=commit)
497 node = FileNode(f_path, '', commit=commit)
491 return node
498 return node
492
499
493 @LoginRequired()
500 @LoginRequired()
494 @HasRepoPermissionAnyDecorator(
501 @HasRepoPermissionAnyDecorator(
495 'repository.read', 'repository.write', 'repository.admin')
502 'repository.read', 'repository.write', 'repository.admin')
496 def repo_files_diff(self):
503 def repo_files_diff(self):
497 c = self.load_default_context()
504 c = self.load_default_context()
498 f_path = self._get_f_path(self.request.matchdict)
505 f_path = self._get_f_path(self.request.matchdict)
499 diff1 = self.request.GET.get('diff1', '')
506 diff1 = self.request.GET.get('diff1', '')
500 diff2 = self.request.GET.get('diff2', '')
507 diff2 = self.request.GET.get('diff2', '')
501
508
502 path1, diff1 = parse_path_ref(diff1, default_path=f_path)
509 path1, diff1 = parse_path_ref(diff1, default_path=f_path)
503
510
504 ignore_whitespace = str2bool(self.request.GET.get('ignorews'))
511 ignore_whitespace = str2bool(self.request.GET.get('ignorews'))
505 line_context = self.request.GET.get('context', 3)
512 line_context = self.request.GET.get('context', 3)
506
513
507 if not any((diff1, diff2)):
514 if not any((diff1, diff2)):
508 h.flash(
515 h.flash(
509 'Need query parameter "diff1" or "diff2" to generate a diff.',
516 'Need query parameter "diff1" or "diff2" to generate a diff.',
510 category='error')
517 category='error')
511 raise HTTPBadRequest()
518 raise HTTPBadRequest()
512
519
513 c.action = self.request.GET.get('diff')
520 c.action = self.request.GET.get('diff')
514 if c.action not in ['download', 'raw']:
521 if c.action not in ['download', 'raw']:
515 compare_url = h.route_path(
522 compare_url = h.route_path(
516 'repo_compare',
523 'repo_compare',
517 repo_name=self.db_repo_name,
524 repo_name=self.db_repo_name,
518 source_ref_type='rev',
525 source_ref_type='rev',
519 source_ref=diff1,
526 source_ref=diff1,
520 target_repo=self.db_repo_name,
527 target_repo=self.db_repo_name,
521 target_ref_type='rev',
528 target_ref_type='rev',
522 target_ref=diff2,
529 target_ref=diff2,
523 _query=dict(f_path=f_path))
530 _query=dict(f_path=f_path))
524 # redirect to new view if we render diff
531 # redirect to new view if we render diff
525 raise HTTPFound(compare_url)
532 raise HTTPFound(compare_url)
526
533
527 try:
534 try:
528 node1 = self._get_file_node(diff1, path1)
535 node1 = self._get_file_node(diff1, path1)
529 node2 = self._get_file_node(diff2, f_path)
536 node2 = self._get_file_node(diff2, f_path)
530 except (RepositoryError, NodeError):
537 except (RepositoryError, NodeError):
531 log.exception("Exception while trying to get node from repository")
538 log.exception("Exception while trying to get node from repository")
532 raise HTTPFound(
539 raise HTTPFound(
533 h.route_path('repo_files', repo_name=self.db_repo_name,
540 h.route_path('repo_files', repo_name=self.db_repo_name,
534 commit_id='tip', f_path=f_path))
541 commit_id='tip', f_path=f_path))
535
542
536 if all(isinstance(node.commit, EmptyCommit)
543 if all(isinstance(node.commit, EmptyCommit)
537 for node in (node1, node2)):
544 for node in (node1, node2)):
538 raise HTTPNotFound()
545 raise HTTPNotFound()
539
546
540 c.commit_1 = node1.commit
547 c.commit_1 = node1.commit
541 c.commit_2 = node2.commit
548 c.commit_2 = node2.commit
542
549
543 if c.action == 'download':
550 if c.action == 'download':
544 _diff = diffs.get_gitdiff(node1, node2,
551 _diff = diffs.get_gitdiff(node1, node2,
545 ignore_whitespace=ignore_whitespace,
552 ignore_whitespace=ignore_whitespace,
546 context=line_context)
553 context=line_context)
547 diff = diffs.DiffProcessor(_diff, format='gitdiff')
554 diff = diffs.DiffProcessor(_diff, format='gitdiff')
548
555
549 response = Response(self.path_filter.get_raw_patch(diff))
556 response = Response(self.path_filter.get_raw_patch(diff))
550 response.content_type = 'text/plain'
557 response.content_type = 'text/plain'
551 response.content_disposition = (
558 response.content_disposition = (
552 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2)
559 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2)
553 )
560 )
554 charset = self._get_default_encoding(c)
561 charset = self._get_default_encoding(c)
555 if charset:
562 if charset:
556 response.charset = charset
563 response.charset = charset
557 return response
564 return response
558
565
559 elif c.action == 'raw':
566 elif c.action == 'raw':
560 _diff = diffs.get_gitdiff(node1, node2,
567 _diff = diffs.get_gitdiff(node1, node2,
561 ignore_whitespace=ignore_whitespace,
568 ignore_whitespace=ignore_whitespace,
562 context=line_context)
569 context=line_context)
563 diff = diffs.DiffProcessor(_diff, format='gitdiff')
570 diff = diffs.DiffProcessor(_diff, format='gitdiff')
564
571
565 response = Response(self.path_filter.get_raw_patch(diff))
572 response = Response(self.path_filter.get_raw_patch(diff))
566 response.content_type = 'text/plain'
573 response.content_type = 'text/plain'
567 charset = self._get_default_encoding(c)
574 charset = self._get_default_encoding(c)
568 if charset:
575 if charset:
569 response.charset = charset
576 response.charset = charset
570 return response
577 return response
571
578
572 # in case we ever end up here
579 # in case we ever end up here
573 raise HTTPNotFound()
580 raise HTTPNotFound()
574
581
575 @LoginRequired()
582 @LoginRequired()
576 @HasRepoPermissionAnyDecorator(
583 @HasRepoPermissionAnyDecorator(
577 'repository.read', 'repository.write', 'repository.admin')
584 'repository.read', 'repository.write', 'repository.admin')
578 def repo_files_diff_2way_redirect(self):
585 def repo_files_diff_2way_redirect(self):
579 """
586 """
580 Kept only to make OLD links work
587 Kept only to make OLD links work
581 """
588 """
582 f_path = self._get_f_path_unchecked(self.request.matchdict)
589 f_path = self._get_f_path_unchecked(self.request.matchdict)
583 diff1 = self.request.GET.get('diff1', '')
590 diff1 = self.request.GET.get('diff1', '')
584 diff2 = self.request.GET.get('diff2', '')
591 diff2 = self.request.GET.get('diff2', '')
585
592
586 if not any((diff1, diff2)):
593 if not any((diff1, diff2)):
587 h.flash(
594 h.flash(
588 'Need query parameter "diff1" or "diff2" to generate a diff.',
595 'Need query parameter "diff1" or "diff2" to generate a diff.',
589 category='error')
596 category='error')
590 raise HTTPBadRequest()
597 raise HTTPBadRequest()
591
598
592 compare_url = h.route_path(
599 compare_url = h.route_path(
593 'repo_compare',
600 'repo_compare',
594 repo_name=self.db_repo_name,
601 repo_name=self.db_repo_name,
595 source_ref_type='rev',
602 source_ref_type='rev',
596 source_ref=diff1,
603 source_ref=diff1,
597 target_ref_type='rev',
604 target_ref_type='rev',
598 target_ref=diff2,
605 target_ref=diff2,
599 _query=dict(f_path=f_path, diffmode='sideside',
606 _query=dict(f_path=f_path, diffmode='sideside',
600 target_repo=self.db_repo_name,))
607 target_repo=self.db_repo_name,))
601 raise HTTPFound(compare_url)
608 raise HTTPFound(compare_url)
602
609
603 @LoginRequired()
610 @LoginRequired()
604 def repo_files_default_commit_redirect(self):
611 def repo_files_default_commit_redirect(self):
605 """
612 """
606 Special page that redirects to the landing page of files based on the default
613 Special page that redirects to the landing page of files based on the default
607 commit for repository
614 commit for repository
608 """
615 """
609 c = self.load_default_context()
616 c = self.load_default_context()
610 ref_name = c.rhodecode_db_repo.landing_ref_name
617 ref_name = c.rhodecode_db_repo.landing_ref_name
611 landing_url = h.repo_files_by_ref_url(
618 landing_url = h.repo_files_by_ref_url(
612 c.rhodecode_db_repo.repo_name,
619 c.rhodecode_db_repo.repo_name,
613 c.rhodecode_db_repo.repo_type,
620 c.rhodecode_db_repo.repo_type,
614 f_path='',
621 f_path='',
615 ref_name=ref_name,
622 ref_name=ref_name,
616 commit_id='tip',
623 commit_id='tip',
617 query=dict(at=ref_name)
624 query=dict(at=ref_name)
618 )
625 )
619
626
620 raise HTTPFound(landing_url)
627 raise HTTPFound(landing_url)
621
628
622 @LoginRequired()
629 @LoginRequired()
623 @HasRepoPermissionAnyDecorator(
630 @HasRepoPermissionAnyDecorator(
624 'repository.read', 'repository.write', 'repository.admin')
631 'repository.read', 'repository.write', 'repository.admin')
625 def repo_files(self):
632 def repo_files(self):
626 c = self.load_default_context()
633 c = self.load_default_context()
627
634
628 view_name = getattr(self.request.matched_route, 'name', None)
635 view_name = getattr(self.request.matched_route, 'name', None)
629
636
630 c.annotate = view_name == 'repo_files:annotated'
637 c.annotate = view_name == 'repo_files:annotated'
631 # default is false, but .rst/.md files later are auto rendered, we can
638 # default is false, but .rst/.md files later are auto rendered, we can
632 # overwrite auto rendering by setting this GET flag
639 # overwrite auto rendering by setting this GET flag
633 c.renderer = view_name == 'repo_files:rendered' or \
640 c.renderer = view_name == 'repo_files:rendered' or \
634 not self.request.GET.get('no-render', False)
641 not self.request.GET.get('no-render', False)
635
642
636 commit_id, f_path = self._get_commit_and_path()
643 commit_id, f_path = self._get_commit_and_path()
637
644
638 c.commit = self._get_commit_or_redirect(commit_id)
645 c.commit = self._get_commit_or_redirect(commit_id)
639 c.branch = self.request.GET.get('branch', None)
646 c.branch = self.request.GET.get('branch', None)
640 c.f_path = f_path
647 c.f_path = f_path
641 at_rev = self.request.GET.get('at')
648 at_rev = self.request.GET.get('at')
642
649
643 # prev link
650 # prev link
644 try:
651 try:
645 prev_commit = c.commit.prev(c.branch)
652 prev_commit = c.commit.prev(c.branch)
646 c.prev_commit = prev_commit
653 c.prev_commit = prev_commit
647 c.url_prev = h.route_path(
654 c.url_prev = h.route_path(
648 'repo_files', repo_name=self.db_repo_name,
655 'repo_files', repo_name=self.db_repo_name,
649 commit_id=prev_commit.raw_id, f_path=f_path)
656 commit_id=prev_commit.raw_id, f_path=f_path)
650 if c.branch:
657 if c.branch:
651 c.url_prev += '?branch=%s' % c.branch
658 c.url_prev += '?branch=%s' % c.branch
652 except (CommitDoesNotExistError, VCSError):
659 except (CommitDoesNotExistError, VCSError):
653 c.url_prev = '#'
660 c.url_prev = '#'
654 c.prev_commit = EmptyCommit()
661 c.prev_commit = EmptyCommit()
655
662
656 # next link
663 # next link
657 try:
664 try:
658 next_commit = c.commit.next(c.branch)
665 next_commit = c.commit.next(c.branch)
659 c.next_commit = next_commit
666 c.next_commit = next_commit
660 c.url_next = h.route_path(
667 c.url_next = h.route_path(
661 'repo_files', repo_name=self.db_repo_name,
668 'repo_files', repo_name=self.db_repo_name,
662 commit_id=next_commit.raw_id, f_path=f_path)
669 commit_id=next_commit.raw_id, f_path=f_path)
663 if c.branch:
670 if c.branch:
664 c.url_next += '?branch=%s' % c.branch
671 c.url_next += '?branch=%s' % c.branch
665 except (CommitDoesNotExistError, VCSError):
672 except (CommitDoesNotExistError, VCSError):
666 c.url_next = '#'
673 c.url_next = '#'
667 c.next_commit = EmptyCommit()
674 c.next_commit = EmptyCommit()
668
675
669 # files or dirs
676 # files or dirs
670 try:
677 try:
671 c.file = c.commit.get_node(f_path)
678 c.file = c.commit.get_node(f_path)
672 c.file_author = True
679 c.file_author = True
673 c.file_tree = ''
680 c.file_tree = ''
674
681
675 # load file content
682 # load file content
676 if c.file.is_file():
683 if c.file.is_file():
677 c.lf_node = {}
684 c.lf_node = {}
678
685
679 has_lf_enabled = self._is_lf_enabled(self.db_repo)
686 has_lf_enabled = self._is_lf_enabled(self.db_repo)
680 if has_lf_enabled:
687 if has_lf_enabled:
681 c.lf_node = c.file.get_largefile_node()
688 c.lf_node = c.file.get_largefile_node()
682
689
683 c.file_source_page = 'true'
690 c.file_source_page = 'true'
684 c.file_last_commit = c.file.last_commit
691 c.file_last_commit = c.file.last_commit
685
692
686 c.file_size_too_big = c.file.size > c.visual.cut_off_limit_file
693 c.file_size_too_big = c.file.size > c.visual.cut_off_limit_file
687
694
688 if not (c.file_size_too_big or c.file.is_binary):
695 if not (c.file_size_too_big or c.file.is_binary):
689 if c.annotate: # annotation has precedence over renderer
696 if c.annotate: # annotation has precedence over renderer
690 c.annotated_lines = filenode_as_annotated_lines_tokens(
697 c.annotated_lines = filenode_as_annotated_lines_tokens(
691 c.file
698 c.file
692 )
699 )
693 else:
700 else:
694 c.renderer = (
701 c.renderer = (
695 c.renderer and h.renderer_from_filename(c.file.path)
702 c.renderer and h.renderer_from_filename(c.file.path)
696 )
703 )
697 if not c.renderer:
704 if not c.renderer:
698 c.lines = filenode_as_lines_tokens(c.file)
705 c.lines = filenode_as_lines_tokens(c.file)
699
706
700 _branch_name, _sha_commit_id, is_head = \
707 _branch_name, _sha_commit_id, is_head = \
701 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
708 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
702 landing_ref=self.db_repo.landing_ref_name)
709 landing_ref=self.db_repo.landing_ref_name)
703 c.on_branch_head = is_head
710 c.on_branch_head = is_head
704
711
705 branch = c.commit.branch if (
712 branch = c.commit.branch if (
706 c.commit.branch and '/' not in c.commit.branch) else None
713 c.commit.branch and '/' not in c.commit.branch) else None
707 c.branch_or_raw_id = branch or c.commit.raw_id
714 c.branch_or_raw_id = branch or c.commit.raw_id
708 c.branch_name = c.commit.branch or h.short_id(c.commit.raw_id)
715 c.branch_name = c.commit.branch or h.short_id(c.commit.raw_id)
709
716
710 author = c.file_last_commit.author
717 author = c.file_last_commit.author
711 c.authors = [[
718 c.authors = [[
712 h.email(author),
719 h.email(author),
713 h.person(author, 'username_or_name_or_email'),
720 h.person(author, 'username_or_name_or_email'),
714 1
721 1
715 ]]
722 ]]
716
723
717 else: # load tree content at path
724 else: # load tree content at path
718 c.file_source_page = 'false'
725 c.file_source_page = 'false'
719 c.authors = []
726 c.authors = []
720 # this loads a simple tree without metadata to speed things up
727 # this loads a simple tree without metadata to speed things up
721 # later via ajax we call repo_nodetree_full and fetch whole
728 # later via ajax we call repo_nodetree_full and fetch whole
722 c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path, at_rev=at_rev)
729 c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path, at_rev=at_rev)
723
730
724 c.readme_data, c.readme_file = \
731 c.readme_data, c.readme_file = \
725 self._get_readme_data(self.db_repo, c.visual.default_renderer,
732 self._get_readme_data(self.db_repo, c.visual.default_renderer,
726 c.commit.raw_id, f_path)
733 c.commit.raw_id, f_path)
727
734
728 except RepositoryError as e:
735 except RepositoryError as e:
729 h.flash(safe_str(h.escape(e)), category='error')
736 h.flash(safe_str(h.escape(e)), category='error')
730 raise HTTPNotFound()
737 raise HTTPNotFound()
731
738
732 if self.request.environ.get('HTTP_X_PJAX'):
739 if self.request.environ.get('HTTP_X_PJAX'):
733 html = render('rhodecode:templates/files/files_pjax.mako',
740 html = render('rhodecode:templates/files/files_pjax.mako',
734 self._get_template_context(c), self.request)
741 self._get_template_context(c), self.request)
735 else:
742 else:
736 html = render('rhodecode:templates/files/files.mako',
743 html = render('rhodecode:templates/files/files.mako',
737 self._get_template_context(c), self.request)
744 self._get_template_context(c), self.request)
738 return Response(html)
745 return Response(html)
739
746
740 @HasRepoPermissionAnyDecorator(
747 @HasRepoPermissionAnyDecorator(
741 'repository.read', 'repository.write', 'repository.admin')
748 'repository.read', 'repository.write', 'repository.admin')
742 def repo_files_annotated_previous(self):
749 def repo_files_annotated_previous(self):
743 self.load_default_context()
750 self.load_default_context()
744
751
745 commit_id, f_path = self._get_commit_and_path()
752 commit_id, f_path = self._get_commit_and_path()
746 commit = self._get_commit_or_redirect(commit_id)
753 commit = self._get_commit_or_redirect(commit_id)
747 prev_commit_id = commit.raw_id
754 prev_commit_id = commit.raw_id
748 line_anchor = self.request.GET.get('line_anchor')
755 line_anchor = self.request.GET.get('line_anchor')
749 is_file = False
756 is_file = False
750 try:
757 try:
751 _file = commit.get_node(f_path)
758 _file = commit.get_node(f_path)
752 is_file = _file.is_file()
759 is_file = _file.is_file()
753 except (NodeDoesNotExistError, CommitDoesNotExistError, VCSError):
760 except (NodeDoesNotExistError, CommitDoesNotExistError, VCSError):
754 pass
761 pass
755
762
756 if is_file:
763 if is_file:
757 history = commit.get_path_history(f_path)
764 history = commit.get_path_history(f_path)
758 prev_commit_id = history[1].raw_id \
765 prev_commit_id = history[1].raw_id \
759 if len(history) > 1 else prev_commit_id
766 if len(history) > 1 else prev_commit_id
760 prev_url = h.route_path(
767 prev_url = h.route_path(
761 'repo_files:annotated', repo_name=self.db_repo_name,
768 'repo_files:annotated', repo_name=self.db_repo_name,
762 commit_id=prev_commit_id, f_path=f_path,
769 commit_id=prev_commit_id, f_path=f_path,
763 _anchor='L{}'.format(line_anchor))
770 _anchor='L{}'.format(line_anchor))
764
771
765 raise HTTPFound(prev_url)
772 raise HTTPFound(prev_url)
766
773
767 @LoginRequired()
774 @LoginRequired()
768 @HasRepoPermissionAnyDecorator(
775 @HasRepoPermissionAnyDecorator(
769 'repository.read', 'repository.write', 'repository.admin')
776 'repository.read', 'repository.write', 'repository.admin')
770 def repo_nodetree_full(self):
777 def repo_nodetree_full(self):
771 """
778 """
772 Returns rendered html of file tree that contains commit date,
779 Returns rendered html of file tree that contains commit date,
773 author, commit_id for the specified combination of
780 author, commit_id for the specified combination of
774 repo, commit_id and file path
781 repo, commit_id and file path
775 """
782 """
776 c = self.load_default_context()
783 c = self.load_default_context()
777
784
778 commit_id, f_path = self._get_commit_and_path()
785 commit_id, f_path = self._get_commit_and_path()
779 commit = self._get_commit_or_redirect(commit_id)
786 commit = self._get_commit_or_redirect(commit_id)
780 try:
787 try:
781 dir_node = commit.get_node(f_path)
788 dir_node = commit.get_node(f_path)
782 except RepositoryError as e:
789 except RepositoryError as e:
783 return Response('error: {}'.format(h.escape(safe_str(e))))
790 return Response('error: {}'.format(h.escape(safe_str(e))))
784
791
785 if dir_node.is_file():
792 if dir_node.is_file():
786 return Response('')
793 return Response('')
787
794
788 c.file = dir_node
795 c.file = dir_node
789 c.commit = commit
796 c.commit = commit
790 at_rev = self.request.GET.get('at')
797 at_rev = self.request.GET.get('at')
791
798
792 html = self._get_tree_at_commit(
799 html = self._get_tree_at_commit(
793 c, commit.raw_id, dir_node.path, full_load=True, at_rev=at_rev)
800 c, commit.raw_id, dir_node.path, full_load=True, at_rev=at_rev)
794
801
795 return Response(html)
802 return Response(html)
796
803
797 def _get_attachement_headers(self, f_path):
804 def _get_attachement_headers(self, f_path):
798 f_name = safe_str(f_path.split(Repository.NAME_SEP)[-1])
805 f_name = safe_str(f_path.split(Repository.NAME_SEP)[-1])
799 safe_path = f_name.replace('"', '\\"')
806 safe_path = f_name.replace('"', '\\"')
800 encoded_path = urllib.quote(f_name)
807 encoded_path = urllib.quote(f_name)
801
808
802 return "attachment; " \
809 return "attachment; " \
803 "filename=\"{}\"; " \
810 "filename=\"{}\"; " \
804 "filename*=UTF-8\'\'{}".format(safe_path, encoded_path)
811 "filename*=UTF-8\'\'{}".format(safe_path, encoded_path)
805
812
806 @LoginRequired()
813 @LoginRequired()
807 @HasRepoPermissionAnyDecorator(
814 @HasRepoPermissionAnyDecorator(
808 'repository.read', 'repository.write', 'repository.admin')
815 'repository.read', 'repository.write', 'repository.admin')
809 def repo_file_raw(self):
816 def repo_file_raw(self):
810 """
817 """
811 Action for show as raw, some mimetypes are "rendered",
818 Action for show as raw, some mimetypes are "rendered",
812 those include images, icons.
819 those include images, icons.
813 """
820 """
814 c = self.load_default_context()
821 c = self.load_default_context()
815
822
816 commit_id, f_path = self._get_commit_and_path()
823 commit_id, f_path = self._get_commit_and_path()
817 commit = self._get_commit_or_redirect(commit_id)
824 commit = self._get_commit_or_redirect(commit_id)
818 file_node = self._get_filenode_or_redirect(commit, f_path)
825 file_node = self._get_filenode_or_redirect(commit, f_path)
819
826
820 raw_mimetype_mapping = {
827 raw_mimetype_mapping = {
821 # map original mimetype to a mimetype used for "show as raw"
828 # map original mimetype to a mimetype used for "show as raw"
822 # you can also provide a content-disposition to override the
829 # you can also provide a content-disposition to override the
823 # default "attachment" disposition.
830 # default "attachment" disposition.
824 # orig_type: (new_type, new_dispo)
831 # orig_type: (new_type, new_dispo)
825
832
826 # show images inline:
833 # show images inline:
827 # Do not re-add SVG: it is unsafe and permits XSS attacks. One can
834 # Do not re-add SVG: it is unsafe and permits XSS attacks. One can
828 # for example render an SVG with javascript inside or even render
835 # for example render an SVG with javascript inside or even render
829 # HTML.
836 # HTML.
830 'image/x-icon': ('image/x-icon', 'inline'),
837 'image/x-icon': ('image/x-icon', 'inline'),
831 'image/png': ('image/png', 'inline'),
838 'image/png': ('image/png', 'inline'),
832 'image/gif': ('image/gif', 'inline'),
839 'image/gif': ('image/gif', 'inline'),
833 'image/jpeg': ('image/jpeg', 'inline'),
840 'image/jpeg': ('image/jpeg', 'inline'),
834 'application/pdf': ('application/pdf', 'inline'),
841 'application/pdf': ('application/pdf', 'inline'),
835 }
842 }
836
843
837 mimetype = file_node.mimetype
844 mimetype = file_node.mimetype
838 try:
845 try:
839 mimetype, disposition = raw_mimetype_mapping[mimetype]
846 mimetype, disposition = raw_mimetype_mapping[mimetype]
840 except KeyError:
847 except KeyError:
841 # we don't know anything special about this, handle it safely
848 # we don't know anything special about this, handle it safely
842 if file_node.is_binary:
849 if file_node.is_binary:
843 # do same as download raw for binary files
850 # do same as download raw for binary files
844 mimetype, disposition = 'application/octet-stream', 'attachment'
851 mimetype, disposition = 'application/octet-stream', 'attachment'
845 else:
852 else:
846 # do not just use the original mimetype, but force text/plain,
853 # do not just use the original mimetype, but force text/plain,
847 # otherwise it would serve text/html and that might be unsafe.
854 # otherwise it would serve text/html and that might be unsafe.
848 # Note: underlying vcs library fakes text/plain mimetype if the
855 # Note: underlying vcs library fakes text/plain mimetype if the
849 # mimetype can not be determined and it thinks it is not
856 # mimetype can not be determined and it thinks it is not
850 # binary.This might lead to erroneous text display in some
857 # binary.This might lead to erroneous text display in some
851 # cases, but helps in other cases, like with text files
858 # cases, but helps in other cases, like with text files
852 # without extension.
859 # without extension.
853 mimetype, disposition = 'text/plain', 'inline'
860 mimetype, disposition = 'text/plain', 'inline'
854
861
855 if disposition == 'attachment':
862 if disposition == 'attachment':
856 disposition = self._get_attachement_headers(f_path)
863 disposition = self._get_attachement_headers(f_path)
857
864
858 stream_content = file_node.stream_bytes()
865 stream_content = file_node.stream_bytes()
859
866
860 response = Response(app_iter=stream_content)
867 response = Response(app_iter=stream_content)
861 response.content_disposition = disposition
868 response.content_disposition = disposition
862 response.content_type = mimetype
869 response.content_type = mimetype
863
870
864 charset = self._get_default_encoding(c)
871 charset = self._get_default_encoding(c)
865 if charset:
872 if charset:
866 response.charset = charset
873 response.charset = charset
867
874
868 return response
875 return response
869
876
870 @LoginRequired()
877 @LoginRequired()
871 @HasRepoPermissionAnyDecorator(
878 @HasRepoPermissionAnyDecorator(
872 'repository.read', 'repository.write', 'repository.admin')
879 'repository.read', 'repository.write', 'repository.admin')
873 def repo_file_download(self):
880 def repo_file_download(self):
874 c = self.load_default_context()
881 c = self.load_default_context()
875
882
876 commit_id, f_path = self._get_commit_and_path()
883 commit_id, f_path = self._get_commit_and_path()
877 commit = self._get_commit_or_redirect(commit_id)
884 commit = self._get_commit_or_redirect(commit_id)
878 file_node = self._get_filenode_or_redirect(commit, f_path)
885 file_node = self._get_filenode_or_redirect(commit, f_path)
879
886
880 if self.request.GET.get('lf'):
887 if self.request.GET.get('lf'):
881 # only if lf get flag is passed, we download this file
888 # only if lf get flag is passed, we download this file
882 # as LFS/Largefile
889 # as LFS/Largefile
883 lf_node = file_node.get_largefile_node()
890 lf_node = file_node.get_largefile_node()
884 if lf_node:
891 if lf_node:
885 # overwrite our pointer with the REAL large-file
892 # overwrite our pointer with the REAL large-file
886 file_node = lf_node
893 file_node = lf_node
887
894
888 disposition = self._get_attachement_headers(f_path)
895 disposition = self._get_attachement_headers(f_path)
889
896
890 stream_content = file_node.stream_bytes()
897 stream_content = file_node.stream_bytes()
891
898
892 response = Response(app_iter=stream_content)
899 response = Response(app_iter=stream_content)
893 response.content_disposition = disposition
900 response.content_disposition = disposition
894 response.content_type = file_node.mimetype
901 response.content_type = file_node.mimetype
895
902
896 charset = self._get_default_encoding(c)
903 charset = self._get_default_encoding(c)
897 if charset:
904 if charset:
898 response.charset = charset
905 response.charset = charset
899
906
900 return response
907 return response
901
908
902 def _get_nodelist_at_commit(self, repo_name, repo_id, commit_id, f_path):
909 def _get_nodelist_at_commit(self, repo_name, repo_id, commit_id, f_path):
903
910
904 cache_seconds = safe_int(
911 cache_seconds = safe_int(
905 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
912 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
906 cache_on = cache_seconds > 0
913 cache_on = cache_seconds > 0
907 log.debug(
914 log.debug(
908 'Computing FILE SEARCH for repo_id %s commit_id `%s` and path `%s`'
915 'Computing FILE SEARCH for repo_id %s commit_id `%s` and path `%s`'
909 'with caching: %s[TTL: %ss]' % (
916 'with caching: %s[TTL: %ss]' % (
910 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
917 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
911
918
912 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
919 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
913 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
920 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
914
921
915 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
922 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
916 def compute_file_search(_name_hash, _repo_id, _commit_id, _f_path):
923 def compute_file_search(_name_hash, _repo_id, _commit_id, _f_path):
917 log.debug('Generating cached nodelist for repo_id:%s, %s, %s',
924 log.debug('Generating cached nodelist for repo_id:%s, %s, %s',
918 _repo_id, commit_id, f_path)
925 _repo_id, commit_id, f_path)
919 try:
926 try:
920 _d, _f = ScmModel().get_quick_filter_nodes(repo_name, _commit_id, _f_path)
927 _d, _f = ScmModel().get_quick_filter_nodes(repo_name, _commit_id, _f_path)
921 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
928 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
922 log.exception(safe_str(e))
929 log.exception(safe_str(e))
923 h.flash(safe_str(h.escape(e)), category='error')
930 h.flash(safe_str(h.escape(e)), category='error')
924 raise HTTPFound(h.route_path(
931 raise HTTPFound(h.route_path(
925 'repo_files', repo_name=self.db_repo_name,
932 'repo_files', repo_name=self.db_repo_name,
926 commit_id='tip', f_path='/'))
933 commit_id='tip', f_path='/'))
927
934
928 return _d + _f
935 return _d + _f
929
936
930 result = compute_file_search(self.db_repo.repo_name_hash, self.db_repo.repo_id,
937 result = compute_file_search(self.db_repo.repo_name_hash, self.db_repo.repo_id,
931 commit_id, f_path)
938 commit_id, f_path)
932 return filter(lambda n: self.path_filter.path_access_allowed(n['name']), result)
939 return filter(lambda n: self.path_filter.path_access_allowed(n['name']), result)
933
940
934 @LoginRequired()
941 @LoginRequired()
935 @HasRepoPermissionAnyDecorator(
942 @HasRepoPermissionAnyDecorator(
936 'repository.read', 'repository.write', 'repository.admin')
943 'repository.read', 'repository.write', 'repository.admin')
937 def repo_nodelist(self):
944 def repo_nodelist(self):
938 self.load_default_context()
945 self.load_default_context()
939
946
940 commit_id, f_path = self._get_commit_and_path()
947 commit_id, f_path = self._get_commit_and_path()
941 commit = self._get_commit_or_redirect(commit_id)
948 commit = self._get_commit_or_redirect(commit_id)
942
949
943 metadata = self._get_nodelist_at_commit(
950 metadata = self._get_nodelist_at_commit(
944 self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path)
951 self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path)
945 return {'nodes': metadata}
952 return {'nodes': metadata}
946
953
947 def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type):
954 def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type):
948 items = []
955 items = []
949 for name, commit_id in branches_or_tags.items():
956 for name, commit_id in branches_or_tags.items():
950 sym_ref = symbolic_reference(commit_id, name, f_path, ref_type)
957 sym_ref = symbolic_reference(commit_id, name, f_path, ref_type)
951 items.append((sym_ref, name, ref_type))
958 items.append((sym_ref, name, ref_type))
952 return items
959 return items
953
960
954 def _symbolic_reference(self, commit_id, name, f_path, ref_type):
961 def _symbolic_reference(self, commit_id, name, f_path, ref_type):
955 return commit_id
962 return commit_id
956
963
957 def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type):
964 def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type):
958 return commit_id
965 return commit_id
959
966
960 # NOTE(dan): old code we used in "diff" mode compare
967 # NOTE(dan): old code we used in "diff" mode compare
961 new_f_path = vcspath.join(name, f_path)
968 new_f_path = vcspath.join(name, f_path)
962 return u'%s@%s' % (new_f_path, commit_id)
969 return u'%s@%s' % (new_f_path, commit_id)
963
970
964 def _get_node_history(self, commit_obj, f_path, commits=None):
971 def _get_node_history(self, commit_obj, f_path, commits=None):
965 """
972 """
966 get commit history for given node
973 get commit history for given node
967
974
968 :param commit_obj: commit to calculate history
975 :param commit_obj: commit to calculate history
969 :param f_path: path for node to calculate history for
976 :param f_path: path for node to calculate history for
970 :param commits: if passed don't calculate history and take
977 :param commits: if passed don't calculate history and take
971 commits defined in this list
978 commits defined in this list
972 """
979 """
973 _ = self.request.translate
980 _ = self.request.translate
974
981
975 # calculate history based on tip
982 # calculate history based on tip
976 tip = self.rhodecode_vcs_repo.get_commit()
983 tip = self.rhodecode_vcs_repo.get_commit()
977 if commits is None:
984 if commits is None:
978 pre_load = ["author", "branch"]
985 pre_load = ["author", "branch"]
979 try:
986 try:
980 commits = tip.get_path_history(f_path, pre_load=pre_load)
987 commits = tip.get_path_history(f_path, pre_load=pre_load)
981 except (NodeDoesNotExistError, CommitError):
988 except (NodeDoesNotExistError, CommitError):
982 # this node is not present at tip!
989 # this node is not present at tip!
983 commits = commit_obj.get_path_history(f_path, pre_load=pre_load)
990 commits = commit_obj.get_path_history(f_path, pre_load=pre_load)
984
991
985 history = []
992 history = []
986 commits_group = ([], _("Changesets"))
993 commits_group = ([], _("Changesets"))
987 for commit in commits:
994 for commit in commits:
988 branch = ' (%s)' % commit.branch if commit.branch else ''
995 branch = ' (%s)' % commit.branch if commit.branch else ''
989 n_desc = 'r%s:%s%s' % (commit.idx, commit.short_id, branch)
996 n_desc = 'r%s:%s%s' % (commit.idx, commit.short_id, branch)
990 commits_group[0].append((commit.raw_id, n_desc, 'sha'))
997 commits_group[0].append((commit.raw_id, n_desc, 'sha'))
991 history.append(commits_group)
998 history.append(commits_group)
992
999
993 symbolic_reference = self._symbolic_reference
1000 symbolic_reference = self._symbolic_reference
994
1001
995 if self.rhodecode_vcs_repo.alias == 'svn':
1002 if self.rhodecode_vcs_repo.alias == 'svn':
996 adjusted_f_path = RepoFilesView.adjust_file_path_for_svn(
1003 adjusted_f_path = RepoFilesView.adjust_file_path_for_svn(
997 f_path, self.rhodecode_vcs_repo)
1004 f_path, self.rhodecode_vcs_repo)
998 if adjusted_f_path != f_path:
1005 if adjusted_f_path != f_path:
999 log.debug(
1006 log.debug(
1000 'Recognized svn tag or branch in file "%s", using svn '
1007 'Recognized svn tag or branch in file "%s", using svn '
1001 'specific symbolic references', f_path)
1008 'specific symbolic references', f_path)
1002 f_path = adjusted_f_path
1009 f_path = adjusted_f_path
1003 symbolic_reference = self._symbolic_reference_svn
1010 symbolic_reference = self._symbolic_reference_svn
1004
1011
1005 branches = self._create_references(
1012 branches = self._create_references(
1006 self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch')
1013 self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch')
1007 branches_group = (branches, _("Branches"))
1014 branches_group = (branches, _("Branches"))
1008
1015
1009 tags = self._create_references(
1016 tags = self._create_references(
1010 self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag')
1017 self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag')
1011 tags_group = (tags, _("Tags"))
1018 tags_group = (tags, _("Tags"))
1012
1019
1013 history.append(branches_group)
1020 history.append(branches_group)
1014 history.append(tags_group)
1021 history.append(tags_group)
1015
1022
1016 return history, commits
1023 return history, commits
1017
1024
1018 @LoginRequired()
1025 @LoginRequired()
1019 @HasRepoPermissionAnyDecorator(
1026 @HasRepoPermissionAnyDecorator(
1020 'repository.read', 'repository.write', 'repository.admin')
1027 'repository.read', 'repository.write', 'repository.admin')
1021 def repo_file_history(self):
1028 def repo_file_history(self):
1022 self.load_default_context()
1029 self.load_default_context()
1023
1030
1024 commit_id, f_path = self._get_commit_and_path()
1031 commit_id, f_path = self._get_commit_and_path()
1025 commit = self._get_commit_or_redirect(commit_id)
1032 commit = self._get_commit_or_redirect(commit_id)
1026 file_node = self._get_filenode_or_redirect(commit, f_path)
1033 file_node = self._get_filenode_or_redirect(commit, f_path)
1027
1034
1028 if file_node.is_file():
1035 if file_node.is_file():
1029 file_history, _hist = self._get_node_history(commit, f_path)
1036 file_history, _hist = self._get_node_history(commit, f_path)
1030
1037
1031 res = []
1038 res = []
1032 for section_items, section in file_history:
1039 for section_items, section in file_history:
1033 items = []
1040 items = []
1034 for obj_id, obj_text, obj_type in section_items:
1041 for obj_id, obj_text, obj_type in section_items:
1035 at_rev = ''
1042 at_rev = ''
1036 if obj_type in ['branch', 'bookmark', 'tag']:
1043 if obj_type in ['branch', 'bookmark', 'tag']:
1037 at_rev = obj_text
1044 at_rev = obj_text
1038 entry = {
1045 entry = {
1039 'id': obj_id,
1046 'id': obj_id,
1040 'text': obj_text,
1047 'text': obj_text,
1041 'type': obj_type,
1048 'type': obj_type,
1042 'at_rev': at_rev
1049 'at_rev': at_rev
1043 }
1050 }
1044
1051
1045 items.append(entry)
1052 items.append(entry)
1046
1053
1047 res.append({
1054 res.append({
1048 'text': section,
1055 'text': section,
1049 'children': items
1056 'children': items
1050 })
1057 })
1051
1058
1052 data = {
1059 data = {
1053 'more': False,
1060 'more': False,
1054 'results': res
1061 'results': res
1055 }
1062 }
1056 return data
1063 return data
1057
1064
1058 log.warning('Cannot fetch history for directory')
1065 log.warning('Cannot fetch history for directory')
1059 raise HTTPBadRequest()
1066 raise HTTPBadRequest()
1060
1067
1061 @LoginRequired()
1068 @LoginRequired()
1062 @HasRepoPermissionAnyDecorator(
1069 @HasRepoPermissionAnyDecorator(
1063 'repository.read', 'repository.write', 'repository.admin')
1070 'repository.read', 'repository.write', 'repository.admin')
1064 def repo_file_authors(self):
1071 def repo_file_authors(self):
1065 c = self.load_default_context()
1072 c = self.load_default_context()
1066
1073
1067 commit_id, f_path = self._get_commit_and_path()
1074 commit_id, f_path = self._get_commit_and_path()
1068 commit = self._get_commit_or_redirect(commit_id)
1075 commit = self._get_commit_or_redirect(commit_id)
1069 file_node = self._get_filenode_or_redirect(commit, f_path)
1076 file_node = self._get_filenode_or_redirect(commit, f_path)
1070
1077
1071 if not file_node.is_file():
1078 if not file_node.is_file():
1072 raise HTTPBadRequest()
1079 raise HTTPBadRequest()
1073
1080
1074 c.file_last_commit = file_node.last_commit
1081 c.file_last_commit = file_node.last_commit
1075 if self.request.GET.get('annotate') == '1':
1082 if self.request.GET.get('annotate') == '1':
1076 # use _hist from annotation if annotation mode is on
1083 # use _hist from annotation if annotation mode is on
1077 commit_ids = set(x[1] for x in file_node.annotate)
1084 commit_ids = set(x[1] for x in file_node.annotate)
1078 _hist = (
1085 _hist = (
1079 self.rhodecode_vcs_repo.get_commit(commit_id)
1086 self.rhodecode_vcs_repo.get_commit(commit_id)
1080 for commit_id in commit_ids)
1087 for commit_id in commit_ids)
1081 else:
1088 else:
1082 _f_history, _hist = self._get_node_history(commit, f_path)
1089 _f_history, _hist = self._get_node_history(commit, f_path)
1083 c.file_author = False
1090 c.file_author = False
1084
1091
1085 unique = collections.OrderedDict()
1092 unique = collections.OrderedDict()
1086 for commit in _hist:
1093 for commit in _hist:
1087 author = commit.author
1094 author = commit.author
1088 if author not in unique:
1095 if author not in unique:
1089 unique[commit.author] = [
1096 unique[commit.author] = [
1090 h.email(author),
1097 h.email(author),
1091 h.person(author, 'username_or_name_or_email'),
1098 h.person(author, 'username_or_name_or_email'),
1092 1 # counter
1099 1 # counter
1093 ]
1100 ]
1094
1101
1095 else:
1102 else:
1096 # increase counter
1103 # increase counter
1097 unique[commit.author][2] += 1
1104 unique[commit.author][2] += 1
1098
1105
1099 c.authors = [val for val in unique.values()]
1106 c.authors = [val for val in unique.values()]
1100
1107
1101 return self._get_template_context(c)
1108 return self._get_template_context(c)
1102
1109
1103 @LoginRequired()
1110 @LoginRequired()
1104 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1111 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1105 def repo_files_check_head(self):
1112 def repo_files_check_head(self):
1106 self.load_default_context()
1113 self.load_default_context()
1107
1114
1108 commit_id, f_path = self._get_commit_and_path()
1115 commit_id, f_path = self._get_commit_and_path()
1109 _branch_name, _sha_commit_id, is_head = \
1116 _branch_name, _sha_commit_id, is_head = \
1110 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1117 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1111 landing_ref=self.db_repo.landing_ref_name)
1118 landing_ref=self.db_repo.landing_ref_name)
1112
1119
1113 new_path = self.request.POST.get('path')
1120 new_path = self.request.POST.get('path')
1114 operation = self.request.POST.get('operation')
1121 operation = self.request.POST.get('operation')
1115 path_exist = ''
1122 path_exist = ''
1116
1123
1117 if new_path and operation in ['create', 'upload']:
1124 if new_path and operation in ['create', 'upload']:
1118 new_f_path = os.path.join(f_path.lstrip('/'), new_path)
1125 new_f_path = os.path.join(f_path.lstrip('/'), new_path)
1119 try:
1126 try:
1120 commit_obj = self.rhodecode_vcs_repo.get_commit(commit_id)
1127 commit_obj = self.rhodecode_vcs_repo.get_commit(commit_id)
1121 # NOTE(dan): construct whole path without leading /
1128 # NOTE(dan): construct whole path without leading /
1122 file_node = commit_obj.get_node(new_f_path)
1129 file_node = commit_obj.get_node(new_f_path)
1123 if file_node is not None:
1130 if file_node is not None:
1124 path_exist = new_f_path
1131 path_exist = new_f_path
1125 except EmptyRepositoryError:
1132 except EmptyRepositoryError:
1126 pass
1133 pass
1127 except Exception:
1134 except Exception:
1128 pass
1135 pass
1129
1136
1130 return {
1137 return {
1131 'branch': _branch_name,
1138 'branch': _branch_name,
1132 'sha': _sha_commit_id,
1139 'sha': _sha_commit_id,
1133 'is_head': is_head,
1140 'is_head': is_head,
1134 'path_exists': path_exist
1141 'path_exists': path_exist
1135 }
1142 }
1136
1143
1137 @LoginRequired()
1144 @LoginRequired()
1138 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1145 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1139 def repo_files_remove_file(self):
1146 def repo_files_remove_file(self):
1140 _ = self.request.translate
1147 _ = self.request.translate
1141 c = self.load_default_context()
1148 c = self.load_default_context()
1142 commit_id, f_path = self._get_commit_and_path()
1149 commit_id, f_path = self._get_commit_and_path()
1143
1150
1144 self._ensure_not_locked()
1151 self._ensure_not_locked()
1145 _branch_name, _sha_commit_id, is_head = \
1152 _branch_name, _sha_commit_id, is_head = \
1146 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1153 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1147 landing_ref=self.db_repo.landing_ref_name)
1154 landing_ref=self.db_repo.landing_ref_name)
1148
1155
1149 self.forbid_non_head(is_head, f_path)
1156 self.forbid_non_head(is_head, f_path)
1150 self.check_branch_permission(_branch_name)
1157 self.check_branch_permission(_branch_name)
1151
1158
1152 c.commit = self._get_commit_or_redirect(commit_id)
1159 c.commit = self._get_commit_or_redirect(commit_id)
1153 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1160 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1154
1161
1155 c.default_message = _(
1162 c.default_message = _(
1156 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1163 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1157 c.f_path = f_path
1164 c.f_path = f_path
1158
1165
1159 return self._get_template_context(c)
1166 return self._get_template_context(c)
1160
1167
1161 @LoginRequired()
1168 @LoginRequired()
1162 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1169 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1163 @CSRFRequired()
1170 @CSRFRequired()
1164 def repo_files_delete_file(self):
1171 def repo_files_delete_file(self):
1165 _ = self.request.translate
1172 _ = self.request.translate
1166
1173
1167 c = self.load_default_context()
1174 c = self.load_default_context()
1168 commit_id, f_path = self._get_commit_and_path()
1175 commit_id, f_path = self._get_commit_and_path()
1169
1176
1170 self._ensure_not_locked()
1177 self._ensure_not_locked()
1171 _branch_name, _sha_commit_id, is_head = \
1178 _branch_name, _sha_commit_id, is_head = \
1172 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1179 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1173 landing_ref=self.db_repo.landing_ref_name)
1180 landing_ref=self.db_repo.landing_ref_name)
1174
1181
1175 self.forbid_non_head(is_head, f_path)
1182 self.forbid_non_head(is_head, f_path)
1176 self.check_branch_permission(_branch_name)
1183 self.check_branch_permission(_branch_name)
1177
1184
1178 c.commit = self._get_commit_or_redirect(commit_id)
1185 c.commit = self._get_commit_or_redirect(commit_id)
1179 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1186 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1180
1187
1181 c.default_message = _(
1188 c.default_message = _(
1182 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1189 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1183 c.f_path = f_path
1190 c.f_path = f_path
1184 node_path = f_path
1191 node_path = f_path
1185 author = self._rhodecode_db_user.full_contact
1192 author = self._rhodecode_db_user.full_contact
1186 message = self.request.POST.get('message') or c.default_message
1193 message = self.request.POST.get('message') or c.default_message
1187 try:
1194 try:
1188 nodes = {
1195 nodes = {
1189 node_path: {
1196 node_path: {
1190 'content': ''
1197 'content': ''
1191 }
1198 }
1192 }
1199 }
1193 ScmModel().delete_nodes(
1200 ScmModel().delete_nodes(
1194 user=self._rhodecode_db_user.user_id, repo=self.db_repo,
1201 user=self._rhodecode_db_user.user_id, repo=self.db_repo,
1195 message=message,
1202 message=message,
1196 nodes=nodes,
1203 nodes=nodes,
1197 parent_commit=c.commit,
1204 parent_commit=c.commit,
1198 author=author,
1205 author=author,
1199 )
1206 )
1200
1207
1201 h.flash(
1208 h.flash(
1202 _('Successfully deleted file `{}`').format(
1209 _('Successfully deleted file `{}`').format(
1203 h.escape(f_path)), category='success')
1210 h.escape(f_path)), category='success')
1204 except Exception:
1211 except Exception:
1205 log.exception('Error during commit operation')
1212 log.exception('Error during commit operation')
1206 h.flash(_('Error occurred during commit'), category='error')
1213 h.flash(_('Error occurred during commit'), category='error')
1207 raise HTTPFound(
1214 raise HTTPFound(
1208 h.route_path('repo_commit', repo_name=self.db_repo_name,
1215 h.route_path('repo_commit', repo_name=self.db_repo_name,
1209 commit_id='tip'))
1216 commit_id='tip'))
1210
1217
1211 @LoginRequired()
1218 @LoginRequired()
1212 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1219 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1213 def repo_files_edit_file(self):
1220 def repo_files_edit_file(self):
1214 _ = self.request.translate
1221 _ = self.request.translate
1215 c = self.load_default_context()
1222 c = self.load_default_context()
1216 commit_id, f_path = self._get_commit_and_path()
1223 commit_id, f_path = self._get_commit_and_path()
1217
1224
1218 self._ensure_not_locked()
1225 self._ensure_not_locked()
1219 _branch_name, _sha_commit_id, is_head = \
1226 _branch_name, _sha_commit_id, is_head = \
1220 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1227 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1221 landing_ref=self.db_repo.landing_ref_name)
1228 landing_ref=self.db_repo.landing_ref_name)
1222
1229
1223 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1230 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1224 self.check_branch_permission(_branch_name, commit_id=commit_id)
1231 self.check_branch_permission(_branch_name, commit_id=commit_id)
1225
1232
1226 c.commit = self._get_commit_or_redirect(commit_id)
1233 c.commit = self._get_commit_or_redirect(commit_id)
1227 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1234 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1228
1235
1229 if c.file.is_binary:
1236 if c.file.is_binary:
1230 files_url = h.route_path(
1237 files_url = h.route_path(
1231 'repo_files',
1238 'repo_files',
1232 repo_name=self.db_repo_name,
1239 repo_name=self.db_repo_name,
1233 commit_id=c.commit.raw_id, f_path=f_path)
1240 commit_id=c.commit.raw_id, f_path=f_path)
1234 raise HTTPFound(files_url)
1241 raise HTTPFound(files_url)
1235
1242
1236 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1243 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1237 c.f_path = f_path
1244 c.f_path = f_path
1238
1245
1239 return self._get_template_context(c)
1246 return self._get_template_context(c)
1240
1247
1241 @LoginRequired()
1248 @LoginRequired()
1242 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1249 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1243 @CSRFRequired()
1250 @CSRFRequired()
1244 def repo_files_update_file(self):
1251 def repo_files_update_file(self):
1245 _ = self.request.translate
1252 _ = self.request.translate
1246 c = self.load_default_context()
1253 c = self.load_default_context()
1247 commit_id, f_path = self._get_commit_and_path()
1254 commit_id, f_path = self._get_commit_and_path()
1248
1255
1249 self._ensure_not_locked()
1256 self._ensure_not_locked()
1250
1257
1251 c.commit = self._get_commit_or_redirect(commit_id)
1258 c.commit = self._get_commit_or_redirect(commit_id)
1252 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1259 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1253
1260
1254 if c.file.is_binary:
1261 if c.file.is_binary:
1255 raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name,
1262 raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name,
1256 commit_id=c.commit.raw_id, f_path=f_path))
1263 commit_id=c.commit.raw_id, f_path=f_path))
1257
1264
1258 _branch_name, _sha_commit_id, is_head = \
1265 _branch_name, _sha_commit_id, is_head = \
1259 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1266 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1260 landing_ref=self.db_repo.landing_ref_name)
1267 landing_ref=self.db_repo.landing_ref_name)
1261
1268
1262 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1269 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1263 self.check_branch_permission(_branch_name, commit_id=commit_id)
1270 self.check_branch_permission(_branch_name, commit_id=commit_id)
1264
1271
1265 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1272 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1266 c.f_path = f_path
1273 c.f_path = f_path
1267
1274
1268 old_content = c.file.content
1275 old_content = c.file.content
1269 sl = old_content.splitlines(1)
1276 sl = old_content.splitlines(1)
1270 first_line = sl[0] if sl else ''
1277 first_line = sl[0] if sl else ''
1271
1278
1272 r_post = self.request.POST
1279 r_post = self.request.POST
1273 # line endings: 0 - Unix, 1 - Mac, 2 - DOS
1280 # line endings: 0 - Unix, 1 - Mac, 2 - DOS
1274 line_ending_mode = detect_mode(first_line, 0)
1281 line_ending_mode = detect_mode(first_line, 0)
1275 content = convert_line_endings(r_post.get('content', ''), line_ending_mode)
1282 content = convert_line_endings(r_post.get('content', ''), line_ending_mode)
1276
1283
1277 message = r_post.get('message') or c.default_message
1284 message = r_post.get('message') or c.default_message
1278 org_node_path = c.file.unicode_path
1285 org_node_path = c.file.unicode_path
1279 filename = r_post['filename']
1286 filename = r_post['filename']
1280
1287
1281 root_path = c.file.dir_path
1288 root_path = c.file.dir_path
1282 pure_path = self.create_pure_path(root_path, filename)
1289 pure_path = self.create_pure_path(root_path, filename)
1283 node_path = safe_unicode(bytes(pure_path))
1290 node_path = safe_unicode(bytes(pure_path))
1284
1291
1285 default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name,
1292 default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name,
1286 commit_id=commit_id)
1293 commit_id=commit_id)
1287 if content == old_content and node_path == org_node_path:
1294 if content == old_content and node_path == org_node_path:
1288 h.flash(_('No changes detected on {}').format(h.escape(org_node_path)),
1295 h.flash(_('No changes detected on {}').format(h.escape(org_node_path)),
1289 category='warning')
1296 category='warning')
1290 raise HTTPFound(default_redirect_url)
1297 raise HTTPFound(default_redirect_url)
1291
1298
1292 try:
1299 try:
1293 mapping = {
1300 mapping = {
1294 org_node_path: {
1301 org_node_path: {
1295 'org_filename': org_node_path,
1302 'org_filename': org_node_path,
1296 'filename': node_path,
1303 'filename': node_path,
1297 'content': content,
1304 'content': content,
1298 'lexer': '',
1305 'lexer': '',
1299 'op': 'mod',
1306 'op': 'mod',
1300 'mode': c.file.mode
1307 'mode': c.file.mode
1301 }
1308 }
1302 }
1309 }
1303
1310
1304 commit = ScmModel().update_nodes(
1311 commit = ScmModel().update_nodes(
1305 user=self._rhodecode_db_user.user_id,
1312 user=self._rhodecode_db_user.user_id,
1306 repo=self.db_repo,
1313 repo=self.db_repo,
1307 message=message,
1314 message=message,
1308 nodes=mapping,
1315 nodes=mapping,
1309 parent_commit=c.commit,
1316 parent_commit=c.commit,
1310 )
1317 )
1311
1318
1312 h.flash(_('Successfully committed changes to file `{}`').format(
1319 h.flash(_('Successfully committed changes to file `{}`').format(
1313 h.escape(f_path)), category='success')
1320 h.escape(f_path)), category='success')
1314 default_redirect_url = h.route_path(
1321 default_redirect_url = h.route_path(
1315 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1322 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1316
1323
1317 except Exception:
1324 except Exception:
1318 log.exception('Error occurred during commit')
1325 log.exception('Error occurred during commit')
1319 h.flash(_('Error occurred during commit'), category='error')
1326 h.flash(_('Error occurred during commit'), category='error')
1320
1327
1321 raise HTTPFound(default_redirect_url)
1328 raise HTTPFound(default_redirect_url)
1322
1329
1323 @LoginRequired()
1330 @LoginRequired()
1324 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1331 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1325 def repo_files_add_file(self):
1332 def repo_files_add_file(self):
1326 _ = self.request.translate
1333 _ = self.request.translate
1327 c = self.load_default_context()
1334 c = self.load_default_context()
1328 commit_id, f_path = self._get_commit_and_path()
1335 commit_id, f_path = self._get_commit_and_path()
1329
1336
1330 self._ensure_not_locked()
1337 self._ensure_not_locked()
1331
1338
1332 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1339 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1333 if c.commit is None:
1340 if c.commit is None:
1334 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1341 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1335
1342
1336 if self.rhodecode_vcs_repo.is_empty():
1343 if self.rhodecode_vcs_repo.is_empty():
1337 # for empty repository we cannot check for current branch, we rely on
1344 # for empty repository we cannot check for current branch, we rely on
1338 # c.commit.branch instead
1345 # c.commit.branch instead
1339 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1346 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1340 else:
1347 else:
1341 _branch_name, _sha_commit_id, is_head = \
1348 _branch_name, _sha_commit_id, is_head = \
1342 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1349 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1343 landing_ref=self.db_repo.landing_ref_name)
1350 landing_ref=self.db_repo.landing_ref_name)
1344
1351
1345 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1352 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1346 self.check_branch_permission(_branch_name, commit_id=commit_id)
1353 self.check_branch_permission(_branch_name, commit_id=commit_id)
1347
1354
1348 c.default_message = (_('Added file via RhodeCode Enterprise'))
1355 c.default_message = (_('Added file via RhodeCode Enterprise'))
1349 c.f_path = f_path.lstrip('/') # ensure not relative path
1356 c.f_path = f_path.lstrip('/') # ensure not relative path
1350
1357
1351 return self._get_template_context(c)
1358 return self._get_template_context(c)
1352
1359
1353 @LoginRequired()
1360 @LoginRequired()
1354 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1361 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1355 @CSRFRequired()
1362 @CSRFRequired()
1356 def repo_files_create_file(self):
1363 def repo_files_create_file(self):
1357 _ = self.request.translate
1364 _ = self.request.translate
1358 c = self.load_default_context()
1365 c = self.load_default_context()
1359 commit_id, f_path = self._get_commit_and_path()
1366 commit_id, f_path = self._get_commit_and_path()
1360
1367
1361 self._ensure_not_locked()
1368 self._ensure_not_locked()
1362
1369
1363 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1370 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1364 if c.commit is None:
1371 if c.commit is None:
1365 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1372 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1366
1373
1367 # calculate redirect URL
1374 # calculate redirect URL
1368 if self.rhodecode_vcs_repo.is_empty():
1375 if self.rhodecode_vcs_repo.is_empty():
1369 default_redirect_url = h.route_path(
1376 default_redirect_url = h.route_path(
1370 'repo_summary', repo_name=self.db_repo_name)
1377 'repo_summary', repo_name=self.db_repo_name)
1371 else:
1378 else:
1372 default_redirect_url = h.route_path(
1379 default_redirect_url = h.route_path(
1373 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1380 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1374
1381
1375 if self.rhodecode_vcs_repo.is_empty():
1382 if self.rhodecode_vcs_repo.is_empty():
1376 # for empty repository we cannot check for current branch, we rely on
1383 # for empty repository we cannot check for current branch, we rely on
1377 # c.commit.branch instead
1384 # c.commit.branch instead
1378 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1385 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1379 else:
1386 else:
1380 _branch_name, _sha_commit_id, is_head = \
1387 _branch_name, _sha_commit_id, is_head = \
1381 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1388 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1382 landing_ref=self.db_repo.landing_ref_name)
1389 landing_ref=self.db_repo.landing_ref_name)
1383
1390
1384 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1391 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1385 self.check_branch_permission(_branch_name, commit_id=commit_id)
1392 self.check_branch_permission(_branch_name, commit_id=commit_id)
1386
1393
1387 c.default_message = (_('Added file via RhodeCode Enterprise'))
1394 c.default_message = (_('Added file via RhodeCode Enterprise'))
1388 c.f_path = f_path
1395 c.f_path = f_path
1389
1396
1390 r_post = self.request.POST
1397 r_post = self.request.POST
1391 message = r_post.get('message') or c.default_message
1398 message = r_post.get('message') or c.default_message
1392 filename = r_post.get('filename')
1399 filename = r_post.get('filename')
1393 unix_mode = 0
1400 unix_mode = 0
1394 content = convert_line_endings(r_post.get('content', ''), unix_mode)
1401 content = convert_line_endings(r_post.get('content', ''), unix_mode)
1395
1402
1396 if not filename:
1403 if not filename:
1397 # If there's no commit, redirect to repo summary
1404 # If there's no commit, redirect to repo summary
1398 if type(c.commit) is EmptyCommit:
1405 if type(c.commit) is EmptyCommit:
1399 redirect_url = h.route_path(
1406 redirect_url = h.route_path(
1400 'repo_summary', repo_name=self.db_repo_name)
1407 'repo_summary', repo_name=self.db_repo_name)
1401 else:
1408 else:
1402 redirect_url = default_redirect_url
1409 redirect_url = default_redirect_url
1403 h.flash(_('No filename specified'), category='warning')
1410 h.flash(_('No filename specified'), category='warning')
1404 raise HTTPFound(redirect_url)
1411 raise HTTPFound(redirect_url)
1405
1412
1406 root_path = f_path
1413 root_path = f_path
1407 pure_path = self.create_pure_path(root_path, filename)
1414 pure_path = self.create_pure_path(root_path, filename)
1408 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1415 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1409
1416
1410 author = self._rhodecode_db_user.full_contact
1417 author = self._rhodecode_db_user.full_contact
1411 nodes = {
1418 nodes = {
1412 node_path: {
1419 node_path: {
1413 'content': content
1420 'content': content
1414 }
1421 }
1415 }
1422 }
1416
1423
1417 try:
1424 try:
1418
1425
1419 commit = ScmModel().create_nodes(
1426 commit = ScmModel().create_nodes(
1420 user=self._rhodecode_db_user.user_id,
1427 user=self._rhodecode_db_user.user_id,
1421 repo=self.db_repo,
1428 repo=self.db_repo,
1422 message=message,
1429 message=message,
1423 nodes=nodes,
1430 nodes=nodes,
1424 parent_commit=c.commit,
1431 parent_commit=c.commit,
1425 author=author,
1432 author=author,
1426 )
1433 )
1427
1434
1428 h.flash(_('Successfully committed new file `{}`').format(
1435 h.flash(_('Successfully committed new file `{}`').format(
1429 h.escape(node_path)), category='success')
1436 h.escape(node_path)), category='success')
1430
1437
1431 default_redirect_url = h.route_path(
1438 default_redirect_url = h.route_path(
1432 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1439 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1433
1440
1434 except NonRelativePathError:
1441 except NonRelativePathError:
1435 log.exception('Non Relative path found')
1442 log.exception('Non Relative path found')
1436 h.flash(_('The location specified must be a relative path and must not '
1443 h.flash(_('The location specified must be a relative path and must not '
1437 'contain .. in the path'), category='warning')
1444 'contain .. in the path'), category='warning')
1438 raise HTTPFound(default_redirect_url)
1445 raise HTTPFound(default_redirect_url)
1439 except (NodeError, NodeAlreadyExistsError) as e:
1446 except (NodeError, NodeAlreadyExistsError) as e:
1440 h.flash(_(h.escape(e)), category='error')
1447 h.flash(_(h.escape(e)), category='error')
1441 except Exception:
1448 except Exception:
1442 log.exception('Error occurred during commit')
1449 log.exception('Error occurred during commit')
1443 h.flash(_('Error occurred during commit'), category='error')
1450 h.flash(_('Error occurred during commit'), category='error')
1444
1451
1445 raise HTTPFound(default_redirect_url)
1452 raise HTTPFound(default_redirect_url)
1446
1453
1447 @LoginRequired()
1454 @LoginRequired()
1448 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1455 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1449 @CSRFRequired()
1456 @CSRFRequired()
1450 def repo_files_upload_file(self):
1457 def repo_files_upload_file(self):
1451 _ = self.request.translate
1458 _ = self.request.translate
1452 c = self.load_default_context()
1459 c = self.load_default_context()
1453 commit_id, f_path = self._get_commit_and_path()
1460 commit_id, f_path = self._get_commit_and_path()
1454
1461
1455 self._ensure_not_locked()
1462 self._ensure_not_locked()
1456
1463
1457 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1464 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1458 if c.commit is None:
1465 if c.commit is None:
1459 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1466 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1460
1467
1461 # calculate redirect URL
1468 # calculate redirect URL
1462 if self.rhodecode_vcs_repo.is_empty():
1469 if self.rhodecode_vcs_repo.is_empty():
1463 default_redirect_url = h.route_path(
1470 default_redirect_url = h.route_path(
1464 'repo_summary', repo_name=self.db_repo_name)
1471 'repo_summary', repo_name=self.db_repo_name)
1465 else:
1472 else:
1466 default_redirect_url = h.route_path(
1473 default_redirect_url = h.route_path(
1467 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1474 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1468
1475
1469 if self.rhodecode_vcs_repo.is_empty():
1476 if self.rhodecode_vcs_repo.is_empty():
1470 # for empty repository we cannot check for current branch, we rely on
1477 # for empty repository we cannot check for current branch, we rely on
1471 # c.commit.branch instead
1478 # c.commit.branch instead
1472 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1479 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1473 else:
1480 else:
1474 _branch_name, _sha_commit_id, is_head = \
1481 _branch_name, _sha_commit_id, is_head = \
1475 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1482 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1476 landing_ref=self.db_repo.landing_ref_name)
1483 landing_ref=self.db_repo.landing_ref_name)
1477
1484
1478 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1485 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1479 if error:
1486 if error:
1480 return {
1487 return {
1481 'error': error,
1488 'error': error,
1482 'redirect_url': default_redirect_url
1489 'redirect_url': default_redirect_url
1483 }
1490 }
1484 error = self.check_branch_permission(_branch_name, json_mode=True)
1491 error = self.check_branch_permission(_branch_name, json_mode=True)
1485 if error:
1492 if error:
1486 return {
1493 return {
1487 'error': error,
1494 'error': error,
1488 'redirect_url': default_redirect_url
1495 'redirect_url': default_redirect_url
1489 }
1496 }
1490
1497
1491 c.default_message = (_('Uploaded file via RhodeCode Enterprise'))
1498 c.default_message = (_('Uploaded file via RhodeCode Enterprise'))
1492 c.f_path = f_path
1499 c.f_path = f_path
1493
1500
1494 r_post = self.request.POST
1501 r_post = self.request.POST
1495
1502
1496 message = c.default_message
1503 message = c.default_message
1497 user_message = r_post.getall('message')
1504 user_message = r_post.getall('message')
1498 if isinstance(user_message, list) and user_message:
1505 if isinstance(user_message, list) and user_message:
1499 # we take the first from duplicated results if it's not empty
1506 # we take the first from duplicated results if it's not empty
1500 message = user_message[0] if user_message[0] else message
1507 message = user_message[0] if user_message[0] else message
1501
1508
1502 nodes = {}
1509 nodes = {}
1503
1510
1504 for file_obj in r_post.getall('files_upload') or []:
1511 for file_obj in r_post.getall('files_upload') or []:
1505 content = file_obj.file
1512 content = file_obj.file
1506 filename = file_obj.filename
1513 filename = file_obj.filename
1507
1514
1508 root_path = f_path
1515 root_path = f_path
1509 pure_path = self.create_pure_path(root_path, filename)
1516 pure_path = self.create_pure_path(root_path, filename)
1510 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1517 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1511
1518
1512 nodes[node_path] = {
1519 nodes[node_path] = {
1513 'content': content
1520 'content': content
1514 }
1521 }
1515
1522
1516 if not nodes:
1523 if not nodes:
1517 error = 'missing files'
1524 error = 'missing files'
1518 return {
1525 return {
1519 'error': error,
1526 'error': error,
1520 'redirect_url': default_redirect_url
1527 'redirect_url': default_redirect_url
1521 }
1528 }
1522
1529
1523 author = self._rhodecode_db_user.full_contact
1530 author = self._rhodecode_db_user.full_contact
1524
1531
1525 try:
1532 try:
1526 commit = ScmModel().create_nodes(
1533 commit = ScmModel().create_nodes(
1527 user=self._rhodecode_db_user.user_id,
1534 user=self._rhodecode_db_user.user_id,
1528 repo=self.db_repo,
1535 repo=self.db_repo,
1529 message=message,
1536 message=message,
1530 nodes=nodes,
1537 nodes=nodes,
1531 parent_commit=c.commit,
1538 parent_commit=c.commit,
1532 author=author,
1539 author=author,
1533 )
1540 )
1534 if len(nodes) == 1:
1541 if len(nodes) == 1:
1535 flash_message = _('Successfully committed {} new files').format(len(nodes))
1542 flash_message = _('Successfully committed {} new files').format(len(nodes))
1536 else:
1543 else:
1537 flash_message = _('Successfully committed 1 new file')
1544 flash_message = _('Successfully committed 1 new file')
1538
1545
1539 h.flash(flash_message, category='success')
1546 h.flash(flash_message, category='success')
1540
1547
1541 default_redirect_url = h.route_path(
1548 default_redirect_url = h.route_path(
1542 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1549 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1543
1550
1544 except NonRelativePathError:
1551 except NonRelativePathError:
1545 log.exception('Non Relative path found')
1552 log.exception('Non Relative path found')
1546 error = _('The location specified must be a relative path and must not '
1553 error = _('The location specified must be a relative path and must not '
1547 'contain .. in the path')
1554 'contain .. in the path')
1548 h.flash(error, category='warning')
1555 h.flash(error, category='warning')
1549
1556
1550 return {
1557 return {
1551 'error': error,
1558 'error': error,
1552 'redirect_url': default_redirect_url
1559 'redirect_url': default_redirect_url
1553 }
1560 }
1554 except (NodeError, NodeAlreadyExistsError) as e:
1561 except (NodeError, NodeAlreadyExistsError) as e:
1555 error = h.escape(e)
1562 error = h.escape(e)
1556 h.flash(error, category='error')
1563 h.flash(error, category='error')
1557
1564
1558 return {
1565 return {
1559 'error': error,
1566 'error': error,
1560 'redirect_url': default_redirect_url
1567 'redirect_url': default_redirect_url
1561 }
1568 }
1562 except Exception:
1569 except Exception:
1563 log.exception('Error occurred during commit')
1570 log.exception('Error occurred during commit')
1564 error = _('Error occurred during commit')
1571 error = _('Error occurred during commit')
1565 h.flash(error, category='error')
1572 h.flash(error, category='error')
1566 return {
1573 return {
1567 'error': error,
1574 'error': error,
1568 'redirect_url': default_redirect_url
1575 'redirect_url': default_redirect_url
1569 }
1576 }
1570
1577
1571 return {
1578 return {
1572 'error': None,
1579 'error': None,
1573 'redirect_url': default_redirect_url
1580 'redirect_url': default_redirect_url
1574 }
1581 }
@@ -1,254 +1,254 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2011-2020 RhodeCode GmbH
3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import datetime
22 import datetime
23 import formencode
23 import formencode
24 import formencode.htmlfill
24 import formencode.htmlfill
25
25
26 from pyramid.httpexceptions import HTTPFound
26 from pyramid.httpexceptions import HTTPFound
27
27
28 from pyramid.renderers import render
28 from pyramid.renderers import render
29 from pyramid.response import Response
29 from pyramid.response import Response
30
30
31 from rhodecode import events
31 from rhodecode import events
32 from rhodecode.apps._base import RepoAppView, DataGridAppView
32 from rhodecode.apps._base import RepoAppView, DataGridAppView
33 from rhodecode.lib.auth import (
33 from rhodecode.lib.auth import (
34 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous,
34 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous,
35 HasRepoPermissionAny, HasPermissionAnyDecorator, CSRFRequired)
35 HasRepoPermissionAny, HasPermissionAnyDecorator, CSRFRequired)
36 import rhodecode.lib.helpers as h
36 import rhodecode.lib.helpers as h
37 from rhodecode.lib.celerylib.utils import get_task_id
37 from rhodecode.lib.celerylib.utils import get_task_id
38 from rhodecode.model.db import coalesce, or_, Repository, RepoGroup
38 from rhodecode.model.db import coalesce, or_, Repository, RepoGroup
39 from rhodecode.model.permission import PermissionModel
39 from rhodecode.model.permission import PermissionModel
40 from rhodecode.model.repo import RepoModel
40 from rhodecode.model.repo import RepoModel
41 from rhodecode.model.forms import RepoForkForm
41 from rhodecode.model.forms import RepoForkForm
42 from rhodecode.model.scm import ScmModel, RepoGroupList
42 from rhodecode.model.scm import ScmModel, RepoGroupList
43 from rhodecode.lib.utils2 import safe_int, safe_unicode
43 from rhodecode.lib.utils2 import safe_int, safe_unicode
44
44
45 log = logging.getLogger(__name__)
45 log = logging.getLogger(__name__)
46
46
47
47
48 class RepoForksView(RepoAppView, DataGridAppView):
48 class RepoForksView(RepoAppView, DataGridAppView):
49
49
50 def load_default_context(self):
50 def load_default_context(self):
51 c = self._get_local_tmpl_context(include_app_defaults=True)
51 c = self._get_local_tmpl_context(include_app_defaults=True)
52 c.rhodecode_repo = self.rhodecode_vcs_repo
52 c.rhodecode_repo = self.rhodecode_vcs_repo
53
53
54 acl_groups = RepoGroupList(
54 acl_groups = RepoGroupList(
55 RepoGroup.query().all(),
55 RepoGroup.query().all(),
56 perm_set=['group.write', 'group.admin'])
56 perm_set=['group.write', 'group.admin'])
57 c.repo_groups = RepoGroup.groups_choices(groups=acl_groups)
57 c.repo_groups = RepoGroup.groups_choices(groups=acl_groups)
58 c.repo_groups_choices = map(lambda k: safe_unicode(k[0]), c.repo_groups)
58 c.repo_groups_choices = map(lambda k: safe_unicode(k[0]), c.repo_groups)
59
59
60 c.personal_repo_group = c.rhodecode_user.personal_repo_group
60 c.personal_repo_group = c.rhodecode_user.personal_repo_group
61
61
62 return c
62 return c
63
63
64 @LoginRequired()
64 @LoginRequired()
65 @HasRepoPermissionAnyDecorator(
65 @HasRepoPermissionAnyDecorator(
66 'repository.read', 'repository.write', 'repository.admin')
66 'repository.read', 'repository.write', 'repository.admin')
67 def repo_forks_show_all(self):
67 def repo_forks_show_all(self):
68 c = self.load_default_context()
68 c = self.load_default_context()
69 return self._get_template_context(c)
69 return self._get_template_context(c)
70
70
71 @LoginRequired()
71 @LoginRequired()
72 @HasRepoPermissionAnyDecorator(
72 @HasRepoPermissionAnyDecorator(
73 'repository.read', 'repository.write', 'repository.admin')
73 'repository.read', 'repository.write', 'repository.admin')
74 def repo_forks_data(self):
74 def repo_forks_data(self):
75 _ = self.request.translate
75 _ = self.request.translate
76 self.load_default_context()
76 self.load_default_context()
77 column_map = {
77 column_map = {
78 'fork_name': 'repo_name',
78 'fork_name': 'repo_name',
79 'fork_date': 'created_on',
79 'fork_date': 'created_on',
80 'last_activity': 'updated_on'
80 'last_activity': 'updated_on'
81 }
81 }
82 draw, start, limit = self._extract_chunk(self.request)
82 draw, start, limit = self._extract_chunk(self.request)
83 search_q, order_by, order_dir = self._extract_ordering(
83 search_q, order_by, order_dir = self._extract_ordering(
84 self.request, column_map=column_map)
84 self.request, column_map=column_map)
85
85
86 acl_check = HasRepoPermissionAny(
86 acl_check = HasRepoPermissionAny(
87 'repository.read', 'repository.write', 'repository.admin')
87 'repository.read', 'repository.write', 'repository.admin')
88 repo_id = self.db_repo.repo_id
88 repo_id = self.db_repo.repo_id
89 allowed_ids = [-1]
89 allowed_ids = [-1]
90 for f in Repository.query().filter(Repository.fork_id == repo_id):
90 for f in Repository.query().filter(Repository.fork_id == repo_id):
91 if acl_check(f.repo_name, 'get forks check'):
91 if acl_check(f.repo_name, 'get forks check'):
92 allowed_ids.append(f.repo_id)
92 allowed_ids.append(f.repo_id)
93
93
94 forks_data_total_count = Repository.query()\
94 forks_data_total_count = Repository.query()\
95 .filter(Repository.fork_id == repo_id)\
95 .filter(Repository.fork_id == repo_id)\
96 .filter(Repository.repo_id.in_(allowed_ids))\
96 .filter(Repository.repo_id.in_(allowed_ids))\
97 .count()
97 .count()
98
98
99 # json generate
99 # json generate
100 base_q = Repository.query()\
100 base_q = Repository.query()\
101 .filter(Repository.fork_id == repo_id)\
101 .filter(Repository.fork_id == repo_id)\
102 .filter(Repository.repo_id.in_(allowed_ids))\
102 .filter(Repository.repo_id.in_(allowed_ids))\
103
103
104 if search_q:
104 if search_q:
105 like_expression = u'%{}%'.format(safe_unicode(search_q))
105 like_expression = u'%{}%'.format(safe_unicode(search_q))
106 base_q = base_q.filter(or_(
106 base_q = base_q.filter(or_(
107 Repository.repo_name.ilike(like_expression),
107 Repository.repo_name.ilike(like_expression),
108 Repository.description.ilike(like_expression),
108 Repository.description.ilike(like_expression),
109 ))
109 ))
110
110
111 forks_data_total_filtered_count = base_q.count()
111 forks_data_total_filtered_count = base_q.count()
112
112
113 sort_col = getattr(Repository, order_by, None)
113 sort_col = getattr(Repository, order_by, None)
114 if sort_col:
114 if sort_col:
115 if order_dir == 'asc':
115 if order_dir == 'asc':
116 # handle null values properly to order by NULL last
116 # handle null values properly to order by NULL last
117 if order_by in ['last_activity']:
117 if order_by in ['last_activity']:
118 sort_col = coalesce(sort_col, datetime.date.max)
118 sort_col = coalesce(sort_col, datetime.date.max)
119 sort_col = sort_col.asc()
119 sort_col = sort_col.asc()
120 else:
120 else:
121 # handle null values properly to order by NULL last
121 # handle null values properly to order by NULL last
122 if order_by in ['last_activity']:
122 if order_by in ['last_activity']:
123 sort_col = coalesce(sort_col, datetime.date.min)
123 sort_col = coalesce(sort_col, datetime.date.min)
124 sort_col = sort_col.desc()
124 sort_col = sort_col.desc()
125
125
126 base_q = base_q.order_by(sort_col)
126 base_q = base_q.order_by(sort_col)
127 base_q = base_q.offset(start).limit(limit)
127 base_q = base_q.offset(start).limit(limit)
128
128
129 fork_list = base_q.all()
129 fork_list = base_q.all()
130
130
131 def fork_actions(fork):
131 def fork_actions(fork):
132 url_link = h.route_path(
132 url_link = h.route_path(
133 'repo_compare',
133 'repo_compare',
134 repo_name=fork.repo_name,
134 repo_name=fork.repo_name,
135 source_ref_type=self.db_repo.landing_ref_type,
135 source_ref_type=self.db_repo.landing_ref_type,
136 source_ref=self.db_repo.landing_ref_name,
136 source_ref=self.db_repo.landing_ref_name,
137 target_ref_type=self.db_repo.landing_ref_type,
137 target_ref_type=self.db_repo.landing_ref_type,
138 target_ref=self.db_repo.landing_ref_name,
138 target_ref=self.db_repo.landing_ref_name,
139 _query=dict(merge=1, target_repo=f.repo_name))
139 _query=dict(merge=1, target_repo=f.repo_name))
140 return h.link_to(_('Compare fork'), url_link, class_='btn-link')
140 return h.link_to(_('Compare fork'), url_link, class_='btn-link')
141
141
142 def fork_name(fork):
142 def fork_name(fork):
143 return h.link_to(fork.repo_name,
143 return h.link_to(fork.repo_name,
144 h.route_path('repo_summary', repo_name=fork.repo_name))
144 h.route_path('repo_summary', repo_name=fork.repo_name))
145
145
146 forks_data = []
146 forks_data = []
147 for fork in fork_list:
147 for fork in fork_list:
148 forks_data.append({
148 forks_data.append({
149 "username": h.gravatar_with_user(self.request, fork.user.username),
149 "username": h.gravatar_with_user(self.request, fork.user.username),
150 "fork_name": fork_name(fork),
150 "fork_name": fork_name(fork),
151 "description": fork.description_safe,
151 "description": fork.description_safe,
152 "fork_date": h.age_component(fork.created_on, time_is_local=True),
152 "fork_date": h.age_component(fork.created_on, time_is_local=True),
153 "last_activity": h.format_date(fork.updated_on),
153 "last_activity": h.format_date(fork.updated_on),
154 "action": fork_actions(fork),
154 "action": fork_actions(fork),
155 })
155 })
156
156
157 data = ({
157 data = ({
158 'draw': draw,
158 'draw': draw,
159 'data': forks_data,
159 'data': forks_data,
160 'recordsTotal': forks_data_total_count,
160 'recordsTotal': forks_data_total_count,
161 'recordsFiltered': forks_data_total_filtered_count,
161 'recordsFiltered': forks_data_total_filtered_count,
162 })
162 })
163
163
164 return data
164 return data
165
165
166 @LoginRequired()
166 @LoginRequired()
167 @NotAnonymous()
167 @NotAnonymous()
168 @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository')
168 @HasPermissionAnyDecorator('hg.admin', PermissionModel.FORKING_ENABLED)
169 @HasRepoPermissionAnyDecorator(
169 @HasRepoPermissionAnyDecorator(
170 'repository.read', 'repository.write', 'repository.admin')
170 'repository.read', 'repository.write', 'repository.admin')
171 def repo_fork_new(self):
171 def repo_fork_new(self):
172 c = self.load_default_context()
172 c = self.load_default_context()
173
173
174 defaults = RepoModel()._get_defaults(self.db_repo_name)
174 defaults = RepoModel()._get_defaults(self.db_repo_name)
175 # alter the description to indicate a fork
175 # alter the description to indicate a fork
176 defaults['description'] = (
176 defaults['description'] = (
177 'fork of repository: %s \n%s' % (
177 'fork of repository: %s \n%s' % (
178 defaults['repo_name'], defaults['description']))
178 defaults['repo_name'], defaults['description']))
179 # add suffix to fork
179 # add suffix to fork
180 defaults['repo_name'] = '%s-fork' % defaults['repo_name']
180 defaults['repo_name'] = '%s-fork' % defaults['repo_name']
181
181
182 data = render('rhodecode:templates/forks/fork.mako',
182 data = render('rhodecode:templates/forks/fork.mako',
183 self._get_template_context(c), self.request)
183 self._get_template_context(c), self.request)
184 html = formencode.htmlfill.render(
184 html = formencode.htmlfill.render(
185 data,
185 data,
186 defaults=defaults,
186 defaults=defaults,
187 encoding="UTF-8",
187 encoding="UTF-8",
188 force_defaults=False
188 force_defaults=False
189 )
189 )
190 return Response(html)
190 return Response(html)
191
191
192 @LoginRequired()
192 @LoginRequired()
193 @NotAnonymous()
193 @NotAnonymous()
194 @HasPermissionAnyDecorator('hg.admin', 'hg.fork.repository')
194 @HasPermissionAnyDecorator('hg.admin', PermissionModel.FORKING_ENABLED)
195 @HasRepoPermissionAnyDecorator(
195 @HasRepoPermissionAnyDecorator(
196 'repository.read', 'repository.write', 'repository.admin')
196 'repository.read', 'repository.write', 'repository.admin')
197 @CSRFRequired()
197 @CSRFRequired()
198 def repo_fork_create(self):
198 def repo_fork_create(self):
199 _ = self.request.translate
199 _ = self.request.translate
200 c = self.load_default_context()
200 c = self.load_default_context()
201
201
202 _form = RepoForkForm(self.request.translate,
202 _form = RepoForkForm(self.request.translate,
203 old_data={'repo_type': self.db_repo.repo_type},
203 old_data={'repo_type': self.db_repo.repo_type},
204 repo_groups=c.repo_groups_choices)()
204 repo_groups=c.repo_groups_choices)()
205 post_data = dict(self.request.POST)
205 post_data = dict(self.request.POST)
206
206
207 # forbid injecting other repo by forging a request
207 # forbid injecting other repo by forging a request
208 post_data['fork_parent_id'] = self.db_repo.repo_id
208 post_data['fork_parent_id'] = self.db_repo.repo_id
209 post_data['landing_rev'] = self.db_repo._landing_revision
209 post_data['landing_rev'] = self.db_repo._landing_revision
210
210
211 form_result = {}
211 form_result = {}
212 task_id = None
212 task_id = None
213 try:
213 try:
214 form_result = _form.to_python(post_data)
214 form_result = _form.to_python(post_data)
215 copy_permissions = form_result.get('copy_permissions')
215 copy_permissions = form_result.get('copy_permissions')
216 # create fork is done sometimes async on celery, db transaction
216 # create fork is done sometimes async on celery, db transaction
217 # management is handled there.
217 # management is handled there.
218 task = RepoModel().create_fork(
218 task = RepoModel().create_fork(
219 form_result, c.rhodecode_user.user_id)
219 form_result, c.rhodecode_user.user_id)
220
220
221 task_id = get_task_id(task)
221 task_id = get_task_id(task)
222 except formencode.Invalid as errors:
222 except formencode.Invalid as errors:
223 c.rhodecode_db_repo = self.db_repo
223 c.rhodecode_db_repo = self.db_repo
224
224
225 data = render('rhodecode:templates/forks/fork.mako',
225 data = render('rhodecode:templates/forks/fork.mako',
226 self._get_template_context(c), self.request)
226 self._get_template_context(c), self.request)
227 html = formencode.htmlfill.render(
227 html = formencode.htmlfill.render(
228 data,
228 data,
229 defaults=errors.value,
229 defaults=errors.value,
230 errors=errors.error_dict or {},
230 errors=errors.error_dict or {},
231 prefix_error=False,
231 prefix_error=False,
232 encoding="UTF-8",
232 encoding="UTF-8",
233 force_defaults=False
233 force_defaults=False
234 )
234 )
235 return Response(html)
235 return Response(html)
236 except Exception:
236 except Exception:
237 log.exception(
237 log.exception(
238 u'Exception while trying to fork the repository %s', self.db_repo_name)
238 u'Exception while trying to fork the repository %s', self.db_repo_name)
239 msg = _('An error occurred during repository forking %s') % (self.db_repo_name, )
239 msg = _('An error occurred during repository forking %s') % (self.db_repo_name, )
240 h.flash(msg, category='error')
240 h.flash(msg, category='error')
241 raise HTTPFound(h.route_path('home'))
241 raise HTTPFound(h.route_path('home'))
242
242
243 repo_name = form_result.get('repo_name_full', self.db_repo_name)
243 repo_name = form_result.get('repo_name_full', self.db_repo_name)
244
244
245 affected_user_ids = [self._rhodecode_user.user_id]
245 affected_user_ids = [self._rhodecode_user.user_id]
246 if copy_permissions:
246 if copy_permissions:
247 # permission flush is done in repo creating
247 # permission flush is done in repo creating
248 pass
248 pass
249
249
250 PermissionModel().trigger_permission_flush(affected_user_ids)
250 PermissionModel().trigger_permission_flush(affected_user_ids)
251
251
252 raise HTTPFound(
252 raise HTTPFound(
253 h.route_path('repo_creating', repo_name=repo_name,
253 h.route_path('repo_creating', repo_name=repo_name,
254 _query=dict(task_id=task_id)))
254 _query=dict(task_id=task_id)))
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
General Comments 1
Under Review
author

Auto status change to "Under Review"

You need to be logged in to leave comments. Login now