##// END OF EJS Templates
release: Merge default into stable for release preparation
milka -
r4566:091769de merge stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,89 b''
1 |RCE| 4.23.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2020-11-20
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - Comments: introduced new draft comments.
14
15 * drafts are private to author
16 * not triggering any notifications
17 * sidebar doesn't display draft comments
18 * They are just placeholders for longer review.
19
20 - Comments: when channelstream is enabled, comments are pushed live, so there's no
21 need to refresh page to see other participant comments.
22 New comments are marker in the sidebar.
23
24 - Comments: multiple changes on comments navigation/display logic.
25
26 * toggle icon is smarter, open/hide windows according to actions. E.g commenting opens threads
27 * toggle are mor explicit
28 * possible to hide/show only single threads using the toggle icon.
29 * new UI for showing thread comments
30
31 - Reviewers: new logic for author/commit-author rules.
32 It's not possible to define if author or commit author should be excluded, or always included in a review.
33 - Reviewers: no reviewers would now allow a PR to be merged, unless review rules require some.
34 Use case is that pr can be created without review needed, maybe just for sharing, or CI checks
35 - Pull requests: save permanently the state if sorting columns for pull-request grids.
36 - Commit ranges: enable combined diff compare directly from range selector.
37
38
39 General
40 ^^^^^^^
41
42 - Authentication: enable custom names for auth plugins. It's possible to name the authentication
43 buttons now for SAML plugins.
44 - Login: optimized UI for login/register/password reset windows.
45 - Repo mapper: make it more resilient to errors, it's better it executes and skip certain
46 repositories, rather then crash whole mapper.
47 - Markdown: improved styling, and fixed nl2br extensions to only do br on new elements not inline.
48 - Pull requests: show pr version in the my-account and repo pr listing grids.
49 - Archives: allowing to obtain archives without the commit short id in the name for
50 better automation of obtained artifacts.
51 New url flag called `?=with_hash=1` controls this
52 - Error document: update info about stored exception retrieval.
53 - Range diff: enable hovercards for commits in range-diff.
54
55
56 Security
57 ^^^^^^^^
58
59
60
61 Performance
62 ^^^^^^^^^^^
63
64 - Improved logic of repo archive, now it's much faster to run archiver as VCSServer
65 communication was removed, and job is delegated to VCSServer itself.
66 - Improved VCSServer startup times.
67 - Notifications: skip double rendering just to generate email title/desc.
68 We'll re-use those now for better performance of creating notifications.
69 - App: improve logging, and remove DB calls on app startup.
70
71
72 Fixes
73 ^^^^^
74
75 - Login/register: fixed header width problem on mobile devices
76 - Exception tracker: don't fail on empty request in context of celery app for example.
77 - Exceptions: improved reporting of unhandled vcsserver exceptions.
78 - Sidebar: fixed refresh of TODOs url.
79 - Remap-rescan: fixes #5636 initial rescan problem.
80 - API: fixed SVN raw diff export. The API method was inconsistent, and used different logic.
81 Now it shares the same code as raw-diff from web-ui.
82
83
84 Upgrade notes
85 ^^^^^^^^^^^^^
86
87 - Scheduled feature release.
88 Please note that now the reviewers logic changed a bit, it's possible to create a pull request
89 Without any reviewers initially, and such pull request doesn't need to have an approval for merging.
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
@@ -1,5 +1,5 b''
1 1 [bumpversion]
2 current_version = 4.22.0
2 current_version = 4.23.0
3 3 message = release: Bump version {current_version} to {new_version}
4 4
5 5 [bumpversion:file:rhodecode/VERSION]
@@ -1,33 +1,28 b''
1 1 [DEFAULT]
2 2 done = false
3 3
4 4 [task:bump_version]
5 5 done = true
6 6
7 7 [task:rc_tools_pinned]
8 done = true
9 8
10 9 [task:fixes_on_stable]
11 done = true
12 10
13 11 [task:pip2nix_generated]
14 done = true
15 12
16 13 [task:changelog_updated]
17 done = true
18 14
19 15 [task:generate_api_docs]
20 done = true
16
17 [task:updated_translation]
21 18
22 19 [release]
23 state = prepared
24 version = 4.22.0
25
26 [task:updated_translation]
20 state = in_progress
21 version = 4.23.0
27 22
28 23 [task:generate_js_routes]
29 24
30 25 [task:updated_trial_license]
31 26
32 27 [task:generate_oss_licenses]
33 28
@@ -1,148 +1,149 b''
1 1 .. _rhodecode-release-notes-ref:
2 2
3 3 Release Notes
4 4 =============
5 5
6 6 |RCE| 4.x Versions
7 7 ------------------
8 8
9 9 .. toctree::
10 10 :maxdepth: 1
11 11
12 release-notes-4.23.0.rst
12 13 release-notes-4.22.0.rst
13 14 release-notes-4.21.0.rst
14 15 release-notes-4.20.1.rst
15 16 release-notes-4.20.0.rst
16 17 release-notes-4.19.3.rst
17 18 release-notes-4.19.2.rst
18 19 release-notes-4.19.1.rst
19 20 release-notes-4.19.0.rst
20 21 release-notes-4.18.3.rst
21 22 release-notes-4.18.2.rst
22 23 release-notes-4.18.1.rst
23 24 release-notes-4.18.0.rst
24 25 release-notes-4.17.4.rst
25 26 release-notes-4.17.3.rst
26 27 release-notes-4.17.2.rst
27 28 release-notes-4.17.1.rst
28 29 release-notes-4.17.0.rst
29 30 release-notes-4.16.2.rst
30 31 release-notes-4.16.1.rst
31 32 release-notes-4.16.0.rst
32 33 release-notes-4.15.2.rst
33 34 release-notes-4.15.1.rst
34 35 release-notes-4.15.0.rst
35 36 release-notes-4.14.1.rst
36 37 release-notes-4.14.0.rst
37 38 release-notes-4.13.3.rst
38 39 release-notes-4.13.2.rst
39 40 release-notes-4.13.1.rst
40 41 release-notes-4.13.0.rst
41 42 release-notes-4.12.4.rst
42 43 release-notes-4.12.3.rst
43 44 release-notes-4.12.2.rst
44 45 release-notes-4.12.1.rst
45 46 release-notes-4.12.0.rst
46 47 release-notes-4.11.6.rst
47 48 release-notes-4.11.5.rst
48 49 release-notes-4.11.4.rst
49 50 release-notes-4.11.3.rst
50 51 release-notes-4.11.2.rst
51 52 release-notes-4.11.1.rst
52 53 release-notes-4.11.0.rst
53 54 release-notes-4.10.6.rst
54 55 release-notes-4.10.5.rst
55 56 release-notes-4.10.4.rst
56 57 release-notes-4.10.3.rst
57 58 release-notes-4.10.2.rst
58 59 release-notes-4.10.1.rst
59 60 release-notes-4.10.0.rst
60 61 release-notes-4.9.1.rst
61 62 release-notes-4.9.0.rst
62 63 release-notes-4.8.0.rst
63 64 release-notes-4.7.2.rst
64 65 release-notes-4.7.1.rst
65 66 release-notes-4.7.0.rst
66 67 release-notes-4.6.1.rst
67 68 release-notes-4.6.0.rst
68 69 release-notes-4.5.2.rst
69 70 release-notes-4.5.1.rst
70 71 release-notes-4.5.0.rst
71 72 release-notes-4.4.2.rst
72 73 release-notes-4.4.1.rst
73 74 release-notes-4.4.0.rst
74 75 release-notes-4.3.1.rst
75 76 release-notes-4.3.0.rst
76 77 release-notes-4.2.1.rst
77 78 release-notes-4.2.0.rst
78 79 release-notes-4.1.2.rst
79 80 release-notes-4.1.1.rst
80 81 release-notes-4.1.0.rst
81 82 release-notes-4.0.1.rst
82 83 release-notes-4.0.0.rst
83 84
84 85 |RCE| 3.x Versions
85 86 ------------------
86 87
87 88 .. toctree::
88 89 :maxdepth: 1
89 90
90 91 release-notes-3.8.4.rst
91 92 release-notes-3.8.3.rst
92 93 release-notes-3.8.2.rst
93 94 release-notes-3.8.1.rst
94 95 release-notes-3.8.0.rst
95 96 release-notes-3.7.1.rst
96 97 release-notes-3.7.0.rst
97 98 release-notes-3.6.1.rst
98 99 release-notes-3.6.0.rst
99 100 release-notes-3.5.2.rst
100 101 release-notes-3.5.1.rst
101 102 release-notes-3.5.0.rst
102 103 release-notes-3.4.1.rst
103 104 release-notes-3.4.0.rst
104 105 release-notes-3.3.4.rst
105 106 release-notes-3.3.3.rst
106 107 release-notes-3.3.2.rst
107 108 release-notes-3.3.1.rst
108 109 release-notes-3.3.0.rst
109 110 release-notes-3.2.3.rst
110 111 release-notes-3.2.2.rst
111 112 release-notes-3.2.1.rst
112 113 release-notes-3.2.0.rst
113 114 release-notes-3.1.1.rst
114 115 release-notes-3.1.0.rst
115 116 release-notes-3.0.2.rst
116 117 release-notes-3.0.1.rst
117 118 release-notes-3.0.0.rst
118 119
119 120 |RCE| 2.x Versions
120 121 ------------------
121 122
122 123 .. toctree::
123 124 :maxdepth: 1
124 125
125 126 release-notes-2.2.8.rst
126 127 release-notes-2.2.7.rst
127 128 release-notes-2.2.6.rst
128 129 release-notes-2.2.5.rst
129 130 release-notes-2.2.4.rst
130 131 release-notes-2.2.3.rst
131 132 release-notes-2.2.2.rst
132 133 release-notes-2.2.1.rst
133 134 release-notes-2.2.0.rst
134 135 release-notes-2.1.0.rst
135 136 release-notes-2.0.2.rst
136 137 release-notes-2.0.1.rst
137 138 release-notes-2.0.0.rst
138 139
139 140 |RCE| 1.x Versions
140 141 ------------------
141 142
142 143 .. toctree::
143 144 :maxdepth: 1
144 145
145 146 release-notes-1.7.2.rst
146 147 release-notes-1.7.1.rst
147 148 release-notes-1.7.0.rst
148 149 release-notes-1.6.0.rst
@@ -1,2509 +1,2509 b''
1 1 # Generated by pip2nix 0.8.0.dev1
2 2 # See https://github.com/johbo/pip2nix
3 3
4 4 { pkgs, fetchurl, fetchgit, fetchhg }:
5 5
6 6 self: super: {
7 7 "alembic" = super.buildPythonPackage {
8 8 name = "alembic-1.4.2";
9 9 doCheck = false;
10 10 propagatedBuildInputs = [
11 11 self."sqlalchemy"
12 12 self."mako"
13 13 self."python-editor"
14 14 self."python-dateutil"
15 15 ];
16 16 src = fetchurl {
17 17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
18 18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
19 19 };
20 20 meta = {
21 21 license = [ pkgs.lib.licenses.mit ];
22 22 };
23 23 };
24 24 "amqp" = super.buildPythonPackage {
25 25 name = "amqp-2.5.2";
26 26 doCheck = false;
27 27 propagatedBuildInputs = [
28 28 self."vine"
29 29 ];
30 30 src = fetchurl {
31 31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
32 32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
33 33 };
34 34 meta = {
35 35 license = [ pkgs.lib.licenses.bsdOriginal ];
36 36 };
37 37 };
38 38 "apispec" = super.buildPythonPackage {
39 39 name = "apispec-1.0.0";
40 40 doCheck = false;
41 41 propagatedBuildInputs = [
42 42 self."PyYAML"
43 43 ];
44 44 src = fetchurl {
45 45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
46 46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
47 47 };
48 48 meta = {
49 49 license = [ pkgs.lib.licenses.mit ];
50 50 };
51 51 };
52 52 "appenlight-client" = super.buildPythonPackage {
53 53 name = "appenlight-client-0.6.26";
54 54 doCheck = false;
55 55 propagatedBuildInputs = [
56 56 self."webob"
57 57 self."requests"
58 58 self."six"
59 59 ];
60 60 src = fetchurl {
61 61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
62 62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
63 63 };
64 64 meta = {
65 65 license = [ pkgs.lib.licenses.bsdOriginal ];
66 66 };
67 67 };
68 68 "asn1crypto" = super.buildPythonPackage {
69 69 name = "asn1crypto-0.24.0";
70 70 doCheck = false;
71 71 src = fetchurl {
72 72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
73 73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
74 74 };
75 75 meta = {
76 76 license = [ pkgs.lib.licenses.mit ];
77 77 };
78 78 };
79 79 "atomicwrites" = super.buildPythonPackage {
80 80 name = "atomicwrites-1.3.0";
81 81 doCheck = false;
82 82 src = fetchurl {
83 83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
84 84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
85 85 };
86 86 meta = {
87 87 license = [ pkgs.lib.licenses.mit ];
88 88 };
89 89 };
90 90 "attrs" = super.buildPythonPackage {
91 91 name = "attrs-19.3.0";
92 92 doCheck = false;
93 93 src = fetchurl {
94 94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
95 95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
96 96 };
97 97 meta = {
98 98 license = [ pkgs.lib.licenses.mit ];
99 99 };
100 100 };
101 101 "babel" = super.buildPythonPackage {
102 102 name = "babel-1.3";
103 103 doCheck = false;
104 104 propagatedBuildInputs = [
105 105 self."pytz"
106 106 ];
107 107 src = fetchurl {
108 108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
109 109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
110 110 };
111 111 meta = {
112 112 license = [ pkgs.lib.licenses.bsdOriginal ];
113 113 };
114 114 };
115 115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
116 116 name = "backports.shutil-get-terminal-size-1.0.0";
117 117 doCheck = false;
118 118 src = fetchurl {
119 119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
120 120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
121 121 };
122 122 meta = {
123 123 license = [ pkgs.lib.licenses.mit ];
124 124 };
125 125 };
126 126 "beaker" = super.buildPythonPackage {
127 127 name = "beaker-1.9.1";
128 128 doCheck = false;
129 129 propagatedBuildInputs = [
130 130 self."funcsigs"
131 131 ];
132 132 src = fetchurl {
133 133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
134 134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
135 135 };
136 136 meta = {
137 137 license = [ pkgs.lib.licenses.bsdOriginal ];
138 138 };
139 139 };
140 140 "beautifulsoup4" = super.buildPythonPackage {
141 141 name = "beautifulsoup4-4.6.3";
142 142 doCheck = false;
143 143 src = fetchurl {
144 144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
145 145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
146 146 };
147 147 meta = {
148 148 license = [ pkgs.lib.licenses.mit ];
149 149 };
150 150 };
151 151 "billiard" = super.buildPythonPackage {
152 152 name = "billiard-3.6.1.0";
153 153 doCheck = false;
154 154 src = fetchurl {
155 155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
156 156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
157 157 };
158 158 meta = {
159 159 license = [ pkgs.lib.licenses.bsdOriginal ];
160 160 };
161 161 };
162 162 "bleach" = super.buildPythonPackage {
163 163 name = "bleach-3.1.3";
164 164 doCheck = false;
165 165 propagatedBuildInputs = [
166 166 self."six"
167 167 self."webencodings"
168 168 ];
169 169 src = fetchurl {
170 170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
171 171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
172 172 };
173 173 meta = {
174 174 license = [ pkgs.lib.licenses.asl20 ];
175 175 };
176 176 };
177 177 "bumpversion" = super.buildPythonPackage {
178 178 name = "bumpversion-0.5.3";
179 179 doCheck = false;
180 180 src = fetchurl {
181 181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
182 182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
183 183 };
184 184 meta = {
185 185 license = [ pkgs.lib.licenses.mit ];
186 186 };
187 187 };
188 188 "cachetools" = super.buildPythonPackage {
189 189 name = "cachetools-3.1.1";
190 190 doCheck = false;
191 191 src = fetchurl {
192 192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
193 193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
194 194 };
195 195 meta = {
196 196 license = [ pkgs.lib.licenses.mit ];
197 197 };
198 198 };
199 199 "celery" = super.buildPythonPackage {
200 200 name = "celery-4.3.0";
201 201 doCheck = false;
202 202 propagatedBuildInputs = [
203 203 self."pytz"
204 204 self."billiard"
205 205 self."kombu"
206 206 self."vine"
207 207 ];
208 208 src = fetchurl {
209 209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
210 210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
211 211 };
212 212 meta = {
213 213 license = [ pkgs.lib.licenses.bsdOriginal ];
214 214 };
215 215 };
216 216 "certifi" = super.buildPythonPackage {
217 217 name = "certifi-2020.4.5.1";
218 218 doCheck = false;
219 219 src = fetchurl {
220 220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
221 221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
222 222 };
223 223 meta = {
224 224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
225 225 };
226 226 };
227 227 "cffi" = super.buildPythonPackage {
228 228 name = "cffi-1.12.3";
229 229 doCheck = false;
230 230 propagatedBuildInputs = [
231 231 self."pycparser"
232 232 ];
233 233 src = fetchurl {
234 234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
235 235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
236 236 };
237 237 meta = {
238 238 license = [ pkgs.lib.licenses.mit ];
239 239 };
240 240 };
241 241 "chameleon" = super.buildPythonPackage {
242 242 name = "chameleon-2.24";
243 243 doCheck = false;
244 244 src = fetchurl {
245 245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
246 246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
247 247 };
248 248 meta = {
249 249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
250 250 };
251 251 };
252 252 "channelstream" = super.buildPythonPackage {
253 253 name = "channelstream-0.6.14";
254 254 doCheck = false;
255 255 propagatedBuildInputs = [
256 256 self."gevent"
257 257 self."ws4py"
258 258 self."marshmallow"
259 259 self."python-dateutil"
260 260 self."pyramid"
261 261 self."pyramid-jinja2"
262 262 self."pyramid-apispec"
263 263 self."itsdangerous"
264 264 self."requests"
265 265 self."six"
266 266 ];
267 267 src = fetchurl {
268 268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
269 269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
270 270 };
271 271 meta = {
272 272 license = [ pkgs.lib.licenses.bsdOriginal ];
273 273 };
274 274 };
275 275 "chardet" = super.buildPythonPackage {
276 276 name = "chardet-3.0.4";
277 277 doCheck = false;
278 278 src = fetchurl {
279 279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
280 280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
281 281 };
282 282 meta = {
283 283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
284 284 };
285 285 };
286 286 "click" = super.buildPythonPackage {
287 287 name = "click-7.0";
288 288 doCheck = false;
289 289 src = fetchurl {
290 290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
291 291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
292 292 };
293 293 meta = {
294 294 license = [ pkgs.lib.licenses.bsdOriginal ];
295 295 };
296 296 };
297 297 "colander" = super.buildPythonPackage {
298 298 name = "colander-1.7.0";
299 299 doCheck = false;
300 300 propagatedBuildInputs = [
301 301 self."translationstring"
302 302 self."iso8601"
303 303 self."enum34"
304 304 ];
305 305 src = fetchurl {
306 306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
307 307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
308 308 };
309 309 meta = {
310 310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
311 311 };
312 312 };
313 313 "configobj" = super.buildPythonPackage {
314 314 name = "configobj-5.0.6";
315 315 doCheck = false;
316 316 propagatedBuildInputs = [
317 317 self."six"
318 318 ];
319 319 src = fetchurl {
320 320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
321 321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
322 322 };
323 323 meta = {
324 324 license = [ pkgs.lib.licenses.bsdOriginal ];
325 325 };
326 326 };
327 327 "configparser" = super.buildPythonPackage {
328 328 name = "configparser-4.0.2";
329 329 doCheck = false;
330 330 src = fetchurl {
331 331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
332 332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
333 333 };
334 334 meta = {
335 335 license = [ pkgs.lib.licenses.mit ];
336 336 };
337 337 };
338 338 "contextlib2" = super.buildPythonPackage {
339 339 name = "contextlib2-0.6.0.post1";
340 340 doCheck = false;
341 341 src = fetchurl {
342 342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
343 343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
344 344 };
345 345 meta = {
346 346 license = [ pkgs.lib.licenses.psfl ];
347 347 };
348 348 };
349 349 "cov-core" = super.buildPythonPackage {
350 350 name = "cov-core-1.15.0";
351 351 doCheck = false;
352 352 propagatedBuildInputs = [
353 353 self."coverage"
354 354 ];
355 355 src = fetchurl {
356 356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
357 357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
358 358 };
359 359 meta = {
360 360 license = [ pkgs.lib.licenses.mit ];
361 361 };
362 362 };
363 363 "coverage" = super.buildPythonPackage {
364 364 name = "coverage-4.5.4";
365 365 doCheck = false;
366 366 src = fetchurl {
367 367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
368 368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
369 369 };
370 370 meta = {
371 371 license = [ pkgs.lib.licenses.asl20 ];
372 372 };
373 373 };
374 374 "cryptography" = super.buildPythonPackage {
375 375 name = "cryptography-2.6.1";
376 376 doCheck = false;
377 377 propagatedBuildInputs = [
378 378 self."asn1crypto"
379 379 self."six"
380 380 self."cffi"
381 381 self."enum34"
382 382 self."ipaddress"
383 383 ];
384 384 src = fetchurl {
385 385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
386 386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
387 387 };
388 388 meta = {
389 389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
390 390 };
391 391 };
392 392 "cssselect" = super.buildPythonPackage {
393 393 name = "cssselect-1.0.3";
394 394 doCheck = false;
395 395 src = fetchurl {
396 396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
397 397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
398 398 };
399 399 meta = {
400 400 license = [ pkgs.lib.licenses.bsdOriginal ];
401 401 };
402 402 };
403 403 "cssutils" = super.buildPythonPackage {
404 404 name = "cssutils-1.0.2";
405 405 doCheck = false;
406 406 src = fetchurl {
407 407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
408 408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
409 409 };
410 410 meta = {
411 411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
412 412 };
413 413 };
414 414 "decorator" = super.buildPythonPackage {
415 415 name = "decorator-4.1.2";
416 416 doCheck = false;
417 417 src = fetchurl {
418 418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
419 419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
420 420 };
421 421 meta = {
422 422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
423 423 };
424 424 };
425 425 "deform" = super.buildPythonPackage {
426 426 name = "deform-2.0.8";
427 427 doCheck = false;
428 428 propagatedBuildInputs = [
429 429 self."chameleon"
430 430 self."colander"
431 431 self."iso8601"
432 432 self."peppercorn"
433 433 self."translationstring"
434 434 self."zope.deprecation"
435 435 ];
436 436 src = fetchurl {
437 437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
438 438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
439 439 };
440 440 meta = {
441 441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
442 442 };
443 443 };
444 444 "defusedxml" = super.buildPythonPackage {
445 445 name = "defusedxml-0.6.0";
446 446 doCheck = false;
447 447 src = fetchurl {
448 448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
449 449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
450 450 };
451 451 meta = {
452 452 license = [ pkgs.lib.licenses.psfl ];
453 453 };
454 454 };
455 455 "dm.xmlsec.binding" = super.buildPythonPackage {
456 456 name = "dm.xmlsec.binding-1.3.7";
457 457 doCheck = false;
458 458 propagatedBuildInputs = [
459 459 self."setuptools"
460 460 self."lxml"
461 461 ];
462 462 src = fetchurl {
463 463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
464 464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
465 465 };
466 466 meta = {
467 467 license = [ pkgs.lib.licenses.bsdOriginal ];
468 468 };
469 469 };
470 470 "docutils" = super.buildPythonPackage {
471 471 name = "docutils-0.16";
472 472 doCheck = false;
473 473 src = fetchurl {
474 474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
475 475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
476 476 };
477 477 meta = {
478 478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
479 479 };
480 480 };
481 481 "dogpile.cache" = super.buildPythonPackage {
482 482 name = "dogpile.cache-0.9.0";
483 483 doCheck = false;
484 484 propagatedBuildInputs = [
485 485 self."decorator"
486 486 ];
487 487 src = fetchurl {
488 488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
489 489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
490 490 };
491 491 meta = {
492 492 license = [ pkgs.lib.licenses.bsdOriginal ];
493 493 };
494 494 };
495 495 "dogpile.core" = super.buildPythonPackage {
496 496 name = "dogpile.core-0.4.1";
497 497 doCheck = false;
498 498 src = fetchurl {
499 499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
500 500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
501 501 };
502 502 meta = {
503 503 license = [ pkgs.lib.licenses.bsdOriginal ];
504 504 };
505 505 };
506 506 "ecdsa" = super.buildPythonPackage {
507 507 name = "ecdsa-0.13.2";
508 508 doCheck = false;
509 509 src = fetchurl {
510 510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
511 511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
512 512 };
513 513 meta = {
514 514 license = [ pkgs.lib.licenses.mit ];
515 515 };
516 516 };
517 517 "elasticsearch" = super.buildPythonPackage {
518 518 name = "elasticsearch-6.3.1";
519 519 doCheck = false;
520 520 propagatedBuildInputs = [
521 521 self."urllib3"
522 522 ];
523 523 src = fetchurl {
524 524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
525 525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
526 526 };
527 527 meta = {
528 528 license = [ pkgs.lib.licenses.asl20 ];
529 529 };
530 530 };
531 531 "elasticsearch-dsl" = super.buildPythonPackage {
532 532 name = "elasticsearch-dsl-6.3.1";
533 533 doCheck = false;
534 534 propagatedBuildInputs = [
535 535 self."six"
536 536 self."python-dateutil"
537 537 self."elasticsearch"
538 538 self."ipaddress"
539 539 ];
540 540 src = fetchurl {
541 541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
542 542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
543 543 };
544 544 meta = {
545 545 license = [ pkgs.lib.licenses.asl20 ];
546 546 };
547 547 };
548 548 "elasticsearch1" = super.buildPythonPackage {
549 549 name = "elasticsearch1-1.10.0";
550 550 doCheck = false;
551 551 propagatedBuildInputs = [
552 552 self."urllib3"
553 553 ];
554 554 src = fetchurl {
555 555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
556 556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
557 557 };
558 558 meta = {
559 559 license = [ pkgs.lib.licenses.asl20 ];
560 560 };
561 561 };
562 562 "elasticsearch1-dsl" = super.buildPythonPackage {
563 563 name = "elasticsearch1-dsl-0.0.12";
564 564 doCheck = false;
565 565 propagatedBuildInputs = [
566 566 self."six"
567 567 self."python-dateutil"
568 568 self."elasticsearch1"
569 569 ];
570 570 src = fetchurl {
571 571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
572 572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
573 573 };
574 574 meta = {
575 575 license = [ pkgs.lib.licenses.asl20 ];
576 576 };
577 577 };
578 578 "elasticsearch2" = super.buildPythonPackage {
579 579 name = "elasticsearch2-2.5.1";
580 580 doCheck = false;
581 581 propagatedBuildInputs = [
582 582 self."urllib3"
583 583 ];
584 584 src = fetchurl {
585 585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
586 586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
587 587 };
588 588 meta = {
589 589 license = [ pkgs.lib.licenses.asl20 ];
590 590 };
591 591 };
592 592 "entrypoints" = super.buildPythonPackage {
593 593 name = "entrypoints-0.2.2";
594 594 doCheck = false;
595 595 propagatedBuildInputs = [
596 596 self."configparser"
597 597 ];
598 598 src = fetchurl {
599 599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
600 600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
601 601 };
602 602 meta = {
603 603 license = [ pkgs.lib.licenses.mit ];
604 604 };
605 605 };
606 606 "enum34" = super.buildPythonPackage {
607 607 name = "enum34-1.1.10";
608 608 doCheck = false;
609 609 src = fetchurl {
610 610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
611 611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
612 612 };
613 613 meta = {
614 614 license = [ pkgs.lib.licenses.bsdOriginal ];
615 615 };
616 616 };
617 617 "formencode" = super.buildPythonPackage {
618 618 name = "formencode-1.2.4";
619 619 doCheck = false;
620 620 src = fetchurl {
621 621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
622 622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
623 623 };
624 624 meta = {
625 625 license = [ pkgs.lib.licenses.psfl ];
626 626 };
627 627 };
628 628 "funcsigs" = super.buildPythonPackage {
629 629 name = "funcsigs-1.0.2";
630 630 doCheck = false;
631 631 src = fetchurl {
632 632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
633 633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
634 634 };
635 635 meta = {
636 636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
637 637 };
638 638 };
639 639 "functools32" = super.buildPythonPackage {
640 640 name = "functools32-3.2.3.post2";
641 641 doCheck = false;
642 642 src = fetchurl {
643 643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
644 644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
645 645 };
646 646 meta = {
647 647 license = [ pkgs.lib.licenses.psfl ];
648 648 };
649 649 };
650 650 "future" = super.buildPythonPackage {
651 651 name = "future-0.14.3";
652 652 doCheck = false;
653 653 src = fetchurl {
654 654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
655 655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
656 656 };
657 657 meta = {
658 658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
659 659 };
660 660 };
661 661 "futures" = super.buildPythonPackage {
662 662 name = "futures-3.0.2";
663 663 doCheck = false;
664 664 src = fetchurl {
665 665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
666 666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
667 667 };
668 668 meta = {
669 669 license = [ pkgs.lib.licenses.bsdOriginal ];
670 670 };
671 671 };
672 672 "gevent" = super.buildPythonPackage {
673 673 name = "gevent-1.5.0";
674 674 doCheck = false;
675 675 propagatedBuildInputs = [
676 676 self."greenlet"
677 677 ];
678 678 src = fetchurl {
679 679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
680 680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
681 681 };
682 682 meta = {
683 683 license = [ pkgs.lib.licenses.mit ];
684 684 };
685 685 };
686 686 "gnureadline" = super.buildPythonPackage {
687 687 name = "gnureadline-6.3.8";
688 688 doCheck = false;
689 689 src = fetchurl {
690 690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
691 691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
692 692 };
693 693 meta = {
694 694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
695 695 };
696 696 };
697 697 "gprof2dot" = super.buildPythonPackage {
698 698 name = "gprof2dot-2017.9.19";
699 699 doCheck = false;
700 700 src = fetchurl {
701 701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
702 702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
703 703 };
704 704 meta = {
705 705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
706 706 };
707 707 };
708 708 "greenlet" = super.buildPythonPackage {
709 709 name = "greenlet-0.4.15";
710 710 doCheck = false;
711 711 src = fetchurl {
712 712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
713 713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
714 714 };
715 715 meta = {
716 716 license = [ pkgs.lib.licenses.mit ];
717 717 };
718 718 };
719 719 "gunicorn" = super.buildPythonPackage {
720 720 name = "gunicorn-19.9.0";
721 721 doCheck = false;
722 722 src = fetchurl {
723 723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
724 724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
725 725 };
726 726 meta = {
727 727 license = [ pkgs.lib.licenses.mit ];
728 728 };
729 729 };
730 730 "hupper" = super.buildPythonPackage {
731 731 name = "hupper-1.10.2";
732 732 doCheck = false;
733 733 src = fetchurl {
734 734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
735 735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
736 736 };
737 737 meta = {
738 738 license = [ pkgs.lib.licenses.mit ];
739 739 };
740 740 };
741 741 "idna" = super.buildPythonPackage {
742 742 name = "idna-2.8";
743 743 doCheck = false;
744 744 src = fetchurl {
745 745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
746 746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
747 747 };
748 748 meta = {
749 749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
750 750 };
751 751 };
752 752 "importlib-metadata" = super.buildPythonPackage {
753 753 name = "importlib-metadata-1.6.0";
754 754 doCheck = false;
755 755 propagatedBuildInputs = [
756 756 self."zipp"
757 757 self."pathlib2"
758 758 self."contextlib2"
759 759 self."configparser"
760 760 ];
761 761 src = fetchurl {
762 762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
763 763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
764 764 };
765 765 meta = {
766 766 license = [ pkgs.lib.licenses.asl20 ];
767 767 };
768 768 };
769 769 "infrae.cache" = super.buildPythonPackage {
770 770 name = "infrae.cache-1.0.1";
771 771 doCheck = false;
772 772 propagatedBuildInputs = [
773 773 self."beaker"
774 774 self."repoze.lru"
775 775 ];
776 776 src = fetchurl {
777 777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
778 778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
779 779 };
780 780 meta = {
781 781 license = [ pkgs.lib.licenses.zpl21 ];
782 782 };
783 783 };
784 784 "invoke" = super.buildPythonPackage {
785 785 name = "invoke-0.13.0";
786 786 doCheck = false;
787 787 src = fetchurl {
788 788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
789 789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
790 790 };
791 791 meta = {
792 792 license = [ pkgs.lib.licenses.bsdOriginal ];
793 793 };
794 794 };
795 795 "ipaddress" = super.buildPythonPackage {
796 796 name = "ipaddress-1.0.23";
797 797 doCheck = false;
798 798 src = fetchurl {
799 799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
800 800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
801 801 };
802 802 meta = {
803 803 license = [ pkgs.lib.licenses.psfl ];
804 804 };
805 805 };
806 806 "ipdb" = super.buildPythonPackage {
807 807 name = "ipdb-0.13.2";
808 808 doCheck = false;
809 809 propagatedBuildInputs = [
810 810 self."setuptools"
811 811 self."ipython"
812 812 ];
813 813 src = fetchurl {
814 814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
815 815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
816 816 };
817 817 meta = {
818 818 license = [ pkgs.lib.licenses.bsdOriginal ];
819 819 };
820 820 };
821 821 "ipython" = super.buildPythonPackage {
822 822 name = "ipython-5.1.0";
823 823 doCheck = false;
824 824 propagatedBuildInputs = [
825 825 self."setuptools"
826 826 self."decorator"
827 827 self."pickleshare"
828 828 self."simplegeneric"
829 829 self."traitlets"
830 830 self."prompt-toolkit"
831 831 self."pygments"
832 832 self."pexpect"
833 833 self."backports.shutil-get-terminal-size"
834 834 self."pathlib2"
835 835 self."pexpect"
836 836 ];
837 837 src = fetchurl {
838 838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
839 839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
840 840 };
841 841 meta = {
842 842 license = [ pkgs.lib.licenses.bsdOriginal ];
843 843 };
844 844 };
845 845 "ipython-genutils" = super.buildPythonPackage {
846 846 name = "ipython-genutils-0.2.0";
847 847 doCheck = false;
848 848 src = fetchurl {
849 849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
850 850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
851 851 };
852 852 meta = {
853 853 license = [ pkgs.lib.licenses.bsdOriginal ];
854 854 };
855 855 };
856 856 "iso8601" = super.buildPythonPackage {
857 857 name = "iso8601-0.1.12";
858 858 doCheck = false;
859 859 src = fetchurl {
860 860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
861 861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
862 862 };
863 863 meta = {
864 864 license = [ pkgs.lib.licenses.mit ];
865 865 };
866 866 };
867 867 "isodate" = super.buildPythonPackage {
868 868 name = "isodate-0.6.0";
869 869 doCheck = false;
870 870 propagatedBuildInputs = [
871 871 self."six"
872 872 ];
873 873 src = fetchurl {
874 874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
875 875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
876 876 };
877 877 meta = {
878 878 license = [ pkgs.lib.licenses.bsdOriginal ];
879 879 };
880 880 };
881 881 "itsdangerous" = super.buildPythonPackage {
882 882 name = "itsdangerous-1.1.0";
883 883 doCheck = false;
884 884 src = fetchurl {
885 885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
886 886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
887 887 };
888 888 meta = {
889 889 license = [ pkgs.lib.licenses.bsdOriginal ];
890 890 };
891 891 };
892 892 "jinja2" = super.buildPythonPackage {
893 893 name = "jinja2-2.9.6";
894 894 doCheck = false;
895 895 propagatedBuildInputs = [
896 896 self."markupsafe"
897 897 ];
898 898 src = fetchurl {
899 899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
900 900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
901 901 };
902 902 meta = {
903 903 license = [ pkgs.lib.licenses.bsdOriginal ];
904 904 };
905 905 };
906 906 "jsonschema" = super.buildPythonPackage {
907 907 name = "jsonschema-2.6.0";
908 908 doCheck = false;
909 909 propagatedBuildInputs = [
910 910 self."functools32"
911 911 ];
912 912 src = fetchurl {
913 913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
914 914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
915 915 };
916 916 meta = {
917 917 license = [ pkgs.lib.licenses.mit ];
918 918 };
919 919 };
920 920 "jupyter-client" = super.buildPythonPackage {
921 921 name = "jupyter-client-5.0.0";
922 922 doCheck = false;
923 923 propagatedBuildInputs = [
924 924 self."traitlets"
925 925 self."jupyter-core"
926 926 self."pyzmq"
927 927 self."python-dateutil"
928 928 ];
929 929 src = fetchurl {
930 930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
931 931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
932 932 };
933 933 meta = {
934 934 license = [ pkgs.lib.licenses.bsdOriginal ];
935 935 };
936 936 };
937 937 "jupyter-core" = super.buildPythonPackage {
938 938 name = "jupyter-core-4.5.0";
939 939 doCheck = false;
940 940 propagatedBuildInputs = [
941 941 self."traitlets"
942 942 ];
943 943 src = fetchurl {
944 944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
945 945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
946 946 };
947 947 meta = {
948 948 license = [ pkgs.lib.licenses.bsdOriginal ];
949 949 };
950 950 };
951 951 "kombu" = super.buildPythonPackage {
952 952 name = "kombu-4.6.6";
953 953 doCheck = false;
954 954 propagatedBuildInputs = [
955 955 self."amqp"
956 956 self."importlib-metadata"
957 957 ];
958 958 src = fetchurl {
959 959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
960 960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
961 961 };
962 962 meta = {
963 963 license = [ pkgs.lib.licenses.bsdOriginal ];
964 964 };
965 965 };
966 966 "lxml" = super.buildPythonPackage {
967 967 name = "lxml-4.2.5";
968 968 doCheck = false;
969 969 src = fetchurl {
970 970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
971 971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
972 972 };
973 973 meta = {
974 974 license = [ pkgs.lib.licenses.bsdOriginal ];
975 975 };
976 976 };
977 977 "mako" = super.buildPythonPackage {
978 978 name = "mako-1.1.0";
979 979 doCheck = false;
980 980 propagatedBuildInputs = [
981 981 self."markupsafe"
982 982 ];
983 983 src = fetchurl {
984 984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
985 985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
986 986 };
987 987 meta = {
988 988 license = [ pkgs.lib.licenses.mit ];
989 989 };
990 990 };
991 991 "markdown" = super.buildPythonPackage {
992 992 name = "markdown-2.6.11";
993 993 doCheck = false;
994 994 src = fetchurl {
995 995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
996 996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
997 997 };
998 998 meta = {
999 999 license = [ pkgs.lib.licenses.bsdOriginal ];
1000 1000 };
1001 1001 };
1002 1002 "markupsafe" = super.buildPythonPackage {
1003 1003 name = "markupsafe-1.1.1";
1004 1004 doCheck = false;
1005 1005 src = fetchurl {
1006 1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1007 1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1008 1008 };
1009 1009 meta = {
1010 1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1011 1011 };
1012 1012 };
1013 1013 "marshmallow" = super.buildPythonPackage {
1014 1014 name = "marshmallow-2.18.0";
1015 1015 doCheck = false;
1016 1016 src = fetchurl {
1017 1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1018 1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1019 1019 };
1020 1020 meta = {
1021 1021 license = [ pkgs.lib.licenses.mit ];
1022 1022 };
1023 1023 };
1024 1024 "mistune" = super.buildPythonPackage {
1025 1025 name = "mistune-0.8.4";
1026 1026 doCheck = false;
1027 1027 src = fetchurl {
1028 1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1029 1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1030 1030 };
1031 1031 meta = {
1032 1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1033 1033 };
1034 1034 };
1035 1035 "mock" = super.buildPythonPackage {
1036 1036 name = "mock-3.0.5";
1037 1037 doCheck = false;
1038 1038 propagatedBuildInputs = [
1039 1039 self."six"
1040 1040 self."funcsigs"
1041 1041 ];
1042 1042 src = fetchurl {
1043 1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1044 1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1045 1045 };
1046 1046 meta = {
1047 1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1048 1048 };
1049 1049 };
1050 1050 "more-itertools" = super.buildPythonPackage {
1051 1051 name = "more-itertools-5.0.0";
1052 1052 doCheck = false;
1053 1053 propagatedBuildInputs = [
1054 1054 self."six"
1055 1055 ];
1056 1056 src = fetchurl {
1057 1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1058 1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1059 1059 };
1060 1060 meta = {
1061 1061 license = [ pkgs.lib.licenses.mit ];
1062 1062 };
1063 1063 };
1064 1064 "msgpack-python" = super.buildPythonPackage {
1065 1065 name = "msgpack-python-0.5.6";
1066 1066 doCheck = false;
1067 1067 src = fetchurl {
1068 1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1069 1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1070 1070 };
1071 1071 meta = {
1072 1072 license = [ pkgs.lib.licenses.asl20 ];
1073 1073 };
1074 1074 };
1075 1075 "mysql-python" = super.buildPythonPackage {
1076 1076 name = "mysql-python-1.2.5";
1077 1077 doCheck = false;
1078 1078 src = fetchurl {
1079 1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1080 1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1081 1081 };
1082 1082 meta = {
1083 1083 license = [ pkgs.lib.licenses.gpl1 ];
1084 1084 };
1085 1085 };
1086 1086 "nbconvert" = super.buildPythonPackage {
1087 1087 name = "nbconvert-5.3.1";
1088 1088 doCheck = false;
1089 1089 propagatedBuildInputs = [
1090 1090 self."mistune"
1091 1091 self."jinja2"
1092 1092 self."pygments"
1093 1093 self."traitlets"
1094 1094 self."jupyter-core"
1095 1095 self."nbformat"
1096 1096 self."entrypoints"
1097 1097 self."bleach"
1098 1098 self."pandocfilters"
1099 1099 self."testpath"
1100 1100 ];
1101 1101 src = fetchurl {
1102 1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1103 1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1104 1104 };
1105 1105 meta = {
1106 1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1107 1107 };
1108 1108 };
1109 1109 "nbformat" = super.buildPythonPackage {
1110 1110 name = "nbformat-4.4.0";
1111 1111 doCheck = false;
1112 1112 propagatedBuildInputs = [
1113 1113 self."ipython-genutils"
1114 1114 self."traitlets"
1115 1115 self."jsonschema"
1116 1116 self."jupyter-core"
1117 1117 ];
1118 1118 src = fetchurl {
1119 1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1120 1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1121 1121 };
1122 1122 meta = {
1123 1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1124 1124 };
1125 1125 };
1126 1126 "packaging" = super.buildPythonPackage {
1127 1127 name = "packaging-20.3";
1128 1128 doCheck = false;
1129 1129 propagatedBuildInputs = [
1130 1130 self."pyparsing"
1131 1131 self."six"
1132 1132 ];
1133 1133 src = fetchurl {
1134 1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1135 1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1136 1136 };
1137 1137 meta = {
1138 1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1139 1139 };
1140 1140 };
1141 1141 "pandocfilters" = super.buildPythonPackage {
1142 1142 name = "pandocfilters-1.4.2";
1143 1143 doCheck = false;
1144 1144 src = fetchurl {
1145 1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1146 1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1147 1147 };
1148 1148 meta = {
1149 1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1150 1150 };
1151 1151 };
1152 1152 "paste" = super.buildPythonPackage {
1153 1153 name = "paste-3.4.0";
1154 1154 doCheck = false;
1155 1155 propagatedBuildInputs = [
1156 1156 self."six"
1157 1157 ];
1158 1158 src = fetchurl {
1159 1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1160 1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1161 1161 };
1162 1162 meta = {
1163 1163 license = [ pkgs.lib.licenses.mit ];
1164 1164 };
1165 1165 };
1166 1166 "pastedeploy" = super.buildPythonPackage {
1167 1167 name = "pastedeploy-2.1.0";
1168 1168 doCheck = false;
1169 1169 src = fetchurl {
1170 1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1171 1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1172 1172 };
1173 1173 meta = {
1174 1174 license = [ pkgs.lib.licenses.mit ];
1175 1175 };
1176 1176 };
1177 1177 "pastescript" = super.buildPythonPackage {
1178 1178 name = "pastescript-3.2.0";
1179 1179 doCheck = false;
1180 1180 propagatedBuildInputs = [
1181 1181 self."paste"
1182 1182 self."pastedeploy"
1183 1183 self."six"
1184 1184 ];
1185 1185 src = fetchurl {
1186 1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1187 1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1188 1188 };
1189 1189 meta = {
1190 1190 license = [ pkgs.lib.licenses.mit ];
1191 1191 };
1192 1192 };
1193 1193 "pathlib2" = super.buildPythonPackage {
1194 1194 name = "pathlib2-2.3.5";
1195 1195 doCheck = false;
1196 1196 propagatedBuildInputs = [
1197 1197 self."six"
1198 1198 self."scandir"
1199 1199 ];
1200 1200 src = fetchurl {
1201 1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1202 1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1203 1203 };
1204 1204 meta = {
1205 1205 license = [ pkgs.lib.licenses.mit ];
1206 1206 };
1207 1207 };
1208 1208 "peppercorn" = super.buildPythonPackage {
1209 1209 name = "peppercorn-0.6";
1210 1210 doCheck = false;
1211 1211 src = fetchurl {
1212 1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1213 1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1214 1214 };
1215 1215 meta = {
1216 1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1217 1217 };
1218 1218 };
1219 1219 "pexpect" = super.buildPythonPackage {
1220 1220 name = "pexpect-4.8.0";
1221 1221 doCheck = false;
1222 1222 propagatedBuildInputs = [
1223 1223 self."ptyprocess"
1224 1224 ];
1225 1225 src = fetchurl {
1226 1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1227 1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1228 1228 };
1229 1229 meta = {
1230 1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1231 1231 };
1232 1232 };
1233 1233 "pickleshare" = super.buildPythonPackage {
1234 1234 name = "pickleshare-0.7.5";
1235 1235 doCheck = false;
1236 1236 propagatedBuildInputs = [
1237 1237 self."pathlib2"
1238 1238 ];
1239 1239 src = fetchurl {
1240 1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1241 1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1242 1242 };
1243 1243 meta = {
1244 1244 license = [ pkgs.lib.licenses.mit ];
1245 1245 };
1246 1246 };
1247 1247 "plaster" = super.buildPythonPackage {
1248 1248 name = "plaster-1.0";
1249 1249 doCheck = false;
1250 1250 propagatedBuildInputs = [
1251 1251 self."setuptools"
1252 1252 ];
1253 1253 src = fetchurl {
1254 1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1255 1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1256 1256 };
1257 1257 meta = {
1258 1258 license = [ pkgs.lib.licenses.mit ];
1259 1259 };
1260 1260 };
1261 1261 "plaster-pastedeploy" = super.buildPythonPackage {
1262 1262 name = "plaster-pastedeploy-0.7";
1263 1263 doCheck = false;
1264 1264 propagatedBuildInputs = [
1265 1265 self."pastedeploy"
1266 1266 self."plaster"
1267 1267 ];
1268 1268 src = fetchurl {
1269 1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1270 1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1271 1271 };
1272 1272 meta = {
1273 1273 license = [ pkgs.lib.licenses.mit ];
1274 1274 };
1275 1275 };
1276 1276 "pluggy" = super.buildPythonPackage {
1277 1277 name = "pluggy-0.13.1";
1278 1278 doCheck = false;
1279 1279 propagatedBuildInputs = [
1280 1280 self."importlib-metadata"
1281 1281 ];
1282 1282 src = fetchurl {
1283 1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1284 1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1285 1285 };
1286 1286 meta = {
1287 1287 license = [ pkgs.lib.licenses.mit ];
1288 1288 };
1289 1289 };
1290 1290 "premailer" = super.buildPythonPackage {
1291 1291 name = "premailer-3.6.1";
1292 1292 doCheck = false;
1293 1293 propagatedBuildInputs = [
1294 1294 self."lxml"
1295 1295 self."cssselect"
1296 1296 self."cssutils"
1297 1297 self."requests"
1298 1298 self."cachetools"
1299 1299 ];
1300 1300 src = fetchurl {
1301 1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1302 1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1303 1303 };
1304 1304 meta = {
1305 1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1306 1306 };
1307 1307 };
1308 1308 "prompt-toolkit" = super.buildPythonPackage {
1309 1309 name = "prompt-toolkit-1.0.18";
1310 1310 doCheck = false;
1311 1311 propagatedBuildInputs = [
1312 1312 self."six"
1313 1313 self."wcwidth"
1314 1314 ];
1315 1315 src = fetchurl {
1316 1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1317 1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1318 1318 };
1319 1319 meta = {
1320 1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1321 1321 };
1322 1322 };
1323 1323 "psutil" = super.buildPythonPackage {
1324 1324 name = "psutil-5.7.0";
1325 1325 doCheck = false;
1326 1326 src = fetchurl {
1327 1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1328 1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1329 1329 };
1330 1330 meta = {
1331 1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1332 1332 };
1333 1333 };
1334 1334 "psycopg2" = super.buildPythonPackage {
1335 1335 name = "psycopg2-2.8.4";
1336 1336 doCheck = false;
1337 1337 src = fetchurl {
1338 1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1339 1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1340 1340 };
1341 1341 meta = {
1342 1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1343 1343 };
1344 1344 };
1345 1345 "ptyprocess" = super.buildPythonPackage {
1346 1346 name = "ptyprocess-0.6.0";
1347 1347 doCheck = false;
1348 1348 src = fetchurl {
1349 1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1350 1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1351 1351 };
1352 1352 meta = {
1353 1353 license = [ ];
1354 1354 };
1355 1355 };
1356 1356 "py" = super.buildPythonPackage {
1357 1357 name = "py-1.8.0";
1358 1358 doCheck = false;
1359 1359 src = fetchurl {
1360 1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1361 1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1362 1362 };
1363 1363 meta = {
1364 1364 license = [ pkgs.lib.licenses.mit ];
1365 1365 };
1366 1366 };
1367 1367 "py-bcrypt" = super.buildPythonPackage {
1368 1368 name = "py-bcrypt-0.4";
1369 1369 doCheck = false;
1370 1370 src = fetchurl {
1371 1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1372 1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1373 1373 };
1374 1374 meta = {
1375 1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1376 1376 };
1377 1377 };
1378 1378 "py-gfm" = super.buildPythonPackage {
1379 1379 name = "py-gfm-0.1.4";
1380 1380 doCheck = false;
1381 1381 propagatedBuildInputs = [
1382 1382 self."setuptools"
1383 1383 self."markdown"
1384 1384 ];
1385 1385 src = fetchurl {
1386 1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1387 1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1388 1388 };
1389 1389 meta = {
1390 1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1391 1391 };
1392 1392 };
1393 1393 "pyasn1" = super.buildPythonPackage {
1394 1394 name = "pyasn1-0.4.8";
1395 1395 doCheck = false;
1396 1396 src = fetchurl {
1397 1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1398 1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1399 1399 };
1400 1400 meta = {
1401 1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1402 1402 };
1403 1403 };
1404 1404 "pyasn1-modules" = super.buildPythonPackage {
1405 1405 name = "pyasn1-modules-0.2.6";
1406 1406 doCheck = false;
1407 1407 propagatedBuildInputs = [
1408 1408 self."pyasn1"
1409 1409 ];
1410 1410 src = fetchurl {
1411 1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1412 1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1413 1413 };
1414 1414 meta = {
1415 1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1416 1416 };
1417 1417 };
1418 1418 "pycparser" = super.buildPythonPackage {
1419 1419 name = "pycparser-2.20";
1420 1420 doCheck = false;
1421 1421 src = fetchurl {
1422 1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1423 1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1424 1424 };
1425 1425 meta = {
1426 1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1427 1427 };
1428 1428 };
1429 1429 "pycrypto" = super.buildPythonPackage {
1430 1430 name = "pycrypto-2.6.1";
1431 1431 doCheck = false;
1432 1432 src = fetchurl {
1433 1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1434 1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1435 1435 };
1436 1436 meta = {
1437 1437 license = [ pkgs.lib.licenses.publicDomain ];
1438 1438 };
1439 1439 };
1440 1440 "pycurl" = super.buildPythonPackage {
1441 1441 name = "pycurl-7.43.0.3";
1442 1442 doCheck = false;
1443 1443 src = fetchurl {
1444 1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1445 1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1446 1446 };
1447 1447 meta = {
1448 1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1449 1449 };
1450 1450 };
1451 1451 "pygments" = super.buildPythonPackage {
1452 1452 name = "pygments-2.4.2";
1453 1453 doCheck = false;
1454 1454 src = fetchurl {
1455 1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1456 1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1457 1457 };
1458 1458 meta = {
1459 1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1460 1460 };
1461 1461 };
1462 1462 "pymysql" = super.buildPythonPackage {
1463 1463 name = "pymysql-0.8.1";
1464 1464 doCheck = false;
1465 1465 src = fetchurl {
1466 1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1467 1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1468 1468 };
1469 1469 meta = {
1470 1470 license = [ pkgs.lib.licenses.mit ];
1471 1471 };
1472 1472 };
1473 1473 "pyotp" = super.buildPythonPackage {
1474 1474 name = "pyotp-2.3.0";
1475 1475 doCheck = false;
1476 1476 src = fetchurl {
1477 1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1478 1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1479 1479 };
1480 1480 meta = {
1481 1481 license = [ pkgs.lib.licenses.mit ];
1482 1482 };
1483 1483 };
1484 1484 "pyparsing" = super.buildPythonPackage {
1485 1485 name = "pyparsing-2.4.7";
1486 1486 doCheck = false;
1487 1487 src = fetchurl {
1488 1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1489 1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1490 1490 };
1491 1491 meta = {
1492 1492 license = [ pkgs.lib.licenses.mit ];
1493 1493 };
1494 1494 };
1495 1495 "pyramid" = super.buildPythonPackage {
1496 1496 name = "pyramid-1.10.4";
1497 1497 doCheck = false;
1498 1498 propagatedBuildInputs = [
1499 1499 self."hupper"
1500 1500 self."plaster"
1501 1501 self."plaster-pastedeploy"
1502 1502 self."setuptools"
1503 1503 self."translationstring"
1504 1504 self."venusian"
1505 1505 self."webob"
1506 1506 self."zope.deprecation"
1507 1507 self."zope.interface"
1508 1508 self."repoze.lru"
1509 1509 ];
1510 1510 src = fetchurl {
1511 1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1512 1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1513 1513 };
1514 1514 meta = {
1515 1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1516 1516 };
1517 1517 };
1518 1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1519 1519 name = "pyramid-debugtoolbar-4.6.1";
1520 1520 doCheck = false;
1521 1521 propagatedBuildInputs = [
1522 1522 self."pyramid"
1523 1523 self."pyramid-mako"
1524 1524 self."repoze.lru"
1525 1525 self."pygments"
1526 1526 self."ipaddress"
1527 1527 ];
1528 1528 src = fetchurl {
1529 1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1530 1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1531 1531 };
1532 1532 meta = {
1533 1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1534 1534 };
1535 1535 };
1536 1536 "pyramid-jinja2" = super.buildPythonPackage {
1537 1537 name = "pyramid-jinja2-2.7";
1538 1538 doCheck = false;
1539 1539 propagatedBuildInputs = [
1540 1540 self."pyramid"
1541 1541 self."zope.deprecation"
1542 1542 self."jinja2"
1543 1543 self."markupsafe"
1544 1544 ];
1545 1545 src = fetchurl {
1546 1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1547 1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1548 1548 };
1549 1549 meta = {
1550 1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1551 1551 };
1552 1552 };
1553 1553 "pyramid-apispec" = super.buildPythonPackage {
1554 1554 name = "pyramid-apispec-0.3.2";
1555 1555 doCheck = false;
1556 1556 propagatedBuildInputs = [
1557 1557 self."apispec"
1558 1558 ];
1559 1559 src = fetchurl {
1560 1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1561 1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1562 1562 };
1563 1563 meta = {
1564 1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1565 1565 };
1566 1566 };
1567 1567 "pyramid-mailer" = super.buildPythonPackage {
1568 1568 name = "pyramid-mailer-0.15.1";
1569 1569 doCheck = false;
1570 1570 propagatedBuildInputs = [
1571 1571 self."pyramid"
1572 1572 self."repoze.sendmail"
1573 1573 self."transaction"
1574 1574 ];
1575 1575 src = fetchurl {
1576 1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1577 1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1578 1578 };
1579 1579 meta = {
1580 1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1581 1581 };
1582 1582 };
1583 1583 "pyramid-mako" = super.buildPythonPackage {
1584 1584 name = "pyramid-mako-1.1.0";
1585 1585 doCheck = false;
1586 1586 propagatedBuildInputs = [
1587 1587 self."pyramid"
1588 1588 self."mako"
1589 1589 ];
1590 1590 src = fetchurl {
1591 1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1592 1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1593 1593 };
1594 1594 meta = {
1595 1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1596 1596 };
1597 1597 };
1598 1598 "pysqlite" = super.buildPythonPackage {
1599 1599 name = "pysqlite-2.8.3";
1600 1600 doCheck = false;
1601 1601 src = fetchurl {
1602 1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1603 1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1604 1604 };
1605 1605 meta = {
1606 1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1607 1607 };
1608 1608 };
1609 1609 "pytest" = super.buildPythonPackage {
1610 1610 name = "pytest-4.6.5";
1611 1611 doCheck = false;
1612 1612 propagatedBuildInputs = [
1613 1613 self."py"
1614 1614 self."six"
1615 1615 self."packaging"
1616 1616 self."attrs"
1617 1617 self."atomicwrites"
1618 1618 self."pluggy"
1619 1619 self."importlib-metadata"
1620 1620 self."wcwidth"
1621 1621 self."funcsigs"
1622 1622 self."pathlib2"
1623 1623 self."more-itertools"
1624 1624 ];
1625 1625 src = fetchurl {
1626 1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1627 1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1628 1628 };
1629 1629 meta = {
1630 1630 license = [ pkgs.lib.licenses.mit ];
1631 1631 };
1632 1632 };
1633 1633 "pytest-cov" = super.buildPythonPackage {
1634 1634 name = "pytest-cov-2.7.1";
1635 1635 doCheck = false;
1636 1636 propagatedBuildInputs = [
1637 1637 self."pytest"
1638 1638 self."coverage"
1639 1639 ];
1640 1640 src = fetchurl {
1641 1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1642 1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1643 1643 };
1644 1644 meta = {
1645 1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1646 1646 };
1647 1647 };
1648 1648 "pytest-profiling" = super.buildPythonPackage {
1649 1649 name = "pytest-profiling-1.7.0";
1650 1650 doCheck = false;
1651 1651 propagatedBuildInputs = [
1652 1652 self."six"
1653 1653 self."pytest"
1654 1654 self."gprof2dot"
1655 1655 ];
1656 1656 src = fetchurl {
1657 1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1658 1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1659 1659 };
1660 1660 meta = {
1661 1661 license = [ pkgs.lib.licenses.mit ];
1662 1662 };
1663 1663 };
1664 1664 "pytest-runner" = super.buildPythonPackage {
1665 1665 name = "pytest-runner-5.1";
1666 1666 doCheck = false;
1667 1667 src = fetchurl {
1668 1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1669 1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1670 1670 };
1671 1671 meta = {
1672 1672 license = [ pkgs.lib.licenses.mit ];
1673 1673 };
1674 1674 };
1675 1675 "pytest-sugar" = super.buildPythonPackage {
1676 1676 name = "pytest-sugar-0.9.2";
1677 1677 doCheck = false;
1678 1678 propagatedBuildInputs = [
1679 1679 self."pytest"
1680 1680 self."termcolor"
1681 1681 self."packaging"
1682 1682 ];
1683 1683 src = fetchurl {
1684 1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1685 1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1686 1686 };
1687 1687 meta = {
1688 1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1689 1689 };
1690 1690 };
1691 1691 "pytest-timeout" = super.buildPythonPackage {
1692 1692 name = "pytest-timeout-1.3.3";
1693 1693 doCheck = false;
1694 1694 propagatedBuildInputs = [
1695 1695 self."pytest"
1696 1696 ];
1697 1697 src = fetchurl {
1698 1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1699 1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1700 1700 };
1701 1701 meta = {
1702 1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1703 1703 };
1704 1704 };
1705 1705 "python-dateutil" = super.buildPythonPackage {
1706 1706 name = "python-dateutil-2.8.1";
1707 1707 doCheck = false;
1708 1708 propagatedBuildInputs = [
1709 1709 self."six"
1710 1710 ];
1711 1711 src = fetchurl {
1712 1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1713 1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1714 1714 };
1715 1715 meta = {
1716 1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1717 1717 };
1718 1718 };
1719 1719 "python-editor" = super.buildPythonPackage {
1720 1720 name = "python-editor-1.0.4";
1721 1721 doCheck = false;
1722 1722 src = fetchurl {
1723 1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1724 1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1725 1725 };
1726 1726 meta = {
1727 1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1728 1728 };
1729 1729 };
1730 1730 "python-ldap" = super.buildPythonPackage {
1731 1731 name = "python-ldap-3.2.0";
1732 1732 doCheck = false;
1733 1733 propagatedBuildInputs = [
1734 1734 self."pyasn1"
1735 1735 self."pyasn1-modules"
1736 1736 ];
1737 1737 src = fetchurl {
1738 1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1739 1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1740 1740 };
1741 1741 meta = {
1742 1742 license = [ pkgs.lib.licenses.psfl ];
1743 1743 };
1744 1744 };
1745 1745 "python-memcached" = super.buildPythonPackage {
1746 1746 name = "python-memcached-1.59";
1747 1747 doCheck = false;
1748 1748 propagatedBuildInputs = [
1749 1749 self."six"
1750 1750 ];
1751 1751 src = fetchurl {
1752 1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1753 1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1754 1754 };
1755 1755 meta = {
1756 1756 license = [ pkgs.lib.licenses.psfl ];
1757 1757 };
1758 1758 };
1759 1759 "python-pam" = super.buildPythonPackage {
1760 1760 name = "python-pam-1.8.4";
1761 1761 doCheck = false;
1762 1762 src = fetchurl {
1763 1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1764 1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1765 1765 };
1766 1766 meta = {
1767 1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1768 1768 };
1769 1769 };
1770 1770 "python-saml" = super.buildPythonPackage {
1771 1771 name = "python-saml-2.4.2";
1772 1772 doCheck = false;
1773 1773 propagatedBuildInputs = [
1774 1774 self."dm.xmlsec.binding"
1775 1775 self."isodate"
1776 1776 self."defusedxml"
1777 1777 ];
1778 1778 src = fetchurl {
1779 1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1780 1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1781 1781 };
1782 1782 meta = {
1783 1783 license = [ pkgs.lib.licenses.mit ];
1784 1784 };
1785 1785 };
1786 1786 "pytz" = super.buildPythonPackage {
1787 1787 name = "pytz-2019.3";
1788 1788 doCheck = false;
1789 1789 src = fetchurl {
1790 1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1791 1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1792 1792 };
1793 1793 meta = {
1794 1794 license = [ pkgs.lib.licenses.mit ];
1795 1795 };
1796 1796 };
1797 1797 "pyzmq" = super.buildPythonPackage {
1798 1798 name = "pyzmq-14.6.0";
1799 1799 doCheck = false;
1800 1800 src = fetchurl {
1801 1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1802 1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1803 1803 };
1804 1804 meta = {
1805 1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1806 1806 };
1807 1807 };
1808 1808 "PyYAML" = super.buildPythonPackage {
1809 1809 name = "PyYAML-5.3.1";
1810 1810 doCheck = false;
1811 1811 src = fetchurl {
1812 1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1813 1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1814 1814 };
1815 1815 meta = {
1816 1816 license = [ pkgs.lib.licenses.mit ];
1817 1817 };
1818 1818 };
1819 1819 "regex" = super.buildPythonPackage {
1820 1820 name = "regex-2020.9.27";
1821 1821 doCheck = false;
1822 1822 src = fetchurl {
1823 1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1824 1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1825 1825 };
1826 1826 meta = {
1827 1827 license = [ pkgs.lib.licenses.psfl ];
1828 1828 };
1829 1829 };
1830 1830 "redis" = super.buildPythonPackage {
1831 1831 name = "redis-3.4.1";
1832 1832 doCheck = false;
1833 1833 src = fetchurl {
1834 1834 url = "https://files.pythonhosted.org/packages/ef/2e/2c0f59891db7db087a7eeaa79bc7c7f2c039e71a2b5b0a41391e9d462926/redis-3.4.1.tar.gz";
1835 1835 sha256 = "07yaj0j9fs7xdkg5bg926fa990khyigjbp31si8ai20vj8sv7kqd";
1836 1836 };
1837 1837 meta = {
1838 1838 license = [ pkgs.lib.licenses.mit ];
1839 1839 };
1840 1840 };
1841 1841 "repoze.lru" = super.buildPythonPackage {
1842 1842 name = "repoze.lru-0.7";
1843 1843 doCheck = false;
1844 1844 src = fetchurl {
1845 1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1846 1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1847 1847 };
1848 1848 meta = {
1849 1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1850 1850 };
1851 1851 };
1852 1852 "repoze.sendmail" = super.buildPythonPackage {
1853 1853 name = "repoze.sendmail-4.4.1";
1854 1854 doCheck = false;
1855 1855 propagatedBuildInputs = [
1856 1856 self."setuptools"
1857 1857 self."zope.interface"
1858 1858 self."transaction"
1859 1859 ];
1860 1860 src = fetchurl {
1861 1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1862 1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1863 1863 };
1864 1864 meta = {
1865 1865 license = [ pkgs.lib.licenses.zpl21 ];
1866 1866 };
1867 1867 };
1868 1868 "requests" = super.buildPythonPackage {
1869 1869 name = "requests-2.22.0";
1870 1870 doCheck = false;
1871 1871 propagatedBuildInputs = [
1872 1872 self."chardet"
1873 1873 self."idna"
1874 1874 self."urllib3"
1875 1875 self."certifi"
1876 1876 ];
1877 1877 src = fetchurl {
1878 1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1879 1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1880 1880 };
1881 1881 meta = {
1882 1882 license = [ pkgs.lib.licenses.asl20 ];
1883 1883 };
1884 1884 };
1885 1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1886 name = "rhodecode-enterprise-ce-4.22.0";
1886 name = "rhodecode-enterprise-ce-4.23.0";
1887 1887 buildInputs = [
1888 1888 self."pytest"
1889 1889 self."py"
1890 1890 self."pytest-cov"
1891 1891 self."pytest-sugar"
1892 1892 self."pytest-runner"
1893 1893 self."pytest-profiling"
1894 1894 self."pytest-timeout"
1895 1895 self."gprof2dot"
1896 1896 self."mock"
1897 1897 self."cov-core"
1898 1898 self."coverage"
1899 1899 self."webtest"
1900 1900 self."beautifulsoup4"
1901 1901 self."configobj"
1902 1902 ];
1903 1903 doCheck = true;
1904 1904 propagatedBuildInputs = [
1905 1905 self."amqp"
1906 1906 self."babel"
1907 1907 self."beaker"
1908 1908 self."bleach"
1909 1909 self."celery"
1910 1910 self."channelstream"
1911 1911 self."click"
1912 1912 self."colander"
1913 1913 self."configobj"
1914 1914 self."cssselect"
1915 1915 self."cryptography"
1916 1916 self."decorator"
1917 1917 self."deform"
1918 1918 self."docutils"
1919 1919 self."dogpile.cache"
1920 1920 self."dogpile.core"
1921 1921 self."formencode"
1922 1922 self."future"
1923 1923 self."futures"
1924 1924 self."infrae.cache"
1925 1925 self."iso8601"
1926 1926 self."itsdangerous"
1927 1927 self."kombu"
1928 1928 self."lxml"
1929 1929 self."mako"
1930 1930 self."markdown"
1931 1931 self."markupsafe"
1932 1932 self."msgpack-python"
1933 1933 self."pyotp"
1934 1934 self."packaging"
1935 1935 self."pathlib2"
1936 1936 self."paste"
1937 1937 self."pastedeploy"
1938 1938 self."pastescript"
1939 1939 self."peppercorn"
1940 1940 self."premailer"
1941 1941 self."psutil"
1942 1942 self."py-bcrypt"
1943 1943 self."pycurl"
1944 1944 self."pycrypto"
1945 1945 self."pygments"
1946 1946 self."pyparsing"
1947 1947 self."pyramid-debugtoolbar"
1948 1948 self."pyramid-mako"
1949 1949 self."pyramid"
1950 1950 self."pyramid-mailer"
1951 1951 self."python-dateutil"
1952 1952 self."python-ldap"
1953 1953 self."python-memcached"
1954 1954 self."python-pam"
1955 1955 self."python-saml"
1956 1956 self."pytz"
1957 1957 self."tzlocal"
1958 1958 self."pyzmq"
1959 1959 self."py-gfm"
1960 1960 self."regex"
1961 1961 self."redis"
1962 1962 self."repoze.lru"
1963 1963 self."requests"
1964 1964 self."routes"
1965 1965 self."simplejson"
1966 1966 self."six"
1967 1967 self."sqlalchemy"
1968 1968 self."sshpubkeys"
1969 1969 self."subprocess32"
1970 1970 self."supervisor"
1971 1971 self."translationstring"
1972 1972 self."urllib3"
1973 1973 self."urlobject"
1974 1974 self."venusian"
1975 1975 self."weberror"
1976 1976 self."webhelpers2"
1977 1977 self."webob"
1978 1978 self."whoosh"
1979 1979 self."wsgiref"
1980 1980 self."zope.cachedescriptors"
1981 1981 self."zope.deprecation"
1982 1982 self."zope.event"
1983 1983 self."zope.interface"
1984 1984 self."mysql-python"
1985 1985 self."pymysql"
1986 1986 self."pysqlite"
1987 1987 self."psycopg2"
1988 1988 self."nbconvert"
1989 1989 self."nbformat"
1990 1990 self."jupyter-client"
1991 1991 self."jupyter-core"
1992 1992 self."alembic"
1993 1993 self."invoke"
1994 1994 self."bumpversion"
1995 1995 self."gevent"
1996 1996 self."greenlet"
1997 1997 self."gunicorn"
1998 1998 self."waitress"
1999 1999 self."ipdb"
2000 2000 self."ipython"
2001 2001 self."rhodecode-tools"
2002 2002 self."appenlight-client"
2003 2003 self."pytest"
2004 2004 self."py"
2005 2005 self."pytest-cov"
2006 2006 self."pytest-sugar"
2007 2007 self."pytest-runner"
2008 2008 self."pytest-profiling"
2009 2009 self."pytest-timeout"
2010 2010 self."gprof2dot"
2011 2011 self."mock"
2012 2012 self."cov-core"
2013 2013 self."coverage"
2014 2014 self."webtest"
2015 2015 self."beautifulsoup4"
2016 2016 ];
2017 2017 src = ./.;
2018 2018 meta = {
2019 2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2020 2020 };
2021 2021 };
2022 2022 "rhodecode-tools" = super.buildPythonPackage {
2023 2023 name = "rhodecode-tools-1.4.0";
2024 2024 doCheck = false;
2025 2025 propagatedBuildInputs = [
2026 2026 self."click"
2027 2027 self."future"
2028 2028 self."six"
2029 2029 self."mako"
2030 2030 self."markupsafe"
2031 2031 self."requests"
2032 2032 self."urllib3"
2033 2033 self."whoosh"
2034 2034 self."elasticsearch"
2035 2035 self."elasticsearch-dsl"
2036 2036 self."elasticsearch2"
2037 2037 self."elasticsearch1-dsl"
2038 2038 ];
2039 2039 src = fetchurl {
2040 2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2041 2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2042 2042 };
2043 2043 meta = {
2044 2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2045 2045 };
2046 2046 };
2047 2047 "routes" = super.buildPythonPackage {
2048 2048 name = "routes-2.4.1";
2049 2049 doCheck = false;
2050 2050 propagatedBuildInputs = [
2051 2051 self."six"
2052 2052 self."repoze.lru"
2053 2053 ];
2054 2054 src = fetchurl {
2055 2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2056 2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2057 2057 };
2058 2058 meta = {
2059 2059 license = [ pkgs.lib.licenses.mit ];
2060 2060 };
2061 2061 };
2062 2062 "scandir" = super.buildPythonPackage {
2063 2063 name = "scandir-1.10.0";
2064 2064 doCheck = false;
2065 2065 src = fetchurl {
2066 2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2067 2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2068 2068 };
2069 2069 meta = {
2070 2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2071 2071 };
2072 2072 };
2073 2073 "setproctitle" = super.buildPythonPackage {
2074 2074 name = "setproctitle-1.1.10";
2075 2075 doCheck = false;
2076 2076 src = fetchurl {
2077 2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2078 2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2079 2079 };
2080 2080 meta = {
2081 2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2082 2082 };
2083 2083 };
2084 2084 "setuptools" = super.buildPythonPackage {
2085 2085 name = "setuptools-44.1.0";
2086 2086 doCheck = false;
2087 2087 src = fetchurl {
2088 2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2089 2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2090 2090 };
2091 2091 meta = {
2092 2092 license = [ pkgs.lib.licenses.mit ];
2093 2093 };
2094 2094 };
2095 2095 "simplegeneric" = super.buildPythonPackage {
2096 2096 name = "simplegeneric-0.8.1";
2097 2097 doCheck = false;
2098 2098 src = fetchurl {
2099 2099 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2100 2100 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2101 2101 };
2102 2102 meta = {
2103 2103 license = [ pkgs.lib.licenses.zpl21 ];
2104 2104 };
2105 2105 };
2106 2106 "simplejson" = super.buildPythonPackage {
2107 2107 name = "simplejson-3.16.0";
2108 2108 doCheck = false;
2109 2109 src = fetchurl {
2110 2110 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2111 2111 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2112 2112 };
2113 2113 meta = {
2114 2114 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2115 2115 };
2116 2116 };
2117 2117 "six" = super.buildPythonPackage {
2118 2118 name = "six-1.11.0";
2119 2119 doCheck = false;
2120 2120 src = fetchurl {
2121 2121 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2122 2122 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2123 2123 };
2124 2124 meta = {
2125 2125 license = [ pkgs.lib.licenses.mit ];
2126 2126 };
2127 2127 };
2128 2128 "sqlalchemy" = super.buildPythonPackage {
2129 2129 name = "sqlalchemy-1.3.15";
2130 2130 doCheck = false;
2131 2131 src = fetchurl {
2132 2132 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2133 2133 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2134 2134 };
2135 2135 meta = {
2136 2136 license = [ pkgs.lib.licenses.mit ];
2137 2137 };
2138 2138 };
2139 2139 "sshpubkeys" = super.buildPythonPackage {
2140 2140 name = "sshpubkeys-3.1.0";
2141 2141 doCheck = false;
2142 2142 propagatedBuildInputs = [
2143 2143 self."cryptography"
2144 2144 self."ecdsa"
2145 2145 ];
2146 2146 src = fetchurl {
2147 2147 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2148 2148 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2149 2149 };
2150 2150 meta = {
2151 2151 license = [ pkgs.lib.licenses.bsdOriginal ];
2152 2152 };
2153 2153 };
2154 2154 "subprocess32" = super.buildPythonPackage {
2155 2155 name = "subprocess32-3.5.4";
2156 2156 doCheck = false;
2157 2157 src = fetchurl {
2158 2158 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2159 2159 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2160 2160 };
2161 2161 meta = {
2162 2162 license = [ pkgs.lib.licenses.psfl ];
2163 2163 };
2164 2164 };
2165 2165 "supervisor" = super.buildPythonPackage {
2166 2166 name = "supervisor-4.1.0";
2167 2167 doCheck = false;
2168 2168 src = fetchurl {
2169 2169 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2170 2170 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2171 2171 };
2172 2172 meta = {
2173 2173 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2174 2174 };
2175 2175 };
2176 2176 "tempita" = super.buildPythonPackage {
2177 2177 name = "tempita-0.5.2";
2178 2178 doCheck = false;
2179 2179 src = fetchurl {
2180 2180 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2181 2181 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2182 2182 };
2183 2183 meta = {
2184 2184 license = [ pkgs.lib.licenses.mit ];
2185 2185 };
2186 2186 };
2187 2187 "termcolor" = super.buildPythonPackage {
2188 2188 name = "termcolor-1.1.0";
2189 2189 doCheck = false;
2190 2190 src = fetchurl {
2191 2191 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2192 2192 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2193 2193 };
2194 2194 meta = {
2195 2195 license = [ pkgs.lib.licenses.mit ];
2196 2196 };
2197 2197 };
2198 2198 "testpath" = super.buildPythonPackage {
2199 2199 name = "testpath-0.4.4";
2200 2200 doCheck = false;
2201 2201 src = fetchurl {
2202 2202 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2203 2203 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2204 2204 };
2205 2205 meta = {
2206 2206 license = [ ];
2207 2207 };
2208 2208 };
2209 2209 "traitlets" = super.buildPythonPackage {
2210 2210 name = "traitlets-4.3.3";
2211 2211 doCheck = false;
2212 2212 propagatedBuildInputs = [
2213 2213 self."ipython-genutils"
2214 2214 self."six"
2215 2215 self."decorator"
2216 2216 self."enum34"
2217 2217 ];
2218 2218 src = fetchurl {
2219 2219 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2220 2220 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2221 2221 };
2222 2222 meta = {
2223 2223 license = [ pkgs.lib.licenses.bsdOriginal ];
2224 2224 };
2225 2225 };
2226 2226 "transaction" = super.buildPythonPackage {
2227 2227 name = "transaction-2.4.0";
2228 2228 doCheck = false;
2229 2229 propagatedBuildInputs = [
2230 2230 self."zope.interface"
2231 2231 ];
2232 2232 src = fetchurl {
2233 2233 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2234 2234 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2235 2235 };
2236 2236 meta = {
2237 2237 license = [ pkgs.lib.licenses.zpl21 ];
2238 2238 };
2239 2239 };
2240 2240 "translationstring" = super.buildPythonPackage {
2241 2241 name = "translationstring-1.3";
2242 2242 doCheck = false;
2243 2243 src = fetchurl {
2244 2244 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2245 2245 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2246 2246 };
2247 2247 meta = {
2248 2248 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2249 2249 };
2250 2250 };
2251 2251 "tzlocal" = super.buildPythonPackage {
2252 2252 name = "tzlocal-1.5.1";
2253 2253 doCheck = false;
2254 2254 propagatedBuildInputs = [
2255 2255 self."pytz"
2256 2256 ];
2257 2257 src = fetchurl {
2258 2258 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2259 2259 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2260 2260 };
2261 2261 meta = {
2262 2262 license = [ pkgs.lib.licenses.mit ];
2263 2263 };
2264 2264 };
2265 2265 "urllib3" = super.buildPythonPackage {
2266 2266 name = "urllib3-1.25.2";
2267 2267 doCheck = false;
2268 2268 src = fetchurl {
2269 2269 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2270 2270 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2271 2271 };
2272 2272 meta = {
2273 2273 license = [ pkgs.lib.licenses.mit ];
2274 2274 };
2275 2275 };
2276 2276 "urlobject" = super.buildPythonPackage {
2277 2277 name = "urlobject-2.4.3";
2278 2278 doCheck = false;
2279 2279 src = fetchurl {
2280 2280 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2281 2281 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2282 2282 };
2283 2283 meta = {
2284 2284 license = [ pkgs.lib.licenses.publicDomain ];
2285 2285 };
2286 2286 };
2287 2287 "venusian" = super.buildPythonPackage {
2288 2288 name = "venusian-1.2.0";
2289 2289 doCheck = false;
2290 2290 src = fetchurl {
2291 2291 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2292 2292 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2293 2293 };
2294 2294 meta = {
2295 2295 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2296 2296 };
2297 2297 };
2298 2298 "vine" = super.buildPythonPackage {
2299 2299 name = "vine-1.3.0";
2300 2300 doCheck = false;
2301 2301 src = fetchurl {
2302 2302 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2303 2303 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2304 2304 };
2305 2305 meta = {
2306 2306 license = [ pkgs.lib.licenses.bsdOriginal ];
2307 2307 };
2308 2308 };
2309 2309 "waitress" = super.buildPythonPackage {
2310 2310 name = "waitress-1.3.1";
2311 2311 doCheck = false;
2312 2312 src = fetchurl {
2313 2313 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2314 2314 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2315 2315 };
2316 2316 meta = {
2317 2317 license = [ pkgs.lib.licenses.zpl21 ];
2318 2318 };
2319 2319 };
2320 2320 "wcwidth" = super.buildPythonPackage {
2321 2321 name = "wcwidth-0.1.9";
2322 2322 doCheck = false;
2323 2323 src = fetchurl {
2324 2324 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2325 2325 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2326 2326 };
2327 2327 meta = {
2328 2328 license = [ pkgs.lib.licenses.mit ];
2329 2329 };
2330 2330 };
2331 2331 "webencodings" = super.buildPythonPackage {
2332 2332 name = "webencodings-0.5.1";
2333 2333 doCheck = false;
2334 2334 src = fetchurl {
2335 2335 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2336 2336 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2337 2337 };
2338 2338 meta = {
2339 2339 license = [ pkgs.lib.licenses.bsdOriginal ];
2340 2340 };
2341 2341 };
2342 2342 "weberror" = super.buildPythonPackage {
2343 2343 name = "weberror-0.13.1";
2344 2344 doCheck = false;
2345 2345 propagatedBuildInputs = [
2346 2346 self."webob"
2347 2347 self."tempita"
2348 2348 self."pygments"
2349 2349 self."paste"
2350 2350 ];
2351 2351 src = fetchurl {
2352 2352 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2353 2353 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2354 2354 };
2355 2355 meta = {
2356 2356 license = [ pkgs.lib.licenses.mit ];
2357 2357 };
2358 2358 };
2359 2359 "webhelpers2" = super.buildPythonPackage {
2360 2360 name = "webhelpers2-2.0";
2361 2361 doCheck = false;
2362 2362 propagatedBuildInputs = [
2363 2363 self."markupsafe"
2364 2364 self."six"
2365 2365 ];
2366 2366 src = fetchurl {
2367 2367 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2368 2368 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2369 2369 };
2370 2370 meta = {
2371 2371 license = [ pkgs.lib.licenses.mit ];
2372 2372 };
2373 2373 };
2374 2374 "webob" = super.buildPythonPackage {
2375 2375 name = "webob-1.8.5";
2376 2376 doCheck = false;
2377 2377 src = fetchurl {
2378 2378 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2379 2379 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2380 2380 };
2381 2381 meta = {
2382 2382 license = [ pkgs.lib.licenses.mit ];
2383 2383 };
2384 2384 };
2385 2385 "webtest" = super.buildPythonPackage {
2386 2386 name = "webtest-2.0.34";
2387 2387 doCheck = false;
2388 2388 propagatedBuildInputs = [
2389 2389 self."six"
2390 2390 self."webob"
2391 2391 self."waitress"
2392 2392 self."beautifulsoup4"
2393 2393 ];
2394 2394 src = fetchurl {
2395 2395 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2396 2396 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2397 2397 };
2398 2398 meta = {
2399 2399 license = [ pkgs.lib.licenses.mit ];
2400 2400 };
2401 2401 };
2402 2402 "whoosh" = super.buildPythonPackage {
2403 2403 name = "whoosh-2.7.4";
2404 2404 doCheck = false;
2405 2405 src = fetchurl {
2406 2406 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2407 2407 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2408 2408 };
2409 2409 meta = {
2410 2410 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2411 2411 };
2412 2412 };
2413 2413 "ws4py" = super.buildPythonPackage {
2414 2414 name = "ws4py-0.5.1";
2415 2415 doCheck = false;
2416 2416 src = fetchurl {
2417 2417 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2418 2418 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2419 2419 };
2420 2420 meta = {
2421 2421 license = [ pkgs.lib.licenses.bsdOriginal ];
2422 2422 };
2423 2423 };
2424 2424 "wsgiref" = super.buildPythonPackage {
2425 2425 name = "wsgiref-0.1.2";
2426 2426 doCheck = false;
2427 2427 src = fetchurl {
2428 2428 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2429 2429 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2430 2430 };
2431 2431 meta = {
2432 2432 license = [ { fullName = "PSF or ZPL"; } ];
2433 2433 };
2434 2434 };
2435 2435 "zipp" = super.buildPythonPackage {
2436 2436 name = "zipp-1.2.0";
2437 2437 doCheck = false;
2438 2438 propagatedBuildInputs = [
2439 2439 self."contextlib2"
2440 2440 ];
2441 2441 src = fetchurl {
2442 2442 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2443 2443 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2444 2444 };
2445 2445 meta = {
2446 2446 license = [ pkgs.lib.licenses.mit ];
2447 2447 };
2448 2448 };
2449 2449 "zope.cachedescriptors" = super.buildPythonPackage {
2450 2450 name = "zope.cachedescriptors-4.3.1";
2451 2451 doCheck = false;
2452 2452 propagatedBuildInputs = [
2453 2453 self."setuptools"
2454 2454 ];
2455 2455 src = fetchurl {
2456 2456 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2457 2457 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2458 2458 };
2459 2459 meta = {
2460 2460 license = [ pkgs.lib.licenses.zpl21 ];
2461 2461 };
2462 2462 };
2463 2463 "zope.deprecation" = super.buildPythonPackage {
2464 2464 name = "zope.deprecation-4.4.0";
2465 2465 doCheck = false;
2466 2466 propagatedBuildInputs = [
2467 2467 self."setuptools"
2468 2468 ];
2469 2469 src = fetchurl {
2470 2470 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2471 2471 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2472 2472 };
2473 2473 meta = {
2474 2474 license = [ pkgs.lib.licenses.zpl21 ];
2475 2475 };
2476 2476 };
2477 2477 "zope.event" = super.buildPythonPackage {
2478 2478 name = "zope.event-4.4";
2479 2479 doCheck = false;
2480 2480 propagatedBuildInputs = [
2481 2481 self."setuptools"
2482 2482 ];
2483 2483 src = fetchurl {
2484 2484 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2485 2485 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2486 2486 };
2487 2487 meta = {
2488 2488 license = [ pkgs.lib.licenses.zpl21 ];
2489 2489 };
2490 2490 };
2491 2491 "zope.interface" = super.buildPythonPackage {
2492 2492 name = "zope.interface-4.6.0";
2493 2493 doCheck = false;
2494 2494 propagatedBuildInputs = [
2495 2495 self."setuptools"
2496 2496 ];
2497 2497 src = fetchurl {
2498 2498 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2499 2499 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2500 2500 };
2501 2501 meta = {
2502 2502 license = [ pkgs.lib.licenses.zpl21 ];
2503 2503 };
2504 2504 };
2505 2505
2506 2506 ### Test requirements
2507 2507
2508 2508
2509 2509 }
@@ -1,1 +1,1 b''
1 4.22.0 No newline at end of file
1 4.23.0 No newline at end of file
@@ -1,60 +1,60 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22 from collections import OrderedDict
23 23
24 24 import sys
25 25 import platform
26 26
27 27 VERSION = tuple(open(os.path.join(
28 28 os.path.dirname(__file__), 'VERSION')).read().split('.'))
29 29
30 30 BACKENDS = OrderedDict()
31 31
32 32 BACKENDS['hg'] = 'Mercurial repository'
33 33 BACKENDS['git'] = 'Git repository'
34 34 BACKENDS['svn'] = 'Subversion repository'
35 35
36 36
37 37 CELERY_ENABLED = False
38 38 CELERY_EAGER = False
39 39
40 40 # link to config for pyramid
41 41 CONFIG = {}
42 42
43 43 # Populated with the settings dictionary from application init in
44 44 # rhodecode.conf.environment.load_pyramid_environment
45 45 PYRAMID_SETTINGS = {}
46 46
47 47 # Linked module for extensions
48 48 EXTENSIONS = {}
49 49
50 50 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
51 __dbversion__ = 110 # defines current db version for migrations
51 __dbversion__ = 112 # defines current db version for migrations
52 52 __platform__ = platform.system()
53 53 __license__ = 'AGPLv3, and Commercial License'
54 54 __author__ = 'RhodeCode GmbH'
55 55 __url__ = 'https://code.rhodecode.com'
56 56
57 57 is_windows = __platform__ in ['Windows']
58 58 is_unix = not is_windows
59 59 is_test = False
60 60 disable_error_handler = False
@@ -1,452 +1,458 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2014-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 """
22 22 JSON RPC utils
23 23 """
24 24
25 25 import collections
26 26 import logging
27 27
28 28 from rhodecode.api.exc import JSONRPCError
29 29 from rhodecode.lib.auth import (
30 30 HasPermissionAnyApi, HasRepoPermissionAnyApi, HasRepoGroupPermissionAnyApi)
31 31 from rhodecode.lib.utils import safe_unicode
32 32 from rhodecode.lib.vcs.exceptions import RepositoryError
33 33 from rhodecode.lib.view_utils import get_commit_from_ref_name
34 34 from rhodecode.lib.utils2 import str2bool
35 35
36 36 log = logging.getLogger(__name__)
37 37
38 38
39 39 class OAttr(object):
40 40 """
41 41 Special Option that defines other attribute, and can default to them
42 42
43 43 Example::
44 44
45 45 def test(apiuser, userid=Optional(OAttr('apiuser')):
46 46 user = Optional.extract(userid, evaluate_locals=local())
47 47 #if we pass in userid, we get it, else it will default to apiuser
48 48 #attribute
49 49 """
50 50
51 51 def __init__(self, attr_name):
52 52 self.attr_name = attr_name
53 53
54 54 def __repr__(self):
55 55 return '<OptionalAttr:%s>' % self.attr_name
56 56
57 57 def __call__(self):
58 58 return self
59 59
60 60
61 61 class Optional(object):
62 62 """
63 63 Defines an optional parameter::
64 64
65 65 param = param.getval() if isinstance(param, Optional) else param
66 66 param = param() if isinstance(param, Optional) else param
67 67
68 68 is equivalent of::
69 69
70 70 param = Optional.extract(param)
71 71
72 72 """
73 73
74 74 def __init__(self, type_):
75 75 self.type_ = type_
76 76
77 77 def __repr__(self):
78 78 return '<Optional:%s>' % self.type_.__repr__()
79 79
80 80 def __call__(self):
81 81 return self.getval()
82 82
83 83 def getval(self, evaluate_locals=None):
84 84 """
85 85 returns value from this Optional instance
86 86 """
87 87 if isinstance(self.type_, OAttr):
88 88 param_name = self.type_.attr_name
89 89 if evaluate_locals:
90 90 return evaluate_locals[param_name]
91 91 # use params name
92 92 return param_name
93 93 return self.type_
94 94
95 95 @classmethod
96 96 def extract(cls, val, evaluate_locals=None, binary=None):
97 97 """
98 98 Extracts value from Optional() instance
99 99
100 100 :param val:
101 101 :return: original value if it's not Optional instance else
102 102 value of instance
103 103 """
104 104 if isinstance(val, cls):
105 105 val = val.getval(evaluate_locals)
106 106
107 107 if binary:
108 108 val = str2bool(val)
109 109
110 110 return val
111 111
112 112
113 113 def parse_args(cli_args, key_prefix=''):
114 114 from rhodecode.lib.utils2 import (escape_split)
115 115 kwargs = collections.defaultdict(dict)
116 116 for el in escape_split(cli_args, ','):
117 117 kv = escape_split(el, '=', 1)
118 118 if len(kv) == 2:
119 119 k, v = kv
120 120 kwargs[key_prefix + k] = v
121 121 return kwargs
122 122
123 123
124 124 def get_origin(obj):
125 125 """
126 126 Get origin of permission from object.
127 127
128 128 :param obj:
129 129 """
130 130 origin = 'permission'
131 131
132 132 if getattr(obj, 'owner_row', '') and getattr(obj, 'admin_row', ''):
133 133 # admin and owner case, maybe we should use dual string ?
134 134 origin = 'owner'
135 135 elif getattr(obj, 'owner_row', ''):
136 136 origin = 'owner'
137 137 elif getattr(obj, 'admin_row', ''):
138 138 origin = 'super-admin'
139 139 return origin
140 140
141 141
142 142 def store_update(updates, attr, name):
143 143 """
144 144 Stores param in updates dict if it's not instance of Optional
145 145 allows easy updates of passed in params
146 146 """
147 147 if not isinstance(attr, Optional):
148 148 updates[name] = attr
149 149
150 150
151 151 def has_superadmin_permission(apiuser):
152 152 """
153 153 Return True if apiuser is admin or return False
154 154
155 155 :param apiuser:
156 156 """
157 157 if HasPermissionAnyApi('hg.admin')(user=apiuser):
158 158 return True
159 159 return False
160 160
161 161
162 162 def validate_repo_permissions(apiuser, repoid, repo, perms):
163 163 """
164 164 Raise JsonRPCError if apiuser is not authorized or return True
165 165
166 166 :param apiuser:
167 167 :param repoid:
168 168 :param repo:
169 169 :param perms:
170 170 """
171 171 if not HasRepoPermissionAnyApi(*perms)(
172 172 user=apiuser, repo_name=repo.repo_name):
173 173 raise JSONRPCError('repository `%s` does not exist' % repoid)
174 174
175 175 return True
176 176
177 177
178 178 def validate_repo_group_permissions(apiuser, repogroupid, repo_group, perms):
179 179 """
180 180 Raise JsonRPCError if apiuser is not authorized or return True
181 181
182 182 :param apiuser:
183 183 :param repogroupid: just the id of repository group
184 184 :param repo_group: instance of repo_group
185 185 :param perms:
186 186 """
187 187 if not HasRepoGroupPermissionAnyApi(*perms)(
188 188 user=apiuser, group_name=repo_group.group_name):
189 189 raise JSONRPCError(
190 190 'repository group `%s` does not exist' % repogroupid)
191 191
192 192 return True
193 193
194 194
195 195 def validate_set_owner_permissions(apiuser, owner):
196 196 if isinstance(owner, Optional):
197 197 owner = get_user_or_error(apiuser.user_id)
198 198 else:
199 199 if has_superadmin_permission(apiuser):
200 200 owner = get_user_or_error(owner)
201 201 else:
202 202 # forbid setting owner for non-admins
203 203 raise JSONRPCError(
204 204 'Only RhodeCode super-admin can specify `owner` param')
205 205 return owner
206 206
207 207
208 208 def get_user_or_error(userid):
209 209 """
210 210 Get user by id or name or return JsonRPCError if not found
211 211
212 212 :param userid:
213 213 """
214 214 from rhodecode.model.user import UserModel
215 215 user_model = UserModel()
216 216
217 217 if isinstance(userid, (int, long)):
218 218 try:
219 219 user = user_model.get_user(userid)
220 220 except ValueError:
221 221 user = None
222 222 else:
223 223 user = user_model.get_by_username(userid)
224 224
225 225 if user is None:
226 226 raise JSONRPCError(
227 227 'user `%s` does not exist' % (userid,))
228 228 return user
229 229
230 230
231 231 def get_repo_or_error(repoid):
232 232 """
233 233 Get repo by id or name or return JsonRPCError if not found
234 234
235 235 :param repoid:
236 236 """
237 237 from rhodecode.model.repo import RepoModel
238 238 repo_model = RepoModel()
239 239
240 240 if isinstance(repoid, (int, long)):
241 241 try:
242 242 repo = repo_model.get_repo(repoid)
243 243 except ValueError:
244 244 repo = None
245 245 else:
246 246 repo = repo_model.get_by_repo_name(repoid)
247 247
248 248 if repo is None:
249 249 raise JSONRPCError(
250 250 'repository `%s` does not exist' % (repoid,))
251 251 return repo
252 252
253 253
254 254 def get_repo_group_or_error(repogroupid):
255 255 """
256 256 Get repo group by id or name or return JsonRPCError if not found
257 257
258 258 :param repogroupid:
259 259 """
260 260 from rhodecode.model.repo_group import RepoGroupModel
261 261 repo_group_model = RepoGroupModel()
262 262
263 263 if isinstance(repogroupid, (int, long)):
264 264 try:
265 265 repo_group = repo_group_model._get_repo_group(repogroupid)
266 266 except ValueError:
267 267 repo_group = None
268 268 else:
269 269 repo_group = repo_group_model.get_by_group_name(repogroupid)
270 270
271 271 if repo_group is None:
272 272 raise JSONRPCError(
273 273 'repository group `%s` does not exist' % (repogroupid,))
274 274 return repo_group
275 275
276 276
277 277 def get_user_group_or_error(usergroupid):
278 278 """
279 279 Get user group by id or name or return JsonRPCError if not found
280 280
281 281 :param usergroupid:
282 282 """
283 283 from rhodecode.model.user_group import UserGroupModel
284 284 user_group_model = UserGroupModel()
285 285
286 286 if isinstance(usergroupid, (int, long)):
287 287 try:
288 288 user_group = user_group_model.get_group(usergroupid)
289 289 except ValueError:
290 290 user_group = None
291 291 else:
292 292 user_group = user_group_model.get_by_name(usergroupid)
293 293
294 294 if user_group is None:
295 295 raise JSONRPCError(
296 296 'user group `%s` does not exist' % (usergroupid,))
297 297 return user_group
298 298
299 299
300 300 def get_perm_or_error(permid, prefix=None):
301 301 """
302 302 Get permission by id or name or return JsonRPCError if not found
303 303
304 304 :param permid:
305 305 """
306 306 from rhodecode.model.permission import PermissionModel
307 307
308 308 perm = PermissionModel.cls.get_by_key(permid)
309 309 if perm is None:
310 310 msg = 'permission `{}` does not exist.'.format(permid)
311 311 if prefix:
312 312 msg += ' Permission should start with prefix: `{}`'.format(prefix)
313 313 raise JSONRPCError(msg)
314 314
315 315 if prefix:
316 316 if not perm.permission_name.startswith(prefix):
317 317 raise JSONRPCError('permission `%s` is invalid, '
318 318 'should start with %s' % (permid, prefix))
319 319 return perm
320 320
321 321
322 322 def get_gist_or_error(gistid):
323 323 """
324 324 Get gist by id or gist_access_id or return JsonRPCError if not found
325 325
326 326 :param gistid:
327 327 """
328 328 from rhodecode.model.gist import GistModel
329 329
330 330 gist = GistModel.cls.get_by_access_id(gistid)
331 331 if gist is None:
332 332 raise JSONRPCError('gist `%s` does not exist' % (gistid,))
333 333 return gist
334 334
335 335
336 336 def get_pull_request_or_error(pullrequestid):
337 337 """
338 338 Get pull request by id or return JsonRPCError if not found
339 339
340 340 :param pullrequestid:
341 341 """
342 342 from rhodecode.model.pull_request import PullRequestModel
343 343
344 344 try:
345 345 pull_request = PullRequestModel().get(int(pullrequestid))
346 346 except ValueError:
347 347 raise JSONRPCError('pullrequestid must be an integer')
348 348 if not pull_request:
349 349 raise JSONRPCError('pull request `%s` does not exist' % (
350 350 pullrequestid,))
351 351 return pull_request
352 352
353 353
354 def build_commit_data(commit, detail_level):
354 def build_commit_data(rhodecode_vcs_repo, commit, detail_level):
355 commit2 = commit
356 commit1 = commit.first_parent
357
355 358 parsed_diff = []
356 359 if detail_level == 'extended':
357 360 for f_path in commit.added_paths:
358 361 parsed_diff.append(_get_commit_dict(filename=f_path, op='A'))
359 362 for f_path in commit.changed_paths:
360 363 parsed_diff.append(_get_commit_dict(filename=f_path, op='M'))
361 364 for f_path in commit.removed_paths:
362 365 parsed_diff.append(_get_commit_dict(filename=f_path, op='D'))
363 366
364 367 elif detail_level == 'full':
365 from rhodecode.lib.diffs import DiffProcessor
366 diff_processor = DiffProcessor(commit.diff())
368 from rhodecode.lib import diffs
369
370 _diff = rhodecode_vcs_repo.get_diff(commit1, commit2,)
371 diff_processor = diffs.DiffProcessor(_diff, format='newdiff', show_full_diff=True)
372
367 373 for dp in diff_processor.prepare():
368 374 del dp['stats']['ops']
369 375 _stats = dp['stats']
370 376 parsed_diff.append(_get_commit_dict(
371 377 filename=dp['filename'], op=dp['operation'],
372 378 new_revision=dp['new_revision'],
373 379 old_revision=dp['old_revision'],
374 380 raw_diff=dp['raw_diff'], stats=_stats))
375 381
376 382 return parsed_diff
377 383
378 384
379 385 def get_commit_or_error(ref, repo):
380 386 try:
381 387 ref_type, _, ref_hash = ref.split(':')
382 388 except ValueError:
383 389 raise JSONRPCError(
384 390 'Ref `{ref}` given in a wrong format. Please check the API'
385 391 ' documentation for more details'.format(ref=ref))
386 392 try:
387 393 # TODO: dan: refactor this to use repo.scm_instance().get_commit()
388 394 # once get_commit supports ref_types
389 395 return get_commit_from_ref_name(repo, ref_hash)
390 396 except RepositoryError:
391 397 raise JSONRPCError('Ref `{ref}` does not exist'.format(ref=ref))
392 398
393 399
394 400 def _get_ref_hash(repo, type_, name):
395 401 vcs_repo = repo.scm_instance()
396 402 if type_ in ['branch'] and vcs_repo.alias in ('hg', 'git'):
397 403 return vcs_repo.branches[name]
398 404 elif type_ in ['bookmark', 'book'] and vcs_repo.alias == 'hg':
399 405 return vcs_repo.bookmarks[name]
400 406 else:
401 407 raise ValueError()
402 408
403 409
404 410 def resolve_ref_or_error(ref, repo, allowed_ref_types=None):
405 411 allowed_ref_types = allowed_ref_types or ['bookmark', 'book', 'tag', 'branch']
406 412
407 413 def _parse_ref(type_, name, hash_=None):
408 414 return type_, name, hash_
409 415
410 416 try:
411 417 ref_type, ref_name, ref_hash = _parse_ref(*ref.split(':'))
412 418 except TypeError:
413 419 raise JSONRPCError(
414 420 'Ref `{ref}` given in a wrong format. Please check the API'
415 421 ' documentation for more details'.format(ref=ref))
416 422
417 423 if ref_type not in allowed_ref_types:
418 424 raise JSONRPCError(
419 425 'Ref `{ref}` type is not allowed. '
420 426 'Only:{allowed_refs} are possible.'.format(
421 427 ref=ref, allowed_refs=allowed_ref_types))
422 428
423 429 try:
424 430 ref_hash = ref_hash or _get_ref_hash(repo, ref_type, ref_name)
425 431 except (KeyError, ValueError):
426 432 raise JSONRPCError(
427 433 'The specified value:{type}:`{name}` does not exist, or is not allowed.'.format(
428 434 type=ref_type, name=ref_name))
429 435
430 436 return ':'.join([ref_type, ref_name, ref_hash])
431 437
432 438
433 439 def _get_commit_dict(
434 440 filename, op, new_revision=None, old_revision=None,
435 441 raw_diff=None, stats=None):
436 442 if stats is None:
437 443 stats = {
438 444 "added": None,
439 445 "binary": None,
440 446 "deleted": None
441 447 }
442 448 return {
443 449 "filename": safe_unicode(filename),
444 450 "op": op,
445 451
446 452 # extra details
447 453 "new_revision": new_revision,
448 454 "old_revision": old_revision,
449 455
450 456 "raw_diff": raw_diff,
451 457 "stats": stats
452 458 }
@@ -1,2523 +1,2524 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import time
23 23
24 24 import rhodecode
25 25 from rhodecode.api import (
26 26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
27 27 from rhodecode.api.utils import (
28 28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
29 29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
30 30 get_perm_or_error, parse_args, get_origin, build_commit_data,
31 31 validate_set_owner_permissions)
32 32 from rhodecode.lib import audit_logger, rc_cache, channelstream
33 33 from rhodecode.lib import repo_maintenance
34 34 from rhodecode.lib.auth import (
35 35 HasPermissionAnyApi, HasUserGroupPermissionAnyApi,
36 36 HasRepoPermissionAnyApi)
37 37 from rhodecode.lib.celerylib.utils import get_task_id
38 38 from rhodecode.lib.utils2 import (
39 39 str2bool, time_to_datetime, safe_str, safe_int, safe_unicode)
40 40 from rhodecode.lib.ext_json import json
41 41 from rhodecode.lib.exceptions import (
42 42 StatusChangeOnClosedPullRequestError, CommentVersionMismatch)
43 43 from rhodecode.lib.vcs import RepositoryError
44 44 from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError
45 45 from rhodecode.model.changeset_status import ChangesetStatusModel
46 46 from rhodecode.model.comment import CommentsModel
47 47 from rhodecode.model.db import (
48 48 Session, ChangesetStatus, RepositoryField, Repository, RepoGroup,
49 49 ChangesetComment)
50 50 from rhodecode.model.permission import PermissionModel
51 51 from rhodecode.model.pull_request import PullRequestModel
52 52 from rhodecode.model.repo import RepoModel
53 53 from rhodecode.model.scm import ScmModel, RepoList
54 54 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
55 55 from rhodecode.model import validation_schema
56 56 from rhodecode.model.validation_schema.schemas import repo_schema
57 57
58 58 log = logging.getLogger(__name__)
59 59
60 60
61 61 @jsonrpc_method()
62 62 def get_repo(request, apiuser, repoid, cache=Optional(True)):
63 63 """
64 64 Gets an existing repository by its name or repository_id.
65 65
66 66 The members section so the output returns users groups or users
67 67 associated with that repository.
68 68
69 69 This command can only be run using an |authtoken| with admin rights,
70 70 or users with at least read rights to the |repo|.
71 71
72 72 :param apiuser: This is filled automatically from the |authtoken|.
73 73 :type apiuser: AuthUser
74 74 :param repoid: The repository name or repository id.
75 75 :type repoid: str or int
76 76 :param cache: use the cached value for last changeset
77 77 :type: cache: Optional(bool)
78 78
79 79 Example output:
80 80
81 81 .. code-block:: bash
82 82
83 83 {
84 84 "error": null,
85 85 "id": <repo_id>,
86 86 "result": {
87 87 "clone_uri": null,
88 88 "created_on": "timestamp",
89 89 "description": "repo description",
90 90 "enable_downloads": false,
91 91 "enable_locking": false,
92 92 "enable_statistics": false,
93 93 "followers": [
94 94 {
95 95 "active": true,
96 96 "admin": false,
97 97 "api_key": "****************************************",
98 98 "api_keys": [
99 99 "****************************************"
100 100 ],
101 101 "email": "user@example.com",
102 102 "emails": [
103 103 "user@example.com"
104 104 ],
105 105 "extern_name": "rhodecode",
106 106 "extern_type": "rhodecode",
107 107 "firstname": "username",
108 108 "ip_addresses": [],
109 109 "language": null,
110 110 "last_login": "2015-09-16T17:16:35.854",
111 111 "lastname": "surname",
112 112 "user_id": <user_id>,
113 113 "username": "name"
114 114 }
115 115 ],
116 116 "fork_of": "parent-repo",
117 117 "landing_rev": [
118 118 "rev",
119 119 "tip"
120 120 ],
121 121 "last_changeset": {
122 122 "author": "User <user@example.com>",
123 123 "branch": "default",
124 124 "date": "timestamp",
125 125 "message": "last commit message",
126 126 "parents": [
127 127 {
128 128 "raw_id": "commit-id"
129 129 }
130 130 ],
131 131 "raw_id": "commit-id",
132 132 "revision": <revision number>,
133 133 "short_id": "short id"
134 134 },
135 135 "lock_reason": null,
136 136 "locked_by": null,
137 137 "locked_date": null,
138 138 "owner": "owner-name",
139 139 "permissions": [
140 140 {
141 141 "name": "super-admin-name",
142 142 "origin": "super-admin",
143 143 "permission": "repository.admin",
144 144 "type": "user"
145 145 },
146 146 {
147 147 "name": "owner-name",
148 148 "origin": "owner",
149 149 "permission": "repository.admin",
150 150 "type": "user"
151 151 },
152 152 {
153 153 "name": "user-group-name",
154 154 "origin": "permission",
155 155 "permission": "repository.write",
156 156 "type": "user_group"
157 157 }
158 158 ],
159 159 "private": true,
160 160 "repo_id": 676,
161 161 "repo_name": "user-group/repo-name",
162 162 "repo_type": "hg"
163 163 }
164 164 }
165 165 """
166 166
167 167 repo = get_repo_or_error(repoid)
168 168 cache = Optional.extract(cache)
169 169
170 170 include_secrets = False
171 171 if has_superadmin_permission(apiuser):
172 172 include_secrets = True
173 173 else:
174 174 # check if we have at least read permission for this repo !
175 175 _perms = (
176 176 'repository.admin', 'repository.write', 'repository.read',)
177 177 validate_repo_permissions(apiuser, repoid, repo, _perms)
178 178
179 179 permissions = []
180 180 for _user in repo.permissions():
181 181 user_data = {
182 182 'name': _user.username,
183 183 'permission': _user.permission,
184 184 'origin': get_origin(_user),
185 185 'type': "user",
186 186 }
187 187 permissions.append(user_data)
188 188
189 189 for _user_group in repo.permission_user_groups():
190 190 user_group_data = {
191 191 'name': _user_group.users_group_name,
192 192 'permission': _user_group.permission,
193 193 'origin': get_origin(_user_group),
194 194 'type': "user_group",
195 195 }
196 196 permissions.append(user_group_data)
197 197
198 198 following_users = [
199 199 user.user.get_api_data(include_secrets=include_secrets)
200 200 for user in repo.followers]
201 201
202 202 if not cache:
203 203 repo.update_commit_cache()
204 204 data = repo.get_api_data(include_secrets=include_secrets)
205 205 data['permissions'] = permissions
206 206 data['followers'] = following_users
207 207 return data
208 208
209 209
210 210 @jsonrpc_method()
211 211 def get_repos(request, apiuser, root=Optional(None), traverse=Optional(True)):
212 212 """
213 213 Lists all existing repositories.
214 214
215 215 This command can only be run using an |authtoken| with admin rights,
216 216 or users with at least read rights to |repos|.
217 217
218 218 :param apiuser: This is filled automatically from the |authtoken|.
219 219 :type apiuser: AuthUser
220 220 :param root: specify root repository group to fetch repositories.
221 221 filters the returned repositories to be members of given root group.
222 222 :type root: Optional(None)
223 223 :param traverse: traverse given root into subrepositories. With this flag
224 224 set to False, it will only return top-level repositories from `root`.
225 225 if root is empty it will return just top-level repositories.
226 226 :type traverse: Optional(True)
227 227
228 228
229 229 Example output:
230 230
231 231 .. code-block:: bash
232 232
233 233 id : <id_given_in_input>
234 234 result: [
235 235 {
236 236 "repo_id" : "<repo_id>",
237 237 "repo_name" : "<reponame>"
238 238 "repo_type" : "<repo_type>",
239 239 "clone_uri" : "<clone_uri>",
240 240 "private": : "<bool>",
241 241 "created_on" : "<datetimecreated>",
242 242 "description" : "<description>",
243 243 "landing_rev": "<landing_rev>",
244 244 "owner": "<repo_owner>",
245 245 "fork_of": "<name_of_fork_parent>",
246 246 "enable_downloads": "<bool>",
247 247 "enable_locking": "<bool>",
248 248 "enable_statistics": "<bool>",
249 249 },
250 250 ...
251 251 ]
252 252 error: null
253 253 """
254 254
255 255 include_secrets = has_superadmin_permission(apiuser)
256 256 _perms = ('repository.read', 'repository.write', 'repository.admin',)
257 257 extras = {'user': apiuser}
258 258
259 259 root = Optional.extract(root)
260 260 traverse = Optional.extract(traverse, binary=True)
261 261
262 262 if root:
263 263 # verify parent existance, if it's empty return an error
264 264 parent = RepoGroup.get_by_group_name(root)
265 265 if not parent:
266 266 raise JSONRPCError(
267 267 'Root repository group `{}` does not exist'.format(root))
268 268
269 269 if traverse:
270 270 repos = RepoModel().get_repos_for_root(root=root, traverse=traverse)
271 271 else:
272 272 repos = RepoModel().get_repos_for_root(root=parent)
273 273 else:
274 274 if traverse:
275 275 repos = RepoModel().get_all()
276 276 else:
277 277 # return just top-level
278 278 repos = RepoModel().get_repos_for_root(root=None)
279 279
280 280 repo_list = RepoList(repos, perm_set=_perms, extra_kwargs=extras)
281 281 return [repo.get_api_data(include_secrets=include_secrets)
282 282 for repo in repo_list]
283 283
284 284
285 285 @jsonrpc_method()
286 286 def get_repo_changeset(request, apiuser, repoid, revision,
287 287 details=Optional('basic')):
288 288 """
289 289 Returns information about a changeset.
290 290
291 291 Additionally parameters define the amount of details returned by
292 292 this function.
293 293
294 294 This command can only be run using an |authtoken| with admin rights,
295 295 or users with at least read rights to the |repo|.
296 296
297 297 :param apiuser: This is filled automatically from the |authtoken|.
298 298 :type apiuser: AuthUser
299 299 :param repoid: The repository name or repository id
300 300 :type repoid: str or int
301 301 :param revision: revision for which listing should be done
302 302 :type revision: str
303 303 :param details: details can be 'basic|extended|full' full gives diff
304 304 info details like the diff itself, and number of changed files etc.
305 305 :type details: Optional(str)
306 306
307 307 """
308 308 repo = get_repo_or_error(repoid)
309 309 if not has_superadmin_permission(apiuser):
310 310 _perms = ('repository.admin', 'repository.write', 'repository.read',)
311 311 validate_repo_permissions(apiuser, repoid, repo, _perms)
312 312
313 313 changes_details = Optional.extract(details)
314 314 _changes_details_types = ['basic', 'extended', 'full']
315 315 if changes_details not in _changes_details_types:
316 316 raise JSONRPCError(
317 317 'ret_type must be one of %s' % (
318 318 ','.join(_changes_details_types)))
319 319
320 vcs_repo = repo.scm_instance()
320 321 pre_load = ['author', 'branch', 'date', 'message', 'parents',
321 322 'status', '_commit', '_file_paths']
322 323
323 324 try:
324 cs = repo.get_commit(commit_id=revision, pre_load=pre_load)
325 commit = repo.get_commit(commit_id=revision, pre_load=pre_load)
325 326 except TypeError as e:
326 327 raise JSONRPCError(safe_str(e))
327 _cs_json = cs.__json__()
328 _cs_json['diff'] = build_commit_data(cs, changes_details)
328 _cs_json = commit.__json__()
329 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
329 330 if changes_details == 'full':
330 _cs_json['refs'] = cs._get_refs()
331 _cs_json['refs'] = commit._get_refs()
331 332 return _cs_json
332 333
333 334
334 335 @jsonrpc_method()
335 336 def get_repo_changesets(request, apiuser, repoid, start_rev, limit,
336 337 details=Optional('basic')):
337 338 """
338 339 Returns a set of commits limited by the number starting
339 340 from the `start_rev` option.
340 341
341 342 Additional parameters define the amount of details returned by this
342 343 function.
343 344
344 345 This command can only be run using an |authtoken| with admin rights,
345 346 or users with at least read rights to |repos|.
346 347
347 348 :param apiuser: This is filled automatically from the |authtoken|.
348 349 :type apiuser: AuthUser
349 350 :param repoid: The repository name or repository ID.
350 351 :type repoid: str or int
351 352 :param start_rev: The starting revision from where to get changesets.
352 353 :type start_rev: str
353 354 :param limit: Limit the number of commits to this amount
354 355 :type limit: str or int
355 356 :param details: Set the level of detail returned. Valid option are:
356 357 ``basic``, ``extended`` and ``full``.
357 358 :type details: Optional(str)
358 359
359 360 .. note::
360 361
361 362 Setting the parameter `details` to the value ``full`` is extensive
362 363 and returns details like the diff itself, and the number
363 364 of changed files.
364 365
365 366 """
366 367 repo = get_repo_or_error(repoid)
367 368 if not has_superadmin_permission(apiuser):
368 369 _perms = ('repository.admin', 'repository.write', 'repository.read',)
369 370 validate_repo_permissions(apiuser, repoid, repo, _perms)
370 371
371 372 changes_details = Optional.extract(details)
372 373 _changes_details_types = ['basic', 'extended', 'full']
373 374 if changes_details not in _changes_details_types:
374 375 raise JSONRPCError(
375 376 'ret_type must be one of %s' % (
376 377 ','.join(_changes_details_types)))
377 378
378 379 limit = int(limit)
379 380 pre_load = ['author', 'branch', 'date', 'message', 'parents',
380 381 'status', '_commit', '_file_paths']
381 382
382 383 vcs_repo = repo.scm_instance()
383 384 # SVN needs a special case to distinguish its index and commit id
384 385 if vcs_repo and vcs_repo.alias == 'svn' and (start_rev == '0'):
385 386 start_rev = vcs_repo.commit_ids[0]
386 387
387 388 try:
388 389 commits = vcs_repo.get_commits(
389 390 start_id=start_rev, pre_load=pre_load, translate_tags=False)
390 391 except TypeError as e:
391 392 raise JSONRPCError(safe_str(e))
392 393 except Exception:
393 394 log.exception('Fetching of commits failed')
394 395 raise JSONRPCError('Error occurred during commit fetching')
395 396
396 397 ret = []
397 398 for cnt, commit in enumerate(commits):
398 399 if cnt >= limit != -1:
399 400 break
400 401 _cs_json = commit.__json__()
401 _cs_json['diff'] = build_commit_data(commit, changes_details)
402 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
402 403 if changes_details == 'full':
403 404 _cs_json['refs'] = {
404 405 'branches': [commit.branch],
405 406 'bookmarks': getattr(commit, 'bookmarks', []),
406 407 'tags': commit.tags
407 408 }
408 409 ret.append(_cs_json)
409 410 return ret
410 411
411 412
412 413 @jsonrpc_method()
413 414 def get_repo_nodes(request, apiuser, repoid, revision, root_path,
414 415 ret_type=Optional('all'), details=Optional('basic'),
415 416 max_file_bytes=Optional(None)):
416 417 """
417 418 Returns a list of nodes and children in a flat list for a given
418 419 path at given revision.
419 420
420 421 It's possible to specify ret_type to show only `files` or `dirs`.
421 422
422 423 This command can only be run using an |authtoken| with admin rights,
423 424 or users with at least read rights to |repos|.
424 425
425 426 :param apiuser: This is filled automatically from the |authtoken|.
426 427 :type apiuser: AuthUser
427 428 :param repoid: The repository name or repository ID.
428 429 :type repoid: str or int
429 430 :param revision: The revision for which listing should be done.
430 431 :type revision: str
431 432 :param root_path: The path from which to start displaying.
432 433 :type root_path: str
433 434 :param ret_type: Set the return type. Valid options are
434 435 ``all`` (default), ``files`` and ``dirs``.
435 436 :type ret_type: Optional(str)
436 437 :param details: Returns extended information about nodes, such as
437 438 md5, binary, and or content.
438 439 The valid options are ``basic`` and ``full``.
439 440 :type details: Optional(str)
440 441 :param max_file_bytes: Only return file content under this file size bytes
441 442 :type details: Optional(int)
442 443
443 444 Example output:
444 445
445 446 .. code-block:: bash
446 447
447 448 id : <id_given_in_input>
448 449 result: [
449 450 {
450 451 "binary": false,
451 452 "content": "File line",
452 453 "extension": "md",
453 454 "lines": 2,
454 455 "md5": "059fa5d29b19c0657e384749480f6422",
455 456 "mimetype": "text/x-minidsrc",
456 457 "name": "file.md",
457 458 "size": 580,
458 459 "type": "file"
459 460 },
460 461 ...
461 462 ]
462 463 error: null
463 464 """
464 465
465 466 repo = get_repo_or_error(repoid)
466 467 if not has_superadmin_permission(apiuser):
467 468 _perms = ('repository.admin', 'repository.write', 'repository.read',)
468 469 validate_repo_permissions(apiuser, repoid, repo, _perms)
469 470
470 471 ret_type = Optional.extract(ret_type)
471 472 details = Optional.extract(details)
472 473 _extended_types = ['basic', 'full']
473 474 if details not in _extended_types:
474 475 raise JSONRPCError('ret_type must be one of %s' % (','.join(_extended_types)))
475 476 extended_info = False
476 477 content = False
477 478 if details == 'basic':
478 479 extended_info = True
479 480
480 481 if details == 'full':
481 482 extended_info = content = True
482 483
483 484 _map = {}
484 485 try:
485 486 # check if repo is not empty by any chance, skip quicker if it is.
486 487 _scm = repo.scm_instance()
487 488 if _scm.is_empty():
488 489 return []
489 490
490 491 _d, _f = ScmModel().get_nodes(
491 492 repo, revision, root_path, flat=False,
492 493 extended_info=extended_info, content=content,
493 494 max_file_bytes=max_file_bytes)
494 495 _map = {
495 496 'all': _d + _f,
496 497 'files': _f,
497 498 'dirs': _d,
498 499 }
499 500 return _map[ret_type]
500 501 except KeyError:
501 502 raise JSONRPCError(
502 503 'ret_type must be one of %s' % (','.join(sorted(_map.keys()))))
503 504 except Exception:
504 505 log.exception("Exception occurred while trying to get repo nodes")
505 506 raise JSONRPCError(
506 507 'failed to get repo: `%s` nodes' % repo.repo_name
507 508 )
508 509
509 510
510 511 @jsonrpc_method()
511 512 def get_repo_file(request, apiuser, repoid, commit_id, file_path,
512 513 max_file_bytes=Optional(None), details=Optional('basic'),
513 514 cache=Optional(True)):
514 515 """
515 516 Returns a single file from repository at given revision.
516 517
517 518 This command can only be run using an |authtoken| with admin rights,
518 519 or users with at least read rights to |repos|.
519 520
520 521 :param apiuser: This is filled automatically from the |authtoken|.
521 522 :type apiuser: AuthUser
522 523 :param repoid: The repository name or repository ID.
523 524 :type repoid: str or int
524 525 :param commit_id: The revision for which listing should be done.
525 526 :type commit_id: str
526 527 :param file_path: The path from which to start displaying.
527 528 :type file_path: str
528 529 :param details: Returns different set of information about nodes.
529 530 The valid options are ``minimal`` ``basic`` and ``full``.
530 531 :type details: Optional(str)
531 532 :param max_file_bytes: Only return file content under this file size bytes
532 533 :type max_file_bytes: Optional(int)
533 534 :param cache: Use internal caches for fetching files. If disabled fetching
534 535 files is slower but more memory efficient
535 536 :type cache: Optional(bool)
536 537
537 538 Example output:
538 539
539 540 .. code-block:: bash
540 541
541 542 id : <id_given_in_input>
542 543 result: {
543 544 "binary": false,
544 545 "extension": "py",
545 546 "lines": 35,
546 547 "content": "....",
547 548 "md5": "76318336366b0f17ee249e11b0c99c41",
548 549 "mimetype": "text/x-python",
549 550 "name": "python.py",
550 551 "size": 817,
551 552 "type": "file",
552 553 }
553 554 error: null
554 555 """
555 556
556 557 repo = get_repo_or_error(repoid)
557 558 if not has_superadmin_permission(apiuser):
558 559 _perms = ('repository.admin', 'repository.write', 'repository.read',)
559 560 validate_repo_permissions(apiuser, repoid, repo, _perms)
560 561
561 562 cache = Optional.extract(cache, binary=True)
562 563 details = Optional.extract(details)
563 564 _extended_types = ['minimal', 'minimal+search', 'basic', 'full']
564 565 if details not in _extended_types:
565 566 raise JSONRPCError(
566 567 'ret_type must be one of %s, got %s' % (','.join(_extended_types)), details)
567 568 extended_info = False
568 569 content = False
569 570
570 571 if details == 'minimal':
571 572 extended_info = False
572 573
573 574 elif details == 'basic':
574 575 extended_info = True
575 576
576 577 elif details == 'full':
577 578 extended_info = content = True
578 579
579 580 file_path = safe_unicode(file_path)
580 581 try:
581 582 # check if repo is not empty by any chance, skip quicker if it is.
582 583 _scm = repo.scm_instance()
583 584 if _scm.is_empty():
584 585 return None
585 586
586 587 node = ScmModel().get_node(
587 588 repo, commit_id, file_path, extended_info=extended_info,
588 589 content=content, max_file_bytes=max_file_bytes, cache=cache)
589 590 except NodeDoesNotExistError:
590 591 raise JSONRPCError(u'There is no file in repo: `{}` at path `{}` for commit: `{}`'.format(
591 592 repo.repo_name, file_path, commit_id))
592 593 except Exception:
593 594 log.exception(u"Exception occurred while trying to get repo %s file",
594 595 repo.repo_name)
595 596 raise JSONRPCError(u'failed to get repo: `{}` file at path {}'.format(
596 597 repo.repo_name, file_path))
597 598
598 599 return node
599 600
600 601
601 602 @jsonrpc_method()
602 603 def get_repo_fts_tree(request, apiuser, repoid, commit_id, root_path):
603 604 """
604 605 Returns a list of tree nodes for path at given revision. This api is built
605 606 strictly for usage in full text search building, and shouldn't be consumed
606 607
607 608 This command can only be run using an |authtoken| with admin rights,
608 609 or users with at least read rights to |repos|.
609 610
610 611 """
611 612
612 613 repo = get_repo_or_error(repoid)
613 614 if not has_superadmin_permission(apiuser):
614 615 _perms = ('repository.admin', 'repository.write', 'repository.read',)
615 616 validate_repo_permissions(apiuser, repoid, repo, _perms)
616 617
617 618 repo_id = repo.repo_id
618 619 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
619 620 cache_on = cache_seconds > 0
620 621
621 622 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
622 623 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
623 624
624 625 def compute_fts_tree(cache_ver, repo_id, commit_id, root_path):
625 626 return ScmModel().get_fts_data(repo_id, commit_id, root_path)
626 627
627 628 try:
628 629 # check if repo is not empty by any chance, skip quicker if it is.
629 630 _scm = repo.scm_instance()
630 631 if _scm.is_empty():
631 632 return []
632 633 except RepositoryError:
633 634 log.exception("Exception occurred while trying to get repo nodes")
634 635 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
635 636
636 637 try:
637 638 # we need to resolve commit_id to a FULL sha for cache to work correctly.
638 639 # sending 'master' is a pointer that needs to be translated to current commit.
639 640 commit_id = _scm.get_commit(commit_id=commit_id).raw_id
640 641 log.debug(
641 642 'Computing FTS REPO TREE for repo_id %s commit_id `%s` '
642 643 'with caching: %s[TTL: %ss]' % (
643 644 repo_id, commit_id, cache_on, cache_seconds or 0))
644 645
645 646 tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path)
646 647 return tree_files
647 648
648 649 except Exception:
649 650 log.exception("Exception occurred while trying to get repo nodes")
650 651 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
651 652
652 653
653 654 @jsonrpc_method()
654 655 def get_repo_refs(request, apiuser, repoid):
655 656 """
656 657 Returns a dictionary of current references. It returns
657 658 bookmarks, branches, closed_branches, and tags for given repository
658 659
659 660 It's possible to specify ret_type to show only `files` or `dirs`.
660 661
661 662 This command can only be run using an |authtoken| with admin rights,
662 663 or users with at least read rights to |repos|.
663 664
664 665 :param apiuser: This is filled automatically from the |authtoken|.
665 666 :type apiuser: AuthUser
666 667 :param repoid: The repository name or repository ID.
667 668 :type repoid: str or int
668 669
669 670 Example output:
670 671
671 672 .. code-block:: bash
672 673
673 674 id : <id_given_in_input>
674 675 "result": {
675 676 "bookmarks": {
676 677 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
677 678 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
678 679 },
679 680 "branches": {
680 681 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
681 682 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
682 683 },
683 684 "branches_closed": {},
684 685 "tags": {
685 686 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
686 687 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
687 688 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
688 689 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
689 690 }
690 691 }
691 692 error: null
692 693 """
693 694
694 695 repo = get_repo_or_error(repoid)
695 696 if not has_superadmin_permission(apiuser):
696 697 _perms = ('repository.admin', 'repository.write', 'repository.read',)
697 698 validate_repo_permissions(apiuser, repoid, repo, _perms)
698 699
699 700 try:
700 701 # check if repo is not empty by any chance, skip quicker if it is.
701 702 vcs_instance = repo.scm_instance()
702 703 refs = vcs_instance.refs()
703 704 return refs
704 705 except Exception:
705 706 log.exception("Exception occurred while trying to get repo refs")
706 707 raise JSONRPCError(
707 708 'failed to get repo: `%s` references' % repo.repo_name
708 709 )
709 710
710 711
711 712 @jsonrpc_method()
712 713 def create_repo(
713 714 request, apiuser, repo_name, repo_type,
714 715 owner=Optional(OAttr('apiuser')),
715 716 description=Optional(''),
716 717 private=Optional(False),
717 718 clone_uri=Optional(None),
718 719 push_uri=Optional(None),
719 720 landing_rev=Optional(None),
720 721 enable_statistics=Optional(False),
721 722 enable_locking=Optional(False),
722 723 enable_downloads=Optional(False),
723 724 copy_permissions=Optional(False)):
724 725 """
725 726 Creates a repository.
726 727
727 728 * If the repository name contains "/", repository will be created inside
728 729 a repository group or nested repository groups
729 730
730 731 For example "foo/bar/repo1" will create |repo| called "repo1" inside
731 732 group "foo/bar". You have to have permissions to access and write to
732 733 the last repository group ("bar" in this example)
733 734
734 735 This command can only be run using an |authtoken| with at least
735 736 permissions to create repositories, or write permissions to
736 737 parent repository groups.
737 738
738 739 :param apiuser: This is filled automatically from the |authtoken|.
739 740 :type apiuser: AuthUser
740 741 :param repo_name: Set the repository name.
741 742 :type repo_name: str
742 743 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
743 744 :type repo_type: str
744 745 :param owner: user_id or username
745 746 :type owner: Optional(str)
746 747 :param description: Set the repository description.
747 748 :type description: Optional(str)
748 749 :param private: set repository as private
749 750 :type private: bool
750 751 :param clone_uri: set clone_uri
751 752 :type clone_uri: str
752 753 :param push_uri: set push_uri
753 754 :type push_uri: str
754 755 :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd
755 756 :type landing_rev: str
756 757 :param enable_locking:
757 758 :type enable_locking: bool
758 759 :param enable_downloads:
759 760 :type enable_downloads: bool
760 761 :param enable_statistics:
761 762 :type enable_statistics: bool
762 763 :param copy_permissions: Copy permission from group in which the
763 764 repository is being created.
764 765 :type copy_permissions: bool
765 766
766 767
767 768 Example output:
768 769
769 770 .. code-block:: bash
770 771
771 772 id : <id_given_in_input>
772 773 result: {
773 774 "msg": "Created new repository `<reponame>`",
774 775 "success": true,
775 776 "task": "<celery task id or None if done sync>"
776 777 }
777 778 error: null
778 779
779 780
780 781 Example error output:
781 782
782 783 .. code-block:: bash
783 784
784 785 id : <id_given_in_input>
785 786 result : null
786 787 error : {
787 788 'failed to create repository `<repo_name>`'
788 789 }
789 790
790 791 """
791 792
792 793 owner = validate_set_owner_permissions(apiuser, owner)
793 794
794 795 description = Optional.extract(description)
795 796 copy_permissions = Optional.extract(copy_permissions)
796 797 clone_uri = Optional.extract(clone_uri)
797 798 push_uri = Optional.extract(push_uri)
798 799
799 800 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
800 801 if isinstance(private, Optional):
801 802 private = defs.get('repo_private') or Optional.extract(private)
802 803 if isinstance(repo_type, Optional):
803 804 repo_type = defs.get('repo_type')
804 805 if isinstance(enable_statistics, Optional):
805 806 enable_statistics = defs.get('repo_enable_statistics')
806 807 if isinstance(enable_locking, Optional):
807 808 enable_locking = defs.get('repo_enable_locking')
808 809 if isinstance(enable_downloads, Optional):
809 810 enable_downloads = defs.get('repo_enable_downloads')
810 811
811 812 landing_ref, _label = ScmModel.backend_landing_ref(repo_type)
812 813 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
813 814 ref_choices = list(set(ref_choices + [landing_ref]))
814 815
815 816 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
816 817
817 818 schema = repo_schema.RepoSchema().bind(
818 819 repo_type_options=rhodecode.BACKENDS.keys(),
819 820 repo_ref_options=ref_choices,
820 821 repo_type=repo_type,
821 822 # user caller
822 823 user=apiuser)
823 824
824 825 try:
825 826 schema_data = schema.deserialize(dict(
826 827 repo_name=repo_name,
827 828 repo_type=repo_type,
828 829 repo_owner=owner.username,
829 830 repo_description=description,
830 831 repo_landing_commit_ref=landing_commit_ref,
831 832 repo_clone_uri=clone_uri,
832 833 repo_push_uri=push_uri,
833 834 repo_private=private,
834 835 repo_copy_permissions=copy_permissions,
835 836 repo_enable_statistics=enable_statistics,
836 837 repo_enable_downloads=enable_downloads,
837 838 repo_enable_locking=enable_locking))
838 839 except validation_schema.Invalid as err:
839 840 raise JSONRPCValidationError(colander_exc=err)
840 841
841 842 try:
842 843 data = {
843 844 'owner': owner,
844 845 'repo_name': schema_data['repo_group']['repo_name_without_group'],
845 846 'repo_name_full': schema_data['repo_name'],
846 847 'repo_group': schema_data['repo_group']['repo_group_id'],
847 848 'repo_type': schema_data['repo_type'],
848 849 'repo_description': schema_data['repo_description'],
849 850 'repo_private': schema_data['repo_private'],
850 851 'clone_uri': schema_data['repo_clone_uri'],
851 852 'push_uri': schema_data['repo_push_uri'],
852 853 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
853 854 'enable_statistics': schema_data['repo_enable_statistics'],
854 855 'enable_locking': schema_data['repo_enable_locking'],
855 856 'enable_downloads': schema_data['repo_enable_downloads'],
856 857 'repo_copy_permissions': schema_data['repo_copy_permissions'],
857 858 }
858 859
859 860 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
860 861 task_id = get_task_id(task)
861 862 # no commit, it's done in RepoModel, or async via celery
862 863 return {
863 864 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
864 865 'success': True, # cannot return the repo data here since fork
865 866 # can be done async
866 867 'task': task_id
867 868 }
868 869 except Exception:
869 870 log.exception(
870 871 u"Exception while trying to create the repository %s",
871 872 schema_data['repo_name'])
872 873 raise JSONRPCError(
873 874 'failed to create repository `%s`' % (schema_data['repo_name'],))
874 875
875 876
876 877 @jsonrpc_method()
877 878 def add_field_to_repo(request, apiuser, repoid, key, label=Optional(''),
878 879 description=Optional('')):
879 880 """
880 881 Adds an extra field to a repository.
881 882
882 883 This command can only be run using an |authtoken| with at least
883 884 write permissions to the |repo|.
884 885
885 886 :param apiuser: This is filled automatically from the |authtoken|.
886 887 :type apiuser: AuthUser
887 888 :param repoid: Set the repository name or repository id.
888 889 :type repoid: str or int
889 890 :param key: Create a unique field key for this repository.
890 891 :type key: str
891 892 :param label:
892 893 :type label: Optional(str)
893 894 :param description:
894 895 :type description: Optional(str)
895 896 """
896 897 repo = get_repo_or_error(repoid)
897 898 if not has_superadmin_permission(apiuser):
898 899 _perms = ('repository.admin',)
899 900 validate_repo_permissions(apiuser, repoid, repo, _perms)
900 901
901 902 label = Optional.extract(label) or key
902 903 description = Optional.extract(description)
903 904
904 905 field = RepositoryField.get_by_key_name(key, repo)
905 906 if field:
906 907 raise JSONRPCError('Field with key '
907 908 '`%s` exists for repo `%s`' % (key, repoid))
908 909
909 910 try:
910 911 RepoModel().add_repo_field(repo, key, field_label=label,
911 912 field_desc=description)
912 913 Session().commit()
913 914 return {
914 915 'msg': "Added new repository field `%s`" % (key,),
915 916 'success': True,
916 917 }
917 918 except Exception:
918 919 log.exception("Exception occurred while trying to add field to repo")
919 920 raise JSONRPCError(
920 921 'failed to create new field for repository `%s`' % (repoid,))
921 922
922 923
923 924 @jsonrpc_method()
924 925 def remove_field_from_repo(request, apiuser, repoid, key):
925 926 """
926 927 Removes an extra field from a repository.
927 928
928 929 This command can only be run using an |authtoken| with at least
929 930 write permissions to the |repo|.
930 931
931 932 :param apiuser: This is filled automatically from the |authtoken|.
932 933 :type apiuser: AuthUser
933 934 :param repoid: Set the repository name or repository ID.
934 935 :type repoid: str or int
935 936 :param key: Set the unique field key for this repository.
936 937 :type key: str
937 938 """
938 939
939 940 repo = get_repo_or_error(repoid)
940 941 if not has_superadmin_permission(apiuser):
941 942 _perms = ('repository.admin',)
942 943 validate_repo_permissions(apiuser, repoid, repo, _perms)
943 944
944 945 field = RepositoryField.get_by_key_name(key, repo)
945 946 if not field:
946 947 raise JSONRPCError('Field with key `%s` does not '
947 948 'exists for repo `%s`' % (key, repoid))
948 949
949 950 try:
950 951 RepoModel().delete_repo_field(repo, field_key=key)
951 952 Session().commit()
952 953 return {
953 954 'msg': "Deleted repository field `%s`" % (key,),
954 955 'success': True,
955 956 }
956 957 except Exception:
957 958 log.exception(
958 959 "Exception occurred while trying to delete field from repo")
959 960 raise JSONRPCError(
960 961 'failed to delete field for repository `%s`' % (repoid,))
961 962
962 963
963 964 @jsonrpc_method()
964 965 def update_repo(
965 966 request, apiuser, repoid, repo_name=Optional(None),
966 967 owner=Optional(OAttr('apiuser')), description=Optional(''),
967 968 private=Optional(False),
968 969 clone_uri=Optional(None), push_uri=Optional(None),
969 970 landing_rev=Optional(None), fork_of=Optional(None),
970 971 enable_statistics=Optional(False),
971 972 enable_locking=Optional(False),
972 973 enable_downloads=Optional(False), fields=Optional('')):
973 974 """
974 975 Updates a repository with the given information.
975 976
976 977 This command can only be run using an |authtoken| with at least
977 978 admin permissions to the |repo|.
978 979
979 980 * If the repository name contains "/", repository will be updated
980 981 accordingly with a repository group or nested repository groups
981 982
982 983 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
983 984 called "repo-test" and place it inside group "foo/bar".
984 985 You have to have permissions to access and write to the last repository
985 986 group ("bar" in this example)
986 987
987 988 :param apiuser: This is filled automatically from the |authtoken|.
988 989 :type apiuser: AuthUser
989 990 :param repoid: repository name or repository ID.
990 991 :type repoid: str or int
991 992 :param repo_name: Update the |repo| name, including the
992 993 repository group it's in.
993 994 :type repo_name: str
994 995 :param owner: Set the |repo| owner.
995 996 :type owner: str
996 997 :param fork_of: Set the |repo| as fork of another |repo|.
997 998 :type fork_of: str
998 999 :param description: Update the |repo| description.
999 1000 :type description: str
1000 1001 :param private: Set the |repo| as private. (True | False)
1001 1002 :type private: bool
1002 1003 :param clone_uri: Update the |repo| clone URI.
1003 1004 :type clone_uri: str
1004 1005 :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd
1005 1006 :type landing_rev: str
1006 1007 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1007 1008 :type enable_statistics: bool
1008 1009 :param enable_locking: Enable |repo| locking.
1009 1010 :type enable_locking: bool
1010 1011 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1011 1012 :type enable_downloads: bool
1012 1013 :param fields: Add extra fields to the |repo|. Use the following
1013 1014 example format: ``field_key=field_val,field_key2=fieldval2``.
1014 1015 Escape ', ' with \,
1015 1016 :type fields: str
1016 1017 """
1017 1018
1018 1019 repo = get_repo_or_error(repoid)
1019 1020
1020 1021 include_secrets = False
1021 1022 if not has_superadmin_permission(apiuser):
1022 1023 _perms = ('repository.admin',)
1023 1024 validate_repo_permissions(apiuser, repoid, repo, _perms)
1024 1025 else:
1025 1026 include_secrets = True
1026 1027
1027 1028 updates = dict(
1028 1029 repo_name=repo_name
1029 1030 if not isinstance(repo_name, Optional) else repo.repo_name,
1030 1031
1031 1032 fork_id=fork_of
1032 1033 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
1033 1034
1034 1035 user=owner
1035 1036 if not isinstance(owner, Optional) else repo.user.username,
1036 1037
1037 1038 repo_description=description
1038 1039 if not isinstance(description, Optional) else repo.description,
1039 1040
1040 1041 repo_private=private
1041 1042 if not isinstance(private, Optional) else repo.private,
1042 1043
1043 1044 clone_uri=clone_uri
1044 1045 if not isinstance(clone_uri, Optional) else repo.clone_uri,
1045 1046
1046 1047 push_uri=push_uri
1047 1048 if not isinstance(push_uri, Optional) else repo.push_uri,
1048 1049
1049 1050 repo_landing_rev=landing_rev
1050 1051 if not isinstance(landing_rev, Optional) else repo._landing_revision,
1051 1052
1052 1053 repo_enable_statistics=enable_statistics
1053 1054 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
1054 1055
1055 1056 repo_enable_locking=enable_locking
1056 1057 if not isinstance(enable_locking, Optional) else repo.enable_locking,
1057 1058
1058 1059 repo_enable_downloads=enable_downloads
1059 1060 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
1060 1061
1061 1062 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1062 1063 ref_choices, _labels = ScmModel().get_repo_landing_revs(
1063 1064 request.translate, repo=repo)
1064 1065 ref_choices = list(set(ref_choices + [landing_ref]))
1065 1066
1066 1067 old_values = repo.get_api_data()
1067 1068 repo_type = repo.repo_type
1068 1069 schema = repo_schema.RepoSchema().bind(
1069 1070 repo_type_options=rhodecode.BACKENDS.keys(),
1070 1071 repo_ref_options=ref_choices,
1071 1072 repo_type=repo_type,
1072 1073 # user caller
1073 1074 user=apiuser,
1074 1075 old_values=old_values)
1075 1076 try:
1076 1077 schema_data = schema.deserialize(dict(
1077 1078 # we save old value, users cannot change type
1078 1079 repo_type=repo_type,
1079 1080
1080 1081 repo_name=updates['repo_name'],
1081 1082 repo_owner=updates['user'],
1082 1083 repo_description=updates['repo_description'],
1083 1084 repo_clone_uri=updates['clone_uri'],
1084 1085 repo_push_uri=updates['push_uri'],
1085 1086 repo_fork_of=updates['fork_id'],
1086 1087 repo_private=updates['repo_private'],
1087 1088 repo_landing_commit_ref=updates['repo_landing_rev'],
1088 1089 repo_enable_statistics=updates['repo_enable_statistics'],
1089 1090 repo_enable_downloads=updates['repo_enable_downloads'],
1090 1091 repo_enable_locking=updates['repo_enable_locking']))
1091 1092 except validation_schema.Invalid as err:
1092 1093 raise JSONRPCValidationError(colander_exc=err)
1093 1094
1094 1095 # save validated data back into the updates dict
1095 1096 validated_updates = dict(
1096 1097 repo_name=schema_data['repo_group']['repo_name_without_group'],
1097 1098 repo_group=schema_data['repo_group']['repo_group_id'],
1098 1099
1099 1100 user=schema_data['repo_owner'],
1100 1101 repo_description=schema_data['repo_description'],
1101 1102 repo_private=schema_data['repo_private'],
1102 1103 clone_uri=schema_data['repo_clone_uri'],
1103 1104 push_uri=schema_data['repo_push_uri'],
1104 1105 repo_landing_rev=schema_data['repo_landing_commit_ref'],
1105 1106 repo_enable_statistics=schema_data['repo_enable_statistics'],
1106 1107 repo_enable_locking=schema_data['repo_enable_locking'],
1107 1108 repo_enable_downloads=schema_data['repo_enable_downloads'],
1108 1109 )
1109 1110
1110 1111 if schema_data['repo_fork_of']:
1111 1112 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
1112 1113 validated_updates['fork_id'] = fork_repo.repo_id
1113 1114
1114 1115 # extra fields
1115 1116 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
1116 1117 if fields:
1117 1118 validated_updates.update(fields)
1118 1119
1119 1120 try:
1120 1121 RepoModel().update(repo, **validated_updates)
1121 1122 audit_logger.store_api(
1122 1123 'repo.edit', action_data={'old_data': old_values},
1123 1124 user=apiuser, repo=repo)
1124 1125 Session().commit()
1125 1126 return {
1126 1127 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
1127 1128 'repository': repo.get_api_data(include_secrets=include_secrets)
1128 1129 }
1129 1130 except Exception:
1130 1131 log.exception(
1131 1132 u"Exception while trying to update the repository %s",
1132 1133 repoid)
1133 1134 raise JSONRPCError('failed to update repo `%s`' % repoid)
1134 1135
1135 1136
1136 1137 @jsonrpc_method()
1137 1138 def fork_repo(request, apiuser, repoid, fork_name,
1138 1139 owner=Optional(OAttr('apiuser')),
1139 1140 description=Optional(''),
1140 1141 private=Optional(False),
1141 1142 clone_uri=Optional(None),
1142 1143 landing_rev=Optional(None),
1143 1144 copy_permissions=Optional(False)):
1144 1145 """
1145 1146 Creates a fork of the specified |repo|.
1146 1147
1147 1148 * If the fork_name contains "/", fork will be created inside
1148 1149 a repository group or nested repository groups
1149 1150
1150 1151 For example "foo/bar/fork-repo" will create fork called "fork-repo"
1151 1152 inside group "foo/bar". You have to have permissions to access and
1152 1153 write to the last repository group ("bar" in this example)
1153 1154
1154 1155 This command can only be run using an |authtoken| with minimum
1155 1156 read permissions of the forked repo, create fork permissions for an user.
1156 1157
1157 1158 :param apiuser: This is filled automatically from the |authtoken|.
1158 1159 :type apiuser: AuthUser
1159 1160 :param repoid: Set repository name or repository ID.
1160 1161 :type repoid: str or int
1161 1162 :param fork_name: Set the fork name, including it's repository group membership.
1162 1163 :type fork_name: str
1163 1164 :param owner: Set the fork owner.
1164 1165 :type owner: str
1165 1166 :param description: Set the fork description.
1166 1167 :type description: str
1167 1168 :param copy_permissions: Copy permissions from parent |repo|. The
1168 1169 default is False.
1169 1170 :type copy_permissions: bool
1170 1171 :param private: Make the fork private. The default is False.
1171 1172 :type private: bool
1172 1173 :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd
1173 1174
1174 1175 Example output:
1175 1176
1176 1177 .. code-block:: bash
1177 1178
1178 1179 id : <id_for_response>
1179 1180 api_key : "<api_key>"
1180 1181 args: {
1181 1182 "repoid" : "<reponame or repo_id>",
1182 1183 "fork_name": "<forkname>",
1183 1184 "owner": "<username or user_id = Optional(=apiuser)>",
1184 1185 "description": "<description>",
1185 1186 "copy_permissions": "<bool>",
1186 1187 "private": "<bool>",
1187 1188 "landing_rev": "<landing_rev>"
1188 1189 }
1189 1190
1190 1191 Example error output:
1191 1192
1192 1193 .. code-block:: bash
1193 1194
1194 1195 id : <id_given_in_input>
1195 1196 result: {
1196 1197 "msg": "Created fork of `<reponame>` as `<forkname>`",
1197 1198 "success": true,
1198 1199 "task": "<celery task id or None if done sync>"
1199 1200 }
1200 1201 error: null
1201 1202
1202 1203 """
1203 1204
1204 1205 repo = get_repo_or_error(repoid)
1205 1206 repo_name = repo.repo_name
1206 1207
1207 1208 if not has_superadmin_permission(apiuser):
1208 1209 # check if we have at least read permission for
1209 1210 # this repo that we fork !
1210 1211 _perms = ('repository.admin', 'repository.write', 'repository.read')
1211 1212 validate_repo_permissions(apiuser, repoid, repo, _perms)
1212 1213
1213 1214 # check if the regular user has at least fork permissions as well
1214 1215 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
1215 1216 raise JSONRPCForbidden()
1216 1217
1217 1218 # check if user can set owner parameter
1218 1219 owner = validate_set_owner_permissions(apiuser, owner)
1219 1220
1220 1221 description = Optional.extract(description)
1221 1222 copy_permissions = Optional.extract(copy_permissions)
1222 1223 clone_uri = Optional.extract(clone_uri)
1223 1224
1224 1225 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1225 1226 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
1226 1227 ref_choices = list(set(ref_choices + [landing_ref]))
1227 1228 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
1228 1229
1229 1230 private = Optional.extract(private)
1230 1231
1231 1232 schema = repo_schema.RepoSchema().bind(
1232 1233 repo_type_options=rhodecode.BACKENDS.keys(),
1233 1234 repo_ref_options=ref_choices,
1234 1235 repo_type=repo.repo_type,
1235 1236 # user caller
1236 1237 user=apiuser)
1237 1238
1238 1239 try:
1239 1240 schema_data = schema.deserialize(dict(
1240 1241 repo_name=fork_name,
1241 1242 repo_type=repo.repo_type,
1242 1243 repo_owner=owner.username,
1243 1244 repo_description=description,
1244 1245 repo_landing_commit_ref=landing_commit_ref,
1245 1246 repo_clone_uri=clone_uri,
1246 1247 repo_private=private,
1247 1248 repo_copy_permissions=copy_permissions))
1248 1249 except validation_schema.Invalid as err:
1249 1250 raise JSONRPCValidationError(colander_exc=err)
1250 1251
1251 1252 try:
1252 1253 data = {
1253 1254 'fork_parent_id': repo.repo_id,
1254 1255
1255 1256 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1256 1257 'repo_name_full': schema_data['repo_name'],
1257 1258 'repo_group': schema_data['repo_group']['repo_group_id'],
1258 1259 'repo_type': schema_data['repo_type'],
1259 1260 'description': schema_data['repo_description'],
1260 1261 'private': schema_data['repo_private'],
1261 1262 'copy_permissions': schema_data['repo_copy_permissions'],
1262 1263 'landing_rev': schema_data['repo_landing_commit_ref'],
1263 1264 }
1264 1265
1265 1266 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1266 1267 # no commit, it's done in RepoModel, or async via celery
1267 1268 task_id = get_task_id(task)
1268 1269
1269 1270 return {
1270 1271 'msg': 'Created fork of `%s` as `%s`' % (
1271 1272 repo.repo_name, schema_data['repo_name']),
1272 1273 'success': True, # cannot return the repo data here since fork
1273 1274 # can be done async
1274 1275 'task': task_id
1275 1276 }
1276 1277 except Exception:
1277 1278 log.exception(
1278 1279 u"Exception while trying to create fork %s",
1279 1280 schema_data['repo_name'])
1280 1281 raise JSONRPCError(
1281 1282 'failed to fork repository `%s` as `%s`' % (
1282 1283 repo_name, schema_data['repo_name']))
1283 1284
1284 1285
1285 1286 @jsonrpc_method()
1286 1287 def delete_repo(request, apiuser, repoid, forks=Optional('')):
1287 1288 """
1288 1289 Deletes a repository.
1289 1290
1290 1291 * When the `forks` parameter is set it's possible to detach or delete
1291 1292 forks of deleted repository.
1292 1293
1293 1294 This command can only be run using an |authtoken| with admin
1294 1295 permissions on the |repo|.
1295 1296
1296 1297 :param apiuser: This is filled automatically from the |authtoken|.
1297 1298 :type apiuser: AuthUser
1298 1299 :param repoid: Set the repository name or repository ID.
1299 1300 :type repoid: str or int
1300 1301 :param forks: Set to `detach` or `delete` forks from the |repo|.
1301 1302 :type forks: Optional(str)
1302 1303
1303 1304 Example error output:
1304 1305
1305 1306 .. code-block:: bash
1306 1307
1307 1308 id : <id_given_in_input>
1308 1309 result: {
1309 1310 "msg": "Deleted repository `<reponame>`",
1310 1311 "success": true
1311 1312 }
1312 1313 error: null
1313 1314 """
1314 1315
1315 1316 repo = get_repo_or_error(repoid)
1316 1317 repo_name = repo.repo_name
1317 1318 if not has_superadmin_permission(apiuser):
1318 1319 _perms = ('repository.admin',)
1319 1320 validate_repo_permissions(apiuser, repoid, repo, _perms)
1320 1321
1321 1322 try:
1322 1323 handle_forks = Optional.extract(forks)
1323 1324 _forks_msg = ''
1324 1325 _forks = [f for f in repo.forks]
1325 1326 if handle_forks == 'detach':
1326 1327 _forks_msg = ' ' + 'Detached %s forks' % len(_forks)
1327 1328 elif handle_forks == 'delete':
1328 1329 _forks_msg = ' ' + 'Deleted %s forks' % len(_forks)
1329 1330 elif _forks:
1330 1331 raise JSONRPCError(
1331 1332 'Cannot delete `%s` it still contains attached forks' %
1332 1333 (repo.repo_name,)
1333 1334 )
1334 1335 old_data = repo.get_api_data()
1335 1336 RepoModel().delete(repo, forks=forks)
1336 1337
1337 1338 repo = audit_logger.RepoWrap(repo_id=None,
1338 1339 repo_name=repo.repo_name)
1339 1340
1340 1341 audit_logger.store_api(
1341 1342 'repo.delete', action_data={'old_data': old_data},
1342 1343 user=apiuser, repo=repo)
1343 1344
1344 1345 ScmModel().mark_for_invalidation(repo_name, delete=True)
1345 1346 Session().commit()
1346 1347 return {
1347 1348 'msg': 'Deleted repository `%s`%s' % (repo_name, _forks_msg),
1348 1349 'success': True
1349 1350 }
1350 1351 except Exception:
1351 1352 log.exception("Exception occurred while trying to delete repo")
1352 1353 raise JSONRPCError(
1353 1354 'failed to delete repository `%s`' % (repo_name,)
1354 1355 )
1355 1356
1356 1357
1357 1358 #TODO: marcink, change name ?
1358 1359 @jsonrpc_method()
1359 1360 def invalidate_cache(request, apiuser, repoid, delete_keys=Optional(False)):
1360 1361 """
1361 1362 Invalidates the cache for the specified repository.
1362 1363
1363 1364 This command can only be run using an |authtoken| with admin rights to
1364 1365 the specified repository.
1365 1366
1366 1367 This command takes the following options:
1367 1368
1368 1369 :param apiuser: This is filled automatically from |authtoken|.
1369 1370 :type apiuser: AuthUser
1370 1371 :param repoid: Sets the repository name or repository ID.
1371 1372 :type repoid: str or int
1372 1373 :param delete_keys: This deletes the invalidated keys instead of
1373 1374 just flagging them.
1374 1375 :type delete_keys: Optional(``True`` | ``False``)
1375 1376
1376 1377 Example output:
1377 1378
1378 1379 .. code-block:: bash
1379 1380
1380 1381 id : <id_given_in_input>
1381 1382 result : {
1382 1383 'msg': Cache for repository `<repository name>` was invalidated,
1383 1384 'repository': <repository name>
1384 1385 }
1385 1386 error : null
1386 1387
1387 1388 Example error output:
1388 1389
1389 1390 .. code-block:: bash
1390 1391
1391 1392 id : <id_given_in_input>
1392 1393 result : null
1393 1394 error : {
1394 1395 'Error occurred during cache invalidation action'
1395 1396 }
1396 1397
1397 1398 """
1398 1399
1399 1400 repo = get_repo_or_error(repoid)
1400 1401 if not has_superadmin_permission(apiuser):
1401 1402 _perms = ('repository.admin', 'repository.write',)
1402 1403 validate_repo_permissions(apiuser, repoid, repo, _perms)
1403 1404
1404 1405 delete = Optional.extract(delete_keys)
1405 1406 try:
1406 1407 ScmModel().mark_for_invalidation(repo.repo_name, delete=delete)
1407 1408 return {
1408 1409 'msg': 'Cache for repository `%s` was invalidated' % (repoid,),
1409 1410 'repository': repo.repo_name
1410 1411 }
1411 1412 except Exception:
1412 1413 log.exception(
1413 1414 "Exception occurred while trying to invalidate repo cache")
1414 1415 raise JSONRPCError(
1415 1416 'Error occurred during cache invalidation action'
1416 1417 )
1417 1418
1418 1419
1419 1420 #TODO: marcink, change name ?
1420 1421 @jsonrpc_method()
1421 1422 def lock(request, apiuser, repoid, locked=Optional(None),
1422 1423 userid=Optional(OAttr('apiuser'))):
1423 1424 """
1424 1425 Sets the lock state of the specified |repo| by the given user.
1425 1426 From more information, see :ref:`repo-locking`.
1426 1427
1427 1428 * If the ``userid`` option is not set, the repository is locked to the
1428 1429 user who called the method.
1429 1430 * If the ``locked`` parameter is not set, the current lock state of the
1430 1431 repository is displayed.
1431 1432
1432 1433 This command can only be run using an |authtoken| with admin rights to
1433 1434 the specified repository.
1434 1435
1435 1436 This command takes the following options:
1436 1437
1437 1438 :param apiuser: This is filled automatically from the |authtoken|.
1438 1439 :type apiuser: AuthUser
1439 1440 :param repoid: Sets the repository name or repository ID.
1440 1441 :type repoid: str or int
1441 1442 :param locked: Sets the lock state.
1442 1443 :type locked: Optional(``True`` | ``False``)
1443 1444 :param userid: Set the repository lock to this user.
1444 1445 :type userid: Optional(str or int)
1445 1446
1446 1447 Example error output:
1447 1448
1448 1449 .. code-block:: bash
1449 1450
1450 1451 id : <id_given_in_input>
1451 1452 result : {
1452 1453 'repo': '<reponame>',
1453 1454 'locked': <bool: lock state>,
1454 1455 'locked_since': <int: lock timestamp>,
1455 1456 'locked_by': <username of person who made the lock>,
1456 1457 'lock_reason': <str: reason for locking>,
1457 1458 'lock_state_changed': <bool: True if lock state has been changed in this request>,
1458 1459 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
1459 1460 or
1460 1461 'msg': 'Repo `<repository name>` not locked.'
1461 1462 or
1462 1463 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
1463 1464 }
1464 1465 error : null
1465 1466
1466 1467 Example error output:
1467 1468
1468 1469 .. code-block:: bash
1469 1470
1470 1471 id : <id_given_in_input>
1471 1472 result : null
1472 1473 error : {
1473 1474 'Error occurred locking repository `<reponame>`'
1474 1475 }
1475 1476 """
1476 1477
1477 1478 repo = get_repo_or_error(repoid)
1478 1479 if not has_superadmin_permission(apiuser):
1479 1480 # check if we have at least write permission for this repo !
1480 1481 _perms = ('repository.admin', 'repository.write',)
1481 1482 validate_repo_permissions(apiuser, repoid, repo, _perms)
1482 1483
1483 1484 # make sure normal user does not pass someone else userid,
1484 1485 # he is not allowed to do that
1485 1486 if not isinstance(userid, Optional) and userid != apiuser.user_id:
1486 1487 raise JSONRPCError('userid is not the same as your user')
1487 1488
1488 1489 if isinstance(userid, Optional):
1489 1490 userid = apiuser.user_id
1490 1491
1491 1492 user = get_user_or_error(userid)
1492 1493
1493 1494 if isinstance(locked, Optional):
1494 1495 lockobj = repo.locked
1495 1496
1496 1497 if lockobj[0] is None:
1497 1498 _d = {
1498 1499 'repo': repo.repo_name,
1499 1500 'locked': False,
1500 1501 'locked_since': None,
1501 1502 'locked_by': None,
1502 1503 'lock_reason': None,
1503 1504 'lock_state_changed': False,
1504 1505 'msg': 'Repo `%s` not locked.' % repo.repo_name
1505 1506 }
1506 1507 return _d
1507 1508 else:
1508 1509 _user_id, _time, _reason = lockobj
1509 1510 lock_user = get_user_or_error(userid)
1510 1511 _d = {
1511 1512 'repo': repo.repo_name,
1512 1513 'locked': True,
1513 1514 'locked_since': _time,
1514 1515 'locked_by': lock_user.username,
1515 1516 'lock_reason': _reason,
1516 1517 'lock_state_changed': False,
1517 1518 'msg': ('Repo `%s` locked by `%s` on `%s`.'
1518 1519 % (repo.repo_name, lock_user.username,
1519 1520 json.dumps(time_to_datetime(_time))))
1520 1521 }
1521 1522 return _d
1522 1523
1523 1524 # force locked state through a flag
1524 1525 else:
1525 1526 locked = str2bool(locked)
1526 1527 lock_reason = Repository.LOCK_API
1527 1528 try:
1528 1529 if locked:
1529 1530 lock_time = time.time()
1530 1531 Repository.lock(repo, user.user_id, lock_time, lock_reason)
1531 1532 else:
1532 1533 lock_time = None
1533 1534 Repository.unlock(repo)
1534 1535 _d = {
1535 1536 'repo': repo.repo_name,
1536 1537 'locked': locked,
1537 1538 'locked_since': lock_time,
1538 1539 'locked_by': user.username,
1539 1540 'lock_reason': lock_reason,
1540 1541 'lock_state_changed': True,
1541 1542 'msg': ('User `%s` set lock state for repo `%s` to `%s`'
1542 1543 % (user.username, repo.repo_name, locked))
1543 1544 }
1544 1545 return _d
1545 1546 except Exception:
1546 1547 log.exception(
1547 1548 "Exception occurred while trying to lock repository")
1548 1549 raise JSONRPCError(
1549 1550 'Error occurred locking repository `%s`' % repo.repo_name
1550 1551 )
1551 1552
1552 1553
1553 1554 @jsonrpc_method()
1554 1555 def comment_commit(
1555 1556 request, apiuser, repoid, commit_id, message, status=Optional(None),
1556 1557 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
1557 1558 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
1558 1559 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
1559 1560 """
1560 1561 Set a commit comment, and optionally change the status of the commit.
1561 1562
1562 1563 :param apiuser: This is filled automatically from the |authtoken|.
1563 1564 :type apiuser: AuthUser
1564 1565 :param repoid: Set the repository name or repository ID.
1565 1566 :type repoid: str or int
1566 1567 :param commit_id: Specify the commit_id for which to set a comment.
1567 1568 :type commit_id: str
1568 1569 :param message: The comment text.
1569 1570 :type message: str
1570 1571 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
1571 1572 'approved', 'rejected', 'under_review'
1572 1573 :type status: str
1573 1574 :param comment_type: Comment type, one of: 'note', 'todo'
1574 1575 :type comment_type: Optional(str), default: 'note'
1575 1576 :param resolves_comment_id: id of comment which this one will resolve
1576 1577 :type resolves_comment_id: Optional(int)
1577 1578 :param extra_recipients: list of user ids or usernames to add
1578 1579 notifications for this comment. Acts like a CC for notification
1579 1580 :type extra_recipients: Optional(list)
1580 1581 :param userid: Set the user name of the comment creator.
1581 1582 :type userid: Optional(str or int)
1582 1583 :param send_email: Define if this comment should also send email notification
1583 1584 :type send_email: Optional(bool)
1584 1585
1585 1586 Example error output:
1586 1587
1587 1588 .. code-block:: bash
1588 1589
1589 1590 {
1590 1591 "id" : <id_given_in_input>,
1591 1592 "result" : {
1592 1593 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
1593 1594 "status_change": null or <status>,
1594 1595 "success": true
1595 1596 },
1596 1597 "error" : null
1597 1598 }
1598 1599
1599 1600 """
1600 1601 _ = request.translate
1601 1602
1602 1603 repo = get_repo_or_error(repoid)
1603 1604 if not has_superadmin_permission(apiuser):
1604 1605 _perms = ('repository.read', 'repository.write', 'repository.admin')
1605 1606 validate_repo_permissions(apiuser, repoid, repo, _perms)
1606 1607 db_repo_name = repo.repo_name
1607 1608
1608 1609 try:
1609 1610 commit = repo.scm_instance().get_commit(commit_id=commit_id)
1610 1611 commit_id = commit.raw_id
1611 1612 except Exception as e:
1612 1613 log.exception('Failed to fetch commit')
1613 1614 raise JSONRPCError(safe_str(e))
1614 1615
1615 1616 if isinstance(userid, Optional):
1616 1617 userid = apiuser.user_id
1617 1618
1618 1619 user = get_user_or_error(userid)
1619 1620 status = Optional.extract(status)
1620 1621 comment_type = Optional.extract(comment_type)
1621 1622 resolves_comment_id = Optional.extract(resolves_comment_id)
1622 1623 extra_recipients = Optional.extract(extra_recipients)
1623 1624 send_email = Optional.extract(send_email, binary=True)
1624 1625
1625 1626 allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES]
1626 1627 if status and status not in allowed_statuses:
1627 1628 raise JSONRPCError('Bad status, must be on '
1628 1629 'of %s got %s' % (allowed_statuses, status,))
1629 1630
1630 1631 if resolves_comment_id:
1631 1632 comment = ChangesetComment.get(resolves_comment_id)
1632 1633 if not comment:
1633 1634 raise JSONRPCError(
1634 1635 'Invalid resolves_comment_id `%s` for this commit.'
1635 1636 % resolves_comment_id)
1636 1637 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
1637 1638 raise JSONRPCError(
1638 1639 'Comment `%s` is wrong type for setting status to resolved.'
1639 1640 % resolves_comment_id)
1640 1641
1641 1642 try:
1642 1643 rc_config = SettingsModel().get_all_settings()
1643 1644 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
1644 1645 status_change_label = ChangesetStatus.get_status_lbl(status)
1645 1646 comment = CommentsModel().create(
1646 1647 message, repo, user, commit_id=commit_id,
1647 1648 status_change=status_change_label,
1648 1649 status_change_type=status,
1649 1650 renderer=renderer,
1650 1651 comment_type=comment_type,
1651 1652 resolves_comment_id=resolves_comment_id,
1652 1653 auth_user=apiuser,
1653 1654 extra_recipients=extra_recipients,
1654 1655 send_email=send_email
1655 1656 )
1656 1657 is_inline = comment.is_inline
1657 1658
1658 1659 if status:
1659 1660 # also do a status change
1660 1661 try:
1661 1662 ChangesetStatusModel().set_status(
1662 1663 repo, status, user, comment, revision=commit_id,
1663 1664 dont_allow_on_closed_pull_request=True
1664 1665 )
1665 1666 except StatusChangeOnClosedPullRequestError:
1666 1667 log.exception(
1667 1668 "Exception occurred while trying to change repo commit status")
1668 1669 msg = ('Changing status on a commit associated with '
1669 1670 'a closed pull request is not allowed')
1670 1671 raise JSONRPCError(msg)
1671 1672
1672 1673 CommentsModel().trigger_commit_comment_hook(
1673 1674 repo, apiuser, 'create',
1674 1675 data={'comment': comment, 'commit': commit})
1675 1676
1676 1677 Session().commit()
1677 1678
1678 1679 comment_broadcast_channel = channelstream.comment_channel(
1679 1680 db_repo_name, commit_obj=commit)
1680 1681
1681 1682 comment_data = {'comment': comment, 'comment_id': comment.comment_id}
1682 1683 comment_type = 'inline' if is_inline else 'general'
1683 1684 channelstream.comment_channelstream_push(
1684 1685 request, comment_broadcast_channel, apiuser,
1685 1686 _('posted a new {} comment').format(comment_type),
1686 1687 comment_data=comment_data)
1687 1688
1688 1689 return {
1689 1690 'msg': (
1690 1691 'Commented on commit `%s` for repository `%s`' % (
1691 1692 comment.revision, repo.repo_name)),
1692 1693 'status_change': status,
1693 1694 'success': True,
1694 1695 }
1695 1696 except JSONRPCError:
1696 1697 # catch any inside errors, and re-raise them to prevent from
1697 1698 # below global catch to silence them
1698 1699 raise
1699 1700 except Exception:
1700 1701 log.exception("Exception occurred while trying to comment on commit")
1701 1702 raise JSONRPCError(
1702 1703 'failed to set comment on repository `%s`' % (repo.repo_name,)
1703 1704 )
1704 1705
1705 1706
1706 1707 @jsonrpc_method()
1707 1708 def get_repo_comments(request, apiuser, repoid,
1708 1709 commit_id=Optional(None), comment_type=Optional(None),
1709 1710 userid=Optional(None)):
1710 1711 """
1711 1712 Get all comments for a repository
1712 1713
1713 1714 :param apiuser: This is filled automatically from the |authtoken|.
1714 1715 :type apiuser: AuthUser
1715 1716 :param repoid: Set the repository name or repository ID.
1716 1717 :type repoid: str or int
1717 1718 :param commit_id: Optionally filter the comments by the commit_id
1718 1719 :type commit_id: Optional(str), default: None
1719 1720 :param comment_type: Optionally filter the comments by the comment_type
1720 1721 one of: 'note', 'todo'
1721 1722 :type comment_type: Optional(str), default: None
1722 1723 :param userid: Optionally filter the comments by the author of comment
1723 1724 :type userid: Optional(str or int), Default: None
1724 1725
1725 1726 Example error output:
1726 1727
1727 1728 .. code-block:: bash
1728 1729
1729 1730 {
1730 1731 "id" : <id_given_in_input>,
1731 1732 "result" : [
1732 1733 {
1733 1734 "comment_author": <USER_DETAILS>,
1734 1735 "comment_created_on": "2017-02-01T14:38:16.309",
1735 1736 "comment_f_path": "file.txt",
1736 1737 "comment_id": 282,
1737 1738 "comment_lineno": "n1",
1738 1739 "comment_resolved_by": null,
1739 1740 "comment_status": [],
1740 1741 "comment_text": "This file needs a header",
1741 1742 "comment_type": "todo",
1742 1743 "comment_last_version: 0
1743 1744 }
1744 1745 ],
1745 1746 "error" : null
1746 1747 }
1747 1748
1748 1749 """
1749 1750 repo = get_repo_or_error(repoid)
1750 1751 if not has_superadmin_permission(apiuser):
1751 1752 _perms = ('repository.read', 'repository.write', 'repository.admin')
1752 1753 validate_repo_permissions(apiuser, repoid, repo, _perms)
1753 1754
1754 1755 commit_id = Optional.extract(commit_id)
1755 1756
1756 1757 userid = Optional.extract(userid)
1757 1758 if userid:
1758 1759 user = get_user_or_error(userid)
1759 1760 else:
1760 1761 user = None
1761 1762
1762 1763 comment_type = Optional.extract(comment_type)
1763 1764 if comment_type and comment_type not in ChangesetComment.COMMENT_TYPES:
1764 1765 raise JSONRPCError(
1765 1766 'comment_type must be one of `{}` got {}'.format(
1766 1767 ChangesetComment.COMMENT_TYPES, comment_type)
1767 1768 )
1768 1769
1769 1770 comments = CommentsModel().get_repository_comments(
1770 1771 repo=repo, comment_type=comment_type, user=user, commit_id=commit_id)
1771 1772 return comments
1772 1773
1773 1774
1774 1775 @jsonrpc_method()
1775 1776 def get_comment(request, apiuser, comment_id):
1776 1777 """
1777 1778 Get single comment from repository or pull_request
1778 1779
1779 1780 :param apiuser: This is filled automatically from the |authtoken|.
1780 1781 :type apiuser: AuthUser
1781 1782 :param comment_id: comment id found in the URL of comment
1782 1783 :type comment_id: str or int
1783 1784
1784 1785 Example error output:
1785 1786
1786 1787 .. code-block:: bash
1787 1788
1788 1789 {
1789 1790 "id" : <id_given_in_input>,
1790 1791 "result" : {
1791 1792 "comment_author": <USER_DETAILS>,
1792 1793 "comment_created_on": "2017-02-01T14:38:16.309",
1793 1794 "comment_f_path": "file.txt",
1794 1795 "comment_id": 282,
1795 1796 "comment_lineno": "n1",
1796 1797 "comment_resolved_by": null,
1797 1798 "comment_status": [],
1798 1799 "comment_text": "This file needs a header",
1799 1800 "comment_type": "todo",
1800 1801 "comment_last_version: 0
1801 1802 },
1802 1803 "error" : null
1803 1804 }
1804 1805
1805 1806 """
1806 1807
1807 1808 comment = ChangesetComment.get(comment_id)
1808 1809 if not comment:
1809 1810 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1810 1811
1811 1812 perms = ('repository.read', 'repository.write', 'repository.admin')
1812 1813 has_comment_perm = HasRepoPermissionAnyApi(*perms)\
1813 1814 (user=apiuser, repo_name=comment.repo.repo_name)
1814 1815
1815 1816 if not has_comment_perm:
1816 1817 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1817 1818
1818 1819 return comment
1819 1820
1820 1821
1821 1822 @jsonrpc_method()
1822 1823 def edit_comment(request, apiuser, message, comment_id, version,
1823 1824 userid=Optional(OAttr('apiuser'))):
1824 1825 """
1825 1826 Edit comment on the pull request or commit,
1826 1827 specified by the `comment_id` and version. Initially version should be 0
1827 1828
1828 1829 :param apiuser: This is filled automatically from the |authtoken|.
1829 1830 :type apiuser: AuthUser
1830 1831 :param comment_id: Specify the comment_id for editing
1831 1832 :type comment_id: int
1832 1833 :param version: version of the comment that will be created, starts from 0
1833 1834 :type version: int
1834 1835 :param message: The text content of the comment.
1835 1836 :type message: str
1836 1837 :param userid: Comment on the pull request as this user
1837 1838 :type userid: Optional(str or int)
1838 1839
1839 1840 Example output:
1840 1841
1841 1842 .. code-block:: bash
1842 1843
1843 1844 id : <id_given_in_input>
1844 1845 result : {
1845 1846 "comment": "<comment data>",
1846 1847 "version": "<Integer>",
1847 1848 },
1848 1849 error : null
1849 1850 """
1850 1851
1851 1852 auth_user = apiuser
1852 1853 comment = ChangesetComment.get(comment_id)
1853 1854 if not comment:
1854 1855 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1855 1856
1856 1857 is_super_admin = has_superadmin_permission(apiuser)
1857 1858 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1858 1859 (user=apiuser, repo_name=comment.repo.repo_name)
1859 1860
1860 1861 if not isinstance(userid, Optional):
1861 1862 if is_super_admin or is_repo_admin:
1862 1863 apiuser = get_user_or_error(userid)
1863 1864 auth_user = apiuser.AuthUser()
1864 1865 else:
1865 1866 raise JSONRPCError('userid is not the same as your user')
1866 1867
1867 1868 comment_author = comment.author.user_id == auth_user.user_id
1868 1869 if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1869 1870 raise JSONRPCError("you don't have access to edit this comment")
1870 1871
1871 1872 try:
1872 1873 comment_history = CommentsModel().edit(
1873 1874 comment_id=comment_id,
1874 1875 text=message,
1875 1876 auth_user=auth_user,
1876 1877 version=version,
1877 1878 )
1878 1879 Session().commit()
1879 1880 except CommentVersionMismatch:
1880 1881 raise JSONRPCError(
1881 1882 'comment ({}) version ({}) mismatch'.format(comment_id, version)
1882 1883 )
1883 1884 if not comment_history and not message:
1884 1885 raise JSONRPCError(
1885 1886 "comment ({}) can't be changed with empty string".format(comment_id)
1886 1887 )
1887 1888
1888 1889 if comment.pull_request:
1889 1890 pull_request = comment.pull_request
1890 1891 PullRequestModel().trigger_pull_request_hook(
1891 1892 pull_request, apiuser, 'comment_edit',
1892 1893 data={'comment': comment})
1893 1894 else:
1894 1895 db_repo = comment.repo
1895 1896 commit_id = comment.revision
1896 1897 commit = db_repo.get_commit(commit_id)
1897 1898 CommentsModel().trigger_commit_comment_hook(
1898 1899 db_repo, apiuser, 'edit',
1899 1900 data={'comment': comment, 'commit': commit})
1900 1901
1901 1902 data = {
1902 1903 'comment': comment,
1903 1904 'version': comment_history.version if comment_history else None,
1904 1905 }
1905 1906 return data
1906 1907
1907 1908
1908 1909 # TODO(marcink): write this with all required logic for deleting a comments in PR or commits
1909 1910 # @jsonrpc_method()
1910 1911 # def delete_comment(request, apiuser, comment_id):
1911 1912 # auth_user = apiuser
1912 1913 #
1913 1914 # comment = ChangesetComment.get(comment_id)
1914 1915 # if not comment:
1915 1916 # raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1916 1917 #
1917 1918 # is_super_admin = has_superadmin_permission(apiuser)
1918 1919 # is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1919 1920 # (user=apiuser, repo_name=comment.repo.repo_name)
1920 1921 #
1921 1922 # comment_author = comment.author.user_id == auth_user.user_id
1922 1923 # if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1923 1924 # raise JSONRPCError("you don't have access to edit this comment")
1924 1925
1925 1926 @jsonrpc_method()
1926 1927 def grant_user_permission(request, apiuser, repoid, userid, perm):
1927 1928 """
1928 1929 Grant permissions for the specified user on the given repository,
1929 1930 or update existing permissions if found.
1930 1931
1931 1932 This command can only be run using an |authtoken| with admin
1932 1933 permissions on the |repo|.
1933 1934
1934 1935 :param apiuser: This is filled automatically from the |authtoken|.
1935 1936 :type apiuser: AuthUser
1936 1937 :param repoid: Set the repository name or repository ID.
1937 1938 :type repoid: str or int
1938 1939 :param userid: Set the user name.
1939 1940 :type userid: str
1940 1941 :param perm: Set the user permissions, using the following format
1941 1942 ``(repository.(none|read|write|admin))``
1942 1943 :type perm: str
1943 1944
1944 1945 Example output:
1945 1946
1946 1947 .. code-block:: bash
1947 1948
1948 1949 id : <id_given_in_input>
1949 1950 result: {
1950 1951 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
1951 1952 "success": true
1952 1953 }
1953 1954 error: null
1954 1955 """
1955 1956
1956 1957 repo = get_repo_or_error(repoid)
1957 1958 user = get_user_or_error(userid)
1958 1959 perm = get_perm_or_error(perm)
1959 1960 if not has_superadmin_permission(apiuser):
1960 1961 _perms = ('repository.admin',)
1961 1962 validate_repo_permissions(apiuser, repoid, repo, _perms)
1962 1963
1963 1964 perm_additions = [[user.user_id, perm.permission_name, "user"]]
1964 1965 try:
1965 1966 changes = RepoModel().update_permissions(
1966 1967 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
1967 1968
1968 1969 action_data = {
1969 1970 'added': changes['added'],
1970 1971 'updated': changes['updated'],
1971 1972 'deleted': changes['deleted'],
1972 1973 }
1973 1974 audit_logger.store_api(
1974 1975 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
1975 1976 Session().commit()
1976 1977 PermissionModel().flush_user_permission_caches(changes)
1977 1978
1978 1979 return {
1979 1980 'msg': 'Granted perm: `%s` for user: `%s` in repo: `%s`' % (
1980 1981 perm.permission_name, user.username, repo.repo_name
1981 1982 ),
1982 1983 'success': True
1983 1984 }
1984 1985 except Exception:
1985 1986 log.exception("Exception occurred while trying edit permissions for repo")
1986 1987 raise JSONRPCError(
1987 1988 'failed to edit permission for user: `%s` in repo: `%s`' % (
1988 1989 userid, repoid
1989 1990 )
1990 1991 )
1991 1992
1992 1993
1993 1994 @jsonrpc_method()
1994 1995 def revoke_user_permission(request, apiuser, repoid, userid):
1995 1996 """
1996 1997 Revoke permission for a user on the specified repository.
1997 1998
1998 1999 This command can only be run using an |authtoken| with admin
1999 2000 permissions on the |repo|.
2000 2001
2001 2002 :param apiuser: This is filled automatically from the |authtoken|.
2002 2003 :type apiuser: AuthUser
2003 2004 :param repoid: Set the repository name or repository ID.
2004 2005 :type repoid: str or int
2005 2006 :param userid: Set the user name of revoked user.
2006 2007 :type userid: str or int
2007 2008
2008 2009 Example error output:
2009 2010
2010 2011 .. code-block:: bash
2011 2012
2012 2013 id : <id_given_in_input>
2013 2014 result: {
2014 2015 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
2015 2016 "success": true
2016 2017 }
2017 2018 error: null
2018 2019 """
2019 2020
2020 2021 repo = get_repo_or_error(repoid)
2021 2022 user = get_user_or_error(userid)
2022 2023 if not has_superadmin_permission(apiuser):
2023 2024 _perms = ('repository.admin',)
2024 2025 validate_repo_permissions(apiuser, repoid, repo, _perms)
2025 2026
2026 2027 perm_deletions = [[user.user_id, None, "user"]]
2027 2028 try:
2028 2029 changes = RepoModel().update_permissions(
2029 2030 repo=repo, perm_deletions=perm_deletions, cur_user=user)
2030 2031
2031 2032 action_data = {
2032 2033 'added': changes['added'],
2033 2034 'updated': changes['updated'],
2034 2035 'deleted': changes['deleted'],
2035 2036 }
2036 2037 audit_logger.store_api(
2037 2038 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2038 2039 Session().commit()
2039 2040 PermissionModel().flush_user_permission_caches(changes)
2040 2041
2041 2042 return {
2042 2043 'msg': 'Revoked perm for user: `%s` in repo: `%s`' % (
2043 2044 user.username, repo.repo_name
2044 2045 ),
2045 2046 'success': True
2046 2047 }
2047 2048 except Exception:
2048 2049 log.exception("Exception occurred while trying revoke permissions to repo")
2049 2050 raise JSONRPCError(
2050 2051 'failed to edit permission for user: `%s` in repo: `%s`' % (
2051 2052 userid, repoid
2052 2053 )
2053 2054 )
2054 2055
2055 2056
2056 2057 @jsonrpc_method()
2057 2058 def grant_user_group_permission(request, apiuser, repoid, usergroupid, perm):
2058 2059 """
2059 2060 Grant permission for a user group on the specified repository,
2060 2061 or update existing permissions.
2061 2062
2062 2063 This command can only be run using an |authtoken| with admin
2063 2064 permissions on the |repo|.
2064 2065
2065 2066 :param apiuser: This is filled automatically from the |authtoken|.
2066 2067 :type apiuser: AuthUser
2067 2068 :param repoid: Set the repository name or repository ID.
2068 2069 :type repoid: str or int
2069 2070 :param usergroupid: Specify the ID of the user group.
2070 2071 :type usergroupid: str or int
2071 2072 :param perm: Set the user group permissions using the following
2072 2073 format: (repository.(none|read|write|admin))
2073 2074 :type perm: str
2074 2075
2075 2076 Example output:
2076 2077
2077 2078 .. code-block:: bash
2078 2079
2079 2080 id : <id_given_in_input>
2080 2081 result : {
2081 2082 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
2082 2083 "success": true
2083 2084
2084 2085 }
2085 2086 error : null
2086 2087
2087 2088 Example error output:
2088 2089
2089 2090 .. code-block:: bash
2090 2091
2091 2092 id : <id_given_in_input>
2092 2093 result : null
2093 2094 error : {
2094 2095 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
2095 2096 }
2096 2097
2097 2098 """
2098 2099
2099 2100 repo = get_repo_or_error(repoid)
2100 2101 perm = get_perm_or_error(perm)
2101 2102 if not has_superadmin_permission(apiuser):
2102 2103 _perms = ('repository.admin',)
2103 2104 validate_repo_permissions(apiuser, repoid, repo, _perms)
2104 2105
2105 2106 user_group = get_user_group_or_error(usergroupid)
2106 2107 if not has_superadmin_permission(apiuser):
2107 2108 # check if we have at least read permission for this user group !
2108 2109 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2109 2110 if not HasUserGroupPermissionAnyApi(*_perms)(
2110 2111 user=apiuser, user_group_name=user_group.users_group_name):
2111 2112 raise JSONRPCError(
2112 2113 'user group `%s` does not exist' % (usergroupid,))
2113 2114
2114 2115 perm_additions = [[user_group.users_group_id, perm.permission_name, "user_group"]]
2115 2116 try:
2116 2117 changes = RepoModel().update_permissions(
2117 2118 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
2118 2119 action_data = {
2119 2120 'added': changes['added'],
2120 2121 'updated': changes['updated'],
2121 2122 'deleted': changes['deleted'],
2122 2123 }
2123 2124 audit_logger.store_api(
2124 2125 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2125 2126 Session().commit()
2126 2127 PermissionModel().flush_user_permission_caches(changes)
2127 2128
2128 2129 return {
2129 2130 'msg': 'Granted perm: `%s` for user group: `%s` in '
2130 2131 'repo: `%s`' % (
2131 2132 perm.permission_name, user_group.users_group_name,
2132 2133 repo.repo_name
2133 2134 ),
2134 2135 'success': True
2135 2136 }
2136 2137 except Exception:
2137 2138 log.exception(
2138 2139 "Exception occurred while trying change permission on repo")
2139 2140 raise JSONRPCError(
2140 2141 'failed to edit permission for user group: `%s` in '
2141 2142 'repo: `%s`' % (
2142 2143 usergroupid, repo.repo_name
2143 2144 )
2144 2145 )
2145 2146
2146 2147
2147 2148 @jsonrpc_method()
2148 2149 def revoke_user_group_permission(request, apiuser, repoid, usergroupid):
2149 2150 """
2150 2151 Revoke the permissions of a user group on a given repository.
2151 2152
2152 2153 This command can only be run using an |authtoken| with admin
2153 2154 permissions on the |repo|.
2154 2155
2155 2156 :param apiuser: This is filled automatically from the |authtoken|.
2156 2157 :type apiuser: AuthUser
2157 2158 :param repoid: Set the repository name or repository ID.
2158 2159 :type repoid: str or int
2159 2160 :param usergroupid: Specify the user group ID.
2160 2161 :type usergroupid: str or int
2161 2162
2162 2163 Example output:
2163 2164
2164 2165 .. code-block:: bash
2165 2166
2166 2167 id : <id_given_in_input>
2167 2168 result: {
2168 2169 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
2169 2170 "success": true
2170 2171 }
2171 2172 error: null
2172 2173 """
2173 2174
2174 2175 repo = get_repo_or_error(repoid)
2175 2176 if not has_superadmin_permission(apiuser):
2176 2177 _perms = ('repository.admin',)
2177 2178 validate_repo_permissions(apiuser, repoid, repo, _perms)
2178 2179
2179 2180 user_group = get_user_group_or_error(usergroupid)
2180 2181 if not has_superadmin_permission(apiuser):
2181 2182 # check if we have at least read permission for this user group !
2182 2183 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2183 2184 if not HasUserGroupPermissionAnyApi(*_perms)(
2184 2185 user=apiuser, user_group_name=user_group.users_group_name):
2185 2186 raise JSONRPCError(
2186 2187 'user group `%s` does not exist' % (usergroupid,))
2187 2188
2188 2189 perm_deletions = [[user_group.users_group_id, None, "user_group"]]
2189 2190 try:
2190 2191 changes = RepoModel().update_permissions(
2191 2192 repo=repo, perm_deletions=perm_deletions, cur_user=apiuser)
2192 2193 action_data = {
2193 2194 'added': changes['added'],
2194 2195 'updated': changes['updated'],
2195 2196 'deleted': changes['deleted'],
2196 2197 }
2197 2198 audit_logger.store_api(
2198 2199 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2199 2200 Session().commit()
2200 2201 PermissionModel().flush_user_permission_caches(changes)
2201 2202
2202 2203 return {
2203 2204 'msg': 'Revoked perm for user group: `%s` in repo: `%s`' % (
2204 2205 user_group.users_group_name, repo.repo_name
2205 2206 ),
2206 2207 'success': True
2207 2208 }
2208 2209 except Exception:
2209 2210 log.exception("Exception occurred while trying revoke "
2210 2211 "user group permission on repo")
2211 2212 raise JSONRPCError(
2212 2213 'failed to edit permission for user group: `%s` in '
2213 2214 'repo: `%s`' % (
2214 2215 user_group.users_group_name, repo.repo_name
2215 2216 )
2216 2217 )
2217 2218
2218 2219
2219 2220 @jsonrpc_method()
2220 2221 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
2221 2222 """
2222 2223 Triggers a pull on the given repository from a remote location. You
2223 2224 can use this to keep remote repositories up-to-date.
2224 2225
2225 2226 This command can only be run using an |authtoken| with admin
2226 2227 rights to the specified repository. For more information,
2227 2228 see :ref:`config-token-ref`.
2228 2229
2229 2230 This command takes the following options:
2230 2231
2231 2232 :param apiuser: This is filled automatically from the |authtoken|.
2232 2233 :type apiuser: AuthUser
2233 2234 :param repoid: The repository name or repository ID.
2234 2235 :type repoid: str or int
2235 2236 :param remote_uri: Optional remote URI to pass in for pull
2236 2237 :type remote_uri: str
2237 2238
2238 2239 Example output:
2239 2240
2240 2241 .. code-block:: bash
2241 2242
2242 2243 id : <id_given_in_input>
2243 2244 result : {
2244 2245 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
2245 2246 "repository": "<repository name>"
2246 2247 }
2247 2248 error : null
2248 2249
2249 2250 Example error output:
2250 2251
2251 2252 .. code-block:: bash
2252 2253
2253 2254 id : <id_given_in_input>
2254 2255 result : null
2255 2256 error : {
2256 2257 "Unable to push changes from `<remote_url>`"
2257 2258 }
2258 2259
2259 2260 """
2260 2261
2261 2262 repo = get_repo_or_error(repoid)
2262 2263 remote_uri = Optional.extract(remote_uri)
2263 2264 remote_uri_display = remote_uri or repo.clone_uri_hidden
2264 2265 if not has_superadmin_permission(apiuser):
2265 2266 _perms = ('repository.admin',)
2266 2267 validate_repo_permissions(apiuser, repoid, repo, _perms)
2267 2268
2268 2269 try:
2269 2270 ScmModel().pull_changes(
2270 2271 repo.repo_name, apiuser.username, remote_uri=remote_uri)
2271 2272 return {
2272 2273 'msg': 'Pulled from url `%s` on repo `%s`' % (
2273 2274 remote_uri_display, repo.repo_name),
2274 2275 'repository': repo.repo_name
2275 2276 }
2276 2277 except Exception:
2277 2278 log.exception("Exception occurred while trying to "
2278 2279 "pull changes from remote location")
2279 2280 raise JSONRPCError(
2280 2281 'Unable to pull changes from `%s`' % remote_uri_display
2281 2282 )
2282 2283
2283 2284
2284 2285 @jsonrpc_method()
2285 2286 def strip(request, apiuser, repoid, revision, branch):
2286 2287 """
2287 2288 Strips the given revision from the specified repository.
2288 2289
2289 2290 * This will remove the revision and all of its decendants.
2290 2291
2291 2292 This command can only be run using an |authtoken| with admin rights to
2292 2293 the specified repository.
2293 2294
2294 2295 This command takes the following options:
2295 2296
2296 2297 :param apiuser: This is filled automatically from the |authtoken|.
2297 2298 :type apiuser: AuthUser
2298 2299 :param repoid: The repository name or repository ID.
2299 2300 :type repoid: str or int
2300 2301 :param revision: The revision you wish to strip.
2301 2302 :type revision: str
2302 2303 :param branch: The branch from which to strip the revision.
2303 2304 :type branch: str
2304 2305
2305 2306 Example output:
2306 2307
2307 2308 .. code-block:: bash
2308 2309
2309 2310 id : <id_given_in_input>
2310 2311 result : {
2311 2312 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
2312 2313 "repository": "<repository name>"
2313 2314 }
2314 2315 error : null
2315 2316
2316 2317 Example error output:
2317 2318
2318 2319 .. code-block:: bash
2319 2320
2320 2321 id : <id_given_in_input>
2321 2322 result : null
2322 2323 error : {
2323 2324 "Unable to strip commit <commit_hash> from repo `<repository name>`"
2324 2325 }
2325 2326
2326 2327 """
2327 2328
2328 2329 repo = get_repo_or_error(repoid)
2329 2330 if not has_superadmin_permission(apiuser):
2330 2331 _perms = ('repository.admin',)
2331 2332 validate_repo_permissions(apiuser, repoid, repo, _perms)
2332 2333
2333 2334 try:
2334 2335 ScmModel().strip(repo, revision, branch)
2335 2336 audit_logger.store_api(
2336 2337 'repo.commit.strip', action_data={'commit_id': revision},
2337 2338 repo=repo,
2338 2339 user=apiuser, commit=True)
2339 2340
2340 2341 return {
2341 2342 'msg': 'Stripped commit %s from repo `%s`' % (
2342 2343 revision, repo.repo_name),
2343 2344 'repository': repo.repo_name
2344 2345 }
2345 2346 except Exception:
2346 2347 log.exception("Exception while trying to strip")
2347 2348 raise JSONRPCError(
2348 2349 'Unable to strip commit %s from repo `%s`' % (
2349 2350 revision, repo.repo_name)
2350 2351 )
2351 2352
2352 2353
2353 2354 @jsonrpc_method()
2354 2355 def get_repo_settings(request, apiuser, repoid, key=Optional(None)):
2355 2356 """
2356 2357 Returns all settings for a repository. If key is given it only returns the
2357 2358 setting identified by the key or null.
2358 2359
2359 2360 :param apiuser: This is filled automatically from the |authtoken|.
2360 2361 :type apiuser: AuthUser
2361 2362 :param repoid: The repository name or repository id.
2362 2363 :type repoid: str or int
2363 2364 :param key: Key of the setting to return.
2364 2365 :type: key: Optional(str)
2365 2366
2366 2367 Example output:
2367 2368
2368 2369 .. code-block:: bash
2369 2370
2370 2371 {
2371 2372 "error": null,
2372 2373 "id": 237,
2373 2374 "result": {
2374 2375 "extensions_largefiles": true,
2375 2376 "extensions_evolve": true,
2376 2377 "hooks_changegroup_push_logger": true,
2377 2378 "hooks_changegroup_repo_size": false,
2378 2379 "hooks_outgoing_pull_logger": true,
2379 2380 "phases_publish": "True",
2380 2381 "rhodecode_hg_use_rebase_for_merging": true,
2381 2382 "rhodecode_pr_merge_enabled": true,
2382 2383 "rhodecode_use_outdated_comments": true
2383 2384 }
2384 2385 }
2385 2386 """
2386 2387
2387 2388 # Restrict access to this api method to super-admins, and repo admins only.
2388 2389 repo = get_repo_or_error(repoid)
2389 2390 if not has_superadmin_permission(apiuser):
2390 2391 _perms = ('repository.admin',)
2391 2392 validate_repo_permissions(apiuser, repoid, repo, _perms)
2392 2393
2393 2394 try:
2394 2395 settings_model = VcsSettingsModel(repo=repo)
2395 2396 settings = settings_model.get_global_settings()
2396 2397 settings.update(settings_model.get_repo_settings())
2397 2398
2398 2399 # If only a single setting is requested fetch it from all settings.
2399 2400 key = Optional.extract(key)
2400 2401 if key is not None:
2401 2402 settings = settings.get(key, None)
2402 2403 except Exception:
2403 2404 msg = 'Failed to fetch settings for repository `{}`'.format(repoid)
2404 2405 log.exception(msg)
2405 2406 raise JSONRPCError(msg)
2406 2407
2407 2408 return settings
2408 2409
2409 2410
2410 2411 @jsonrpc_method()
2411 2412 def set_repo_settings(request, apiuser, repoid, settings):
2412 2413 """
2413 2414 Update repository settings. Returns true on success.
2414 2415
2415 2416 :param apiuser: This is filled automatically from the |authtoken|.
2416 2417 :type apiuser: AuthUser
2417 2418 :param repoid: The repository name or repository id.
2418 2419 :type repoid: str or int
2419 2420 :param settings: The new settings for the repository.
2420 2421 :type: settings: dict
2421 2422
2422 2423 Example output:
2423 2424
2424 2425 .. code-block:: bash
2425 2426
2426 2427 {
2427 2428 "error": null,
2428 2429 "id": 237,
2429 2430 "result": true
2430 2431 }
2431 2432 """
2432 2433 # Restrict access to this api method to super-admins, and repo admins only.
2433 2434 repo = get_repo_or_error(repoid)
2434 2435 if not has_superadmin_permission(apiuser):
2435 2436 _perms = ('repository.admin',)
2436 2437 validate_repo_permissions(apiuser, repoid, repo, _perms)
2437 2438
2438 2439 if type(settings) is not dict:
2439 2440 raise JSONRPCError('Settings have to be a JSON Object.')
2440 2441
2441 2442 try:
2442 2443 settings_model = VcsSettingsModel(repo=repoid)
2443 2444
2444 2445 # Merge global, repo and incoming settings.
2445 2446 new_settings = settings_model.get_global_settings()
2446 2447 new_settings.update(settings_model.get_repo_settings())
2447 2448 new_settings.update(settings)
2448 2449
2449 2450 # Update the settings.
2450 2451 inherit_global_settings = new_settings.get(
2451 2452 'inherit_global_settings', False)
2452 2453 settings_model.create_or_update_repo_settings(
2453 2454 new_settings, inherit_global_settings=inherit_global_settings)
2454 2455 Session().commit()
2455 2456 except Exception:
2456 2457 msg = 'Failed to update settings for repository `{}`'.format(repoid)
2457 2458 log.exception(msg)
2458 2459 raise JSONRPCError(msg)
2459 2460
2460 2461 # Indicate success.
2461 2462 return True
2462 2463
2463 2464
2464 2465 @jsonrpc_method()
2465 2466 def maintenance(request, apiuser, repoid):
2466 2467 """
2467 2468 Triggers a maintenance on the given repository.
2468 2469
2469 2470 This command can only be run using an |authtoken| with admin
2470 2471 rights to the specified repository. For more information,
2471 2472 see :ref:`config-token-ref`.
2472 2473
2473 2474 This command takes the following options:
2474 2475
2475 2476 :param apiuser: This is filled automatically from the |authtoken|.
2476 2477 :type apiuser: AuthUser
2477 2478 :param repoid: The repository name or repository ID.
2478 2479 :type repoid: str or int
2479 2480
2480 2481 Example output:
2481 2482
2482 2483 .. code-block:: bash
2483 2484
2484 2485 id : <id_given_in_input>
2485 2486 result : {
2486 2487 "msg": "executed maintenance command",
2487 2488 "executed_actions": [
2488 2489 <action_message>, <action_message2>...
2489 2490 ],
2490 2491 "repository": "<repository name>"
2491 2492 }
2492 2493 error : null
2493 2494
2494 2495 Example error output:
2495 2496
2496 2497 .. code-block:: bash
2497 2498
2498 2499 id : <id_given_in_input>
2499 2500 result : null
2500 2501 error : {
2501 2502 "Unable to execute maintenance on `<reponame>`"
2502 2503 }
2503 2504
2504 2505 """
2505 2506
2506 2507 repo = get_repo_or_error(repoid)
2507 2508 if not has_superadmin_permission(apiuser):
2508 2509 _perms = ('repository.admin',)
2509 2510 validate_repo_permissions(apiuser, repoid, repo, _perms)
2510 2511
2511 2512 try:
2512 2513 maintenance = repo_maintenance.RepoMaintenance()
2513 2514 executed_actions = maintenance.execute(repo)
2514 2515
2515 2516 return {
2516 2517 'msg': 'executed maintenance command',
2517 2518 'executed_actions': executed_actions,
2518 2519 'repository': repo.repo_name
2519 2520 }
2520 2521 except Exception:
2521 2522 log.exception("Exception occurred while trying to run maintenance")
2522 2523 raise JSONRPCError(
2523 2524 'Unable to execute maintenance on `%s`' % repo.repo_name)
@@ -1,1414 +1,1418 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import datetime
23 23 import formencode
24 24 import formencode.htmlfill
25 25
26 26 from pyramid.httpexceptions import HTTPFound
27 27 from pyramid.view import view_config
28 28 from pyramid.renderers import render
29 29 from pyramid.response import Response
30 30
31 31 from rhodecode import events
32 32 from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView
33 33 from rhodecode.apps.ssh_support import SshKeyFileChangeEvent
34 34 from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin
35 35 from rhodecode.authentication.plugins import auth_rhodecode
36 36 from rhodecode.events import trigger
37 37 from rhodecode.model.db import true, UserNotice
38 38
39 from rhodecode.lib import audit_logger, rc_cache
39 from rhodecode.lib import audit_logger, rc_cache, auth
40 40 from rhodecode.lib.exceptions import (
41 41 UserCreationError, UserOwnsReposException, UserOwnsRepoGroupsException,
42 42 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
43 43 UserOwnsArtifactsException, DefaultUserException)
44 44 from rhodecode.lib.ext_json import json
45 45 from rhodecode.lib.auth import (
46 46 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
47 47 from rhodecode.lib import helpers as h
48 48 from rhodecode.lib.helpers import SqlPage
49 49 from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict
50 50 from rhodecode.model.auth_token import AuthTokenModel
51 51 from rhodecode.model.forms import (
52 52 UserForm, UserIndividualPermissionsForm, UserPermissionsForm,
53 53 UserExtraEmailForm, UserExtraIpForm)
54 54 from rhodecode.model.permission import PermissionModel
55 55 from rhodecode.model.repo_group import RepoGroupModel
56 56 from rhodecode.model.ssh_key import SshKeyModel
57 57 from rhodecode.model.user import UserModel
58 58 from rhodecode.model.user_group import UserGroupModel
59 59 from rhodecode.model.db import (
60 60 or_, coalesce,IntegrityError, User, UserGroup, UserIpMap, UserEmailMap,
61 61 UserApiKeys, UserSshKeys, RepoGroup)
62 62 from rhodecode.model.meta import Session
63 63
64 64 log = logging.getLogger(__name__)
65 65
66 66
67 67 class AdminUsersView(BaseAppView, DataGridAppView):
68 68
69 69 def load_default_context(self):
70 70 c = self._get_local_tmpl_context()
71 71 return c
72 72
73 73 @LoginRequired()
74 74 @HasPermissionAllDecorator('hg.admin')
75 75 @view_config(
76 76 route_name='users', request_method='GET',
77 77 renderer='rhodecode:templates/admin/users/users.mako')
78 78 def users_list(self):
79 79 c = self.load_default_context()
80 80 return self._get_template_context(c)
81 81
82 82 @LoginRequired()
83 83 @HasPermissionAllDecorator('hg.admin')
84 84 @view_config(
85 85 # renderer defined below
86 86 route_name='users_data', request_method='GET',
87 87 renderer='json_ext', xhr=True)
88 88 def users_list_data(self):
89 89 self.load_default_context()
90 90 column_map = {
91 91 'first_name': 'name',
92 92 'last_name': 'lastname',
93 93 }
94 94 draw, start, limit = self._extract_chunk(self.request)
95 95 search_q, order_by, order_dir = self._extract_ordering(
96 96 self.request, column_map=column_map)
97 97 _render = self.request.get_partial_renderer(
98 98 'rhodecode:templates/data_table/_dt_elements.mako')
99 99
100 100 def user_actions(user_id, username):
101 101 return _render("user_actions", user_id, username)
102 102
103 103 users_data_total_count = User.query()\
104 104 .filter(User.username != User.DEFAULT_USER) \
105 105 .count()
106 106
107 107 users_data_total_inactive_count = User.query()\
108 108 .filter(User.username != User.DEFAULT_USER) \
109 109 .filter(User.active != true())\
110 110 .count()
111 111
112 112 # json generate
113 113 base_q = User.query().filter(User.username != User.DEFAULT_USER)
114 114 base_inactive_q = base_q.filter(User.active != true())
115 115
116 116 if search_q:
117 117 like_expression = u'%{}%'.format(safe_unicode(search_q))
118 118 base_q = base_q.filter(or_(
119 119 User.username.ilike(like_expression),
120 120 User._email.ilike(like_expression),
121 121 User.name.ilike(like_expression),
122 122 User.lastname.ilike(like_expression),
123 123 ))
124 124 base_inactive_q = base_q.filter(User.active != true())
125 125
126 126 users_data_total_filtered_count = base_q.count()
127 127 users_data_total_filtered_inactive_count = base_inactive_q.count()
128 128
129 129 sort_col = getattr(User, order_by, None)
130 130 if sort_col:
131 131 if order_dir == 'asc':
132 132 # handle null values properly to order by NULL last
133 133 if order_by in ['last_activity']:
134 134 sort_col = coalesce(sort_col, datetime.date.max)
135 135 sort_col = sort_col.asc()
136 136 else:
137 137 # handle null values properly to order by NULL last
138 138 if order_by in ['last_activity']:
139 139 sort_col = coalesce(sort_col, datetime.date.min)
140 140 sort_col = sort_col.desc()
141 141
142 142 base_q = base_q.order_by(sort_col)
143 143 base_q = base_q.offset(start).limit(limit)
144 144
145 145 users_list = base_q.all()
146 146
147 147 users_data = []
148 148 for user in users_list:
149 149 users_data.append({
150 150 "username": h.gravatar_with_user(self.request, user.username),
151 151 "email": user.email,
152 152 "first_name": user.first_name,
153 153 "last_name": user.last_name,
154 154 "last_login": h.format_date(user.last_login),
155 155 "last_activity": h.format_date(user.last_activity),
156 156 "active": h.bool2icon(user.active),
157 157 "active_raw": user.active,
158 158 "admin": h.bool2icon(user.admin),
159 159 "extern_type": user.extern_type,
160 160 "extern_name": user.extern_name,
161 161 "action": user_actions(user.user_id, user.username),
162 162 })
163 163 data = ({
164 164 'draw': draw,
165 165 'data': users_data,
166 166 'recordsTotal': users_data_total_count,
167 167 'recordsFiltered': users_data_total_filtered_count,
168 168 'recordsTotalInactive': users_data_total_inactive_count,
169 169 'recordsFilteredInactive': users_data_total_filtered_inactive_count
170 170 })
171 171
172 172 return data
173 173
174 174 def _set_personal_repo_group_template_vars(self, c_obj):
175 175 DummyUser = AttributeDict({
176 176 'username': '${username}',
177 177 'user_id': '${user_id}',
178 178 })
179 179 c_obj.default_create_repo_group = RepoGroupModel() \
180 180 .get_default_create_personal_repo_group()
181 181 c_obj.personal_repo_group_name = RepoGroupModel() \
182 182 .get_personal_group_name(DummyUser)
183 183
184 184 @LoginRequired()
185 185 @HasPermissionAllDecorator('hg.admin')
186 186 @view_config(
187 187 route_name='users_new', request_method='GET',
188 188 renderer='rhodecode:templates/admin/users/user_add.mako')
189 189 def users_new(self):
190 190 _ = self.request.translate
191 191 c = self.load_default_context()
192 192 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
193 193 self._set_personal_repo_group_template_vars(c)
194 194 return self._get_template_context(c)
195 195
196 196 @LoginRequired()
197 197 @HasPermissionAllDecorator('hg.admin')
198 198 @CSRFRequired()
199 199 @view_config(
200 200 route_name='users_create', request_method='POST',
201 201 renderer='rhodecode:templates/admin/users/user_add.mako')
202 202 def users_create(self):
203 203 _ = self.request.translate
204 204 c = self.load_default_context()
205 205 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
206 206 user_model = UserModel()
207 207 user_form = UserForm(self.request.translate)()
208 208 try:
209 209 form_result = user_form.to_python(dict(self.request.POST))
210 210 user = user_model.create(form_result)
211 211 Session().flush()
212 212 creation_data = user.get_api_data()
213 213 username = form_result['username']
214 214
215 215 audit_logger.store_web(
216 216 'user.create', action_data={'data': creation_data},
217 217 user=c.rhodecode_user)
218 218
219 219 user_link = h.link_to(
220 220 h.escape(username),
221 221 h.route_path('user_edit', user_id=user.user_id))
222 222 h.flash(h.literal(_('Created user %(user_link)s')
223 223 % {'user_link': user_link}), category='success')
224 224 Session().commit()
225 225 except formencode.Invalid as errors:
226 226 self._set_personal_repo_group_template_vars(c)
227 227 data = render(
228 228 'rhodecode:templates/admin/users/user_add.mako',
229 229 self._get_template_context(c), self.request)
230 230 html = formencode.htmlfill.render(
231 231 data,
232 232 defaults=errors.value,
233 233 errors=errors.error_dict or {},
234 234 prefix_error=False,
235 235 encoding="UTF-8",
236 236 force_defaults=False
237 237 )
238 238 return Response(html)
239 239 except UserCreationError as e:
240 240 h.flash(e, 'error')
241 241 except Exception:
242 242 log.exception("Exception creation of user")
243 243 h.flash(_('Error occurred during creation of user %s')
244 244 % self.request.POST.get('username'), category='error')
245 245 raise HTTPFound(h.route_path('users'))
246 246
247 247
248 248 class UsersView(UserAppView):
249 249 ALLOW_SCOPED_TOKENS = False
250 250 """
251 251 This view has alternative version inside EE, if modified please take a look
252 252 in there as well.
253 253 """
254 254
255 255 def get_auth_plugins(self):
256 256 valid_plugins = []
257 257 authn_registry = get_authn_registry(self.request.registry)
258 258 for plugin in authn_registry.get_plugins_for_authentication():
259 259 if isinstance(plugin, RhodeCodeExternalAuthPlugin):
260 260 valid_plugins.append(plugin)
261 261 elif plugin.name == 'rhodecode':
262 262 valid_plugins.append(plugin)
263 263
264 264 # extend our choices if user has set a bound plugin which isn't enabled at the
265 265 # moment
266 266 extern_type = self.db_user.extern_type
267 267 if extern_type not in [x.uid for x in valid_plugins]:
268 268 try:
269 269 plugin = authn_registry.get_plugin_by_uid(extern_type)
270 270 if plugin:
271 271 valid_plugins.append(plugin)
272 272
273 273 except Exception:
274 274 log.exception(
275 275 'Could not extend user plugins with `{}`'.format(extern_type))
276 276 return valid_plugins
277 277
278 278 def load_default_context(self):
279 279 req = self.request
280 280
281 281 c = self._get_local_tmpl_context()
282 282 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
283 283 c.allowed_languages = [
284 284 ('en', 'English (en)'),
285 285 ('de', 'German (de)'),
286 286 ('fr', 'French (fr)'),
287 287 ('it', 'Italian (it)'),
288 288 ('ja', 'Japanese (ja)'),
289 289 ('pl', 'Polish (pl)'),
290 290 ('pt', 'Portuguese (pt)'),
291 291 ('ru', 'Russian (ru)'),
292 292 ('zh', 'Chinese (zh)'),
293 293 ]
294 294
295 295 c.allowed_extern_types = [
296 296 (x.uid, x.get_display_name()) for x in self.get_auth_plugins()
297 297 ]
298 perms = req.registry.settings.get('available_permissions')
299 if not perms:
300 # inject info about available permissions
301 auth.set_available_permissions(req.registry.settings)
298 302
299 303 c.available_permissions = req.registry.settings['available_permissions']
300 304 PermissionModel().set_global_permission_choices(
301 305 c, gettext_translator=req.translate)
302 306
303 307 return c
304 308
305 309 @LoginRequired()
306 310 @HasPermissionAllDecorator('hg.admin')
307 311 @CSRFRequired()
308 312 @view_config(
309 313 route_name='user_update', request_method='POST',
310 314 renderer='rhodecode:templates/admin/users/user_edit.mako')
311 315 def user_update(self):
312 316 _ = self.request.translate
313 317 c = self.load_default_context()
314 318
315 319 user_id = self.db_user_id
316 320 c.user = self.db_user
317 321
318 322 c.active = 'profile'
319 323 c.extern_type = c.user.extern_type
320 324 c.extern_name = c.user.extern_name
321 325 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
322 326 available_languages = [x[0] for x in c.allowed_languages]
323 327 _form = UserForm(self.request.translate, edit=True,
324 328 available_languages=available_languages,
325 329 old_data={'user_id': user_id,
326 330 'email': c.user.email})()
327 331 form_result = {}
328 332 old_values = c.user.get_api_data()
329 333 try:
330 334 form_result = _form.to_python(dict(self.request.POST))
331 335 skip_attrs = ['extern_name']
332 336 # TODO: plugin should define if username can be updated
333 337 if c.extern_type != "rhodecode":
334 338 # forbid updating username for external accounts
335 339 skip_attrs.append('username')
336 340
337 341 UserModel().update_user(
338 342 user_id, skip_attrs=skip_attrs, **form_result)
339 343
340 344 audit_logger.store_web(
341 345 'user.edit', action_data={'old_data': old_values},
342 346 user=c.rhodecode_user)
343 347
344 348 Session().commit()
345 349 h.flash(_('User updated successfully'), category='success')
346 350 except formencode.Invalid as errors:
347 351 data = render(
348 352 'rhodecode:templates/admin/users/user_edit.mako',
349 353 self._get_template_context(c), self.request)
350 354 html = formencode.htmlfill.render(
351 355 data,
352 356 defaults=errors.value,
353 357 errors=errors.error_dict or {},
354 358 prefix_error=False,
355 359 encoding="UTF-8",
356 360 force_defaults=False
357 361 )
358 362 return Response(html)
359 363 except UserCreationError as e:
360 364 h.flash(e, 'error')
361 365 except Exception:
362 366 log.exception("Exception updating user")
363 367 h.flash(_('Error occurred during update of user %s')
364 368 % form_result.get('username'), category='error')
365 369 raise HTTPFound(h.route_path('user_edit', user_id=user_id))
366 370
367 371 @LoginRequired()
368 372 @HasPermissionAllDecorator('hg.admin')
369 373 @CSRFRequired()
370 374 @view_config(
371 375 route_name='user_delete', request_method='POST',
372 376 renderer='rhodecode:templates/admin/users/user_edit.mako')
373 377 def user_delete(self):
374 378 _ = self.request.translate
375 379 c = self.load_default_context()
376 380 c.user = self.db_user
377 381
378 382 _repos = c.user.repositories
379 383 _repo_groups = c.user.repository_groups
380 384 _user_groups = c.user.user_groups
381 385 _pull_requests = c.user.user_pull_requests
382 386 _artifacts = c.user.artifacts
383 387
384 388 handle_repos = None
385 389 handle_repo_groups = None
386 390 handle_user_groups = None
387 391 handle_pull_requests = None
388 392 handle_artifacts = None
389 393
390 394 # calls for flash of handle based on handle case detach or delete
391 395 def set_handle_flash_repos():
392 396 handle = handle_repos
393 397 if handle == 'detach':
394 398 h.flash(_('Detached %s repositories') % len(_repos),
395 399 category='success')
396 400 elif handle == 'delete':
397 401 h.flash(_('Deleted %s repositories') % len(_repos),
398 402 category='success')
399 403
400 404 def set_handle_flash_repo_groups():
401 405 handle = handle_repo_groups
402 406 if handle == 'detach':
403 407 h.flash(_('Detached %s repository groups') % len(_repo_groups),
404 408 category='success')
405 409 elif handle == 'delete':
406 410 h.flash(_('Deleted %s repository groups') % len(_repo_groups),
407 411 category='success')
408 412
409 413 def set_handle_flash_user_groups():
410 414 handle = handle_user_groups
411 415 if handle == 'detach':
412 416 h.flash(_('Detached %s user groups') % len(_user_groups),
413 417 category='success')
414 418 elif handle == 'delete':
415 419 h.flash(_('Deleted %s user groups') % len(_user_groups),
416 420 category='success')
417 421
418 422 def set_handle_flash_pull_requests():
419 423 handle = handle_pull_requests
420 424 if handle == 'detach':
421 425 h.flash(_('Detached %s pull requests') % len(_pull_requests),
422 426 category='success')
423 427 elif handle == 'delete':
424 428 h.flash(_('Deleted %s pull requests') % len(_pull_requests),
425 429 category='success')
426 430
427 431 def set_handle_flash_artifacts():
428 432 handle = handle_artifacts
429 433 if handle == 'detach':
430 434 h.flash(_('Detached %s artifacts') % len(_artifacts),
431 435 category='success')
432 436 elif handle == 'delete':
433 437 h.flash(_('Deleted %s artifacts') % len(_artifacts),
434 438 category='success')
435 439
436 440 handle_user = User.get_first_super_admin()
437 441 handle_user_id = safe_int(self.request.POST.get('detach_user_id'))
438 442 if handle_user_id:
439 443 # NOTE(marcink): we get new owner for objects...
440 444 handle_user = User.get_or_404(handle_user_id)
441 445
442 446 if _repos and self.request.POST.get('user_repos'):
443 447 handle_repos = self.request.POST['user_repos']
444 448
445 449 if _repo_groups and self.request.POST.get('user_repo_groups'):
446 450 handle_repo_groups = self.request.POST['user_repo_groups']
447 451
448 452 if _user_groups and self.request.POST.get('user_user_groups'):
449 453 handle_user_groups = self.request.POST['user_user_groups']
450 454
451 455 if _pull_requests and self.request.POST.get('user_pull_requests'):
452 456 handle_pull_requests = self.request.POST['user_pull_requests']
453 457
454 458 if _artifacts and self.request.POST.get('user_artifacts'):
455 459 handle_artifacts = self.request.POST['user_artifacts']
456 460
457 461 old_values = c.user.get_api_data()
458 462
459 463 try:
460 464
461 465 UserModel().delete(
462 466 c.user,
463 467 handle_repos=handle_repos,
464 468 handle_repo_groups=handle_repo_groups,
465 469 handle_user_groups=handle_user_groups,
466 470 handle_pull_requests=handle_pull_requests,
467 471 handle_artifacts=handle_artifacts,
468 472 handle_new_owner=handle_user
469 473 )
470 474
471 475 audit_logger.store_web(
472 476 'user.delete', action_data={'old_data': old_values},
473 477 user=c.rhodecode_user)
474 478
475 479 Session().commit()
476 480 set_handle_flash_repos()
477 481 set_handle_flash_repo_groups()
478 482 set_handle_flash_user_groups()
479 483 set_handle_flash_pull_requests()
480 484 set_handle_flash_artifacts()
481 485 username = h.escape(old_values['username'])
482 486 h.flash(_('Successfully deleted user `{}`').format(username), category='success')
483 487 except (UserOwnsReposException, UserOwnsRepoGroupsException,
484 488 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
485 489 UserOwnsArtifactsException, DefaultUserException) as e:
486 490 h.flash(e, category='warning')
487 491 except Exception:
488 492 log.exception("Exception during deletion of user")
489 493 h.flash(_('An error occurred during deletion of user'),
490 494 category='error')
491 495 raise HTTPFound(h.route_path('users'))
492 496
493 497 @LoginRequired()
494 498 @HasPermissionAllDecorator('hg.admin')
495 499 @view_config(
496 500 route_name='user_edit', request_method='GET',
497 501 renderer='rhodecode:templates/admin/users/user_edit.mako')
498 502 def user_edit(self):
499 503 _ = self.request.translate
500 504 c = self.load_default_context()
501 505 c.user = self.db_user
502 506
503 507 c.active = 'profile'
504 508 c.extern_type = c.user.extern_type
505 509 c.extern_name = c.user.extern_name
506 510 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
507 511
508 512 defaults = c.user.get_dict()
509 513 defaults.update({'language': c.user.user_data.get('language')})
510 514
511 515 data = render(
512 516 'rhodecode:templates/admin/users/user_edit.mako',
513 517 self._get_template_context(c), self.request)
514 518 html = formencode.htmlfill.render(
515 519 data,
516 520 defaults=defaults,
517 521 encoding="UTF-8",
518 522 force_defaults=False
519 523 )
520 524 return Response(html)
521 525
522 526 @LoginRequired()
523 527 @HasPermissionAllDecorator('hg.admin')
524 528 @view_config(
525 529 route_name='user_edit_advanced', request_method='GET',
526 530 renderer='rhodecode:templates/admin/users/user_edit.mako')
527 531 def user_edit_advanced(self):
528 532 _ = self.request.translate
529 533 c = self.load_default_context()
530 534
531 535 user_id = self.db_user_id
532 536 c.user = self.db_user
533 537
534 538 c.detach_user = User.get_first_super_admin()
535 539 detach_user_id = safe_int(self.request.GET.get('detach_user_id'))
536 540 if detach_user_id:
537 541 c.detach_user = User.get_or_404(detach_user_id)
538 542
539 543 c.active = 'advanced'
540 544 c.personal_repo_group = RepoGroup.get_user_personal_repo_group(user_id)
541 545 c.personal_repo_group_name = RepoGroupModel()\
542 546 .get_personal_group_name(c.user)
543 547
544 548 c.user_to_review_rules = sorted(
545 549 (x.user for x in c.user.user_review_rules),
546 550 key=lambda u: u.username.lower())
547 551
548 552 defaults = c.user.get_dict()
549 553
550 554 # Interim workaround if the user participated on any pull requests as a
551 555 # reviewer.
552 556 has_review = len(c.user.reviewer_pull_requests)
553 557 c.can_delete_user = not has_review
554 558 c.can_delete_user_message = ''
555 559 inactive_link = h.link_to(
556 560 'inactive', h.route_path('user_edit', user_id=user_id, _anchor='active'))
557 561 if has_review == 1:
558 562 c.can_delete_user_message = h.literal(_(
559 563 'The user participates as reviewer in {} pull request and '
560 564 'cannot be deleted. \nYou can set the user to '
561 565 '"{}" instead of deleting it.').format(
562 566 has_review, inactive_link))
563 567 elif has_review:
564 568 c.can_delete_user_message = h.literal(_(
565 569 'The user participates as reviewer in {} pull requests and '
566 570 'cannot be deleted. \nYou can set the user to '
567 571 '"{}" instead of deleting it.').format(
568 572 has_review, inactive_link))
569 573
570 574 data = render(
571 575 'rhodecode:templates/admin/users/user_edit.mako',
572 576 self._get_template_context(c), self.request)
573 577 html = formencode.htmlfill.render(
574 578 data,
575 579 defaults=defaults,
576 580 encoding="UTF-8",
577 581 force_defaults=False
578 582 )
579 583 return Response(html)
580 584
581 585 @LoginRequired()
582 586 @HasPermissionAllDecorator('hg.admin')
583 587 @view_config(
584 588 route_name='user_edit_global_perms', request_method='GET',
585 589 renderer='rhodecode:templates/admin/users/user_edit.mako')
586 590 def user_edit_global_perms(self):
587 591 _ = self.request.translate
588 592 c = self.load_default_context()
589 593 c.user = self.db_user
590 594
591 595 c.active = 'global_perms'
592 596
593 597 c.default_user = User.get_default_user()
594 598 defaults = c.user.get_dict()
595 599 defaults.update(c.default_user.get_default_perms(suffix='_inherited'))
596 600 defaults.update(c.default_user.get_default_perms())
597 601 defaults.update(c.user.get_default_perms())
598 602
599 603 data = render(
600 604 'rhodecode:templates/admin/users/user_edit.mako',
601 605 self._get_template_context(c), self.request)
602 606 html = formencode.htmlfill.render(
603 607 data,
604 608 defaults=defaults,
605 609 encoding="UTF-8",
606 610 force_defaults=False
607 611 )
608 612 return Response(html)
609 613
610 614 @LoginRequired()
611 615 @HasPermissionAllDecorator('hg.admin')
612 616 @CSRFRequired()
613 617 @view_config(
614 618 route_name='user_edit_global_perms_update', request_method='POST',
615 619 renderer='rhodecode:templates/admin/users/user_edit.mako')
616 620 def user_edit_global_perms_update(self):
617 621 _ = self.request.translate
618 622 c = self.load_default_context()
619 623
620 624 user_id = self.db_user_id
621 625 c.user = self.db_user
622 626
623 627 c.active = 'global_perms'
624 628 try:
625 629 # first stage that verifies the checkbox
626 630 _form = UserIndividualPermissionsForm(self.request.translate)
627 631 form_result = _form.to_python(dict(self.request.POST))
628 632 inherit_perms = form_result['inherit_default_permissions']
629 633 c.user.inherit_default_permissions = inherit_perms
630 634 Session().add(c.user)
631 635
632 636 if not inherit_perms:
633 637 # only update the individual ones if we un check the flag
634 638 _form = UserPermissionsForm(
635 639 self.request.translate,
636 640 [x[0] for x in c.repo_create_choices],
637 641 [x[0] for x in c.repo_create_on_write_choices],
638 642 [x[0] for x in c.repo_group_create_choices],
639 643 [x[0] for x in c.user_group_create_choices],
640 644 [x[0] for x in c.fork_choices],
641 645 [x[0] for x in c.inherit_default_permission_choices])()
642 646
643 647 form_result = _form.to_python(dict(self.request.POST))
644 648 form_result.update({'perm_user_id': c.user.user_id})
645 649
646 650 PermissionModel().update_user_permissions(form_result)
647 651
648 652 # TODO(marcink): implement global permissions
649 653 # audit_log.store_web('user.edit.permissions')
650 654
651 655 Session().commit()
652 656
653 657 h.flash(_('User global permissions updated successfully'),
654 658 category='success')
655 659
656 660 except formencode.Invalid as errors:
657 661 data = render(
658 662 'rhodecode:templates/admin/users/user_edit.mako',
659 663 self._get_template_context(c), self.request)
660 664 html = formencode.htmlfill.render(
661 665 data,
662 666 defaults=errors.value,
663 667 errors=errors.error_dict or {},
664 668 prefix_error=False,
665 669 encoding="UTF-8",
666 670 force_defaults=False
667 671 )
668 672 return Response(html)
669 673 except Exception:
670 674 log.exception("Exception during permissions saving")
671 675 h.flash(_('An error occurred during permissions saving'),
672 676 category='error')
673 677
674 678 affected_user_ids = [user_id]
675 679 PermissionModel().trigger_permission_flush(affected_user_ids)
676 680 raise HTTPFound(h.route_path('user_edit_global_perms', user_id=user_id))
677 681
678 682 @LoginRequired()
679 683 @HasPermissionAllDecorator('hg.admin')
680 684 @CSRFRequired()
681 685 @view_config(
682 686 route_name='user_enable_force_password_reset', request_method='POST',
683 687 renderer='rhodecode:templates/admin/users/user_edit.mako')
684 688 def user_enable_force_password_reset(self):
685 689 _ = self.request.translate
686 690 c = self.load_default_context()
687 691
688 692 user_id = self.db_user_id
689 693 c.user = self.db_user
690 694
691 695 try:
692 696 c.user.update_userdata(force_password_change=True)
693 697
694 698 msg = _('Force password change enabled for user')
695 699 audit_logger.store_web('user.edit.password_reset.enabled',
696 700 user=c.rhodecode_user)
697 701
698 702 Session().commit()
699 703 h.flash(msg, category='success')
700 704 except Exception:
701 705 log.exception("Exception during password reset for user")
702 706 h.flash(_('An error occurred during password reset for user'),
703 707 category='error')
704 708
705 709 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
706 710
707 711 @LoginRequired()
708 712 @HasPermissionAllDecorator('hg.admin')
709 713 @CSRFRequired()
710 714 @view_config(
711 715 route_name='user_disable_force_password_reset', request_method='POST',
712 716 renderer='rhodecode:templates/admin/users/user_edit.mako')
713 717 def user_disable_force_password_reset(self):
714 718 _ = self.request.translate
715 719 c = self.load_default_context()
716 720
717 721 user_id = self.db_user_id
718 722 c.user = self.db_user
719 723
720 724 try:
721 725 c.user.update_userdata(force_password_change=False)
722 726
723 727 msg = _('Force password change disabled for user')
724 728 audit_logger.store_web(
725 729 'user.edit.password_reset.disabled',
726 730 user=c.rhodecode_user)
727 731
728 732 Session().commit()
729 733 h.flash(msg, category='success')
730 734 except Exception:
731 735 log.exception("Exception during password reset for user")
732 736 h.flash(_('An error occurred during password reset for user'),
733 737 category='error')
734 738
735 739 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
736 740
737 741 @LoginRequired()
738 742 @HasPermissionAllDecorator('hg.admin')
739 743 @CSRFRequired()
740 744 @view_config(
741 745 route_name='user_notice_dismiss', request_method='POST',
742 746 renderer='json_ext', xhr=True)
743 747 def user_notice_dismiss(self):
744 748 _ = self.request.translate
745 749 c = self.load_default_context()
746 750
747 751 user_id = self.db_user_id
748 752 c.user = self.db_user
749 753 user_notice_id = safe_int(self.request.POST.get('notice_id'))
750 754 notice = UserNotice().query()\
751 755 .filter(UserNotice.user_id == user_id)\
752 756 .filter(UserNotice.user_notice_id == user_notice_id)\
753 757 .scalar()
754 758 read = False
755 759 if notice:
756 760 notice.notice_read = True
757 761 Session().add(notice)
758 762 Session().commit()
759 763 read = True
760 764
761 765 return {'notice': user_notice_id, 'read': read}
762 766
763 767 @LoginRequired()
764 768 @HasPermissionAllDecorator('hg.admin')
765 769 @CSRFRequired()
766 770 @view_config(
767 771 route_name='user_create_personal_repo_group', request_method='POST',
768 772 renderer='rhodecode:templates/admin/users/user_edit.mako')
769 773 def user_create_personal_repo_group(self):
770 774 """
771 775 Create personal repository group for this user
772 776 """
773 777 from rhodecode.model.repo_group import RepoGroupModel
774 778
775 779 _ = self.request.translate
776 780 c = self.load_default_context()
777 781
778 782 user_id = self.db_user_id
779 783 c.user = self.db_user
780 784
781 785 personal_repo_group = RepoGroup.get_user_personal_repo_group(
782 786 c.user.user_id)
783 787 if personal_repo_group:
784 788 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
785 789
786 790 personal_repo_group_name = RepoGroupModel().get_personal_group_name(c.user)
787 791 named_personal_group = RepoGroup.get_by_group_name(
788 792 personal_repo_group_name)
789 793 try:
790 794
791 795 if named_personal_group and named_personal_group.user_id == c.user.user_id:
792 796 # migrate the same named group, and mark it as personal
793 797 named_personal_group.personal = True
794 798 Session().add(named_personal_group)
795 799 Session().commit()
796 800 msg = _('Linked repository group `%s` as personal' % (
797 801 personal_repo_group_name,))
798 802 h.flash(msg, category='success')
799 803 elif not named_personal_group:
800 804 RepoGroupModel().create_personal_repo_group(c.user)
801 805
802 806 msg = _('Created repository group `%s`' % (
803 807 personal_repo_group_name,))
804 808 h.flash(msg, category='success')
805 809 else:
806 810 msg = _('Repository group `%s` is already taken' % (
807 811 personal_repo_group_name,))
808 812 h.flash(msg, category='warning')
809 813 except Exception:
810 814 log.exception("Exception during repository group creation")
811 815 msg = _(
812 816 'An error occurred during repository group creation for user')
813 817 h.flash(msg, category='error')
814 818 Session().rollback()
815 819
816 820 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
817 821
818 822 @LoginRequired()
819 823 @HasPermissionAllDecorator('hg.admin')
820 824 @view_config(
821 825 route_name='edit_user_auth_tokens', request_method='GET',
822 826 renderer='rhodecode:templates/admin/users/user_edit.mako')
823 827 def auth_tokens(self):
824 828 _ = self.request.translate
825 829 c = self.load_default_context()
826 830 c.user = self.db_user
827 831
828 832 c.active = 'auth_tokens'
829 833
830 834 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
831 835 c.role_values = [
832 836 (x, AuthTokenModel.cls._get_role_name(x))
833 837 for x in AuthTokenModel.cls.ROLES]
834 838 c.role_options = [(c.role_values, _("Role"))]
835 839 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
836 840 c.user.user_id, show_expired=True)
837 841 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
838 842 return self._get_template_context(c)
839 843
840 844 @LoginRequired()
841 845 @HasPermissionAllDecorator('hg.admin')
842 846 @view_config(
843 847 route_name='edit_user_auth_tokens_view', request_method='POST',
844 848 renderer='json_ext', xhr=True)
845 849 def auth_tokens_view(self):
846 850 _ = self.request.translate
847 851 c = self.load_default_context()
848 852 c.user = self.db_user
849 853
850 854 auth_token_id = self.request.POST.get('auth_token_id')
851 855
852 856 if auth_token_id:
853 857 token = UserApiKeys.get_or_404(auth_token_id)
854 858
855 859 return {
856 860 'auth_token': token.api_key
857 861 }
858 862
859 863 def maybe_attach_token_scope(self, token):
860 864 # implemented in EE edition
861 865 pass
862 866
863 867 @LoginRequired()
864 868 @HasPermissionAllDecorator('hg.admin')
865 869 @CSRFRequired()
866 870 @view_config(
867 871 route_name='edit_user_auth_tokens_add', request_method='POST')
868 872 def auth_tokens_add(self):
869 873 _ = self.request.translate
870 874 c = self.load_default_context()
871 875
872 876 user_id = self.db_user_id
873 877 c.user = self.db_user
874 878
875 879 user_data = c.user.get_api_data()
876 880 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
877 881 description = self.request.POST.get('description')
878 882 role = self.request.POST.get('role')
879 883
880 884 token = UserModel().add_auth_token(
881 885 user=c.user.user_id,
882 886 lifetime_minutes=lifetime, role=role, description=description,
883 887 scope_callback=self.maybe_attach_token_scope)
884 888 token_data = token.get_api_data()
885 889
886 890 audit_logger.store_web(
887 891 'user.edit.token.add', action_data={
888 892 'data': {'token': token_data, 'user': user_data}},
889 893 user=self._rhodecode_user, )
890 894 Session().commit()
891 895
892 896 h.flash(_("Auth token successfully created"), category='success')
893 897 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
894 898
895 899 @LoginRequired()
896 900 @HasPermissionAllDecorator('hg.admin')
897 901 @CSRFRequired()
898 902 @view_config(
899 903 route_name='edit_user_auth_tokens_delete', request_method='POST')
900 904 def auth_tokens_delete(self):
901 905 _ = self.request.translate
902 906 c = self.load_default_context()
903 907
904 908 user_id = self.db_user_id
905 909 c.user = self.db_user
906 910
907 911 user_data = c.user.get_api_data()
908 912
909 913 del_auth_token = self.request.POST.get('del_auth_token')
910 914
911 915 if del_auth_token:
912 916 token = UserApiKeys.get_or_404(del_auth_token)
913 917 token_data = token.get_api_data()
914 918
915 919 AuthTokenModel().delete(del_auth_token, c.user.user_id)
916 920 audit_logger.store_web(
917 921 'user.edit.token.delete', action_data={
918 922 'data': {'token': token_data, 'user': user_data}},
919 923 user=self._rhodecode_user,)
920 924 Session().commit()
921 925 h.flash(_("Auth token successfully deleted"), category='success')
922 926
923 927 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
924 928
925 929 @LoginRequired()
926 930 @HasPermissionAllDecorator('hg.admin')
927 931 @view_config(
928 932 route_name='edit_user_ssh_keys', request_method='GET',
929 933 renderer='rhodecode:templates/admin/users/user_edit.mako')
930 934 def ssh_keys(self):
931 935 _ = self.request.translate
932 936 c = self.load_default_context()
933 937 c.user = self.db_user
934 938
935 939 c.active = 'ssh_keys'
936 940 c.default_key = self.request.GET.get('default_key')
937 941 c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id)
938 942 return self._get_template_context(c)
939 943
940 944 @LoginRequired()
941 945 @HasPermissionAllDecorator('hg.admin')
942 946 @view_config(
943 947 route_name='edit_user_ssh_keys_generate_keypair', request_method='GET',
944 948 renderer='rhodecode:templates/admin/users/user_edit.mako')
945 949 def ssh_keys_generate_keypair(self):
946 950 _ = self.request.translate
947 951 c = self.load_default_context()
948 952
949 953 c.user = self.db_user
950 954
951 955 c.active = 'ssh_keys_generate'
952 956 comment = 'RhodeCode-SSH {}'.format(c.user.email or '')
953 957 private_format = self.request.GET.get('private_format') \
954 958 or SshKeyModel.DEFAULT_PRIVATE_KEY_FORMAT
955 959 c.private, c.public = SshKeyModel().generate_keypair(
956 960 comment=comment, private_format=private_format)
957 961
958 962 return self._get_template_context(c)
959 963
960 964 @LoginRequired()
961 965 @HasPermissionAllDecorator('hg.admin')
962 966 @CSRFRequired()
963 967 @view_config(
964 968 route_name='edit_user_ssh_keys_add', request_method='POST')
965 969 def ssh_keys_add(self):
966 970 _ = self.request.translate
967 971 c = self.load_default_context()
968 972
969 973 user_id = self.db_user_id
970 974 c.user = self.db_user
971 975
972 976 user_data = c.user.get_api_data()
973 977 key_data = self.request.POST.get('key_data')
974 978 description = self.request.POST.get('description')
975 979
976 980 fingerprint = 'unknown'
977 981 try:
978 982 if not key_data:
979 983 raise ValueError('Please add a valid public key')
980 984
981 985 key = SshKeyModel().parse_key(key_data.strip())
982 986 fingerprint = key.hash_md5()
983 987
984 988 ssh_key = SshKeyModel().create(
985 989 c.user.user_id, fingerprint, key.keydata, description)
986 990 ssh_key_data = ssh_key.get_api_data()
987 991
988 992 audit_logger.store_web(
989 993 'user.edit.ssh_key.add', action_data={
990 994 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
991 995 user=self._rhodecode_user, )
992 996 Session().commit()
993 997
994 998 # Trigger an event on change of keys.
995 999 trigger(SshKeyFileChangeEvent(), self.request.registry)
996 1000
997 1001 h.flash(_("Ssh Key successfully created"), category='success')
998 1002
999 1003 except IntegrityError:
1000 1004 log.exception("Exception during ssh key saving")
1001 1005 err = 'Such key with fingerprint `{}` already exists, ' \
1002 1006 'please use a different one'.format(fingerprint)
1003 1007 h.flash(_('An error occurred during ssh key saving: {}').format(err),
1004 1008 category='error')
1005 1009 except Exception as e:
1006 1010 log.exception("Exception during ssh key saving")
1007 1011 h.flash(_('An error occurred during ssh key saving: {}').format(e),
1008 1012 category='error')
1009 1013
1010 1014 return HTTPFound(
1011 1015 h.route_path('edit_user_ssh_keys', user_id=user_id))
1012 1016
1013 1017 @LoginRequired()
1014 1018 @HasPermissionAllDecorator('hg.admin')
1015 1019 @CSRFRequired()
1016 1020 @view_config(
1017 1021 route_name='edit_user_ssh_keys_delete', request_method='POST')
1018 1022 def ssh_keys_delete(self):
1019 1023 _ = self.request.translate
1020 1024 c = self.load_default_context()
1021 1025
1022 1026 user_id = self.db_user_id
1023 1027 c.user = self.db_user
1024 1028
1025 1029 user_data = c.user.get_api_data()
1026 1030
1027 1031 del_ssh_key = self.request.POST.get('del_ssh_key')
1028 1032
1029 1033 if del_ssh_key:
1030 1034 ssh_key = UserSshKeys.get_or_404(del_ssh_key)
1031 1035 ssh_key_data = ssh_key.get_api_data()
1032 1036
1033 1037 SshKeyModel().delete(del_ssh_key, c.user.user_id)
1034 1038 audit_logger.store_web(
1035 1039 'user.edit.ssh_key.delete', action_data={
1036 1040 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
1037 1041 user=self._rhodecode_user,)
1038 1042 Session().commit()
1039 1043 # Trigger an event on change of keys.
1040 1044 trigger(SshKeyFileChangeEvent(), self.request.registry)
1041 1045 h.flash(_("Ssh key successfully deleted"), category='success')
1042 1046
1043 1047 return HTTPFound(h.route_path('edit_user_ssh_keys', user_id=user_id))
1044 1048
1045 1049 @LoginRequired()
1046 1050 @HasPermissionAllDecorator('hg.admin')
1047 1051 @view_config(
1048 1052 route_name='edit_user_emails', request_method='GET',
1049 1053 renderer='rhodecode:templates/admin/users/user_edit.mako')
1050 1054 def emails(self):
1051 1055 _ = self.request.translate
1052 1056 c = self.load_default_context()
1053 1057 c.user = self.db_user
1054 1058
1055 1059 c.active = 'emails'
1056 1060 c.user_email_map = UserEmailMap.query() \
1057 1061 .filter(UserEmailMap.user == c.user).all()
1058 1062
1059 1063 return self._get_template_context(c)
1060 1064
1061 1065 @LoginRequired()
1062 1066 @HasPermissionAllDecorator('hg.admin')
1063 1067 @CSRFRequired()
1064 1068 @view_config(
1065 1069 route_name='edit_user_emails_add', request_method='POST')
1066 1070 def emails_add(self):
1067 1071 _ = self.request.translate
1068 1072 c = self.load_default_context()
1069 1073
1070 1074 user_id = self.db_user_id
1071 1075 c.user = self.db_user
1072 1076
1073 1077 email = self.request.POST.get('new_email')
1074 1078 user_data = c.user.get_api_data()
1075 1079 try:
1076 1080
1077 1081 form = UserExtraEmailForm(self.request.translate)()
1078 1082 data = form.to_python({'email': email})
1079 1083 email = data['email']
1080 1084
1081 1085 UserModel().add_extra_email(c.user.user_id, email)
1082 1086 audit_logger.store_web(
1083 1087 'user.edit.email.add',
1084 1088 action_data={'email': email, 'user': user_data},
1085 1089 user=self._rhodecode_user)
1086 1090 Session().commit()
1087 1091 h.flash(_("Added new email address `%s` for user account") % email,
1088 1092 category='success')
1089 1093 except formencode.Invalid as error:
1090 1094 h.flash(h.escape(error.error_dict['email']), category='error')
1091 1095 except IntegrityError:
1092 1096 log.warning("Email %s already exists", email)
1093 1097 h.flash(_('Email `{}` is already registered for another user.').format(email),
1094 1098 category='error')
1095 1099 except Exception:
1096 1100 log.exception("Exception during email saving")
1097 1101 h.flash(_('An error occurred during email saving'),
1098 1102 category='error')
1099 1103 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1100 1104
1101 1105 @LoginRequired()
1102 1106 @HasPermissionAllDecorator('hg.admin')
1103 1107 @CSRFRequired()
1104 1108 @view_config(
1105 1109 route_name='edit_user_emails_delete', request_method='POST')
1106 1110 def emails_delete(self):
1107 1111 _ = self.request.translate
1108 1112 c = self.load_default_context()
1109 1113
1110 1114 user_id = self.db_user_id
1111 1115 c.user = self.db_user
1112 1116
1113 1117 email_id = self.request.POST.get('del_email_id')
1114 1118 user_model = UserModel()
1115 1119
1116 1120 email = UserEmailMap.query().get(email_id).email
1117 1121 user_data = c.user.get_api_data()
1118 1122 user_model.delete_extra_email(c.user.user_id, email_id)
1119 1123 audit_logger.store_web(
1120 1124 'user.edit.email.delete',
1121 1125 action_data={'email': email, 'user': user_data},
1122 1126 user=self._rhodecode_user)
1123 1127 Session().commit()
1124 1128 h.flash(_("Removed email address from user account"),
1125 1129 category='success')
1126 1130 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1127 1131
1128 1132 @LoginRequired()
1129 1133 @HasPermissionAllDecorator('hg.admin')
1130 1134 @view_config(
1131 1135 route_name='edit_user_ips', request_method='GET',
1132 1136 renderer='rhodecode:templates/admin/users/user_edit.mako')
1133 1137 def ips(self):
1134 1138 _ = self.request.translate
1135 1139 c = self.load_default_context()
1136 1140 c.user = self.db_user
1137 1141
1138 1142 c.active = 'ips'
1139 1143 c.user_ip_map = UserIpMap.query() \
1140 1144 .filter(UserIpMap.user == c.user).all()
1141 1145
1142 1146 c.inherit_default_ips = c.user.inherit_default_permissions
1143 1147 c.default_user_ip_map = UserIpMap.query() \
1144 1148 .filter(UserIpMap.user == User.get_default_user()).all()
1145 1149
1146 1150 return self._get_template_context(c)
1147 1151
1148 1152 @LoginRequired()
1149 1153 @HasPermissionAllDecorator('hg.admin')
1150 1154 @CSRFRequired()
1151 1155 @view_config(
1152 1156 route_name='edit_user_ips_add', request_method='POST')
1153 1157 # NOTE(marcink): this view is allowed for default users, as we can
1154 1158 # edit their IP white list
1155 1159 def ips_add(self):
1156 1160 _ = self.request.translate
1157 1161 c = self.load_default_context()
1158 1162
1159 1163 user_id = self.db_user_id
1160 1164 c.user = self.db_user
1161 1165
1162 1166 user_model = UserModel()
1163 1167 desc = self.request.POST.get('description')
1164 1168 try:
1165 1169 ip_list = user_model.parse_ip_range(
1166 1170 self.request.POST.get('new_ip'))
1167 1171 except Exception as e:
1168 1172 ip_list = []
1169 1173 log.exception("Exception during ip saving")
1170 1174 h.flash(_('An error occurred during ip saving:%s' % (e,)),
1171 1175 category='error')
1172 1176 added = []
1173 1177 user_data = c.user.get_api_data()
1174 1178 for ip in ip_list:
1175 1179 try:
1176 1180 form = UserExtraIpForm(self.request.translate)()
1177 1181 data = form.to_python({'ip': ip})
1178 1182 ip = data['ip']
1179 1183
1180 1184 user_model.add_extra_ip(c.user.user_id, ip, desc)
1181 1185 audit_logger.store_web(
1182 1186 'user.edit.ip.add',
1183 1187 action_data={'ip': ip, 'user': user_data},
1184 1188 user=self._rhodecode_user)
1185 1189 Session().commit()
1186 1190 added.append(ip)
1187 1191 except formencode.Invalid as error:
1188 1192 msg = error.error_dict['ip']
1189 1193 h.flash(msg, category='error')
1190 1194 except Exception:
1191 1195 log.exception("Exception during ip saving")
1192 1196 h.flash(_('An error occurred during ip saving'),
1193 1197 category='error')
1194 1198 if added:
1195 1199 h.flash(
1196 1200 _("Added ips %s to user whitelist") % (', '.join(ip_list), ),
1197 1201 category='success')
1198 1202 if 'default_user' in self.request.POST:
1199 1203 # case for editing global IP list we do it for 'DEFAULT' user
1200 1204 raise HTTPFound(h.route_path('admin_permissions_ips'))
1201 1205 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1202 1206
1203 1207 @LoginRequired()
1204 1208 @HasPermissionAllDecorator('hg.admin')
1205 1209 @CSRFRequired()
1206 1210 @view_config(
1207 1211 route_name='edit_user_ips_delete', request_method='POST')
1208 1212 # NOTE(marcink): this view is allowed for default users, as we can
1209 1213 # edit their IP white list
1210 1214 def ips_delete(self):
1211 1215 _ = self.request.translate
1212 1216 c = self.load_default_context()
1213 1217
1214 1218 user_id = self.db_user_id
1215 1219 c.user = self.db_user
1216 1220
1217 1221 ip_id = self.request.POST.get('del_ip_id')
1218 1222 user_model = UserModel()
1219 1223 user_data = c.user.get_api_data()
1220 1224 ip = UserIpMap.query().get(ip_id).ip_addr
1221 1225 user_model.delete_extra_ip(c.user.user_id, ip_id)
1222 1226 audit_logger.store_web(
1223 1227 'user.edit.ip.delete', action_data={'ip': ip, 'user': user_data},
1224 1228 user=self._rhodecode_user)
1225 1229 Session().commit()
1226 1230 h.flash(_("Removed ip address from user whitelist"), category='success')
1227 1231
1228 1232 if 'default_user' in self.request.POST:
1229 1233 # case for editing global IP list we do it for 'DEFAULT' user
1230 1234 raise HTTPFound(h.route_path('admin_permissions_ips'))
1231 1235 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1232 1236
1233 1237 @LoginRequired()
1234 1238 @HasPermissionAllDecorator('hg.admin')
1235 1239 @view_config(
1236 1240 route_name='edit_user_groups_management', request_method='GET',
1237 1241 renderer='rhodecode:templates/admin/users/user_edit.mako')
1238 1242 def groups_management(self):
1239 1243 c = self.load_default_context()
1240 1244 c.user = self.db_user
1241 1245 c.data = c.user.group_member
1242 1246
1243 1247 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
1244 1248 for group in c.user.group_member]
1245 1249 c.groups = json.dumps(groups)
1246 1250 c.active = 'groups'
1247 1251
1248 1252 return self._get_template_context(c)
1249 1253
1250 1254 @LoginRequired()
1251 1255 @HasPermissionAllDecorator('hg.admin')
1252 1256 @CSRFRequired()
1253 1257 @view_config(
1254 1258 route_name='edit_user_groups_management_updates', request_method='POST')
1255 1259 def groups_management_updates(self):
1256 1260 _ = self.request.translate
1257 1261 c = self.load_default_context()
1258 1262
1259 1263 user_id = self.db_user_id
1260 1264 c.user = self.db_user
1261 1265
1262 1266 user_groups = set(self.request.POST.getall('users_group_id'))
1263 1267 user_groups_objects = []
1264 1268
1265 1269 for ugid in user_groups:
1266 1270 user_groups_objects.append(
1267 1271 UserGroupModel().get_group(safe_int(ugid)))
1268 1272 user_group_model = UserGroupModel()
1269 1273 added_to_groups, removed_from_groups = \
1270 1274 user_group_model.change_groups(c.user, user_groups_objects)
1271 1275
1272 1276 user_data = c.user.get_api_data()
1273 1277 for user_group_id in added_to_groups:
1274 1278 user_group = UserGroup.get(user_group_id)
1275 1279 old_values = user_group.get_api_data()
1276 1280 audit_logger.store_web(
1277 1281 'user_group.edit.member.add',
1278 1282 action_data={'user': user_data, 'old_data': old_values},
1279 1283 user=self._rhodecode_user)
1280 1284
1281 1285 for user_group_id in removed_from_groups:
1282 1286 user_group = UserGroup.get(user_group_id)
1283 1287 old_values = user_group.get_api_data()
1284 1288 audit_logger.store_web(
1285 1289 'user_group.edit.member.delete',
1286 1290 action_data={'user': user_data, 'old_data': old_values},
1287 1291 user=self._rhodecode_user)
1288 1292
1289 1293 Session().commit()
1290 1294 c.active = 'user_groups_management'
1291 1295 h.flash(_("Groups successfully changed"), category='success')
1292 1296
1293 1297 return HTTPFound(h.route_path(
1294 1298 'edit_user_groups_management', user_id=user_id))
1295 1299
1296 1300 @LoginRequired()
1297 1301 @HasPermissionAllDecorator('hg.admin')
1298 1302 @view_config(
1299 1303 route_name='edit_user_audit_logs', request_method='GET',
1300 1304 renderer='rhodecode:templates/admin/users/user_edit.mako')
1301 1305 def user_audit_logs(self):
1302 1306 _ = self.request.translate
1303 1307 c = self.load_default_context()
1304 1308 c.user = self.db_user
1305 1309
1306 1310 c.active = 'audit'
1307 1311
1308 1312 p = safe_int(self.request.GET.get('page', 1), 1)
1309 1313
1310 1314 filter_term = self.request.GET.get('filter')
1311 1315 user_log = UserModel().get_user_log(c.user, filter_term)
1312 1316
1313 1317 def url_generator(page_num):
1314 1318 query_params = {
1315 1319 'page': page_num
1316 1320 }
1317 1321 if filter_term:
1318 1322 query_params['filter'] = filter_term
1319 1323 return self.request.current_route_path(_query=query_params)
1320 1324
1321 1325 c.audit_logs = SqlPage(
1322 1326 user_log, page=p, items_per_page=10, url_maker=url_generator)
1323 1327 c.filter_term = filter_term
1324 1328 return self._get_template_context(c)
1325 1329
1326 1330 @LoginRequired()
1327 1331 @HasPermissionAllDecorator('hg.admin')
1328 1332 @view_config(
1329 1333 route_name='edit_user_audit_logs_download', request_method='GET',
1330 1334 renderer='string')
1331 1335 def user_audit_logs_download(self):
1332 1336 _ = self.request.translate
1333 1337 c = self.load_default_context()
1334 1338 c.user = self.db_user
1335 1339
1336 1340 user_log = UserModel().get_user_log(c.user, filter_term=None)
1337 1341
1338 1342 audit_log_data = {}
1339 1343 for entry in user_log:
1340 1344 audit_log_data[entry.user_log_id] = entry.get_dict()
1341 1345
1342 1346 response = Response(json.dumps(audit_log_data, indent=4))
1343 1347 response.content_disposition = str(
1344 1348 'attachment; filename=%s' % 'user_{}_audit_logs.json'.format(c.user.user_id))
1345 1349 response.content_type = 'application/json'
1346 1350
1347 1351 return response
1348 1352
1349 1353 @LoginRequired()
1350 1354 @HasPermissionAllDecorator('hg.admin')
1351 1355 @view_config(
1352 1356 route_name='edit_user_perms_summary', request_method='GET',
1353 1357 renderer='rhodecode:templates/admin/users/user_edit.mako')
1354 1358 def user_perms_summary(self):
1355 1359 _ = self.request.translate
1356 1360 c = self.load_default_context()
1357 1361 c.user = self.db_user
1358 1362
1359 1363 c.active = 'perms_summary'
1360 1364 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1361 1365
1362 1366 return self._get_template_context(c)
1363 1367
1364 1368 @LoginRequired()
1365 1369 @HasPermissionAllDecorator('hg.admin')
1366 1370 @view_config(
1367 1371 route_name='edit_user_perms_summary_json', request_method='GET',
1368 1372 renderer='json_ext')
1369 1373 def user_perms_summary_json(self):
1370 1374 self.load_default_context()
1371 1375 perm_user = self.db_user.AuthUser(ip_addr=self.request.remote_addr)
1372 1376
1373 1377 return perm_user.permissions
1374 1378
1375 1379 @LoginRequired()
1376 1380 @HasPermissionAllDecorator('hg.admin')
1377 1381 @view_config(
1378 1382 route_name='edit_user_caches', request_method='GET',
1379 1383 renderer='rhodecode:templates/admin/users/user_edit.mako')
1380 1384 def user_caches(self):
1381 1385 _ = self.request.translate
1382 1386 c = self.load_default_context()
1383 1387 c.user = self.db_user
1384 1388
1385 1389 c.active = 'caches'
1386 1390 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1387 1391
1388 1392 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1389 1393 c.region = rc_cache.get_or_create_region('cache_perms', cache_namespace_uid)
1390 1394 c.backend = c.region.backend
1391 1395 c.user_keys = sorted(c.region.backend.list_keys(prefix=cache_namespace_uid))
1392 1396
1393 1397 return self._get_template_context(c)
1394 1398
1395 1399 @LoginRequired()
1396 1400 @HasPermissionAllDecorator('hg.admin')
1397 1401 @CSRFRequired()
1398 1402 @view_config(
1399 1403 route_name='edit_user_caches_update', request_method='POST')
1400 1404 def user_caches_update(self):
1401 1405 _ = self.request.translate
1402 1406 c = self.load_default_context()
1403 1407 c.user = self.db_user
1404 1408
1405 1409 c.active = 'caches'
1406 1410 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1407 1411
1408 1412 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1409 1413 del_keys = rc_cache.clear_cache_namespace('cache_perms', cache_namespace_uid)
1410 1414
1411 1415 h.flash(_("Deleted {} cache keys").format(del_keys), category='success')
1412 1416
1413 1417 return HTTPFound(h.route_path(
1414 1418 'edit_user_caches', user_id=c.user.user_id))
@@ -1,486 +1,486 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import os
22 22 import logging
23 23 import datetime
24 24
25 25 from pyramid.view import view_config
26 26 from pyramid.renderers import render_to_response
27 27 from rhodecode.apps._base import BaseAppView
28 28 from rhodecode.lib.celerylib import run_task, tasks
29 29 from rhodecode.lib.utils2 import AttributeDict
30 30 from rhodecode.model.db import User
31 31 from rhodecode.model.notification import EmailNotificationModel
32 32
33 33 log = logging.getLogger(__name__)
34 34
35 35
36 36 class DebugStyleView(BaseAppView):
37 37
38 38 def load_default_context(self):
39 39 c = self._get_local_tmpl_context()
40 40
41 41 return c
42 42
43 43 @view_config(
44 44 route_name='debug_style_home', request_method='GET',
45 45 renderer=None)
46 46 def index(self):
47 47 c = self.load_default_context()
48 48 c.active = 'index'
49 49
50 50 return render_to_response(
51 51 'debug_style/index.html', self._get_template_context(c),
52 52 request=self.request)
53 53
54 54 @view_config(
55 55 route_name='debug_style_email', request_method='GET',
56 56 renderer=None)
57 57 @view_config(
58 58 route_name='debug_style_email_plain_rendered', request_method='GET',
59 59 renderer=None)
60 60 def render_email(self):
61 61 c = self.load_default_context()
62 62 email_id = self.request.matchdict['email_id']
63 63 c.active = 'emails'
64 64
65 65 pr = AttributeDict(
66 66 pull_request_id=123,
67 67 title='digital_ocean: fix redis, elastic search start on boot, '
68 68 'fix fd limits on supervisor, set postgres 11 version',
69 69 description='''
70 70 Check if we should use full-topic or mini-topic.
71 71
72 72 - full topic produces some problems with merge states etc
73 73 - server-mini-topic needs probably tweeks.
74 74 ''',
75 75 repo_name='foobar',
76 76 source_ref_parts=AttributeDict(type='branch', name='fix-ticket-2000'),
77 77 target_ref_parts=AttributeDict(type='branch', name='master'),
78 78 )
79 79
80 80 target_repo = AttributeDict(repo_name='repo_group/target_repo')
81 81 source_repo = AttributeDict(repo_name='repo_group/source_repo')
82 82 user = User.get_by_username(self.request.GET.get('user')) or self._rhodecode_db_user
83 83 # file/commit changes for PR update
84 84 commit_changes = AttributeDict({
85 85 'added': ['aaaaaaabbbbb', 'cccccccddddddd'],
86 86 'removed': ['eeeeeeeeeee'],
87 87 })
88 88
89 89 file_changes = AttributeDict({
90 90 'added': ['a/file1.md', 'file2.py'],
91 91 'modified': ['b/modified_file.rst'],
92 92 'removed': ['.idea'],
93 93 })
94 94
95 95 exc_traceback = {
96 96 'exc_utc_date': '2020-03-26T12:54:50.683281',
97 97 'exc_id': 139638856342656,
98 98 'exc_timestamp': '1585227290.683288',
99 99 'version': 'v1',
100 100 'exc_message': 'Traceback (most recent call last):\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/tweens.py", line 41, in excview_tween\n response = handler(request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/router.py", line 148, in handle_request\n registry, request, context, context_iface, view_name\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/view.py", line 667, in _call_view\n response = view_callable(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 188, in attr_view\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 214, in predicate_wrapper\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 401, in viewresult_to_response\n result = view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 132, in _class_view\n response = getattr(inst, attr)()\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/apps/debug_style/views.py", line 355, in render_email\n template_type, **email_kwargs.get(email_id, {}))\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/model/notification.py", line 402, in render_email\n body = email_template.render(None, **_kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 95, in render\n return self._render_with_exc(tmpl, args, kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 79, in _render_with_exc\n return render_func.render(*args, **kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/template.py", line 476, in render\n return runtime._render(self, self.callable_, args, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 883, in _render\n **_kwargs_for_callable(callable_, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 920, in _render_context\n _exec_template(inherit, lclcontext, args=args, kwargs=kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 947, in _exec_template\n callable_(context, *args, **kwargs)\n File "rhodecode_templates_email_templates_base_mako", line 63, in render_body\n File "rhodecode_templates_email_templates_exception_tracker_mako", line 43, in render_body\nAttributeError: \'str\' object has no attribute \'get\'\n',
101 101 'exc_type': 'AttributeError'
102 102 }
103 103
104 104 email_kwargs = {
105 105 'test': {},
106 106
107 107 'message': {
108 108 'body': 'message body !'
109 109 },
110 110
111 111 'email_test': {
112 112 'user': user,
113 113 'date': datetime.datetime.now(),
114 114 },
115 115
116 116 'exception': {
117 117 'email_prefix': '[RHODECODE ERROR]',
118 118 'exc_id': exc_traceback['exc_id'],
119 119 'exc_url': 'http://server-url/{}'.format(exc_traceback['exc_id']),
120 120 'exc_type_name': 'NameError',
121 121 'exc_traceback': exc_traceback,
122 122 },
123 123
124 124 'password_reset': {
125 125 'password_reset_url': 'http://example.com/reset-rhodecode-password/token',
126 126
127 127 'user': user,
128 128 'date': datetime.datetime.now(),
129 129 'email': 'test@rhodecode.com',
130 130 'first_admin_email': User.get_first_super_admin().email
131 131 },
132 132
133 133 'password_reset_confirmation': {
134 134 'new_password': 'new-password-example',
135 135 'user': user,
136 136 'date': datetime.datetime.now(),
137 137 'email': 'test@rhodecode.com',
138 138 'first_admin_email': User.get_first_super_admin().email
139 139 },
140 140
141 141 'registration': {
142 142 'user': user,
143 143 'date': datetime.datetime.now(),
144 144 },
145 145
146 146 'pull_request_comment': {
147 147 'user': user,
148 148
149 149 'status_change': None,
150 150 'status_change_type': None,
151 151
152 152 'pull_request': pr,
153 153 'pull_request_commits': [],
154 154
155 155 'pull_request_target_repo': target_repo,
156 156 'pull_request_target_repo_url': 'http://target-repo/url',
157 157
158 158 'pull_request_source_repo': source_repo,
159 159 'pull_request_source_repo_url': 'http://source-repo/url',
160 160
161 161 'pull_request_url': 'http://localhost/pr1',
162 162 'pr_comment_url': 'http://comment-url',
163 163 'pr_comment_reply_url': 'http://comment-url#reply',
164 164
165 165 'comment_file': None,
166 166 'comment_line': None,
167 167 'comment_type': 'note',
168 168 'comment_body': 'This is my comment body. *I like !*',
169 169 'comment_id': 2048,
170 170 'renderer_type': 'markdown',
171 171 'mention': True,
172 172
173 173 },
174 174
175 175 'pull_request_comment+status': {
176 176 'user': user,
177 177
178 178 'status_change': 'approved',
179 179 'status_change_type': 'approved',
180 180
181 181 'pull_request': pr,
182 182 'pull_request_commits': [],
183 183
184 184 'pull_request_target_repo': target_repo,
185 185 'pull_request_target_repo_url': 'http://target-repo/url',
186 186
187 187 'pull_request_source_repo': source_repo,
188 188 'pull_request_source_repo_url': 'http://source-repo/url',
189 189
190 190 'pull_request_url': 'http://localhost/pr1',
191 191 'pr_comment_url': 'http://comment-url',
192 192 'pr_comment_reply_url': 'http://comment-url#reply',
193 193
194 194 'comment_type': 'todo',
195 195 'comment_file': None,
196 196 'comment_line': None,
197 197 'comment_body': '''
198 198 I think something like this would be better
199 199
200 200 ```py
201 201 // markdown renderer
202 202
203 203 def db():
204 204 global connection
205 205 return connection
206 206
207 207 ```
208 208
209 209 ''',
210 210 'comment_id': 2048,
211 211 'renderer_type': 'markdown',
212 212 'mention': True,
213 213
214 214 },
215 215
216 216 'pull_request_comment+file': {
217 217 'user': user,
218 218
219 219 'status_change': None,
220 220 'status_change_type': None,
221 221
222 222 'pull_request': pr,
223 223 'pull_request_commits': [],
224 224
225 225 'pull_request_target_repo': target_repo,
226 226 'pull_request_target_repo_url': 'http://target-repo/url',
227 227
228 228 'pull_request_source_repo': source_repo,
229 229 'pull_request_source_repo_url': 'http://source-repo/url',
230 230
231 231 'pull_request_url': 'http://localhost/pr1',
232 232
233 233 'pr_comment_url': 'http://comment-url',
234 234 'pr_comment_reply_url': 'http://comment-url#reply',
235 235
236 236 'comment_file': 'rhodecode/model/get_flow_commits',
237 237 'comment_line': 'o1210',
238 238 'comment_type': 'todo',
239 239 'comment_body': '''
240 240 I like this !
241 241
242 242 But please check this code
243 243
244 244 .. code-block:: javascript
245 245
246 246 // THIS IS RST CODE
247 247
248 248 this.createResolutionComment = function(commentId) {
249 249 // hide the trigger text
250 250 $('#resolve-comment-{0}'.format(commentId)).hide();
251 251
252 252 var comment = $('#comment-'+commentId);
253 253 var commentData = comment.data();
254 254 if (commentData.commentInline) {
255 this.createComment(comment, commentId)
255 this.createComment(comment, f_path, line_no, commentId)
256 256 } else {
257 257 Rhodecode.comments.createGeneralComment('general', "$placeholder", commentId)
258 258 }
259 259
260 260 return false;
261 261 };
262 262
263 263 This should work better !
264 264 ''',
265 265 'comment_id': 2048,
266 266 'renderer_type': 'rst',
267 267 'mention': True,
268 268
269 269 },
270 270
271 271 'pull_request_update': {
272 272 'updating_user': user,
273 273
274 274 'status_change': None,
275 275 'status_change_type': None,
276 276
277 277 'pull_request': pr,
278 278 'pull_request_commits': [],
279 279
280 280 'pull_request_target_repo': target_repo,
281 281 'pull_request_target_repo_url': 'http://target-repo/url',
282 282
283 283 'pull_request_source_repo': source_repo,
284 284 'pull_request_source_repo_url': 'http://source-repo/url',
285 285
286 286 'pull_request_url': 'http://localhost/pr1',
287 287
288 288 # update comment links
289 289 'pr_comment_url': 'http://comment-url',
290 290 'pr_comment_reply_url': 'http://comment-url#reply',
291 291 'ancestor_commit_id': 'f39bd443',
292 292 'added_commits': commit_changes.added,
293 293 'removed_commits': commit_changes.removed,
294 294 'changed_files': (file_changes.added + file_changes.modified + file_changes.removed),
295 295 'added_files': file_changes.added,
296 296 'modified_files': file_changes.modified,
297 297 'removed_files': file_changes.removed,
298 298 },
299 299
300 300 'cs_comment': {
301 301 'user': user,
302 302 'commit': AttributeDict(idx=123, raw_id='a'*40, message='Commit message'),
303 303 'status_change': None,
304 304 'status_change_type': None,
305 305
306 306 'commit_target_repo_url': 'http://foo.example.com/#comment1',
307 307 'repo_name': 'test-repo',
308 308 'comment_type': 'note',
309 309 'comment_file': None,
310 310 'comment_line': None,
311 311 'commit_comment_url': 'http://comment-url',
312 312 'commit_comment_reply_url': 'http://comment-url#reply',
313 313 'comment_body': 'This is my comment body. *I like !*',
314 314 'comment_id': 2048,
315 315 'renderer_type': 'markdown',
316 316 'mention': True,
317 317 },
318 318
319 319 'cs_comment+status': {
320 320 'user': user,
321 321 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
322 322 'status_change': 'approved',
323 323 'status_change_type': 'approved',
324 324
325 325 'commit_target_repo_url': 'http://foo.example.com/#comment1',
326 326 'repo_name': 'test-repo',
327 327 'comment_type': 'note',
328 328 'comment_file': None,
329 329 'comment_line': None,
330 330 'commit_comment_url': 'http://comment-url',
331 331 'commit_comment_reply_url': 'http://comment-url#reply',
332 332 'comment_body': '''
333 333 Hello **world**
334 334
335 335 This is a multiline comment :)
336 336
337 337 - list
338 338 - list2
339 339 ''',
340 340 'comment_id': 2048,
341 341 'renderer_type': 'markdown',
342 342 'mention': True,
343 343 },
344 344
345 345 'cs_comment+file': {
346 346 'user': user,
347 347 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
348 348 'status_change': None,
349 349 'status_change_type': None,
350 350
351 351 'commit_target_repo_url': 'http://foo.example.com/#comment1',
352 352 'repo_name': 'test-repo',
353 353
354 354 'comment_type': 'note',
355 355 'comment_file': 'test-file.py',
356 356 'comment_line': 'n100',
357 357
358 358 'commit_comment_url': 'http://comment-url',
359 359 'commit_comment_reply_url': 'http://comment-url#reply',
360 360 'comment_body': 'This is my comment body. *I like !*',
361 361 'comment_id': 2048,
362 362 'renderer_type': 'markdown',
363 363 'mention': True,
364 364 },
365 365
366 366 'pull_request': {
367 367 'user': user,
368 368 'pull_request': pr,
369 369 'pull_request_commits': [
370 370 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
371 371 my-account: moved email closer to profile as it's similar data just moved outside.
372 372 '''),
373 373 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
374 374 users: description edit fixes
375 375
376 376 - tests
377 377 - added metatags info
378 378 '''),
379 379 ],
380 380
381 381 'pull_request_target_repo': target_repo,
382 382 'pull_request_target_repo_url': 'http://target-repo/url',
383 383
384 384 'pull_request_source_repo': source_repo,
385 385 'pull_request_source_repo_url': 'http://source-repo/url',
386 386
387 387 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
388 388 'user_role': 'reviewer',
389 389 },
390 390
391 391 'pull_request+reviewer_role': {
392 392 'user': user,
393 393 'pull_request': pr,
394 394 'pull_request_commits': [
395 395 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
396 396 my-account: moved email closer to profile as it's similar data just moved outside.
397 397 '''),
398 398 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
399 399 users: description edit fixes
400 400
401 401 - tests
402 402 - added metatags info
403 403 '''),
404 404 ],
405 405
406 406 'pull_request_target_repo': target_repo,
407 407 'pull_request_target_repo_url': 'http://target-repo/url',
408 408
409 409 'pull_request_source_repo': source_repo,
410 410 'pull_request_source_repo_url': 'http://source-repo/url',
411 411
412 412 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
413 413 'user_role': 'reviewer',
414 414 },
415 415
416 416 'pull_request+observer_role': {
417 417 'user': user,
418 418 'pull_request': pr,
419 419 'pull_request_commits': [
420 420 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
421 421 my-account: moved email closer to profile as it's similar data just moved outside.
422 422 '''),
423 423 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
424 424 users: description edit fixes
425 425
426 426 - tests
427 427 - added metatags info
428 428 '''),
429 429 ],
430 430
431 431 'pull_request_target_repo': target_repo,
432 432 'pull_request_target_repo_url': 'http://target-repo/url',
433 433
434 434 'pull_request_source_repo': source_repo,
435 435 'pull_request_source_repo_url': 'http://source-repo/url',
436 436
437 437 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
438 438 'user_role': 'observer'
439 439 }
440 440 }
441 441
442 442 template_type = email_id.split('+')[0]
443 443 (c.subject, c.email_body, c.email_body_plaintext) = EmailNotificationModel().render_email(
444 444 template_type, **email_kwargs.get(email_id, {}))
445 445
446 446 test_email = self.request.GET.get('email')
447 447 if test_email:
448 448 recipients = [test_email]
449 449 run_task(tasks.send_email, recipients, c.subject,
450 450 c.email_body_plaintext, c.email_body)
451 451
452 452 if self.request.matched_route.name == 'debug_style_email_plain_rendered':
453 453 template = 'debug_style/email_plain_rendered.mako'
454 454 else:
455 455 template = 'debug_style/email.mako'
456 456 return render_to_response(
457 457 template, self._get_template_context(c),
458 458 request=self.request)
459 459
460 460 @view_config(
461 461 route_name='debug_style_template', request_method='GET',
462 462 renderer=None)
463 463 def template(self):
464 464 t_path = self.request.matchdict['t_path']
465 465 c = self.load_default_context()
466 466 c.active = os.path.splitext(t_path)[0]
467 467 c.came_from = ''
468 468 # NOTE(marcink): extend the email types with variations based on data sets
469 469 c.email_types = {
470 470 'cs_comment+file': {},
471 471 'cs_comment+status': {},
472 472
473 473 'pull_request_comment+file': {},
474 474 'pull_request_comment+status': {},
475 475
476 476 'pull_request_update': {},
477 477
478 478 'pull_request+reviewer_role': {},
479 479 'pull_request+observer_role': {},
480 480 }
481 481 c.email_types.update(EmailNotificationModel.email_types)
482 482
483 483 return render_to_response(
484 484 'debug_style/' + t_path, self._get_template_context(c),
485 485 request=self.request)
486 486
@@ -1,822 +1,826 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import datetime
23 23 import string
24 24
25 25 import formencode
26 26 import formencode.htmlfill
27 27 import peppercorn
28 28 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
29 29 from pyramid.view import view_config
30 30
31 31 from rhodecode.apps._base import BaseAppView, DataGridAppView
32 32 from rhodecode import forms
33 33 from rhodecode.lib import helpers as h
34 34 from rhodecode.lib import audit_logger
35 35 from rhodecode.lib.ext_json import json
36 36 from rhodecode.lib.auth import (
37 37 LoginRequired, NotAnonymous, CSRFRequired,
38 38 HasRepoPermissionAny, HasRepoGroupPermissionAny, AuthUser)
39 39 from rhodecode.lib.channelstream import (
40 40 channelstream_request, ChannelstreamException)
41 41 from rhodecode.lib.utils2 import safe_int, md5, str2bool
42 42 from rhodecode.model.auth_token import AuthTokenModel
43 43 from rhodecode.model.comment import CommentsModel
44 44 from rhodecode.model.db import (
45 45 IntegrityError, or_, in_filter_generator,
46 46 Repository, UserEmailMap, UserApiKeys, UserFollowing,
47 47 PullRequest, UserBookmark, RepoGroup)
48 48 from rhodecode.model.meta import Session
49 49 from rhodecode.model.pull_request import PullRequestModel
50 50 from rhodecode.model.user import UserModel
51 51 from rhodecode.model.user_group import UserGroupModel
52 52 from rhodecode.model.validation_schema.schemas import user_schema
53 53
54 54 log = logging.getLogger(__name__)
55 55
56 56
57 57 class MyAccountView(BaseAppView, DataGridAppView):
58 58 ALLOW_SCOPED_TOKENS = False
59 59 """
60 60 This view has alternative version inside EE, if modified please take a look
61 61 in there as well.
62 62 """
63 63
64 64 def load_default_context(self):
65 65 c = self._get_local_tmpl_context()
66 66 c.user = c.auth_user.get_instance()
67 67 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
68 68
69 69 return c
70 70
71 71 @LoginRequired()
72 72 @NotAnonymous()
73 73 @view_config(
74 74 route_name='my_account_profile', request_method='GET',
75 75 renderer='rhodecode:templates/admin/my_account/my_account.mako')
76 76 def my_account_profile(self):
77 77 c = self.load_default_context()
78 78 c.active = 'profile'
79 79 c.extern_type = c.user.extern_type
80 80 return self._get_template_context(c)
81 81
82 82 @LoginRequired()
83 83 @NotAnonymous()
84 84 @view_config(
85 85 route_name='my_account_password', request_method='GET',
86 86 renderer='rhodecode:templates/admin/my_account/my_account.mako')
87 87 def my_account_password(self):
88 88 c = self.load_default_context()
89 89 c.active = 'password'
90 90 c.extern_type = c.user.extern_type
91 91
92 92 schema = user_schema.ChangePasswordSchema().bind(
93 93 username=c.user.username)
94 94
95 95 form = forms.Form(
96 96 schema,
97 97 action=h.route_path('my_account_password_update'),
98 98 buttons=(forms.buttons.save, forms.buttons.reset))
99 99
100 100 c.form = form
101 101 return self._get_template_context(c)
102 102
103 103 @LoginRequired()
104 104 @NotAnonymous()
105 105 @CSRFRequired()
106 106 @view_config(
107 107 route_name='my_account_password_update', request_method='POST',
108 108 renderer='rhodecode:templates/admin/my_account/my_account.mako')
109 109 def my_account_password_update(self):
110 110 _ = self.request.translate
111 111 c = self.load_default_context()
112 112 c.active = 'password'
113 113 c.extern_type = c.user.extern_type
114 114
115 115 schema = user_schema.ChangePasswordSchema().bind(
116 116 username=c.user.username)
117 117
118 118 form = forms.Form(
119 119 schema, buttons=(forms.buttons.save, forms.buttons.reset))
120 120
121 121 if c.extern_type != 'rhodecode':
122 122 raise HTTPFound(self.request.route_path('my_account_password'))
123 123
124 124 controls = self.request.POST.items()
125 125 try:
126 126 valid_data = form.validate(controls)
127 127 UserModel().update_user(c.user.user_id, **valid_data)
128 128 c.user.update_userdata(force_password_change=False)
129 129 Session().commit()
130 130 except forms.ValidationFailure as e:
131 131 c.form = e
132 132 return self._get_template_context(c)
133 133
134 134 except Exception:
135 135 log.exception("Exception updating password")
136 136 h.flash(_('Error occurred during update of user password'),
137 137 category='error')
138 138 else:
139 139 instance = c.auth_user.get_instance()
140 140 self.session.setdefault('rhodecode_user', {}).update(
141 141 {'password': md5(instance.password)})
142 142 self.session.save()
143 143 h.flash(_("Successfully updated password"), category='success')
144 144
145 145 raise HTTPFound(self.request.route_path('my_account_password'))
146 146
147 147 @LoginRequired()
148 148 @NotAnonymous()
149 149 @view_config(
150 150 route_name='my_account_auth_tokens', request_method='GET',
151 151 renderer='rhodecode:templates/admin/my_account/my_account.mako')
152 152 def my_account_auth_tokens(self):
153 153 _ = self.request.translate
154 154
155 155 c = self.load_default_context()
156 156 c.active = 'auth_tokens'
157 157 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
158 158 c.role_values = [
159 159 (x, AuthTokenModel.cls._get_role_name(x))
160 160 for x in AuthTokenModel.cls.ROLES]
161 161 c.role_options = [(c.role_values, _("Role"))]
162 162 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
163 163 c.user.user_id, show_expired=True)
164 164 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
165 165 return self._get_template_context(c)
166 166
167 167 @LoginRequired()
168 168 @NotAnonymous()
169 169 @CSRFRequired()
170 170 @view_config(
171 171 route_name='my_account_auth_tokens_view', request_method='POST', xhr=True,
172 172 renderer='json_ext')
173 173 def my_account_auth_tokens_view(self):
174 174 _ = self.request.translate
175 175 c = self.load_default_context()
176 176
177 177 auth_token_id = self.request.POST.get('auth_token_id')
178 178
179 179 if auth_token_id:
180 180 token = UserApiKeys.get_or_404(auth_token_id)
181 181 if token.user.user_id != c.user.user_id:
182 182 raise HTTPNotFound()
183 183
184 184 return {
185 185 'auth_token': token.api_key
186 186 }
187 187
188 188 def maybe_attach_token_scope(self, token):
189 189 # implemented in EE edition
190 190 pass
191 191
192 192 @LoginRequired()
193 193 @NotAnonymous()
194 194 @CSRFRequired()
195 195 @view_config(
196 196 route_name='my_account_auth_tokens_add', request_method='POST',)
197 197 def my_account_auth_tokens_add(self):
198 198 _ = self.request.translate
199 199 c = self.load_default_context()
200 200
201 201 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
202 202 description = self.request.POST.get('description')
203 203 role = self.request.POST.get('role')
204 204
205 205 token = UserModel().add_auth_token(
206 206 user=c.user.user_id,
207 207 lifetime_minutes=lifetime, role=role, description=description,
208 208 scope_callback=self.maybe_attach_token_scope)
209 209 token_data = token.get_api_data()
210 210
211 211 audit_logger.store_web(
212 212 'user.edit.token.add', action_data={
213 213 'data': {'token': token_data, 'user': 'self'}},
214 214 user=self._rhodecode_user, )
215 215 Session().commit()
216 216
217 217 h.flash(_("Auth token successfully created"), category='success')
218 218 return HTTPFound(h.route_path('my_account_auth_tokens'))
219 219
220 220 @LoginRequired()
221 221 @NotAnonymous()
222 222 @CSRFRequired()
223 223 @view_config(
224 224 route_name='my_account_auth_tokens_delete', request_method='POST')
225 225 def my_account_auth_tokens_delete(self):
226 226 _ = self.request.translate
227 227 c = self.load_default_context()
228 228
229 229 del_auth_token = self.request.POST.get('del_auth_token')
230 230
231 231 if del_auth_token:
232 232 token = UserApiKeys.get_or_404(del_auth_token)
233 233 token_data = token.get_api_data()
234 234
235 235 AuthTokenModel().delete(del_auth_token, c.user.user_id)
236 236 audit_logger.store_web(
237 237 'user.edit.token.delete', action_data={
238 238 'data': {'token': token_data, 'user': 'self'}},
239 239 user=self._rhodecode_user,)
240 240 Session().commit()
241 241 h.flash(_("Auth token successfully deleted"), category='success')
242 242
243 243 return HTTPFound(h.route_path('my_account_auth_tokens'))
244 244
245 245 @LoginRequired()
246 246 @NotAnonymous()
247 247 @view_config(
248 248 route_name='my_account_emails', request_method='GET',
249 249 renderer='rhodecode:templates/admin/my_account/my_account.mako')
250 250 def my_account_emails(self):
251 251 _ = self.request.translate
252 252
253 253 c = self.load_default_context()
254 254 c.active = 'emails'
255 255
256 256 c.user_email_map = UserEmailMap.query()\
257 257 .filter(UserEmailMap.user == c.user).all()
258 258
259 259 schema = user_schema.AddEmailSchema().bind(
260 260 username=c.user.username, user_emails=c.user.emails)
261 261
262 262 form = forms.RcForm(schema,
263 263 action=h.route_path('my_account_emails_add'),
264 264 buttons=(forms.buttons.save, forms.buttons.reset))
265 265
266 266 c.form = form
267 267 return self._get_template_context(c)
268 268
269 269 @LoginRequired()
270 270 @NotAnonymous()
271 271 @CSRFRequired()
272 272 @view_config(
273 273 route_name='my_account_emails_add', request_method='POST',
274 274 renderer='rhodecode:templates/admin/my_account/my_account.mako')
275 275 def my_account_emails_add(self):
276 276 _ = self.request.translate
277 277 c = self.load_default_context()
278 278 c.active = 'emails'
279 279
280 280 schema = user_schema.AddEmailSchema().bind(
281 281 username=c.user.username, user_emails=c.user.emails)
282 282
283 283 form = forms.RcForm(
284 284 schema, action=h.route_path('my_account_emails_add'),
285 285 buttons=(forms.buttons.save, forms.buttons.reset))
286 286
287 287 controls = self.request.POST.items()
288 288 try:
289 289 valid_data = form.validate(controls)
290 290 UserModel().add_extra_email(c.user.user_id, valid_data['email'])
291 291 audit_logger.store_web(
292 292 'user.edit.email.add', action_data={
293 293 'data': {'email': valid_data['email'], 'user': 'self'}},
294 294 user=self._rhodecode_user,)
295 295 Session().commit()
296 296 except formencode.Invalid as error:
297 297 h.flash(h.escape(error.error_dict['email']), category='error')
298 298 except forms.ValidationFailure as e:
299 299 c.user_email_map = UserEmailMap.query() \
300 300 .filter(UserEmailMap.user == c.user).all()
301 301 c.form = e
302 302 return self._get_template_context(c)
303 303 except Exception:
304 304 log.exception("Exception adding email")
305 305 h.flash(_('Error occurred during adding email'),
306 306 category='error')
307 307 else:
308 308 h.flash(_("Successfully added email"), category='success')
309 309
310 310 raise HTTPFound(self.request.route_path('my_account_emails'))
311 311
312 312 @LoginRequired()
313 313 @NotAnonymous()
314 314 @CSRFRequired()
315 315 @view_config(
316 316 route_name='my_account_emails_delete', request_method='POST')
317 317 def my_account_emails_delete(self):
318 318 _ = self.request.translate
319 319 c = self.load_default_context()
320 320
321 321 del_email_id = self.request.POST.get('del_email_id')
322 322 if del_email_id:
323 323 email = UserEmailMap.get_or_404(del_email_id).email
324 324 UserModel().delete_extra_email(c.user.user_id, del_email_id)
325 325 audit_logger.store_web(
326 326 'user.edit.email.delete', action_data={
327 327 'data': {'email': email, 'user': 'self'}},
328 328 user=self._rhodecode_user,)
329 329 Session().commit()
330 330 h.flash(_("Email successfully deleted"),
331 331 category='success')
332 332 return HTTPFound(h.route_path('my_account_emails'))
333 333
334 334 @LoginRequired()
335 335 @NotAnonymous()
336 336 @CSRFRequired()
337 337 @view_config(
338 338 route_name='my_account_notifications_test_channelstream',
339 339 request_method='POST', renderer='json_ext')
340 340 def my_account_notifications_test_channelstream(self):
341 341 message = 'Test message sent via Channelstream by user: {}, on {}'.format(
342 342 self._rhodecode_user.username, datetime.datetime.now())
343 343 payload = {
344 344 # 'channel': 'broadcast',
345 345 'type': 'message',
346 346 'timestamp': datetime.datetime.utcnow(),
347 347 'user': 'system',
348 348 'pm_users': [self._rhodecode_user.username],
349 349 'message': {
350 350 'message': message,
351 351 'level': 'info',
352 352 'topic': '/notifications'
353 353 }
354 354 }
355 355
356 356 registry = self.request.registry
357 357 rhodecode_plugins = getattr(registry, 'rhodecode_plugins', {})
358 358 channelstream_config = rhodecode_plugins.get('channelstream', {})
359 359
360 360 try:
361 361 channelstream_request(channelstream_config, [payload], '/message')
362 362 except ChannelstreamException as e:
363 363 log.exception('Failed to send channelstream data')
364 364 return {"response": 'ERROR: {}'.format(e.__class__.__name__)}
365 365 return {"response": 'Channelstream data sent. '
366 366 'You should see a new live message now.'}
367 367
368 368 def _load_my_repos_data(self, watched=False):
369 369
370 370 allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(AuthUser.repo_read_perms)
371 371
372 372 if watched:
373 373 # repos user watch
374 374 repo_list = Session().query(
375 375 Repository
376 376 ) \
377 377 .join(
378 378 (UserFollowing, UserFollowing.follows_repo_id == Repository.repo_id)
379 379 ) \
380 380 .filter(
381 381 UserFollowing.user_id == self._rhodecode_user.user_id
382 382 ) \
383 383 .filter(or_(
384 384 # generate multiple IN to fix limitation problems
385 385 *in_filter_generator(Repository.repo_id, allowed_ids))
386 386 ) \
387 387 .order_by(Repository.repo_name) \
388 388 .all()
389 389
390 390 else:
391 391 # repos user is owner of
392 392 repo_list = Session().query(
393 393 Repository
394 394 ) \
395 395 .filter(
396 396 Repository.user_id == self._rhodecode_user.user_id
397 397 ) \
398 398 .filter(or_(
399 399 # generate multiple IN to fix limitation problems
400 400 *in_filter_generator(Repository.repo_id, allowed_ids))
401 401 ) \
402 402 .order_by(Repository.repo_name) \
403 403 .all()
404 404
405 405 _render = self.request.get_partial_renderer(
406 406 'rhodecode:templates/data_table/_dt_elements.mako')
407 407
408 408 def repo_lnk(name, rtype, rstate, private, archived, fork_of):
409 409 return _render('repo_name', name, rtype, rstate, private, archived, fork_of,
410 410 short_name=False, admin=False)
411 411
412 412 repos_data = []
413 413 for repo in repo_list:
414 414 row = {
415 415 "name": repo_lnk(repo.repo_name, repo.repo_type, repo.repo_state,
416 416 repo.private, repo.archived, repo.fork),
417 417 "name_raw": repo.repo_name.lower(),
418 418 }
419 419
420 420 repos_data.append(row)
421 421
422 422 # json used to render the grid
423 423 return json.dumps(repos_data)
424 424
425 425 @LoginRequired()
426 426 @NotAnonymous()
427 427 @view_config(
428 428 route_name='my_account_repos', request_method='GET',
429 429 renderer='rhodecode:templates/admin/my_account/my_account.mako')
430 430 def my_account_repos(self):
431 431 c = self.load_default_context()
432 432 c.active = 'repos'
433 433
434 434 # json used to render the grid
435 435 c.data = self._load_my_repos_data()
436 436 return self._get_template_context(c)
437 437
438 438 @LoginRequired()
439 439 @NotAnonymous()
440 440 @view_config(
441 441 route_name='my_account_watched', request_method='GET',
442 442 renderer='rhodecode:templates/admin/my_account/my_account.mako')
443 443 def my_account_watched(self):
444 444 c = self.load_default_context()
445 445 c.active = 'watched'
446 446
447 447 # json used to render the grid
448 448 c.data = self._load_my_repos_data(watched=True)
449 449 return self._get_template_context(c)
450 450
451 451 @LoginRequired()
452 452 @NotAnonymous()
453 453 @view_config(
454 454 route_name='my_account_bookmarks', request_method='GET',
455 455 renderer='rhodecode:templates/admin/my_account/my_account.mako')
456 456 def my_account_bookmarks(self):
457 457 c = self.load_default_context()
458 458 c.active = 'bookmarks'
459 459 c.bookmark_items = UserBookmark.get_bookmarks_for_user(
460 460 self._rhodecode_db_user.user_id, cache=False)
461 461 return self._get_template_context(c)
462 462
463 463 def _process_bookmark_entry(self, entry, user_id):
464 464 position = safe_int(entry.get('position'))
465 465 cur_position = safe_int(entry.get('cur_position'))
466 466 if position is None:
467 467 return
468 468
469 469 # check if this is an existing entry
470 470 is_new = False
471 471 db_entry = UserBookmark().get_by_position_for_user(cur_position, user_id)
472 472
473 473 if db_entry and str2bool(entry.get('remove')):
474 474 log.debug('Marked bookmark %s for deletion', db_entry)
475 475 Session().delete(db_entry)
476 476 return
477 477
478 478 if not db_entry:
479 479 # new
480 480 db_entry = UserBookmark()
481 481 is_new = True
482 482
483 483 should_save = False
484 484 default_redirect_url = ''
485 485
486 486 # save repo
487 487 if entry.get('bookmark_repo') and safe_int(entry.get('bookmark_repo')):
488 488 repo = Repository.get(entry['bookmark_repo'])
489 489 perm_check = HasRepoPermissionAny(
490 490 'repository.read', 'repository.write', 'repository.admin')
491 491 if repo and perm_check(repo_name=repo.repo_name):
492 492 db_entry.repository = repo
493 493 should_save = True
494 494 default_redirect_url = '${repo_url}'
495 495 # save repo group
496 496 elif entry.get('bookmark_repo_group') and safe_int(entry.get('bookmark_repo_group')):
497 497 repo_group = RepoGroup.get(entry['bookmark_repo_group'])
498 498 perm_check = HasRepoGroupPermissionAny(
499 499 'group.read', 'group.write', 'group.admin')
500 500
501 501 if repo_group and perm_check(group_name=repo_group.group_name):
502 502 db_entry.repository_group = repo_group
503 503 should_save = True
504 504 default_redirect_url = '${repo_group_url}'
505 505 # save generic info
506 506 elif entry.get('title') and entry.get('redirect_url'):
507 507 should_save = True
508 508
509 509 if should_save:
510 510 # mark user and position
511 511 db_entry.user_id = user_id
512 512 db_entry.position = position
513 513 db_entry.title = entry.get('title')
514 514 db_entry.redirect_url = entry.get('redirect_url') or default_redirect_url
515 515 log.debug('Saving bookmark %s, new:%s', db_entry, is_new)
516 516
517 517 Session().add(db_entry)
518 518
519 519 @LoginRequired()
520 520 @NotAnonymous()
521 521 @CSRFRequired()
522 522 @view_config(
523 523 route_name='my_account_bookmarks_update', request_method='POST')
524 524 def my_account_bookmarks_update(self):
525 525 _ = self.request.translate
526 526 c = self.load_default_context()
527 527 c.active = 'bookmarks'
528 528
529 529 controls = peppercorn.parse(self.request.POST.items())
530 530 user_id = c.user.user_id
531 531
532 532 # validate positions
533 533 positions = {}
534 534 for entry in controls.get('bookmarks', []):
535 535 position = safe_int(entry['position'])
536 536 if position is None:
537 537 continue
538 538
539 539 if position in positions:
540 540 h.flash(_("Position {} is defined twice. "
541 541 "Please correct this error.").format(position), category='error')
542 542 return HTTPFound(h.route_path('my_account_bookmarks'))
543 543
544 544 entry['position'] = position
545 545 entry['cur_position'] = safe_int(entry.get('cur_position'))
546 546 positions[position] = entry
547 547
548 548 try:
549 549 for entry in positions.values():
550 550 self._process_bookmark_entry(entry, user_id)
551 551
552 552 Session().commit()
553 553 h.flash(_("Update Bookmarks"), category='success')
554 554 except IntegrityError:
555 555 h.flash(_("Failed to update bookmarks. "
556 556 "Make sure an unique position is used."), category='error')
557 557
558 558 return HTTPFound(h.route_path('my_account_bookmarks'))
559 559
560 560 @LoginRequired()
561 561 @NotAnonymous()
562 562 @view_config(
563 563 route_name='my_account_goto_bookmark', request_method='GET',
564 564 renderer='rhodecode:templates/admin/my_account/my_account.mako')
565 565 def my_account_goto_bookmark(self):
566 566
567 567 bookmark_id = self.request.matchdict['bookmark_id']
568 568 user_bookmark = UserBookmark().query()\
569 569 .filter(UserBookmark.user_id == self.request.user.user_id) \
570 570 .filter(UserBookmark.position == bookmark_id).scalar()
571 571
572 572 redirect_url = h.route_path('my_account_bookmarks')
573 573 if not user_bookmark:
574 574 raise HTTPFound(redirect_url)
575 575
576 576 # repository set
577 577 if user_bookmark.repository:
578 578 repo_name = user_bookmark.repository.repo_name
579 579 base_redirect_url = h.route_path(
580 580 'repo_summary', repo_name=repo_name)
581 581 if user_bookmark.redirect_url and \
582 582 '${repo_url}' in user_bookmark.redirect_url:
583 583 redirect_url = string.Template(user_bookmark.redirect_url)\
584 584 .safe_substitute({'repo_url': base_redirect_url})
585 585 else:
586 586 redirect_url = base_redirect_url
587 587 # repository group set
588 588 elif user_bookmark.repository_group:
589 589 repo_group_name = user_bookmark.repository_group.group_name
590 590 base_redirect_url = h.route_path(
591 591 'repo_group_home', repo_group_name=repo_group_name)
592 592 if user_bookmark.redirect_url and \
593 593 '${repo_group_url}' in user_bookmark.redirect_url:
594 594 redirect_url = string.Template(user_bookmark.redirect_url)\
595 595 .safe_substitute({'repo_group_url': base_redirect_url})
596 596 else:
597 597 redirect_url = base_redirect_url
598 598 # custom URL set
599 599 elif user_bookmark.redirect_url:
600 600 server_url = h.route_url('home').rstrip('/')
601 601 redirect_url = string.Template(user_bookmark.redirect_url) \
602 602 .safe_substitute({'server_url': server_url})
603 603
604 604 log.debug('Redirecting bookmark %s to %s', user_bookmark, redirect_url)
605 605 raise HTTPFound(redirect_url)
606 606
607 607 @LoginRequired()
608 608 @NotAnonymous()
609 609 @view_config(
610 610 route_name='my_account_perms', request_method='GET',
611 611 renderer='rhodecode:templates/admin/my_account/my_account.mako')
612 612 def my_account_perms(self):
613 613 c = self.load_default_context()
614 614 c.active = 'perms'
615 615
616 616 c.perm_user = c.auth_user
617 617 return self._get_template_context(c)
618 618
619 619 @LoginRequired()
620 620 @NotAnonymous()
621 621 @view_config(
622 622 route_name='my_account_notifications', request_method='GET',
623 623 renderer='rhodecode:templates/admin/my_account/my_account.mako')
624 624 def my_notifications(self):
625 625 c = self.load_default_context()
626 626 c.active = 'notifications'
627 627
628 628 return self._get_template_context(c)
629 629
630 630 @LoginRequired()
631 631 @NotAnonymous()
632 632 @CSRFRequired()
633 633 @view_config(
634 634 route_name='my_account_notifications_toggle_visibility',
635 635 request_method='POST', renderer='json_ext')
636 636 def my_notifications_toggle_visibility(self):
637 637 user = self._rhodecode_db_user
638 638 new_status = not user.user_data.get('notification_status', True)
639 639 user.update_userdata(notification_status=new_status)
640 640 Session().commit()
641 641 return user.user_data['notification_status']
642 642
643 643 @LoginRequired()
644 644 @NotAnonymous()
645 645 @view_config(
646 646 route_name='my_account_edit',
647 647 request_method='GET',
648 648 renderer='rhodecode:templates/admin/my_account/my_account.mako')
649 649 def my_account_edit(self):
650 650 c = self.load_default_context()
651 651 c.active = 'profile_edit'
652 652 c.extern_type = c.user.extern_type
653 653 c.extern_name = c.user.extern_name
654 654
655 655 schema = user_schema.UserProfileSchema().bind(
656 656 username=c.user.username, user_emails=c.user.emails)
657 657 appstruct = {
658 658 'username': c.user.username,
659 659 'email': c.user.email,
660 660 'firstname': c.user.firstname,
661 661 'lastname': c.user.lastname,
662 662 'description': c.user.description,
663 663 }
664 664 c.form = forms.RcForm(
665 665 schema, appstruct=appstruct,
666 666 action=h.route_path('my_account_update'),
667 667 buttons=(forms.buttons.save, forms.buttons.reset))
668 668
669 669 return self._get_template_context(c)
670 670
671 671 @LoginRequired()
672 672 @NotAnonymous()
673 673 @CSRFRequired()
674 674 @view_config(
675 675 route_name='my_account_update',
676 676 request_method='POST',
677 677 renderer='rhodecode:templates/admin/my_account/my_account.mako')
678 678 def my_account_update(self):
679 679 _ = self.request.translate
680 680 c = self.load_default_context()
681 681 c.active = 'profile_edit'
682 682 c.perm_user = c.auth_user
683 683 c.extern_type = c.user.extern_type
684 684 c.extern_name = c.user.extern_name
685 685
686 686 schema = user_schema.UserProfileSchema().bind(
687 687 username=c.user.username, user_emails=c.user.emails)
688 688 form = forms.RcForm(
689 689 schema, buttons=(forms.buttons.save, forms.buttons.reset))
690 690
691 691 controls = self.request.POST.items()
692 692 try:
693 693 valid_data = form.validate(controls)
694 694 skip_attrs = ['admin', 'active', 'extern_type', 'extern_name',
695 695 'new_password', 'password_confirmation']
696 696 if c.extern_type != "rhodecode":
697 697 # forbid updating username for external accounts
698 698 skip_attrs.append('username')
699 699 old_email = c.user.email
700 700 UserModel().update_user(
701 701 self._rhodecode_user.user_id, skip_attrs=skip_attrs,
702 702 **valid_data)
703 703 if old_email != valid_data['email']:
704 704 old = UserEmailMap.query() \
705 .filter(UserEmailMap.user == c.user).filter(UserEmailMap.email == valid_data['email']).first()
705 .filter(UserEmailMap.user == c.user)\
706 .filter(UserEmailMap.email == valid_data['email'])\
707 .first()
706 708 old.email = old_email
707 709 h.flash(_('Your account was updated successfully'), category='success')
708 710 Session().commit()
709 711 except forms.ValidationFailure as e:
710 712 c.form = e
711 713 return self._get_template_context(c)
712 714 except Exception:
713 715 log.exception("Exception updating user")
714 716 h.flash(_('Error occurred during update of user'),
715 717 category='error')
716 718 raise HTTPFound(h.route_path('my_account_profile'))
717 719
718 720 def _get_pull_requests_list(self, statuses):
719 721 draw, start, limit = self._extract_chunk(self.request)
720 722 search_q, order_by, order_dir = self._extract_ordering(self.request)
723
721 724 _render = self.request.get_partial_renderer(
722 725 'rhodecode:templates/data_table/_dt_elements.mako')
723 726
724 727 pull_requests = PullRequestModel().get_im_participating_in(
725 728 user_id=self._rhodecode_user.user_id,
726 729 statuses=statuses, query=search_q,
727 730 offset=start, length=limit, order_by=order_by,
728 731 order_dir=order_dir)
729 732
730 733 pull_requests_total_count = PullRequestModel().count_im_participating_in(
731 734 user_id=self._rhodecode_user.user_id, statuses=statuses, query=search_q)
732 735
733 736 data = []
734 737 comments_model = CommentsModel()
735 738 for pr in pull_requests:
736 739 repo_id = pr.target_repo_id
737 740 comments_count = comments_model.get_all_comments(
738 repo_id, pull_request=pr, count_only=True)
741 repo_id, pull_request=pr, include_drafts=False, count_only=True)
739 742 owned = pr.user_id == self._rhodecode_user.user_id
740 743
741 744 data.append({
742 745 'target_repo': _render('pullrequest_target_repo',
743 746 pr.target_repo.repo_name),
744 747 'name': _render('pullrequest_name',
745 748 pr.pull_request_id, pr.pull_request_state,
746 749 pr.work_in_progress, pr.target_repo.repo_name,
747 750 short=True),
748 751 'name_raw': pr.pull_request_id,
749 752 'status': _render('pullrequest_status',
750 753 pr.calculated_review_status()),
751 754 'title': _render('pullrequest_title', pr.title, pr.description),
752 755 'description': h.escape(pr.description),
753 756 'updated_on': _render('pullrequest_updated_on',
754 h.datetime_to_time(pr.updated_on)),
757 h.datetime_to_time(pr.updated_on),
758 pr.versions_count),
755 759 'updated_on_raw': h.datetime_to_time(pr.updated_on),
756 760 'created_on': _render('pullrequest_updated_on',
757 761 h.datetime_to_time(pr.created_on)),
758 762 'created_on_raw': h.datetime_to_time(pr.created_on),
759 763 'state': pr.pull_request_state,
760 764 'author': _render('pullrequest_author',
761 765 pr.author.full_contact, ),
762 766 'author_raw': pr.author.full_name,
763 767 'comments': _render('pullrequest_comments', comments_count),
764 768 'comments_raw': comments_count,
765 769 'closed': pr.is_closed(),
766 770 'owned': owned
767 771 })
768 772
769 773 # json used to render the grid
770 774 data = ({
771 775 'draw': draw,
772 776 'data': data,
773 777 'recordsTotal': pull_requests_total_count,
774 778 'recordsFiltered': pull_requests_total_count,
775 779 })
776 780 return data
777 781
778 782 @LoginRequired()
779 783 @NotAnonymous()
780 784 @view_config(
781 785 route_name='my_account_pullrequests',
782 786 request_method='GET',
783 787 renderer='rhodecode:templates/admin/my_account/my_account.mako')
784 788 def my_account_pullrequests(self):
785 789 c = self.load_default_context()
786 790 c.active = 'pullrequests'
787 791 req_get = self.request.GET
788 792
789 793 c.closed = str2bool(req_get.get('pr_show_closed'))
790 794
791 795 return self._get_template_context(c)
792 796
793 797 @LoginRequired()
794 798 @NotAnonymous()
795 799 @view_config(
796 800 route_name='my_account_pullrequests_data',
797 801 request_method='GET', renderer='json_ext')
798 802 def my_account_pullrequests_data(self):
799 803 self.load_default_context()
800 804 req_get = self.request.GET
801 805 closed = str2bool(req_get.get('closed'))
802 806
803 807 statuses = [PullRequest.STATUS_NEW, PullRequest.STATUS_OPEN]
804 808 if closed:
805 809 statuses += [PullRequest.STATUS_CLOSED]
806 810
807 811 data = self._get_pull_requests_list(statuses=statuses)
808 812 return data
809 813
810 814 @LoginRequired()
811 815 @NotAnonymous()
812 816 @view_config(
813 817 route_name='my_account_user_group_membership',
814 818 request_method='GET',
815 819 renderer='rhodecode:templates/admin/my_account/my_account.mako')
816 820 def my_account_user_group_membership(self):
817 821 c = self.load_default_context()
818 822 c.active = 'user_group_membership'
819 823 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
820 824 for group in self._rhodecode_db_user.group_member]
821 825 c.user_groups = json.dumps(groups)
822 826 return self._get_template_context(c)
@@ -1,543 +1,548 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20 from rhodecode.apps._base import add_route_with_slash
21 21
22 22
23 23 def includeme(config):
24 24
25 25 # repo creating checks, special cases that aren't repo routes
26 26 config.add_route(
27 27 name='repo_creating',
28 28 pattern='/{repo_name:.*?[^/]}/repo_creating')
29 29
30 30 config.add_route(
31 31 name='repo_creating_check',
32 32 pattern='/{repo_name:.*?[^/]}/repo_creating_check')
33 33
34 34 # Summary
35 35 # NOTE(marcink): one additional route is defined in very bottom, catch
36 36 # all pattern
37 37 config.add_route(
38 38 name='repo_summary_explicit',
39 39 pattern='/{repo_name:.*?[^/]}/summary', repo_route=True)
40 40 config.add_route(
41 41 name='repo_summary_commits',
42 42 pattern='/{repo_name:.*?[^/]}/summary-commits', repo_route=True)
43 43
44 44 # Commits
45 45 config.add_route(
46 46 name='repo_commit',
47 47 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}', repo_route=True)
48 48
49 49 config.add_route(
50 50 name='repo_commit_children',
51 51 pattern='/{repo_name:.*?[^/]}/changeset_children/{commit_id}', repo_route=True)
52 52
53 53 config.add_route(
54 54 name='repo_commit_parents',
55 55 pattern='/{repo_name:.*?[^/]}/changeset_parents/{commit_id}', repo_route=True)
56 56
57 57 config.add_route(
58 58 name='repo_commit_raw',
59 59 pattern='/{repo_name:.*?[^/]}/changeset-diff/{commit_id}', repo_route=True)
60 60
61 61 config.add_route(
62 62 name='repo_commit_patch',
63 63 pattern='/{repo_name:.*?[^/]}/changeset-patch/{commit_id}', repo_route=True)
64 64
65 65 config.add_route(
66 66 name='repo_commit_download',
67 67 pattern='/{repo_name:.*?[^/]}/changeset-download/{commit_id}', repo_route=True)
68 68
69 69 config.add_route(
70 70 name='repo_commit_data',
71 71 pattern='/{repo_name:.*?[^/]}/changeset-data/{commit_id}', repo_route=True)
72 72
73 73 config.add_route(
74 74 name='repo_commit_comment_create',
75 75 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/create', repo_route=True)
76 76
77 77 config.add_route(
78 78 name='repo_commit_comment_preview',
79 79 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True)
80 80
81 81 config.add_route(
82 82 name='repo_commit_comment_history_view',
83 83 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_history_id}/history_view', repo_route=True)
84 84
85 85 config.add_route(
86 86 name='repo_commit_comment_attachment_upload',
87 87 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True)
88 88
89 89 config.add_route(
90 90 name='repo_commit_comment_delete',
91 91 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True)
92 92
93 93 config.add_route(
94 94 name='repo_commit_comment_edit',
95 95 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/edit', repo_route=True)
96 96
97 97 # still working url for backward compat.
98 98 config.add_route(
99 99 name='repo_commit_raw_deprecated',
100 100 pattern='/{repo_name:.*?[^/]}/raw-changeset/{commit_id}', repo_route=True)
101 101
102 102 # Files
103 103 config.add_route(
104 104 name='repo_archivefile',
105 105 pattern='/{repo_name:.*?[^/]}/archive/{fname:.*}', repo_route=True)
106 106
107 107 config.add_route(
108 108 name='repo_files_diff',
109 109 pattern='/{repo_name:.*?[^/]}/diff/{f_path:.*}', repo_route=True)
110 110 config.add_route( # legacy route to make old links work
111 111 name='repo_files_diff_2way_redirect',
112 112 pattern='/{repo_name:.*?[^/]}/diff-2way/{f_path:.*}', repo_route=True)
113 113
114 114 config.add_route(
115 115 name='repo_files',
116 116 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/{f_path:.*}', repo_route=True)
117 117 config.add_route(
118 118 name='repo_files:default_path',
119 119 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/', repo_route=True)
120 120 config.add_route(
121 121 name='repo_files:default_commit',
122 122 pattern='/{repo_name:.*?[^/]}/files', repo_route=True)
123 123
124 124 config.add_route(
125 125 name='repo_files:rendered',
126 126 pattern='/{repo_name:.*?[^/]}/render/{commit_id}/{f_path:.*}', repo_route=True)
127 127
128 128 config.add_route(
129 129 name='repo_files:annotated',
130 130 pattern='/{repo_name:.*?[^/]}/annotate/{commit_id}/{f_path:.*}', repo_route=True)
131 131 config.add_route(
132 132 name='repo_files:annotated_previous',
133 133 pattern='/{repo_name:.*?[^/]}/annotate-previous/{commit_id}/{f_path:.*}', repo_route=True)
134 134
135 135 config.add_route(
136 136 name='repo_nodetree_full',
137 137 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/{f_path:.*}', repo_route=True)
138 138 config.add_route(
139 139 name='repo_nodetree_full:default_path',
140 140 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/', repo_route=True)
141 141
142 142 config.add_route(
143 143 name='repo_files_nodelist',
144 144 pattern='/{repo_name:.*?[^/]}/nodelist/{commit_id}/{f_path:.*}', repo_route=True)
145 145
146 146 config.add_route(
147 147 name='repo_file_raw',
148 148 pattern='/{repo_name:.*?[^/]}/raw/{commit_id}/{f_path:.*}', repo_route=True)
149 149
150 150 config.add_route(
151 151 name='repo_file_download',
152 152 pattern='/{repo_name:.*?[^/]}/download/{commit_id}/{f_path:.*}', repo_route=True)
153 153 config.add_route( # backward compat to keep old links working
154 154 name='repo_file_download:legacy',
155 155 pattern='/{repo_name:.*?[^/]}/rawfile/{commit_id}/{f_path:.*}',
156 156 repo_route=True)
157 157
158 158 config.add_route(
159 159 name='repo_file_history',
160 160 pattern='/{repo_name:.*?[^/]}/history/{commit_id}/{f_path:.*}', repo_route=True)
161 161
162 162 config.add_route(
163 163 name='repo_file_authors',
164 164 pattern='/{repo_name:.*?[^/]}/authors/{commit_id}/{f_path:.*}', repo_route=True)
165 165
166 166 config.add_route(
167 167 name='repo_files_check_head',
168 168 pattern='/{repo_name:.*?[^/]}/check_head/{commit_id}/{f_path:.*}',
169 169 repo_route=True)
170 170 config.add_route(
171 171 name='repo_files_remove_file',
172 172 pattern='/{repo_name:.*?[^/]}/remove_file/{commit_id}/{f_path:.*}',
173 173 repo_route=True)
174 174 config.add_route(
175 175 name='repo_files_delete_file',
176 176 pattern='/{repo_name:.*?[^/]}/delete_file/{commit_id}/{f_path:.*}',
177 177 repo_route=True)
178 178 config.add_route(
179 179 name='repo_files_edit_file',
180 180 pattern='/{repo_name:.*?[^/]}/edit_file/{commit_id}/{f_path:.*}',
181 181 repo_route=True)
182 182 config.add_route(
183 183 name='repo_files_update_file',
184 184 pattern='/{repo_name:.*?[^/]}/update_file/{commit_id}/{f_path:.*}',
185 185 repo_route=True)
186 186 config.add_route(
187 187 name='repo_files_add_file',
188 188 pattern='/{repo_name:.*?[^/]}/add_file/{commit_id}/{f_path:.*}',
189 189 repo_route=True)
190 190 config.add_route(
191 191 name='repo_files_upload_file',
192 192 pattern='/{repo_name:.*?[^/]}/upload_file/{commit_id}/{f_path:.*}',
193 193 repo_route=True)
194 194 config.add_route(
195 195 name='repo_files_create_file',
196 196 pattern='/{repo_name:.*?[^/]}/create_file/{commit_id}/{f_path:.*}',
197 197 repo_route=True)
198 198
199 199 # Refs data
200 200 config.add_route(
201 201 name='repo_refs_data',
202 202 pattern='/{repo_name:.*?[^/]}/refs-data', repo_route=True)
203 203
204 204 config.add_route(
205 205 name='repo_refs_changelog_data',
206 206 pattern='/{repo_name:.*?[^/]}/refs-data-changelog', repo_route=True)
207 207
208 208 config.add_route(
209 209 name='repo_stats',
210 210 pattern='/{repo_name:.*?[^/]}/repo_stats/{commit_id}', repo_route=True)
211 211
212 212 # Commits
213 213 config.add_route(
214 214 name='repo_commits',
215 215 pattern='/{repo_name:.*?[^/]}/commits', repo_route=True)
216 216 config.add_route(
217 217 name='repo_commits_file',
218 218 pattern='/{repo_name:.*?[^/]}/commits/{commit_id}/{f_path:.*}', repo_route=True)
219 219 config.add_route(
220 220 name='repo_commits_elements',
221 221 pattern='/{repo_name:.*?[^/]}/commits_elements', repo_route=True)
222 222 config.add_route(
223 223 name='repo_commits_elements_file',
224 224 pattern='/{repo_name:.*?[^/]}/commits_elements/{commit_id}/{f_path:.*}', repo_route=True)
225 225
226 226 # Changelog (old deprecated name for commits page)
227 227 config.add_route(
228 228 name='repo_changelog',
229 229 pattern='/{repo_name:.*?[^/]}/changelog', repo_route=True)
230 230 config.add_route(
231 231 name='repo_changelog_file',
232 232 pattern='/{repo_name:.*?[^/]}/changelog/{commit_id}/{f_path:.*}', repo_route=True)
233 233
234 234 # Compare
235 235 config.add_route(
236 236 name='repo_compare_select',
237 237 pattern='/{repo_name:.*?[^/]}/compare', repo_route=True)
238 238
239 239 config.add_route(
240 240 name='repo_compare',
241 241 pattern='/{repo_name:.*?[^/]}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', repo_route=True)
242 242
243 243 # Tags
244 244 config.add_route(
245 245 name='tags_home',
246 246 pattern='/{repo_name:.*?[^/]}/tags', repo_route=True)
247 247
248 248 # Branches
249 249 config.add_route(
250 250 name='branches_home',
251 251 pattern='/{repo_name:.*?[^/]}/branches', repo_route=True)
252 252
253 253 # Bookmarks
254 254 config.add_route(
255 255 name='bookmarks_home',
256 256 pattern='/{repo_name:.*?[^/]}/bookmarks', repo_route=True)
257 257
258 258 # Forks
259 259 config.add_route(
260 260 name='repo_fork_new',
261 261 pattern='/{repo_name:.*?[^/]}/fork', repo_route=True,
262 262 repo_forbid_when_archived=True,
263 263 repo_accepted_types=['hg', 'git'])
264 264
265 265 config.add_route(
266 266 name='repo_fork_create',
267 267 pattern='/{repo_name:.*?[^/]}/fork/create', repo_route=True,
268 268 repo_forbid_when_archived=True,
269 269 repo_accepted_types=['hg', 'git'])
270 270
271 271 config.add_route(
272 272 name='repo_forks_show_all',
273 273 pattern='/{repo_name:.*?[^/]}/forks', repo_route=True,
274 274 repo_accepted_types=['hg', 'git'])
275 275 config.add_route(
276 276 name='repo_forks_data',
277 277 pattern='/{repo_name:.*?[^/]}/forks/data', repo_route=True,
278 278 repo_accepted_types=['hg', 'git'])
279 279
280 280 # Pull Requests
281 281 config.add_route(
282 282 name='pullrequest_show',
283 283 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}',
284 284 repo_route=True)
285 285
286 286 config.add_route(
287 287 name='pullrequest_show_all',
288 288 pattern='/{repo_name:.*?[^/]}/pull-request',
289 289 repo_route=True, repo_accepted_types=['hg', 'git'])
290 290
291 291 config.add_route(
292 292 name='pullrequest_show_all_data',
293 293 pattern='/{repo_name:.*?[^/]}/pull-request-data',
294 294 repo_route=True, repo_accepted_types=['hg', 'git'])
295 295
296 296 config.add_route(
297 297 name='pullrequest_repo_refs',
298 298 pattern='/{repo_name:.*?[^/]}/pull-request/refs/{target_repo_name:.*?[^/]}',
299 299 repo_route=True)
300 300
301 301 config.add_route(
302 302 name='pullrequest_repo_targets',
303 303 pattern='/{repo_name:.*?[^/]}/pull-request/repo-targets',
304 304 repo_route=True)
305 305
306 306 config.add_route(
307 307 name='pullrequest_new',
308 308 pattern='/{repo_name:.*?[^/]}/pull-request/new',
309 309 repo_route=True, repo_accepted_types=['hg', 'git'],
310 310 repo_forbid_when_archived=True)
311 311
312 312 config.add_route(
313 313 name='pullrequest_create',
314 314 pattern='/{repo_name:.*?[^/]}/pull-request/create',
315 315 repo_route=True, repo_accepted_types=['hg', 'git'],
316 316 repo_forbid_when_archived=True)
317 317
318 318 config.add_route(
319 319 name='pullrequest_update',
320 320 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/update',
321 321 repo_route=True, repo_forbid_when_archived=True)
322 322
323 323 config.add_route(
324 324 name='pullrequest_merge',
325 325 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/merge',
326 326 repo_route=True, repo_forbid_when_archived=True)
327 327
328 328 config.add_route(
329 329 name='pullrequest_delete',
330 330 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/delete',
331 331 repo_route=True, repo_forbid_when_archived=True)
332 332
333 333 config.add_route(
334 334 name='pullrequest_comment_create',
335 335 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment',
336 336 repo_route=True)
337 337
338 338 config.add_route(
339 339 name='pullrequest_comment_edit',
340 340 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/edit',
341 341 repo_route=True, repo_accepted_types=['hg', 'git'])
342 342
343 343 config.add_route(
344 344 name='pullrequest_comment_delete',
345 345 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/delete',
346 346 repo_route=True, repo_accepted_types=['hg', 'git'])
347 347
348 348 config.add_route(
349 349 name='pullrequest_comments',
350 350 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comments',
351 351 repo_route=True)
352 352
353 353 config.add_route(
354 354 name='pullrequest_todos',
355 355 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/todos',
356 356 repo_route=True)
357 357
358 config.add_route(
359 name='pullrequest_drafts',
360 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/drafts',
361 repo_route=True)
362
358 363 # Artifacts, (EE feature)
359 364 config.add_route(
360 365 name='repo_artifacts_list',
361 366 pattern='/{repo_name:.*?[^/]}/artifacts', repo_route=True)
362 367
363 368 # Settings
364 369 config.add_route(
365 370 name='edit_repo',
366 371 pattern='/{repo_name:.*?[^/]}/settings', repo_route=True)
367 372 # update is POST on edit_repo
368 373
369 374 # Settings advanced
370 375 config.add_route(
371 376 name='edit_repo_advanced',
372 377 pattern='/{repo_name:.*?[^/]}/settings/advanced', repo_route=True)
373 378 config.add_route(
374 379 name='edit_repo_advanced_archive',
375 380 pattern='/{repo_name:.*?[^/]}/settings/advanced/archive', repo_route=True)
376 381 config.add_route(
377 382 name='edit_repo_advanced_delete',
378 383 pattern='/{repo_name:.*?[^/]}/settings/advanced/delete', repo_route=True)
379 384 config.add_route(
380 385 name='edit_repo_advanced_locking',
381 386 pattern='/{repo_name:.*?[^/]}/settings/advanced/locking', repo_route=True)
382 387 config.add_route(
383 388 name='edit_repo_advanced_journal',
384 389 pattern='/{repo_name:.*?[^/]}/settings/advanced/journal', repo_route=True)
385 390 config.add_route(
386 391 name='edit_repo_advanced_fork',
387 392 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
388 393
389 394 config.add_route(
390 395 name='edit_repo_advanced_hooks',
391 396 pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True)
392 397
393 398 # Caches
394 399 config.add_route(
395 400 name='edit_repo_caches',
396 401 pattern='/{repo_name:.*?[^/]}/settings/caches', repo_route=True)
397 402
398 403 # Permissions
399 404 config.add_route(
400 405 name='edit_repo_perms',
401 406 pattern='/{repo_name:.*?[^/]}/settings/permissions', repo_route=True)
402 407
403 408 config.add_route(
404 409 name='edit_repo_perms_set_private',
405 410 pattern='/{repo_name:.*?[^/]}/settings/permissions/set_private', repo_route=True)
406 411
407 412 # Permissions Branch (EE feature)
408 413 config.add_route(
409 414 name='edit_repo_perms_branch',
410 415 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions', repo_route=True)
411 416 config.add_route(
412 417 name='edit_repo_perms_branch_delete',
413 418 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions/{rule_id}/delete',
414 419 repo_route=True)
415 420
416 421 # Maintenance
417 422 config.add_route(
418 423 name='edit_repo_maintenance',
419 424 pattern='/{repo_name:.*?[^/]}/settings/maintenance', repo_route=True)
420 425
421 426 config.add_route(
422 427 name='edit_repo_maintenance_execute',
423 428 pattern='/{repo_name:.*?[^/]}/settings/maintenance/execute', repo_route=True)
424 429
425 430 # Fields
426 431 config.add_route(
427 432 name='edit_repo_fields',
428 433 pattern='/{repo_name:.*?[^/]}/settings/fields', repo_route=True)
429 434 config.add_route(
430 435 name='edit_repo_fields_create',
431 436 pattern='/{repo_name:.*?[^/]}/settings/fields/create', repo_route=True)
432 437 config.add_route(
433 438 name='edit_repo_fields_delete',
434 439 pattern='/{repo_name:.*?[^/]}/settings/fields/{field_id}/delete', repo_route=True)
435 440
436 441 # Locking
437 442 config.add_route(
438 443 name='repo_edit_toggle_locking',
439 444 pattern='/{repo_name:.*?[^/]}/settings/toggle_locking', repo_route=True)
440 445
441 446 # Remote
442 447 config.add_route(
443 448 name='edit_repo_remote',
444 449 pattern='/{repo_name:.*?[^/]}/settings/remote', repo_route=True)
445 450 config.add_route(
446 451 name='edit_repo_remote_pull',
447 452 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
448 453 config.add_route(
449 454 name='edit_repo_remote_push',
450 455 pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True)
451 456
452 457 # Statistics
453 458 config.add_route(
454 459 name='edit_repo_statistics',
455 460 pattern='/{repo_name:.*?[^/]}/settings/statistics', repo_route=True)
456 461 config.add_route(
457 462 name='edit_repo_statistics_reset',
458 463 pattern='/{repo_name:.*?[^/]}/settings/statistics/update', repo_route=True)
459 464
460 465 # Issue trackers
461 466 config.add_route(
462 467 name='edit_repo_issuetracker',
463 468 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers', repo_route=True)
464 469 config.add_route(
465 470 name='edit_repo_issuetracker_test',
466 471 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/test', repo_route=True)
467 472 config.add_route(
468 473 name='edit_repo_issuetracker_delete',
469 474 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/delete', repo_route=True)
470 475 config.add_route(
471 476 name='edit_repo_issuetracker_update',
472 477 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/update', repo_route=True)
473 478
474 479 # VCS Settings
475 480 config.add_route(
476 481 name='edit_repo_vcs',
477 482 pattern='/{repo_name:.*?[^/]}/settings/vcs', repo_route=True)
478 483 config.add_route(
479 484 name='edit_repo_vcs_update',
480 485 pattern='/{repo_name:.*?[^/]}/settings/vcs/update', repo_route=True)
481 486
482 487 # svn pattern
483 488 config.add_route(
484 489 name='edit_repo_vcs_svn_pattern_delete',
485 490 pattern='/{repo_name:.*?[^/]}/settings/vcs/svn_pattern/delete', repo_route=True)
486 491
487 492 # Repo Review Rules (EE feature)
488 493 config.add_route(
489 494 name='repo_reviewers',
490 495 pattern='/{repo_name:.*?[^/]}/settings/review/rules', repo_route=True)
491 496
492 497 config.add_route(
493 498 name='repo_default_reviewers_data',
494 499 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
495 500
496 501 # Repo Automation (EE feature)
497 502 config.add_route(
498 503 name='repo_automation',
499 504 pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True)
500 505
501 506 # Strip
502 507 config.add_route(
503 508 name='edit_repo_strip',
504 509 pattern='/{repo_name:.*?[^/]}/settings/strip', repo_route=True)
505 510
506 511 config.add_route(
507 512 name='strip_check',
508 513 pattern='/{repo_name:.*?[^/]}/settings/strip_check', repo_route=True)
509 514
510 515 config.add_route(
511 516 name='strip_execute',
512 517 pattern='/{repo_name:.*?[^/]}/settings/strip_execute', repo_route=True)
513 518
514 519 # Audit logs
515 520 config.add_route(
516 521 name='edit_repo_audit_logs',
517 522 pattern='/{repo_name:.*?[^/]}/settings/audit_logs', repo_route=True)
518 523
519 524 # ATOM/RSS Feed, shouldn't contain slashes for outlook compatibility
520 525 config.add_route(
521 526 name='rss_feed_home',
522 527 pattern='/{repo_name:.*?[^/]}/feed-rss', repo_route=True)
523 528
524 529 config.add_route(
525 530 name='atom_feed_home',
526 531 pattern='/{repo_name:.*?[^/]}/feed-atom', repo_route=True)
527 532
528 533 config.add_route(
529 534 name='rss_feed_home_old',
530 535 pattern='/{repo_name:.*?[^/]}/feed/rss', repo_route=True)
531 536
532 537 config.add_route(
533 538 name='atom_feed_home_old',
534 539 pattern='/{repo_name:.*?[^/]}/feed/atom', repo_route=True)
535 540
536 541 # NOTE(marcink): needs to be at the end for catch-all
537 542 add_route_with_slash(
538 543 config,
539 544 name='repo_summary',
540 545 pattern='/{repo_name:.*?[^/]}', repo_route=True)
541 546
542 547 # Scan module for configuration decorators.
543 548 config.scan('.views', ignore='.tests')
@@ -1,1661 +1,1658 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20 import mock
21 21 import pytest
22 22
23 23 import rhodecode
24 24 from rhodecode.lib.vcs.backends.base import MergeResponse, MergeFailureReason
25 25 from rhodecode.lib.vcs.nodes import FileNode
26 26 from rhodecode.lib import helpers as h
27 27 from rhodecode.model.changeset_status import ChangesetStatusModel
28 28 from rhodecode.model.db import (
29 29 PullRequest, ChangesetStatus, UserLog, Notification, ChangesetComment, Repository)
30 30 from rhodecode.model.meta import Session
31 31 from rhodecode.model.pull_request import PullRequestModel
32 32 from rhodecode.model.user import UserModel
33 33 from rhodecode.model.comment import CommentsModel
34 34 from rhodecode.tests import (
35 35 assert_session_flash, TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_LOGIN)
36 36
37 37
38 38 def route_path(name, params=None, **kwargs):
39 39 import urllib
40 40
41 41 base_url = {
42 42 'repo_changelog': '/{repo_name}/changelog',
43 43 'repo_changelog_file': '/{repo_name}/changelog/{commit_id}/{f_path}',
44 44 'repo_commits': '/{repo_name}/commits',
45 45 'repo_commits_file': '/{repo_name}/commits/{commit_id}/{f_path}',
46 46 'pullrequest_show': '/{repo_name}/pull-request/{pull_request_id}',
47 47 'pullrequest_show_all': '/{repo_name}/pull-request',
48 48 'pullrequest_show_all_data': '/{repo_name}/pull-request-data',
49 49 'pullrequest_repo_refs': '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}',
50 50 'pullrequest_repo_targets': '/{repo_name}/pull-request/repo-destinations',
51 51 'pullrequest_new': '/{repo_name}/pull-request/new',
52 52 'pullrequest_create': '/{repo_name}/pull-request/create',
53 53 'pullrequest_update': '/{repo_name}/pull-request/{pull_request_id}/update',
54 54 'pullrequest_merge': '/{repo_name}/pull-request/{pull_request_id}/merge',
55 55 'pullrequest_delete': '/{repo_name}/pull-request/{pull_request_id}/delete',
56 56 'pullrequest_comment_create': '/{repo_name}/pull-request/{pull_request_id}/comment',
57 57 'pullrequest_comment_delete': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/delete',
58 58 'pullrequest_comment_edit': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/edit',
59 59 }[name].format(**kwargs)
60 60
61 61 if params:
62 62 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
63 63 return base_url
64 64
65 65
66 66 @pytest.mark.usefixtures('app', 'autologin_user')
67 67 @pytest.mark.backends("git", "hg")
68 68 class TestPullrequestsView(object):
69 69
70 70 def test_index(self, backend):
71 71 self.app.get(route_path(
72 72 'pullrequest_new',
73 73 repo_name=backend.repo_name))
74 74
75 75 def test_option_menu_create_pull_request_exists(self, backend):
76 76 repo_name = backend.repo_name
77 77 response = self.app.get(h.route_path('repo_summary', repo_name=repo_name))
78 78
79 79 create_pr_link = '<a href="%s">Create Pull Request</a>' % route_path(
80 80 'pullrequest_new', repo_name=repo_name)
81 81 response.mustcontain(create_pr_link)
82 82
83 83 def test_create_pr_form_with_raw_commit_id(self, backend):
84 84 repo = backend.repo
85 85
86 86 self.app.get(
87 87 route_path('pullrequest_new', repo_name=repo.repo_name,
88 88 commit=repo.get_commit().raw_id),
89 89 status=200)
90 90
91 91 @pytest.mark.parametrize('pr_merge_enabled', [True, False])
92 92 @pytest.mark.parametrize('range_diff', ["0", "1"])
93 93 def test_show(self, pr_util, pr_merge_enabled, range_diff):
94 94 pull_request = pr_util.create_pull_request(
95 95 mergeable=pr_merge_enabled, enable_notifications=False)
96 96
97 97 response = self.app.get(route_path(
98 98 'pullrequest_show',
99 99 repo_name=pull_request.target_repo.scm_instance().name,
100 100 pull_request_id=pull_request.pull_request_id,
101 101 params={'range-diff': range_diff}))
102 102
103 103 for commit_id in pull_request.revisions:
104 104 response.mustcontain(commit_id)
105 105
106 106 response.mustcontain(pull_request.target_ref_parts.type)
107 107 response.mustcontain(pull_request.target_ref_parts.name)
108 108
109 109 response.mustcontain('class="pull-request-merge"')
110 110
111 111 if pr_merge_enabled:
112 112 response.mustcontain('Pull request reviewer approval is pending')
113 113 else:
114 114 response.mustcontain('Server-side pull request merging is disabled.')
115 115
116 116 if range_diff == "1":
117 117 response.mustcontain('Turn off: Show the diff as commit range')
118 118
119 119 def test_show_versions_of_pr(self, backend, csrf_token):
120 120 commits = [
121 121 {'message': 'initial-commit',
122 122 'added': [FileNode('test-file.txt', 'LINE1\n')]},
123 123
124 124 {'message': 'commit-1',
125 125 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\n')]},
126 126 # Above is the initial version of PR that changes a single line
127 127
128 128 # from now on we'll add 3x commit adding a nother line on each step
129 129 {'message': 'commit-2',
130 130 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\n')]},
131 131
132 132 {'message': 'commit-3',
133 133 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\n')]},
134 134
135 135 {'message': 'commit-4',
136 136 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\nLINE5\n')]},
137 137 ]
138 138
139 139 commit_ids = backend.create_master_repo(commits)
140 140 target = backend.create_repo(heads=['initial-commit'])
141 141 source = backend.create_repo(heads=['commit-1'])
142 142 source_repo_name = source.repo_name
143 143 target_repo_name = target.repo_name
144 144
145 145 target_ref = 'branch:{branch}:{commit_id}'.format(
146 146 branch=backend.default_branch_name, commit_id=commit_ids['initial-commit'])
147 147 source_ref = 'branch:{branch}:{commit_id}'.format(
148 148 branch=backend.default_branch_name, commit_id=commit_ids['commit-1'])
149 149
150 150 response = self.app.post(
151 151 route_path('pullrequest_create', repo_name=source.repo_name),
152 152 [
153 153 ('source_repo', source_repo_name),
154 154 ('source_ref', source_ref),
155 155 ('target_repo', target_repo_name),
156 156 ('target_ref', target_ref),
157 157 ('common_ancestor', commit_ids['initial-commit']),
158 158 ('pullrequest_title', 'Title'),
159 159 ('pullrequest_desc', 'Description'),
160 160 ('description_renderer', 'markdown'),
161 161 ('__start__', 'review_members:sequence'),
162 162 ('__start__', 'reviewer:mapping'),
163 163 ('user_id', '1'),
164 164 ('__start__', 'reasons:sequence'),
165 165 ('reason', 'Some reason'),
166 166 ('__end__', 'reasons:sequence'),
167 167 ('__start__', 'rules:sequence'),
168 168 ('__end__', 'rules:sequence'),
169 169 ('mandatory', 'False'),
170 170 ('__end__', 'reviewer:mapping'),
171 171 ('__end__', 'review_members:sequence'),
172 172 ('__start__', 'revisions:sequence'),
173 173 ('revisions', commit_ids['commit-1']),
174 174 ('__end__', 'revisions:sequence'),
175 175 ('user', ''),
176 176 ('csrf_token', csrf_token),
177 177 ],
178 178 status=302)
179 179
180 180 location = response.headers['Location']
181 181
182 182 pull_request_id = location.rsplit('/', 1)[1]
183 183 assert pull_request_id != 'new'
184 184 pull_request = PullRequest.get(int(pull_request_id))
185 185
186 186 pull_request_id = pull_request.pull_request_id
187 187
188 188 # Show initial version of PR
189 189 response = self.app.get(
190 190 route_path('pullrequest_show',
191 191 repo_name=target_repo_name,
192 192 pull_request_id=pull_request_id))
193 193
194 194 response.mustcontain('commit-1')
195 195 response.mustcontain(no=['commit-2'])
196 196 response.mustcontain(no=['commit-3'])
197 197 response.mustcontain(no=['commit-4'])
198 198
199 199 response.mustcontain('cb-addition"></span><span>LINE2</span>')
200 200 response.mustcontain(no=['LINE3'])
201 201 response.mustcontain(no=['LINE4'])
202 202 response.mustcontain(no=['LINE5'])
203 203
204 204 # update PR #1
205 205 source_repo = Repository.get_by_repo_name(source_repo_name)
206 206 backend.pull_heads(source_repo, heads=['commit-2'])
207 207 response = self.app.post(
208 208 route_path('pullrequest_update',
209 209 repo_name=target_repo_name, pull_request_id=pull_request_id),
210 210 params={'update_commits': 'true', 'csrf_token': csrf_token})
211 211
212 212 # update PR #2
213 213 source_repo = Repository.get_by_repo_name(source_repo_name)
214 214 backend.pull_heads(source_repo, heads=['commit-3'])
215 215 response = self.app.post(
216 216 route_path('pullrequest_update',
217 217 repo_name=target_repo_name, pull_request_id=pull_request_id),
218 218 params={'update_commits': 'true', 'csrf_token': csrf_token})
219 219
220 220 # update PR #3
221 221 source_repo = Repository.get_by_repo_name(source_repo_name)
222 222 backend.pull_heads(source_repo, heads=['commit-4'])
223 223 response = self.app.post(
224 224 route_path('pullrequest_update',
225 225 repo_name=target_repo_name, pull_request_id=pull_request_id),
226 226 params={'update_commits': 'true', 'csrf_token': csrf_token})
227 227
228 228 # Show final version !
229 229 response = self.app.get(
230 230 route_path('pullrequest_show',
231 231 repo_name=target_repo_name,
232 232 pull_request_id=pull_request_id))
233 233
234 234 # 3 updates, and the latest == 4
235 235 response.mustcontain('4 versions available for this pull request')
236 236 response.mustcontain(no=['rhodecode diff rendering error'])
237 237
238 238 # initial show must have 3 commits, and 3 adds
239 239 response.mustcontain('commit-1')
240 240 response.mustcontain('commit-2')
241 241 response.mustcontain('commit-3')
242 242 response.mustcontain('commit-4')
243 243
244 244 response.mustcontain('cb-addition"></span><span>LINE2</span>')
245 245 response.mustcontain('cb-addition"></span><span>LINE3</span>')
246 246 response.mustcontain('cb-addition"></span><span>LINE4</span>')
247 247 response.mustcontain('cb-addition"></span><span>LINE5</span>')
248 248
249 249 # fetch versions
250 250 pr = PullRequest.get(pull_request_id)
251 251 versions = [x.pull_request_version_id for x in pr.versions.all()]
252 252 assert len(versions) == 3
253 253
254 254 # show v1,v2,v3,v4
255 255 def cb_line(text):
256 256 return 'cb-addition"></span><span>{}</span>'.format(text)
257 257
258 258 def cb_context(text):
259 259 return '<span class="cb-code"><span class="cb-action cb-context">' \
260 260 '</span><span>{}</span></span>'.format(text)
261 261
262 262 commit_tests = {
263 263 # in response, not in response
264 264 1: (['commit-1'], ['commit-2', 'commit-3', 'commit-4']),
265 265 2: (['commit-1', 'commit-2'], ['commit-3', 'commit-4']),
266 266 3: (['commit-1', 'commit-2', 'commit-3'], ['commit-4']),
267 267 4: (['commit-1', 'commit-2', 'commit-3', 'commit-4'], []),
268 268 }
269 269 diff_tests = {
270 270 1: (['LINE2'], ['LINE3', 'LINE4', 'LINE5']),
271 271 2: (['LINE2', 'LINE3'], ['LINE4', 'LINE5']),
272 272 3: (['LINE2', 'LINE3', 'LINE4'], ['LINE5']),
273 273 4: (['LINE2', 'LINE3', 'LINE4', 'LINE5'], []),
274 274 }
275 275 for idx, ver in enumerate(versions, 1):
276 276
277 277 response = self.app.get(
278 278 route_path('pullrequest_show',
279 279 repo_name=target_repo_name,
280 280 pull_request_id=pull_request_id,
281 281 params={'version': ver}))
282 282
283 283 response.mustcontain(no=['rhodecode diff rendering error'])
284 284 response.mustcontain('Showing changes at v{}'.format(idx))
285 285
286 286 yes, no = commit_tests[idx]
287 287 for y in yes:
288 288 response.mustcontain(y)
289 289 for n in no:
290 290 response.mustcontain(no=n)
291 291
292 292 yes, no = diff_tests[idx]
293 293 for y in yes:
294 294 response.mustcontain(cb_line(y))
295 295 for n in no:
296 296 response.mustcontain(no=n)
297 297
298 298 # show diff between versions
299 299 diff_compare_tests = {
300 300 1: (['LINE3'], ['LINE1', 'LINE2']),
301 301 2: (['LINE3', 'LINE4'], ['LINE1', 'LINE2']),
302 302 3: (['LINE3', 'LINE4', 'LINE5'], ['LINE1', 'LINE2']),
303 303 }
304 304 for idx, ver in enumerate(versions, 1):
305 305 adds, context = diff_compare_tests[idx]
306 306
307 307 to_ver = ver+1
308 308 if idx == 3:
309 309 to_ver = 'latest'
310 310
311 311 response = self.app.get(
312 312 route_path('pullrequest_show',
313 313 repo_name=target_repo_name,
314 314 pull_request_id=pull_request_id,
315 315 params={'from_version': versions[0], 'version': to_ver}))
316 316
317 317 response.mustcontain(no=['rhodecode diff rendering error'])
318 318
319 319 for a in adds:
320 320 response.mustcontain(cb_line(a))
321 321 for c in context:
322 322 response.mustcontain(cb_context(c))
323 323
324 324 # test version v2 -> v3
325 325 response = self.app.get(
326 326 route_path('pullrequest_show',
327 327 repo_name=target_repo_name,
328 328 pull_request_id=pull_request_id,
329 329 params={'from_version': versions[1], 'version': versions[2]}))
330 330
331 331 response.mustcontain(cb_context('LINE1'))
332 332 response.mustcontain(cb_context('LINE2'))
333 333 response.mustcontain(cb_context('LINE3'))
334 334 response.mustcontain(cb_line('LINE4'))
335 335
336 336 def test_close_status_visibility(self, pr_util, user_util, csrf_token):
337 337 # Logout
338 338 response = self.app.post(
339 339 h.route_path('logout'),
340 340 params={'csrf_token': csrf_token})
341 341 # Login as regular user
342 342 response = self.app.post(h.route_path('login'),
343 343 {'username': TEST_USER_REGULAR_LOGIN,
344 344 'password': 'test12'})
345 345
346 346 pull_request = pr_util.create_pull_request(
347 347 author=TEST_USER_REGULAR_LOGIN)
348 348
349 349 response = self.app.get(route_path(
350 350 'pullrequest_show',
351 351 repo_name=pull_request.target_repo.scm_instance().name,
352 352 pull_request_id=pull_request.pull_request_id))
353 353
354 354 response.mustcontain('Server-side pull request merging is disabled.')
355 355
356 356 assert_response = response.assert_response()
357 357 # for regular user without a merge permissions, we don't see it
358 358 assert_response.no_element_exists('#close-pull-request-action')
359 359
360 360 user_util.grant_user_permission_to_repo(
361 361 pull_request.target_repo,
362 362 UserModel().get_by_username(TEST_USER_REGULAR_LOGIN),
363 363 'repository.write')
364 364 response = self.app.get(route_path(
365 365 'pullrequest_show',
366 366 repo_name=pull_request.target_repo.scm_instance().name,
367 367 pull_request_id=pull_request.pull_request_id))
368 368
369 369 response.mustcontain('Server-side pull request merging is disabled.')
370 370
371 371 assert_response = response.assert_response()
372 372 # now regular user has a merge permissions, we have CLOSE button
373 373 assert_response.one_element_exists('#close-pull-request-action')
374 374
375 375 def test_show_invalid_commit_id(self, pr_util):
376 376 # Simulating invalid revisions which will cause a lookup error
377 377 pull_request = pr_util.create_pull_request()
378 378 pull_request.revisions = ['invalid']
379 379 Session().add(pull_request)
380 380 Session().commit()
381 381
382 382 response = self.app.get(route_path(
383 383 'pullrequest_show',
384 384 repo_name=pull_request.target_repo.scm_instance().name,
385 385 pull_request_id=pull_request.pull_request_id))
386 386
387 387 for commit_id in pull_request.revisions:
388 388 response.mustcontain(commit_id)
389 389
390 390 def test_show_invalid_source_reference(self, pr_util):
391 391 pull_request = pr_util.create_pull_request()
392 392 pull_request.source_ref = 'branch:b:invalid'
393 393 Session().add(pull_request)
394 394 Session().commit()
395 395
396 396 self.app.get(route_path(
397 397 'pullrequest_show',
398 398 repo_name=pull_request.target_repo.scm_instance().name,
399 399 pull_request_id=pull_request.pull_request_id))
400 400
401 401 def test_edit_title_description(self, pr_util, csrf_token):
402 402 pull_request = pr_util.create_pull_request()
403 403 pull_request_id = pull_request.pull_request_id
404 404
405 405 response = self.app.post(
406 406 route_path('pullrequest_update',
407 407 repo_name=pull_request.target_repo.repo_name,
408 408 pull_request_id=pull_request_id),
409 409 params={
410 410 'edit_pull_request': 'true',
411 411 'title': 'New title',
412 412 'description': 'New description',
413 413 'csrf_token': csrf_token})
414 414
415 415 assert_session_flash(
416 416 response, u'Pull request title & description updated.',
417 417 category='success')
418 418
419 419 pull_request = PullRequest.get(pull_request_id)
420 420 assert pull_request.title == 'New title'
421 421 assert pull_request.description == 'New description'
422 422
423 423 def test_edit_title_description_closed(self, pr_util, csrf_token):
424 424 pull_request = pr_util.create_pull_request()
425 425 pull_request_id = pull_request.pull_request_id
426 426 repo_name = pull_request.target_repo.repo_name
427 427 pr_util.close()
428 428
429 429 response = self.app.post(
430 430 route_path('pullrequest_update',
431 431 repo_name=repo_name, pull_request_id=pull_request_id),
432 432 params={
433 433 'edit_pull_request': 'true',
434 434 'title': 'New title',
435 435 'description': 'New description',
436 436 'csrf_token': csrf_token}, status=200)
437 437 assert_session_flash(
438 438 response, u'Cannot update closed pull requests.',
439 439 category='error')
440 440
441 441 def test_update_invalid_source_reference(self, pr_util, csrf_token):
442 442 from rhodecode.lib.vcs.backends.base import UpdateFailureReason
443 443
444 444 pull_request = pr_util.create_pull_request()
445 445 pull_request.source_ref = 'branch:invalid-branch:invalid-commit-id'
446 446 Session().add(pull_request)
447 447 Session().commit()
448 448
449 449 pull_request_id = pull_request.pull_request_id
450 450
451 451 response = self.app.post(
452 452 route_path('pullrequest_update',
453 453 repo_name=pull_request.target_repo.repo_name,
454 454 pull_request_id=pull_request_id),
455 455 params={'update_commits': 'true', 'csrf_token': csrf_token})
456 456
457 457 expected_msg = str(PullRequestModel.UPDATE_STATUS_MESSAGES[
458 458 UpdateFailureReason.MISSING_SOURCE_REF])
459 459 assert_session_flash(response, expected_msg, category='error')
460 460
461 461 def test_missing_target_reference(self, pr_util, csrf_token):
462 462 from rhodecode.lib.vcs.backends.base import MergeFailureReason
463 463 pull_request = pr_util.create_pull_request(
464 464 approved=True, mergeable=True)
465 465 unicode_reference = u'branch:invalid-branch:invalid-commit-id'
466 466 pull_request.target_ref = unicode_reference
467 467 Session().add(pull_request)
468 468 Session().commit()
469 469
470 470 pull_request_id = pull_request.pull_request_id
471 471 pull_request_url = route_path(
472 472 'pullrequest_show',
473 473 repo_name=pull_request.target_repo.repo_name,
474 474 pull_request_id=pull_request_id)
475 475
476 476 response = self.app.get(pull_request_url)
477 477 target_ref_id = 'invalid-branch'
478 478 merge_resp = MergeResponse(
479 479 True, True, '', MergeFailureReason.MISSING_TARGET_REF,
480 480 metadata={'target_ref': PullRequest.unicode_to_reference(unicode_reference)})
481 481 response.assert_response().element_contains(
482 482 'div[data-role="merge-message"]', merge_resp.merge_status_message)
483 483
484 484 def test_comment_and_close_pull_request_custom_message_approved(
485 485 self, pr_util, csrf_token, xhr_header):
486 486
487 487 pull_request = pr_util.create_pull_request(approved=True)
488 488 pull_request_id = pull_request.pull_request_id
489 489 author = pull_request.user_id
490 490 repo = pull_request.target_repo.repo_id
491 491
492 492 self.app.post(
493 493 route_path('pullrequest_comment_create',
494 494 repo_name=pull_request.target_repo.scm_instance().name,
495 495 pull_request_id=pull_request_id),
496 496 params={
497 497 'close_pull_request': '1',
498 498 'text': 'Closing a PR',
499 499 'csrf_token': csrf_token},
500 500 extra_environ=xhr_header,)
501 501
502 502 journal = UserLog.query()\
503 503 .filter(UserLog.user_id == author)\
504 504 .filter(UserLog.repository_id == repo) \
505 505 .order_by(UserLog.user_log_id.asc()) \
506 506 .all()
507 507 assert journal[-1].action == 'repo.pull_request.close'
508 508
509 509 pull_request = PullRequest.get(pull_request_id)
510 510 assert pull_request.is_closed()
511 511
512 512 status = ChangesetStatusModel().get_status(
513 513 pull_request.source_repo, pull_request=pull_request)
514 514 assert status == ChangesetStatus.STATUS_APPROVED
515 515 comments = ChangesetComment().query() \
516 516 .filter(ChangesetComment.pull_request == pull_request) \
517 517 .order_by(ChangesetComment.comment_id.asc())\
518 518 .all()
519 519 assert comments[-1].text == 'Closing a PR'
520 520
521 521 def test_comment_force_close_pull_request_rejected(
522 522 self, pr_util, csrf_token, xhr_header):
523 523 pull_request = pr_util.create_pull_request()
524 524 pull_request_id = pull_request.pull_request_id
525 525 PullRequestModel().update_reviewers(
526 526 pull_request_id, [
527 527 (1, ['reason'], False, 'reviewer', []),
528 528 (2, ['reason2'], False, 'reviewer', [])],
529 529 pull_request.author)
530 530 author = pull_request.user_id
531 531 repo = pull_request.target_repo.repo_id
532 532
533 533 self.app.post(
534 534 route_path('pullrequest_comment_create',
535 535 repo_name=pull_request.target_repo.scm_instance().name,
536 536 pull_request_id=pull_request_id),
537 537 params={
538 538 'close_pull_request': '1',
539 539 'csrf_token': csrf_token},
540 540 extra_environ=xhr_header)
541 541
542 542 pull_request = PullRequest.get(pull_request_id)
543 543
544 544 journal = UserLog.query()\
545 545 .filter(UserLog.user_id == author, UserLog.repository_id == repo) \
546 546 .order_by(UserLog.user_log_id.asc()) \
547 547 .all()
548 548 assert journal[-1].action == 'repo.pull_request.close'
549 549
550 550 # check only the latest status, not the review status
551 551 status = ChangesetStatusModel().get_status(
552 552 pull_request.source_repo, pull_request=pull_request)
553 553 assert status == ChangesetStatus.STATUS_REJECTED
554 554
555 555 def test_comment_and_close_pull_request(
556 556 self, pr_util, csrf_token, xhr_header):
557 557 pull_request = pr_util.create_pull_request()
558 558 pull_request_id = pull_request.pull_request_id
559 559
560 560 response = self.app.post(
561 561 route_path('pullrequest_comment_create',
562 562 repo_name=pull_request.target_repo.scm_instance().name,
563 563 pull_request_id=pull_request.pull_request_id),
564 564 params={
565 565 'close_pull_request': 'true',
566 566 'csrf_token': csrf_token},
567 567 extra_environ=xhr_header)
568 568
569 569 assert response.json
570 570
571 571 pull_request = PullRequest.get(pull_request_id)
572 572 assert pull_request.is_closed()
573 573
574 574 # check only the latest status, not the review status
575 575 status = ChangesetStatusModel().get_status(
576 576 pull_request.source_repo, pull_request=pull_request)
577 577 assert status == ChangesetStatus.STATUS_REJECTED
578 578
579 579 def test_comment_and_close_pull_request_try_edit_comment(
580 580 self, pr_util, csrf_token, xhr_header
581 581 ):
582 582 pull_request = pr_util.create_pull_request()
583 583 pull_request_id = pull_request.pull_request_id
584 584 target_scm = pull_request.target_repo.scm_instance()
585 585 target_scm_name = target_scm.name
586 586
587 587 response = self.app.post(
588 588 route_path(
589 589 'pullrequest_comment_create',
590 590 repo_name=target_scm_name,
591 591 pull_request_id=pull_request_id,
592 592 ),
593 593 params={
594 594 'close_pull_request': 'true',
595 595 'csrf_token': csrf_token,
596 596 },
597 597 extra_environ=xhr_header)
598 598
599 599 assert response.json
600 600
601 601 pull_request = PullRequest.get(pull_request_id)
602 602 target_scm = pull_request.target_repo.scm_instance()
603 603 target_scm_name = target_scm.name
604 604 assert pull_request.is_closed()
605 605
606 606 # check only the latest status, not the review status
607 607 status = ChangesetStatusModel().get_status(
608 608 pull_request.source_repo, pull_request=pull_request)
609 609 assert status == ChangesetStatus.STATUS_REJECTED
610 610
611 comment_id = response.json.get('comment_id', None)
612 test_text = 'test'
613 response = self.app.post(
614 route_path(
615 'pullrequest_comment_edit',
616 repo_name=target_scm_name,
617 pull_request_id=pull_request_id,
618 comment_id=comment_id,
619 ),
620 extra_environ=xhr_header,
621 params={
622 'csrf_token': csrf_token,
623 'text': test_text,
624 },
625 status=403,
626 )
627 assert response.status_int == 403
611 for comment_id in response.json.keys():
612 test_text = 'test'
613 response = self.app.post(
614 route_path(
615 'pullrequest_comment_edit',
616 repo_name=target_scm_name,
617 pull_request_id=pull_request_id,
618 comment_id=comment_id,
619 ),
620 extra_environ=xhr_header,
621 params={
622 'csrf_token': csrf_token,
623 'text': test_text,
624 },
625 status=403,
626 )
627 assert response.status_int == 403
628 628
629 629 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
630 630 pull_request = pr_util.create_pull_request()
631 631 target_scm = pull_request.target_repo.scm_instance()
632 632 target_scm_name = target_scm.name
633 633
634 634 response = self.app.post(
635 635 route_path(
636 636 'pullrequest_comment_create',
637 637 repo_name=target_scm_name,
638 638 pull_request_id=pull_request.pull_request_id),
639 639 params={
640 640 'csrf_token': csrf_token,
641 641 'text': 'init',
642 642 },
643 643 extra_environ=xhr_header,
644 644 )
645 645 assert response.json
646 646
647 comment_id = response.json.get('comment_id', None)
648 assert comment_id
649 test_text = 'test'
650 self.app.post(
651 route_path(
652 'pullrequest_comment_edit',
653 repo_name=target_scm_name,
654 pull_request_id=pull_request.pull_request_id,
655 comment_id=comment_id,
656 ),
657 extra_environ=xhr_header,
658 params={
659 'csrf_token': csrf_token,
660 'text': test_text,
661 'version': '0',
662 },
647 for comment_id in response.json.keys():
648 assert comment_id
649 test_text = 'test'
650 self.app.post(
651 route_path(
652 'pullrequest_comment_edit',
653 repo_name=target_scm_name,
654 pull_request_id=pull_request.pull_request_id,
655 comment_id=comment_id,
656 ),
657 extra_environ=xhr_header,
658 params={
659 'csrf_token': csrf_token,
660 'text': test_text,
661 'version': '0',
662 },
663 663
664 )
665 text_form_db = ChangesetComment.query().filter(
666 ChangesetComment.comment_id == comment_id).first().text
667 assert test_text == text_form_db
664 )
665 text_form_db = ChangesetComment.query().filter(
666 ChangesetComment.comment_id == comment_id).first().text
667 assert test_text == text_form_db
668 668
669 669 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
670 670 pull_request = pr_util.create_pull_request()
671 671 target_scm = pull_request.target_repo.scm_instance()
672 672 target_scm_name = target_scm.name
673 673
674 674 response = self.app.post(
675 675 route_path(
676 676 'pullrequest_comment_create',
677 677 repo_name=target_scm_name,
678 678 pull_request_id=pull_request.pull_request_id),
679 679 params={
680 680 'csrf_token': csrf_token,
681 681 'text': 'init',
682 682 },
683 683 extra_environ=xhr_header,
684 684 )
685 685 assert response.json
686 686
687 comment_id = response.json.get('comment_id', None)
688 assert comment_id
689 test_text = 'init'
690 response = self.app.post(
691 route_path(
692 'pullrequest_comment_edit',
693 repo_name=target_scm_name,
694 pull_request_id=pull_request.pull_request_id,
695 comment_id=comment_id,
696 ),
697 extra_environ=xhr_header,
698 params={
699 'csrf_token': csrf_token,
700 'text': test_text,
701 'version': '0',
702 },
703 status=404,
687 for comment_id in response.json.keys():
688 test_text = 'init'
689 response = self.app.post(
690 route_path(
691 'pullrequest_comment_edit',
692 repo_name=target_scm_name,
693 pull_request_id=pull_request.pull_request_id,
694 comment_id=comment_id,
695 ),
696 extra_environ=xhr_header,
697 params={
698 'csrf_token': csrf_token,
699 'text': test_text,
700 'version': '0',
701 },
702 status=404,
704 703
705 )
706 assert response.status_int == 404
704 )
705 assert response.status_int == 404
707 706
708 707 def test_comment_and_try_edit_already_edited(self, pr_util, csrf_token, xhr_header):
709 708 pull_request = pr_util.create_pull_request()
710 709 target_scm = pull_request.target_repo.scm_instance()
711 710 target_scm_name = target_scm.name
712 711
713 712 response = self.app.post(
714 713 route_path(
715 714 'pullrequest_comment_create',
716 715 repo_name=target_scm_name,
717 716 pull_request_id=pull_request.pull_request_id),
718 717 params={
719 718 'csrf_token': csrf_token,
720 719 'text': 'init',
721 720 },
722 721 extra_environ=xhr_header,
723 722 )
724 723 assert response.json
725 comment_id = response.json.get('comment_id', None)
726 assert comment_id
727
728 test_text = 'test'
729 self.app.post(
730 route_path(
731 'pullrequest_comment_edit',
732 repo_name=target_scm_name,
733 pull_request_id=pull_request.pull_request_id,
734 comment_id=comment_id,
735 ),
736 extra_environ=xhr_header,
737 params={
738 'csrf_token': csrf_token,
739 'text': test_text,
740 'version': '0',
741 },
724 for comment_id in response.json.keys():
725 test_text = 'test'
726 self.app.post(
727 route_path(
728 'pullrequest_comment_edit',
729 repo_name=target_scm_name,
730 pull_request_id=pull_request.pull_request_id,
731 comment_id=comment_id,
732 ),
733 extra_environ=xhr_header,
734 params={
735 'csrf_token': csrf_token,
736 'text': test_text,
737 'version': '0',
738 },
742 739
743 )
744 test_text_v2 = 'test_v2'
745 response = self.app.post(
746 route_path(
747 'pullrequest_comment_edit',
748 repo_name=target_scm_name,
749 pull_request_id=pull_request.pull_request_id,
750 comment_id=comment_id,
751 ),
752 extra_environ=xhr_header,
753 params={
754 'csrf_token': csrf_token,
755 'text': test_text_v2,
756 'version': '0',
757 },
758 status=409,
759 )
760 assert response.status_int == 409
740 )
741 test_text_v2 = 'test_v2'
742 response = self.app.post(
743 route_path(
744 'pullrequest_comment_edit',
745 repo_name=target_scm_name,
746 pull_request_id=pull_request.pull_request_id,
747 comment_id=comment_id,
748 ),
749 extra_environ=xhr_header,
750 params={
751 'csrf_token': csrf_token,
752 'text': test_text_v2,
753 'version': '0',
754 },
755 status=409,
756 )
757 assert response.status_int == 409
761 758
762 text_form_db = ChangesetComment.query().filter(
763 ChangesetComment.comment_id == comment_id).first().text
759 text_form_db = ChangesetComment.query().filter(
760 ChangesetComment.comment_id == comment_id).first().text
764 761
765 assert test_text == text_form_db
766 assert test_text_v2 != text_form_db
762 assert test_text == text_form_db
763 assert test_text_v2 != text_form_db
767 764
768 765 def test_comment_and_comment_edit_permissions_forbidden(
769 766 self, autologin_regular_user, user_regular, user_admin, pr_util,
770 767 csrf_token, xhr_header):
771 768 pull_request = pr_util.create_pull_request(
772 769 author=user_admin.username, enable_notifications=False)
773 770 comment = CommentsModel().create(
774 771 text='test',
775 772 repo=pull_request.target_repo.scm_instance().name,
776 773 user=user_admin,
777 774 pull_request=pull_request,
778 775 )
779 776 response = self.app.post(
780 777 route_path(
781 778 'pullrequest_comment_edit',
782 779 repo_name=pull_request.target_repo.scm_instance().name,
783 780 pull_request_id=pull_request.pull_request_id,
784 781 comment_id=comment.comment_id,
785 782 ),
786 783 extra_environ=xhr_header,
787 784 params={
788 785 'csrf_token': csrf_token,
789 786 'text': 'test_text',
790 787 },
791 788 status=403,
792 789 )
793 790 assert response.status_int == 403
794 791
795 792 def test_create_pull_request(self, backend, csrf_token):
796 793 commits = [
797 794 {'message': 'ancestor'},
798 795 {'message': 'change'},
799 796 {'message': 'change2'},
800 797 ]
801 798 commit_ids = backend.create_master_repo(commits)
802 799 target = backend.create_repo(heads=['ancestor'])
803 800 source = backend.create_repo(heads=['change2'])
804 801
805 802 response = self.app.post(
806 803 route_path('pullrequest_create', repo_name=source.repo_name),
807 804 [
808 805 ('source_repo', source.repo_name),
809 806 ('source_ref', 'branch:default:' + commit_ids['change2']),
810 807 ('target_repo', target.repo_name),
811 808 ('target_ref', 'branch:default:' + commit_ids['ancestor']),
812 809 ('common_ancestor', commit_ids['ancestor']),
813 810 ('pullrequest_title', 'Title'),
814 811 ('pullrequest_desc', 'Description'),
815 812 ('description_renderer', 'markdown'),
816 813 ('__start__', 'review_members:sequence'),
817 814 ('__start__', 'reviewer:mapping'),
818 815 ('user_id', '1'),
819 816 ('__start__', 'reasons:sequence'),
820 817 ('reason', 'Some reason'),
821 818 ('__end__', 'reasons:sequence'),
822 819 ('__start__', 'rules:sequence'),
823 820 ('__end__', 'rules:sequence'),
824 821 ('mandatory', 'False'),
825 822 ('__end__', 'reviewer:mapping'),
826 823 ('__end__', 'review_members:sequence'),
827 824 ('__start__', 'revisions:sequence'),
828 825 ('revisions', commit_ids['change']),
829 826 ('revisions', commit_ids['change2']),
830 827 ('__end__', 'revisions:sequence'),
831 828 ('user', ''),
832 829 ('csrf_token', csrf_token),
833 830 ],
834 831 status=302)
835 832
836 833 location = response.headers['Location']
837 834 pull_request_id = location.rsplit('/', 1)[1]
838 835 assert pull_request_id != 'new'
839 836 pull_request = PullRequest.get(int(pull_request_id))
840 837
841 838 # check that we have now both revisions
842 839 assert pull_request.revisions == [commit_ids['change2'], commit_ids['change']]
843 840 assert pull_request.source_ref == 'branch:default:' + commit_ids['change2']
844 841 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
845 842 assert pull_request.target_ref == expected_target_ref
846 843
847 844 def test_reviewer_notifications(self, backend, csrf_token):
848 845 # We have to use the app.post for this test so it will create the
849 846 # notifications properly with the new PR
850 847 commits = [
851 848 {'message': 'ancestor',
852 849 'added': [FileNode('file_A', content='content_of_ancestor')]},
853 850 {'message': 'change',
854 851 'added': [FileNode('file_a', content='content_of_change')]},
855 852 {'message': 'change-child'},
856 853 {'message': 'ancestor-child', 'parents': ['ancestor'],
857 854 'added': [
858 855 FileNode('file_B', content='content_of_ancestor_child')]},
859 856 {'message': 'ancestor-child-2'},
860 857 ]
861 858 commit_ids = backend.create_master_repo(commits)
862 859 target = backend.create_repo(heads=['ancestor-child'])
863 860 source = backend.create_repo(heads=['change'])
864 861
865 862 response = self.app.post(
866 863 route_path('pullrequest_create', repo_name=source.repo_name),
867 864 [
868 865 ('source_repo', source.repo_name),
869 866 ('source_ref', 'branch:default:' + commit_ids['change']),
870 867 ('target_repo', target.repo_name),
871 868 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
872 869 ('common_ancestor', commit_ids['ancestor']),
873 870 ('pullrequest_title', 'Title'),
874 871 ('pullrequest_desc', 'Description'),
875 872 ('description_renderer', 'markdown'),
876 873 ('__start__', 'review_members:sequence'),
877 874 ('__start__', 'reviewer:mapping'),
878 875 ('user_id', '2'),
879 876 ('__start__', 'reasons:sequence'),
880 877 ('reason', 'Some reason'),
881 878 ('__end__', 'reasons:sequence'),
882 879 ('__start__', 'rules:sequence'),
883 880 ('__end__', 'rules:sequence'),
884 881 ('mandatory', 'False'),
885 882 ('__end__', 'reviewer:mapping'),
886 883 ('__end__', 'review_members:sequence'),
887 884 ('__start__', 'revisions:sequence'),
888 885 ('revisions', commit_ids['change']),
889 886 ('__end__', 'revisions:sequence'),
890 887 ('user', ''),
891 888 ('csrf_token', csrf_token),
892 889 ],
893 890 status=302)
894 891
895 892 location = response.headers['Location']
896 893
897 894 pull_request_id = location.rsplit('/', 1)[1]
898 895 assert pull_request_id != 'new'
899 896 pull_request = PullRequest.get(int(pull_request_id))
900 897
901 898 # Check that a notification was made
902 899 notifications = Notification.query()\
903 900 .filter(Notification.created_by == pull_request.author.user_id,
904 901 Notification.type_ == Notification.TYPE_PULL_REQUEST,
905 902 Notification.subject.contains(
906 903 "requested a pull request review. !%s" % pull_request_id))
907 904 assert len(notifications.all()) == 1
908 905
909 906 # Change reviewers and check that a notification was made
910 907 PullRequestModel().update_reviewers(
911 908 pull_request.pull_request_id, [
912 909 (1, [], False, 'reviewer', [])
913 910 ],
914 911 pull_request.author)
915 912 assert len(notifications.all()) == 2
916 913
917 914 def test_create_pull_request_stores_ancestor_commit_id(self, backend, csrf_token):
918 915 commits = [
919 916 {'message': 'ancestor',
920 917 'added': [FileNode('file_A', content='content_of_ancestor')]},
921 918 {'message': 'change',
922 919 'added': [FileNode('file_a', content='content_of_change')]},
923 920 {'message': 'change-child'},
924 921 {'message': 'ancestor-child', 'parents': ['ancestor'],
925 922 'added': [
926 923 FileNode('file_B', content='content_of_ancestor_child')]},
927 924 {'message': 'ancestor-child-2'},
928 925 ]
929 926 commit_ids = backend.create_master_repo(commits)
930 927 target = backend.create_repo(heads=['ancestor-child'])
931 928 source = backend.create_repo(heads=['change'])
932 929
933 930 response = self.app.post(
934 931 route_path('pullrequest_create', repo_name=source.repo_name),
935 932 [
936 933 ('source_repo', source.repo_name),
937 934 ('source_ref', 'branch:default:' + commit_ids['change']),
938 935 ('target_repo', target.repo_name),
939 936 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
940 937 ('common_ancestor', commit_ids['ancestor']),
941 938 ('pullrequest_title', 'Title'),
942 939 ('pullrequest_desc', 'Description'),
943 940 ('description_renderer', 'markdown'),
944 941 ('__start__', 'review_members:sequence'),
945 942 ('__start__', 'reviewer:mapping'),
946 943 ('user_id', '1'),
947 944 ('__start__', 'reasons:sequence'),
948 945 ('reason', 'Some reason'),
949 946 ('__end__', 'reasons:sequence'),
950 947 ('__start__', 'rules:sequence'),
951 948 ('__end__', 'rules:sequence'),
952 949 ('mandatory', 'False'),
953 950 ('__end__', 'reviewer:mapping'),
954 951 ('__end__', 'review_members:sequence'),
955 952 ('__start__', 'revisions:sequence'),
956 953 ('revisions', commit_ids['change']),
957 954 ('__end__', 'revisions:sequence'),
958 955 ('user', ''),
959 956 ('csrf_token', csrf_token),
960 957 ],
961 958 status=302)
962 959
963 960 location = response.headers['Location']
964 961
965 962 pull_request_id = location.rsplit('/', 1)[1]
966 963 assert pull_request_id != 'new'
967 964 pull_request = PullRequest.get(int(pull_request_id))
968 965
969 966 # target_ref has to point to the ancestor's commit_id in order to
970 967 # show the correct diff
971 968 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
972 969 assert pull_request.target_ref == expected_target_ref
973 970
974 971 # Check generated diff contents
975 972 response = response.follow()
976 973 response.mustcontain(no=['content_of_ancestor'])
977 974 response.mustcontain(no=['content_of_ancestor-child'])
978 975 response.mustcontain('content_of_change')
979 976
980 977 def test_merge_pull_request_enabled(self, pr_util, csrf_token):
981 978 # Clear any previous calls to rcextensions
982 979 rhodecode.EXTENSIONS.calls.clear()
983 980
984 981 pull_request = pr_util.create_pull_request(
985 982 approved=True, mergeable=True)
986 983 pull_request_id = pull_request.pull_request_id
987 984 repo_name = pull_request.target_repo.scm_instance().name,
988 985
989 986 url = route_path('pullrequest_merge',
990 987 repo_name=str(repo_name[0]),
991 988 pull_request_id=pull_request_id)
992 989 response = self.app.post(url, params={'csrf_token': csrf_token}).follow()
993 990
994 991 pull_request = PullRequest.get(pull_request_id)
995 992
996 993 assert response.status_int == 200
997 994 assert pull_request.is_closed()
998 995 assert_pull_request_status(
999 996 pull_request, ChangesetStatus.STATUS_APPROVED)
1000 997
1001 998 # Check the relevant log entries were added
1002 999 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(3)
1003 1000 actions = [log.action for log in user_logs]
1004 1001 pr_commit_ids = PullRequestModel()._get_commit_ids(pull_request)
1005 1002 expected_actions = [
1006 1003 u'repo.pull_request.close',
1007 1004 u'repo.pull_request.merge',
1008 1005 u'repo.pull_request.comment.create'
1009 1006 ]
1010 1007 assert actions == expected_actions
1011 1008
1012 1009 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(4)
1013 1010 actions = [log for log in user_logs]
1014 1011 assert actions[-1].action == 'user.push'
1015 1012 assert actions[-1].action_data['commit_ids'] == pr_commit_ids
1016 1013
1017 1014 # Check post_push rcextension was really executed
1018 1015 push_calls = rhodecode.EXTENSIONS.calls['_push_hook']
1019 1016 assert len(push_calls) == 1
1020 1017 unused_last_call_args, last_call_kwargs = push_calls[0]
1021 1018 assert last_call_kwargs['action'] == 'push'
1022 1019 assert last_call_kwargs['commit_ids'] == pr_commit_ids
1023 1020
1024 1021 def test_merge_pull_request_disabled(self, pr_util, csrf_token):
1025 1022 pull_request = pr_util.create_pull_request(mergeable=False)
1026 1023 pull_request_id = pull_request.pull_request_id
1027 1024 pull_request = PullRequest.get(pull_request_id)
1028 1025
1029 1026 response = self.app.post(
1030 1027 route_path('pullrequest_merge',
1031 1028 repo_name=pull_request.target_repo.scm_instance().name,
1032 1029 pull_request_id=pull_request.pull_request_id),
1033 1030 params={'csrf_token': csrf_token}).follow()
1034 1031
1035 1032 assert response.status_int == 200
1036 1033 response.mustcontain(
1037 1034 'Merge is not currently possible because of below failed checks.')
1038 1035 response.mustcontain('Server-side pull request merging is disabled.')
1039 1036
1040 1037 @pytest.mark.skip_backends('svn')
1041 1038 def test_merge_pull_request_not_approved(self, pr_util, csrf_token):
1042 1039 pull_request = pr_util.create_pull_request(mergeable=True)
1043 1040 pull_request_id = pull_request.pull_request_id
1044 1041 repo_name = pull_request.target_repo.scm_instance().name
1045 1042
1046 1043 response = self.app.post(
1047 1044 route_path('pullrequest_merge',
1048 1045 repo_name=repo_name, pull_request_id=pull_request_id),
1049 1046 params={'csrf_token': csrf_token}).follow()
1050 1047
1051 1048 assert response.status_int == 200
1052 1049
1053 1050 response.mustcontain(
1054 1051 'Merge is not currently possible because of below failed checks.')
1055 1052 response.mustcontain('Pull request reviewer approval is pending.')
1056 1053
1057 1054 def test_merge_pull_request_renders_failure_reason(
1058 1055 self, user_regular, csrf_token, pr_util):
1059 1056 pull_request = pr_util.create_pull_request(mergeable=True, approved=True)
1060 1057 pull_request_id = pull_request.pull_request_id
1061 1058 repo_name = pull_request.target_repo.scm_instance().name
1062 1059
1063 1060 merge_resp = MergeResponse(True, False, 'STUB_COMMIT_ID',
1064 1061 MergeFailureReason.PUSH_FAILED,
1065 1062 metadata={'target': 'shadow repo',
1066 1063 'merge_commit': 'xxx'})
1067 1064 model_patcher = mock.patch.multiple(
1068 1065 PullRequestModel,
1069 1066 merge_repo=mock.Mock(return_value=merge_resp),
1070 1067 merge_status=mock.Mock(return_value=(None, True, 'WRONG_MESSAGE')))
1071 1068
1072 1069 with model_patcher:
1073 1070 response = self.app.post(
1074 1071 route_path('pullrequest_merge',
1075 1072 repo_name=repo_name,
1076 1073 pull_request_id=pull_request_id),
1077 1074 params={'csrf_token': csrf_token}, status=302)
1078 1075
1079 1076 merge_resp = MergeResponse(True, True, '', MergeFailureReason.PUSH_FAILED,
1080 1077 metadata={'target': 'shadow repo',
1081 1078 'merge_commit': 'xxx'})
1082 1079 assert_session_flash(response, merge_resp.merge_status_message)
1083 1080
1084 1081 def test_update_source_revision(self, backend, csrf_token):
1085 1082 commits = [
1086 1083 {'message': 'ancestor'},
1087 1084 {'message': 'change'},
1088 1085 {'message': 'change-2'},
1089 1086 ]
1090 1087 commit_ids = backend.create_master_repo(commits)
1091 1088 target = backend.create_repo(heads=['ancestor'])
1092 1089 source = backend.create_repo(heads=['change'])
1093 1090
1094 1091 # create pr from a in source to A in target
1095 1092 pull_request = PullRequest()
1096 1093
1097 1094 pull_request.source_repo = source
1098 1095 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1099 1096 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1100 1097
1101 1098 pull_request.target_repo = target
1102 1099 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1103 1100 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1104 1101
1105 1102 pull_request.revisions = [commit_ids['change']]
1106 1103 pull_request.title = u"Test"
1107 1104 pull_request.description = u"Description"
1108 1105 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1109 1106 pull_request.pull_request_state = PullRequest.STATE_CREATED
1110 1107 Session().add(pull_request)
1111 1108 Session().commit()
1112 1109 pull_request_id = pull_request.pull_request_id
1113 1110
1114 1111 # source has ancestor - change - change-2
1115 1112 backend.pull_heads(source, heads=['change-2'])
1116 1113 target_repo_name = target.repo_name
1117 1114
1118 1115 # update PR
1119 1116 self.app.post(
1120 1117 route_path('pullrequest_update',
1121 1118 repo_name=target_repo_name, pull_request_id=pull_request_id),
1122 1119 params={'update_commits': 'true', 'csrf_token': csrf_token})
1123 1120
1124 1121 response = self.app.get(
1125 1122 route_path('pullrequest_show',
1126 1123 repo_name=target_repo_name,
1127 1124 pull_request_id=pull_request.pull_request_id))
1128 1125
1129 1126 assert response.status_int == 200
1130 1127 response.mustcontain('Pull request updated to')
1131 1128 response.mustcontain('with 1 added, 0 removed commits.')
1132 1129
1133 1130 # check that we have now both revisions
1134 1131 pull_request = PullRequest.get(pull_request_id)
1135 1132 assert pull_request.revisions == [commit_ids['change-2'], commit_ids['change']]
1136 1133
1137 1134 def test_update_target_revision(self, backend, csrf_token):
1138 1135 commits = [
1139 1136 {'message': 'ancestor'},
1140 1137 {'message': 'change'},
1141 1138 {'message': 'ancestor-new', 'parents': ['ancestor']},
1142 1139 {'message': 'change-rebased'},
1143 1140 ]
1144 1141 commit_ids = backend.create_master_repo(commits)
1145 1142 target = backend.create_repo(heads=['ancestor'])
1146 1143 source = backend.create_repo(heads=['change'])
1147 1144
1148 1145 # create pr from a in source to A in target
1149 1146 pull_request = PullRequest()
1150 1147
1151 1148 pull_request.source_repo = source
1152 1149 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1153 1150 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1154 1151
1155 1152 pull_request.target_repo = target
1156 1153 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1157 1154 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1158 1155
1159 1156 pull_request.revisions = [commit_ids['change']]
1160 1157 pull_request.title = u"Test"
1161 1158 pull_request.description = u"Description"
1162 1159 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1163 1160 pull_request.pull_request_state = PullRequest.STATE_CREATED
1164 1161
1165 1162 Session().add(pull_request)
1166 1163 Session().commit()
1167 1164 pull_request_id = pull_request.pull_request_id
1168 1165
1169 1166 # target has ancestor - ancestor-new
1170 1167 # source has ancestor - ancestor-new - change-rebased
1171 1168 backend.pull_heads(target, heads=['ancestor-new'])
1172 1169 backend.pull_heads(source, heads=['change-rebased'])
1173 1170 target_repo_name = target.repo_name
1174 1171
1175 1172 # update PR
1176 1173 url = route_path('pullrequest_update',
1177 1174 repo_name=target_repo_name,
1178 1175 pull_request_id=pull_request_id)
1179 1176 self.app.post(url,
1180 1177 params={'update_commits': 'true', 'csrf_token': csrf_token},
1181 1178 status=200)
1182 1179
1183 1180 # check that we have now both revisions
1184 1181 pull_request = PullRequest.get(pull_request_id)
1185 1182 assert pull_request.revisions == [commit_ids['change-rebased']]
1186 1183 assert pull_request.target_ref == 'branch:{branch}:{commit_id}'.format(
1187 1184 branch=backend.default_branch_name, commit_id=commit_ids['ancestor-new'])
1188 1185
1189 1186 response = self.app.get(
1190 1187 route_path('pullrequest_show',
1191 1188 repo_name=target_repo_name,
1192 1189 pull_request_id=pull_request.pull_request_id))
1193 1190 assert response.status_int == 200
1194 1191 response.mustcontain('Pull request updated to')
1195 1192 response.mustcontain('with 1 added, 1 removed commits.')
1196 1193
1197 1194 def test_update_target_revision_with_removal_of_1_commit_git(self, backend_git, csrf_token):
1198 1195 backend = backend_git
1199 1196 commits = [
1200 1197 {'message': 'master-commit-1'},
1201 1198 {'message': 'master-commit-2-change-1'},
1202 1199 {'message': 'master-commit-3-change-2'},
1203 1200
1204 1201 {'message': 'feat-commit-1', 'parents': ['master-commit-1']},
1205 1202 {'message': 'feat-commit-2'},
1206 1203 ]
1207 1204 commit_ids = backend.create_master_repo(commits)
1208 1205 target = backend.create_repo(heads=['master-commit-3-change-2'])
1209 1206 source = backend.create_repo(heads=['feat-commit-2'])
1210 1207
1211 1208 # create pr from a in source to A in target
1212 1209 pull_request = PullRequest()
1213 1210 pull_request.source_repo = source
1214 1211
1215 1212 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1216 1213 branch=backend.default_branch_name,
1217 1214 commit_id=commit_ids['master-commit-3-change-2'])
1218 1215
1219 1216 pull_request.target_repo = target
1220 1217 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1221 1218 branch=backend.default_branch_name, commit_id=commit_ids['feat-commit-2'])
1222 1219
1223 1220 pull_request.revisions = [
1224 1221 commit_ids['feat-commit-1'],
1225 1222 commit_ids['feat-commit-2']
1226 1223 ]
1227 1224 pull_request.title = u"Test"
1228 1225 pull_request.description = u"Description"
1229 1226 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1230 1227 pull_request.pull_request_state = PullRequest.STATE_CREATED
1231 1228 Session().add(pull_request)
1232 1229 Session().commit()
1233 1230 pull_request_id = pull_request.pull_request_id
1234 1231
1235 1232 # PR is created, now we simulate a force-push into target,
1236 1233 # that drops a 2 last commits
1237 1234 vcsrepo = target.scm_instance()
1238 1235 vcsrepo.config.clear_section('hooks')
1239 1236 vcsrepo.run_git_command(['reset', '--soft', 'HEAD~2'])
1240 1237 target_repo_name = target.repo_name
1241 1238
1242 1239 # update PR
1243 1240 url = route_path('pullrequest_update',
1244 1241 repo_name=target_repo_name,
1245 1242 pull_request_id=pull_request_id)
1246 1243 self.app.post(url,
1247 1244 params={'update_commits': 'true', 'csrf_token': csrf_token},
1248 1245 status=200)
1249 1246
1250 1247 response = self.app.get(route_path('pullrequest_new', repo_name=target_repo_name))
1251 1248 assert response.status_int == 200
1252 1249 response.mustcontain('Pull request updated to')
1253 1250 response.mustcontain('with 0 added, 0 removed commits.')
1254 1251
1255 1252 def test_update_of_ancestor_reference(self, backend, csrf_token):
1256 1253 commits = [
1257 1254 {'message': 'ancestor'},
1258 1255 {'message': 'change'},
1259 1256 {'message': 'change-2'},
1260 1257 {'message': 'ancestor-new', 'parents': ['ancestor']},
1261 1258 {'message': 'change-rebased'},
1262 1259 ]
1263 1260 commit_ids = backend.create_master_repo(commits)
1264 1261 target = backend.create_repo(heads=['ancestor'])
1265 1262 source = backend.create_repo(heads=['change'])
1266 1263
1267 1264 # create pr from a in source to A in target
1268 1265 pull_request = PullRequest()
1269 1266 pull_request.source_repo = source
1270 1267
1271 1268 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1272 1269 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1273 1270 pull_request.target_repo = target
1274 1271 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1275 1272 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1276 1273 pull_request.revisions = [commit_ids['change']]
1277 1274 pull_request.title = u"Test"
1278 1275 pull_request.description = u"Description"
1279 1276 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1280 1277 pull_request.pull_request_state = PullRequest.STATE_CREATED
1281 1278 Session().add(pull_request)
1282 1279 Session().commit()
1283 1280 pull_request_id = pull_request.pull_request_id
1284 1281
1285 1282 # target has ancestor - ancestor-new
1286 1283 # source has ancestor - ancestor-new - change-rebased
1287 1284 backend.pull_heads(target, heads=['ancestor-new'])
1288 1285 backend.pull_heads(source, heads=['change-rebased'])
1289 1286 target_repo_name = target.repo_name
1290 1287
1291 1288 # update PR
1292 1289 self.app.post(
1293 1290 route_path('pullrequest_update',
1294 1291 repo_name=target_repo_name, pull_request_id=pull_request_id),
1295 1292 params={'update_commits': 'true', 'csrf_token': csrf_token},
1296 1293 status=200)
1297 1294
1298 1295 # Expect the target reference to be updated correctly
1299 1296 pull_request = PullRequest.get(pull_request_id)
1300 1297 assert pull_request.revisions == [commit_ids['change-rebased']]
1301 1298 expected_target_ref = 'branch:{branch}:{commit_id}'.format(
1302 1299 branch=backend.default_branch_name,
1303 1300 commit_id=commit_ids['ancestor-new'])
1304 1301 assert pull_request.target_ref == expected_target_ref
1305 1302
1306 1303 def test_remove_pull_request_branch(self, backend_git, csrf_token):
1307 1304 branch_name = 'development'
1308 1305 commits = [
1309 1306 {'message': 'initial-commit'},
1310 1307 {'message': 'old-feature'},
1311 1308 {'message': 'new-feature', 'branch': branch_name},
1312 1309 ]
1313 1310 repo = backend_git.create_repo(commits)
1314 1311 repo_name = repo.repo_name
1315 1312 commit_ids = backend_git.commit_ids
1316 1313
1317 1314 pull_request = PullRequest()
1318 1315 pull_request.source_repo = repo
1319 1316 pull_request.target_repo = repo
1320 1317 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1321 1318 branch=branch_name, commit_id=commit_ids['new-feature'])
1322 1319 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1323 1320 branch=backend_git.default_branch_name, commit_id=commit_ids['old-feature'])
1324 1321 pull_request.revisions = [commit_ids['new-feature']]
1325 1322 pull_request.title = u"Test"
1326 1323 pull_request.description = u"Description"
1327 1324 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1328 1325 pull_request.pull_request_state = PullRequest.STATE_CREATED
1329 1326 Session().add(pull_request)
1330 1327 Session().commit()
1331 1328
1332 1329 pull_request_id = pull_request.pull_request_id
1333 1330
1334 1331 vcs = repo.scm_instance()
1335 1332 vcs.remove_ref('refs/heads/{}'.format(branch_name))
1336 1333 # NOTE(marcink): run GC to ensure the commits are gone
1337 1334 vcs.run_gc()
1338 1335
1339 1336 response = self.app.get(route_path(
1340 1337 'pullrequest_show',
1341 1338 repo_name=repo_name,
1342 1339 pull_request_id=pull_request_id))
1343 1340
1344 1341 assert response.status_int == 200
1345 1342
1346 1343 response.assert_response().element_contains(
1347 1344 '#changeset_compare_view_content .alert strong',
1348 1345 'Missing commits')
1349 1346 response.assert_response().element_contains(
1350 1347 '#changeset_compare_view_content .alert',
1351 1348 'This pull request cannot be displayed, because one or more'
1352 1349 ' commits no longer exist in the source repository.')
1353 1350
1354 1351 def test_strip_commits_from_pull_request(
1355 1352 self, backend, pr_util, csrf_token):
1356 1353 commits = [
1357 1354 {'message': 'initial-commit'},
1358 1355 {'message': 'old-feature'},
1359 1356 {'message': 'new-feature', 'parents': ['initial-commit']},
1360 1357 ]
1361 1358 pull_request = pr_util.create_pull_request(
1362 1359 commits, target_head='initial-commit', source_head='new-feature',
1363 1360 revisions=['new-feature'])
1364 1361
1365 1362 vcs = pr_util.source_repository.scm_instance()
1366 1363 if backend.alias == 'git':
1367 1364 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1368 1365 else:
1369 1366 vcs.strip(pr_util.commit_ids['new-feature'])
1370 1367
1371 1368 response = self.app.get(route_path(
1372 1369 'pullrequest_show',
1373 1370 repo_name=pr_util.target_repository.repo_name,
1374 1371 pull_request_id=pull_request.pull_request_id))
1375 1372
1376 1373 assert response.status_int == 200
1377 1374
1378 1375 response.assert_response().element_contains(
1379 1376 '#changeset_compare_view_content .alert strong',
1380 1377 'Missing commits')
1381 1378 response.assert_response().element_contains(
1382 1379 '#changeset_compare_view_content .alert',
1383 1380 'This pull request cannot be displayed, because one or more'
1384 1381 ' commits no longer exist in the source repository.')
1385 1382 response.assert_response().element_contains(
1386 1383 '#update_commits',
1387 1384 'Update commits')
1388 1385
1389 1386 def test_strip_commits_and_update(
1390 1387 self, backend, pr_util, csrf_token):
1391 1388 commits = [
1392 1389 {'message': 'initial-commit'},
1393 1390 {'message': 'old-feature'},
1394 1391 {'message': 'new-feature', 'parents': ['old-feature']},
1395 1392 ]
1396 1393 pull_request = pr_util.create_pull_request(
1397 1394 commits, target_head='old-feature', source_head='new-feature',
1398 1395 revisions=['new-feature'], mergeable=True)
1399 1396 pr_id = pull_request.pull_request_id
1400 1397 target_repo_name = pull_request.target_repo.repo_name
1401 1398
1402 1399 vcs = pr_util.source_repository.scm_instance()
1403 1400 if backend.alias == 'git':
1404 1401 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1405 1402 else:
1406 1403 vcs.strip(pr_util.commit_ids['new-feature'])
1407 1404
1408 1405 url = route_path('pullrequest_update',
1409 1406 repo_name=target_repo_name,
1410 1407 pull_request_id=pr_id)
1411 1408 response = self.app.post(url,
1412 1409 params={'update_commits': 'true',
1413 1410 'csrf_token': csrf_token})
1414 1411
1415 1412 assert response.status_int == 200
1416 1413 assert response.body == '{"response": true, "redirect_url": null}'
1417 1414
1418 1415 # Make sure that after update, it won't raise 500 errors
1419 1416 response = self.app.get(route_path(
1420 1417 'pullrequest_show',
1421 1418 repo_name=target_repo_name,
1422 1419 pull_request_id=pr_id))
1423 1420
1424 1421 assert response.status_int == 200
1425 1422 response.assert_response().element_contains(
1426 1423 '#changeset_compare_view_content .alert strong',
1427 1424 'Missing commits')
1428 1425
1429 1426 def test_branch_is_a_link(self, pr_util):
1430 1427 pull_request = pr_util.create_pull_request()
1431 1428 pull_request.source_ref = 'branch:origin:1234567890abcdef'
1432 1429 pull_request.target_ref = 'branch:target:abcdef1234567890'
1433 1430 Session().add(pull_request)
1434 1431 Session().commit()
1435 1432
1436 1433 response = self.app.get(route_path(
1437 1434 'pullrequest_show',
1438 1435 repo_name=pull_request.target_repo.scm_instance().name,
1439 1436 pull_request_id=pull_request.pull_request_id))
1440 1437 assert response.status_int == 200
1441 1438
1442 1439 source = response.assert_response().get_element('.pr-source-info')
1443 1440 source_parent = source.getparent()
1444 1441 assert len(source_parent) == 1
1445 1442
1446 1443 target = response.assert_response().get_element('.pr-target-info')
1447 1444 target_parent = target.getparent()
1448 1445 assert len(target_parent) == 1
1449 1446
1450 1447 expected_origin_link = route_path(
1451 1448 'repo_commits',
1452 1449 repo_name=pull_request.source_repo.scm_instance().name,
1453 1450 params=dict(branch='origin'))
1454 1451 expected_target_link = route_path(
1455 1452 'repo_commits',
1456 1453 repo_name=pull_request.target_repo.scm_instance().name,
1457 1454 params=dict(branch='target'))
1458 1455 assert source_parent.attrib['href'] == expected_origin_link
1459 1456 assert target_parent.attrib['href'] == expected_target_link
1460 1457
1461 1458 def test_bookmark_is_not_a_link(self, pr_util):
1462 1459 pull_request = pr_util.create_pull_request()
1463 1460 pull_request.source_ref = 'bookmark:origin:1234567890abcdef'
1464 1461 pull_request.target_ref = 'bookmark:target:abcdef1234567890'
1465 1462 Session().add(pull_request)
1466 1463 Session().commit()
1467 1464
1468 1465 response = self.app.get(route_path(
1469 1466 'pullrequest_show',
1470 1467 repo_name=pull_request.target_repo.scm_instance().name,
1471 1468 pull_request_id=pull_request.pull_request_id))
1472 1469 assert response.status_int == 200
1473 1470
1474 1471 source = response.assert_response().get_element('.pr-source-info')
1475 1472 assert source.text.strip() == 'bookmark:origin'
1476 1473 assert source.getparent().attrib.get('href') is None
1477 1474
1478 1475 target = response.assert_response().get_element('.pr-target-info')
1479 1476 assert target.text.strip() == 'bookmark:target'
1480 1477 assert target.getparent().attrib.get('href') is None
1481 1478
1482 1479 def test_tag_is_not_a_link(self, pr_util):
1483 1480 pull_request = pr_util.create_pull_request()
1484 1481 pull_request.source_ref = 'tag:origin:1234567890abcdef'
1485 1482 pull_request.target_ref = 'tag:target:abcdef1234567890'
1486 1483 Session().add(pull_request)
1487 1484 Session().commit()
1488 1485
1489 1486 response = self.app.get(route_path(
1490 1487 'pullrequest_show',
1491 1488 repo_name=pull_request.target_repo.scm_instance().name,
1492 1489 pull_request_id=pull_request.pull_request_id))
1493 1490 assert response.status_int == 200
1494 1491
1495 1492 source = response.assert_response().get_element('.pr-source-info')
1496 1493 assert source.text.strip() == 'tag:origin'
1497 1494 assert source.getparent().attrib.get('href') is None
1498 1495
1499 1496 target = response.assert_response().get_element('.pr-target-info')
1500 1497 assert target.text.strip() == 'tag:target'
1501 1498 assert target.getparent().attrib.get('href') is None
1502 1499
1503 1500 @pytest.mark.parametrize('mergeable', [True, False])
1504 1501 def test_shadow_repository_link(
1505 1502 self, mergeable, pr_util, http_host_only_stub):
1506 1503 """
1507 1504 Check that the pull request summary page displays a link to the shadow
1508 1505 repository if the pull request is mergeable. If it is not mergeable
1509 1506 the link should not be displayed.
1510 1507 """
1511 1508 pull_request = pr_util.create_pull_request(
1512 1509 mergeable=mergeable, enable_notifications=False)
1513 1510 target_repo = pull_request.target_repo.scm_instance()
1514 1511 pr_id = pull_request.pull_request_id
1515 1512 shadow_url = '{host}/{repo}/pull-request/{pr_id}/repository'.format(
1516 1513 host=http_host_only_stub, repo=target_repo.name, pr_id=pr_id)
1517 1514
1518 1515 response = self.app.get(route_path(
1519 1516 'pullrequest_show',
1520 1517 repo_name=target_repo.name,
1521 1518 pull_request_id=pr_id))
1522 1519
1523 1520 if mergeable:
1524 1521 response.assert_response().element_value_contains(
1525 1522 'input.pr-mergeinfo', shadow_url)
1526 1523 response.assert_response().element_value_contains(
1527 1524 'input.pr-mergeinfo ', 'pr-merge')
1528 1525 else:
1529 1526 response.assert_response().no_element_exists('.pr-mergeinfo')
1530 1527
1531 1528
1532 1529 @pytest.mark.usefixtures('app')
1533 1530 @pytest.mark.backends("git", "hg")
1534 1531 class TestPullrequestsControllerDelete(object):
1535 1532 def test_pull_request_delete_button_permissions_admin(
1536 1533 self, autologin_user, user_admin, pr_util):
1537 1534 pull_request = pr_util.create_pull_request(
1538 1535 author=user_admin.username, enable_notifications=False)
1539 1536
1540 1537 response = self.app.get(route_path(
1541 1538 'pullrequest_show',
1542 1539 repo_name=pull_request.target_repo.scm_instance().name,
1543 1540 pull_request_id=pull_request.pull_request_id))
1544 1541
1545 1542 response.mustcontain('id="delete_pullrequest"')
1546 1543 response.mustcontain('Confirm to delete this pull request')
1547 1544
1548 1545 def test_pull_request_delete_button_permissions_owner(
1549 1546 self, autologin_regular_user, user_regular, pr_util):
1550 1547 pull_request = pr_util.create_pull_request(
1551 1548 author=user_regular.username, enable_notifications=False)
1552 1549
1553 1550 response = self.app.get(route_path(
1554 1551 'pullrequest_show',
1555 1552 repo_name=pull_request.target_repo.scm_instance().name,
1556 1553 pull_request_id=pull_request.pull_request_id))
1557 1554
1558 1555 response.mustcontain('id="delete_pullrequest"')
1559 1556 response.mustcontain('Confirm to delete this pull request')
1560 1557
1561 1558 def test_pull_request_delete_button_permissions_forbidden(
1562 1559 self, autologin_regular_user, user_regular, user_admin, pr_util):
1563 1560 pull_request = pr_util.create_pull_request(
1564 1561 author=user_admin.username, enable_notifications=False)
1565 1562
1566 1563 response = self.app.get(route_path(
1567 1564 'pullrequest_show',
1568 1565 repo_name=pull_request.target_repo.scm_instance().name,
1569 1566 pull_request_id=pull_request.pull_request_id))
1570 1567 response.mustcontain(no=['id="delete_pullrequest"'])
1571 1568 response.mustcontain(no=['Confirm to delete this pull request'])
1572 1569
1573 1570 def test_pull_request_delete_button_permissions_can_update_cannot_delete(
1574 1571 self, autologin_regular_user, user_regular, user_admin, pr_util,
1575 1572 user_util):
1576 1573
1577 1574 pull_request = pr_util.create_pull_request(
1578 1575 author=user_admin.username, enable_notifications=False)
1579 1576
1580 1577 user_util.grant_user_permission_to_repo(
1581 1578 pull_request.target_repo, user_regular,
1582 1579 'repository.write')
1583 1580
1584 1581 response = self.app.get(route_path(
1585 1582 'pullrequest_show',
1586 1583 repo_name=pull_request.target_repo.scm_instance().name,
1587 1584 pull_request_id=pull_request.pull_request_id))
1588 1585
1589 1586 response.mustcontain('id="open_edit_pullrequest"')
1590 1587 response.mustcontain('id="delete_pullrequest"')
1591 1588 response.mustcontain(no=['Confirm to delete this pull request'])
1592 1589
1593 1590 def test_delete_comment_returns_404_if_comment_does_not_exist(
1594 1591 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1595 1592
1596 1593 pull_request = pr_util.create_pull_request(
1597 1594 author=user_admin.username, enable_notifications=False)
1598 1595
1599 1596 self.app.post(
1600 1597 route_path(
1601 1598 'pullrequest_comment_delete',
1602 1599 repo_name=pull_request.target_repo.scm_instance().name,
1603 1600 pull_request_id=pull_request.pull_request_id,
1604 1601 comment_id=1024404),
1605 1602 extra_environ=xhr_header,
1606 1603 params={'csrf_token': csrf_token},
1607 1604 status=404
1608 1605 )
1609 1606
1610 1607 def test_delete_comment(
1611 1608 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1612 1609
1613 1610 pull_request = pr_util.create_pull_request(
1614 1611 author=user_admin.username, enable_notifications=False)
1615 1612 comment = pr_util.create_comment()
1616 1613 comment_id = comment.comment_id
1617 1614
1618 1615 response = self.app.post(
1619 1616 route_path(
1620 1617 'pullrequest_comment_delete',
1621 1618 repo_name=pull_request.target_repo.scm_instance().name,
1622 1619 pull_request_id=pull_request.pull_request_id,
1623 1620 comment_id=comment_id),
1624 1621 extra_environ=xhr_header,
1625 1622 params={'csrf_token': csrf_token},
1626 1623 status=200
1627 1624 )
1628 1625 assert response.body == 'true'
1629 1626
1630 1627 @pytest.mark.parametrize('url_type', [
1631 1628 'pullrequest_new',
1632 1629 'pullrequest_create',
1633 1630 'pullrequest_update',
1634 1631 'pullrequest_merge',
1635 1632 ])
1636 1633 def test_pull_request_is_forbidden_on_archived_repo(
1637 1634 self, autologin_user, backend, xhr_header, user_util, url_type):
1638 1635
1639 1636 # create a temporary repo
1640 1637 source = user_util.create_repo(repo_type=backend.alias)
1641 1638 repo_name = source.repo_name
1642 1639 repo = Repository.get_by_repo_name(repo_name)
1643 1640 repo.archived = True
1644 1641 Session().commit()
1645 1642
1646 1643 response = self.app.get(
1647 1644 route_path(url_type, repo_name=repo_name, pull_request_id=1), status=302)
1648 1645
1649 1646 msg = 'Action not supported for archived repository.'
1650 1647 assert_session_flash(response, msg)
1651 1648
1652 1649
1653 1650 def assert_pull_request_status(pull_request, expected_status):
1654 1651 status = ChangesetStatusModel().calculated_review_status(pull_request=pull_request)
1655 1652 assert status == expected_status
1656 1653
1657 1654
1658 1655 @pytest.mark.parametrize('route', ['pullrequest_new', 'pullrequest_create'])
1659 1656 @pytest.mark.usefixtures("autologin_user")
1660 1657 def test_forbidde_to_repo_summary_for_svn_repositories(backend_svn, app, route):
1661 1658 app.get(route_path(route, repo_name=backend_svn.repo_name), status=404)
@@ -1,111 +1,113 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 from rhodecode.lib import helpers as h, rc_cache
22 22 from rhodecode.lib.utils2 import safe_int
23 23 from rhodecode.model.pull_request import get_diff_info
24 24 from rhodecode.model.db import PullRequestReviewers
25 25 # V3 - Reviewers, with default rules data
26 26 # v4 - Added observers metadata
27 REVIEWER_API_VERSION = 'V4'
27 # v5 - pr_author/commit_author include/exclude logic
28 REVIEWER_API_VERSION = 'V5'
28 29
29 30
30 31 def reviewer_as_json(user, reasons=None, role=None, mandatory=False, rules=None, user_group=None):
31 32 """
32 33 Returns json struct of a reviewer for frontend
33 34
34 35 :param user: the reviewer
35 36 :param reasons: list of strings of why they are reviewers
36 37 :param mandatory: bool, to set user as mandatory
37 38 """
38 39 role = role or PullRequestReviewers.ROLE_REVIEWER
39 40 if role not in PullRequestReviewers.ROLES:
40 41 raise ValueError('role is not one of %s', PullRequestReviewers.ROLES)
41 42
42 43 return {
43 44 'user_id': user.user_id,
44 45 'reasons': reasons or [],
45 46 'rules': rules or [],
46 47 'role': role,
47 48 'mandatory': mandatory,
48 49 'user_group': user_group,
49 50 'username': user.username,
50 51 'first_name': user.first_name,
51 52 'last_name': user.last_name,
52 53 'user_link': h.link_to_user(user),
53 54 'gravatar_link': h.gravatar_url(user.email, 14),
54 55 }
55 56
56 57
57 58 def to_reviewers(e):
58 59 if isinstance(e, (tuple, list)):
59 60 return map(reviewer_as_json, e)
60 61 else:
61 62 return reviewer_as_json(e)
62 63
63 64
64 65 def get_default_reviewers_data(current_user, source_repo, source_ref, target_repo, target_ref,
65 66 include_diff_info=True):
66 67 """
67 68 Return json for default reviewers of a repository
68 69 """
69 70
70 71 diff_info = {}
71 72 if include_diff_info:
72 73 diff_info = get_diff_info(
73 74 source_repo, source_ref.commit_id, target_repo, target_ref.commit_id)
74 75
75 76 reasons = ['Default reviewer', 'Repository owner']
76 77 json_reviewers = [reviewer_as_json(
77 78 user=target_repo.user, reasons=reasons, mandatory=False, rules=None, role=None)]
78 79
79 80 compute_key = rc_cache.utils.compute_key_from_params(
80 81 current_user.user_id, source_repo.repo_id, source_ref.type, source_ref.name,
81 82 source_ref.commit_id, target_repo.repo_id, target_ref.type, target_ref.name,
82 83 target_ref.commit_id)
83 84
84 85 return {
85 86 'api_ver': REVIEWER_API_VERSION, # define version for later possible schema upgrade
86 87 'compute_key': compute_key,
87 88 'diff_info': diff_info,
88 89 'reviewers': json_reviewers,
89 90 'rules': {},
90 91 'rules_data': {},
92 'rules_humanized': [],
91 93 }
92 94
93 95
94 96 def validate_default_reviewers(review_members, reviewer_rules):
95 97 """
96 98 Function to validate submitted reviewers against the saved rules
97 99 """
98 100 reviewers = []
99 101 reviewer_by_id = {}
100 102 for r in review_members:
101 103 reviewer_user_id = safe_int(r['user_id'])
102 104 entry = (reviewer_user_id, r['reasons'], r['mandatory'], r['role'], r['rules'])
103 105
104 106 reviewer_by_id[reviewer_user_id] = entry
105 107 reviewers.append(entry)
106 108
107 109 return reviewers
108 110
109 111
110 112 def validate_observers(observer_members, reviewer_rules):
111 113 return {}
@@ -1,791 +1,852 b''
1 1 # -*- coding: utf-8 -*-
2 2
3 3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 4 #
5 5 # This program is free software: you can redistribute it and/or modify
6 6 # it under the terms of the GNU Affero General Public License, version 3
7 7 # (only), as published by the Free Software Foundation.
8 8 #
9 9 # This program is distributed in the hope that it will be useful,
10 10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 12 # GNU General Public License for more details.
13 13 #
14 14 # You should have received a copy of the GNU Affero General Public License
15 15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 16 #
17 17 # This program is dual-licensed. If you wish to learn more about the
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import logging
22 22 import collections
23 23
24 24 from pyramid.httpexceptions import (
25 25 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
26 26 from pyramid.view import view_config
27 27 from pyramid.renderers import render
28 28 from pyramid.response import Response
29 29
30 30 from rhodecode.apps._base import RepoAppView
31 31 from rhodecode.apps.file_store import utils as store_utils
32 32 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
33 33
34 34 from rhodecode.lib import diffs, codeblocks, channelstream
35 35 from rhodecode.lib.auth import (
36 36 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
37 37 from rhodecode.lib.ext_json import json
38 38 from rhodecode.lib.compat import OrderedDict
39 39 from rhodecode.lib.diffs import (
40 40 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
41 41 get_diff_whitespace_flag)
42 42 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
43 43 import rhodecode.lib.helpers as h
44 44 from rhodecode.lib.utils2 import safe_unicode, str2bool, StrictAttributeDict
45 45 from rhodecode.lib.vcs.backends.base import EmptyCommit
46 46 from rhodecode.lib.vcs.exceptions import (
47 47 RepositoryError, CommitDoesNotExistError)
48 48 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
49 49 ChangesetCommentHistory
50 50 from rhodecode.model.changeset_status import ChangesetStatusModel
51 51 from rhodecode.model.comment import CommentsModel
52 52 from rhodecode.model.meta import Session
53 53 from rhodecode.model.settings import VcsSettingsModel
54 54
55 55 log = logging.getLogger(__name__)
56 56
57 57
58 58 def _update_with_GET(params, request):
59 59 for k in ['diff1', 'diff2', 'diff']:
60 60 params[k] += request.GET.getall(k)
61 61
62 62
63 63 class RepoCommitsView(RepoAppView):
64 64 def load_default_context(self):
65 65 c = self._get_local_tmpl_context(include_app_defaults=True)
66 66 c.rhodecode_repo = self.rhodecode_vcs_repo
67 67
68 68 return c
69 69
70 70 def _is_diff_cache_enabled(self, target_repo):
71 71 caching_enabled = self._get_general_setting(
72 72 target_repo, 'rhodecode_diff_cache')
73 73 log.debug('Diff caching enabled: %s', caching_enabled)
74 74 return caching_enabled
75 75
76 76 def _commit(self, commit_id_range, method):
77 77 _ = self.request.translate
78 78 c = self.load_default_context()
79 79 c.fulldiff = self.request.GET.get('fulldiff')
80 redirect_to_combined = str2bool(self.request.GET.get('redirect_combined'))
80 81
81 82 # fetch global flags of ignore ws or context lines
82 83 diff_context = get_diff_context(self.request)
83 84 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
84 85
85 86 # diff_limit will cut off the whole diff if the limit is applied
86 87 # otherwise it will just hide the big files from the front-end
87 88 diff_limit = c.visual.cut_off_limit_diff
88 89 file_limit = c.visual.cut_off_limit_file
89 90
90 91 # get ranges of commit ids if preset
91 92 commit_range = commit_id_range.split('...')[:2]
92 93
93 94 try:
94 95 pre_load = ['affected_files', 'author', 'branch', 'date',
95 96 'message', 'parents']
96 97 if self.rhodecode_vcs_repo.alias == 'hg':
97 98 pre_load += ['hidden', 'obsolete', 'phase']
98 99
99 100 if len(commit_range) == 2:
100 101 commits = self.rhodecode_vcs_repo.get_commits(
101 102 start_id=commit_range[0], end_id=commit_range[1],
102 103 pre_load=pre_load, translate_tags=False)
103 104 commits = list(commits)
104 105 else:
105 106 commits = [self.rhodecode_vcs_repo.get_commit(
106 107 commit_id=commit_id_range, pre_load=pre_load)]
107 108
108 109 c.commit_ranges = commits
109 110 if not c.commit_ranges:
110 111 raise RepositoryError('The commit range returned an empty result')
111 112 except CommitDoesNotExistError as e:
112 113 msg = _('No such commit exists. Org exception: `{}`').format(e)
113 114 h.flash(msg, category='error')
114 115 raise HTTPNotFound()
115 116 except Exception:
116 117 log.exception("General failure")
117 118 raise HTTPNotFound()
118 119 single_commit = len(c.commit_ranges) == 1
119 120
121 if redirect_to_combined and not single_commit:
122 source_ref = getattr(c.commit_ranges[0].parents[0]
123 if c.commit_ranges[0].parents else h.EmptyCommit(), 'raw_id')
124 target_ref = c.commit_ranges[-1].raw_id
125 next_url = h.route_path(
126 'repo_compare',
127 repo_name=c.repo_name,
128 source_ref_type='rev',
129 source_ref=source_ref,
130 target_ref_type='rev',
131 target_ref=target_ref)
132 raise HTTPFound(next_url)
133
120 134 c.changes = OrderedDict()
121 135 c.lines_added = 0
122 136 c.lines_deleted = 0
123 137
124 138 # auto collapse if we have more than limit
125 139 collapse_limit = diffs.DiffProcessor._collapse_commits_over
126 140 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
127 141
128 142 c.commit_statuses = ChangesetStatus.STATUSES
129 143 c.inline_comments = []
130 144 c.files = []
131 145
132 146 c.comments = []
133 147 c.unresolved_comments = []
134 148 c.resolved_comments = []
135 149
136 150 # Single commit
137 151 if single_commit:
138 152 commit = c.commit_ranges[0]
139 153 c.comments = CommentsModel().get_comments(
140 154 self.db_repo.repo_id,
141 155 revision=commit.raw_id)
142 156
143 157 # comments from PR
144 158 statuses = ChangesetStatusModel().get_statuses(
145 159 self.db_repo.repo_id, commit.raw_id,
146 160 with_revisions=True)
147 161
148 162 prs = set()
149 163 reviewers = list()
150 164 reviewers_duplicates = set() # to not have duplicates from multiple votes
151 165 for c_status in statuses:
152 166
153 167 # extract associated pull-requests from votes
154 168 if c_status.pull_request:
155 169 prs.add(c_status.pull_request)
156 170
157 171 # extract reviewers
158 172 _user_id = c_status.author.user_id
159 173 if _user_id not in reviewers_duplicates:
160 174 reviewers.append(
161 175 StrictAttributeDict({
162 176 'user': c_status.author,
163 177
164 178 # fake attributed for commit, page that we don't have
165 179 # but we share the display with PR page
166 180 'mandatory': False,
167 181 'reasons': [],
168 182 'rule_user_group_data': lambda: None
169 183 })
170 184 )
171 185 reviewers_duplicates.add(_user_id)
172 186
173 187 c.reviewers_count = len(reviewers)
174 188 c.observers_count = 0
175 189
176 190 # from associated statuses, check the pull requests, and
177 191 # show comments from them
178 192 for pr in prs:
179 193 c.comments.extend(pr.comments)
180 194
181 195 c.unresolved_comments = CommentsModel()\
182 196 .get_commit_unresolved_todos(commit.raw_id)
183 197 c.resolved_comments = CommentsModel()\
184 198 .get_commit_resolved_todos(commit.raw_id)
185 199
186 200 c.inline_comments_flat = CommentsModel()\
187 201 .get_commit_inline_comments(commit.raw_id)
188 202
189 203 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
190 204 statuses, reviewers)
191 205
192 206 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
193 207
194 208 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
195 209
196 210 for review_obj, member, reasons, mandatory, status in review_statuses:
197 211 member_reviewer = h.reviewer_as_json(
198 212 member, reasons=reasons, mandatory=mandatory, role=None,
199 213 user_group=None
200 214 )
201 215
202 216 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
203 217 member_reviewer['review_status'] = current_review_status
204 218 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
205 219 member_reviewer['allowed_to_update'] = False
206 220 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
207 221
208 222 c.commit_set_reviewers_data_json = json.dumps(c.commit_set_reviewers_data_json)
209 223
210 224 # NOTE(marcink): this uses the same voting logic as in pull-requests
211 225 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
212 226 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
213 227
214 228 diff = None
215 229 # Iterate over ranges (default commit view is always one commit)
216 230 for commit in c.commit_ranges:
217 231 c.changes[commit.raw_id] = []
218 232
219 233 commit2 = commit
220 234 commit1 = commit.first_parent
221 235
222 236 if method == 'show':
223 237 inline_comments = CommentsModel().get_inline_comments(
224 238 self.db_repo.repo_id, revision=commit.raw_id)
225 239 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
226 240 inline_comments))
227 241 c.inline_comments = inline_comments
228 242
229 243 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
230 244 self.db_repo)
231 245 cache_file_path = diff_cache_exist(
232 246 cache_path, 'diff', commit.raw_id,
233 247 hide_whitespace_changes, diff_context, c.fulldiff)
234 248
235 249 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
236 250 force_recache = str2bool(self.request.GET.get('force_recache'))
237 251
238 252 cached_diff = None
239 253 if caching_enabled:
240 254 cached_diff = load_cached_diff(cache_file_path)
241 255
242 256 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
243 257 if not force_recache and has_proper_diff_cache:
244 258 diffset = cached_diff['diff']
245 259 else:
246 260 vcs_diff = self.rhodecode_vcs_repo.get_diff(
247 261 commit1, commit2,
248 262 ignore_whitespace=hide_whitespace_changes,
249 263 context=diff_context)
250 264
251 265 diff_processor = diffs.DiffProcessor(
252 266 vcs_diff, format='newdiff', diff_limit=diff_limit,
253 267 file_limit=file_limit, show_full_diff=c.fulldiff)
254 268
255 269 _parsed = diff_processor.prepare()
256 270
257 271 diffset = codeblocks.DiffSet(
258 272 repo_name=self.db_repo_name,
259 273 source_node_getter=codeblocks.diffset_node_getter(commit1),
260 274 target_node_getter=codeblocks.diffset_node_getter(commit2))
261 275
262 276 diffset = self.path_filter.render_patchset_filtered(
263 277 diffset, _parsed, commit1.raw_id, commit2.raw_id)
264 278
265 279 # save cached diff
266 280 if caching_enabled:
267 281 cache_diff(cache_file_path, diffset, None)
268 282
269 283 c.limited_diff = diffset.limited_diff
270 284 c.changes[commit.raw_id] = diffset
271 285 else:
272 286 # TODO(marcink): no cache usage here...
273 287 _diff = self.rhodecode_vcs_repo.get_diff(
274 288 commit1, commit2,
275 289 ignore_whitespace=hide_whitespace_changes, context=diff_context)
276 290 diff_processor = diffs.DiffProcessor(
277 291 _diff, format='newdiff', diff_limit=diff_limit,
278 292 file_limit=file_limit, show_full_diff=c.fulldiff)
279 293 # downloads/raw we only need RAW diff nothing else
280 294 diff = self.path_filter.get_raw_patch(diff_processor)
281 295 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
282 296
283 297 # sort comments by how they were generated
284 298 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
285 299 c.at_version_num = None
286 300
287 301 if len(c.commit_ranges) == 1:
288 302 c.commit = c.commit_ranges[0]
289 303 c.parent_tmpl = ''.join(
290 304 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
291 305
292 306 if method == 'download':
293 307 response = Response(diff)
294 308 response.content_type = 'text/plain'
295 309 response.content_disposition = (
296 310 'attachment; filename=%s.diff' % commit_id_range[:12])
297 311 return response
298 312 elif method == 'patch':
299 313 c.diff = safe_unicode(diff)
300 314 patch = render(
301 315 'rhodecode:templates/changeset/patch_changeset.mako',
302 316 self._get_template_context(c), self.request)
303 317 response = Response(patch)
304 318 response.content_type = 'text/plain'
305 319 return response
306 320 elif method == 'raw':
307 321 response = Response(diff)
308 322 response.content_type = 'text/plain'
309 323 return response
310 324 elif method == 'show':
311 325 if len(c.commit_ranges) == 1:
312 326 html = render(
313 327 'rhodecode:templates/changeset/changeset.mako',
314 328 self._get_template_context(c), self.request)
315 329 return Response(html)
316 330 else:
317 331 c.ancestor = None
318 332 c.target_repo = self.db_repo
319 333 html = render(
320 334 'rhodecode:templates/changeset/changeset_range.mako',
321 335 self._get_template_context(c), self.request)
322 336 return Response(html)
323 337
324 338 raise HTTPBadRequest()
325 339
326 340 @LoginRequired()
327 341 @HasRepoPermissionAnyDecorator(
328 342 'repository.read', 'repository.write', 'repository.admin')
329 343 @view_config(
330 344 route_name='repo_commit', request_method='GET',
331 345 renderer=None)
332 346 def repo_commit_show(self):
333 347 commit_id = self.request.matchdict['commit_id']
334 348 return self._commit(commit_id, method='show')
335 349
336 350 @LoginRequired()
337 351 @HasRepoPermissionAnyDecorator(
338 352 'repository.read', 'repository.write', 'repository.admin')
339 353 @view_config(
340 354 route_name='repo_commit_raw', request_method='GET',
341 355 renderer=None)
342 356 @view_config(
343 357 route_name='repo_commit_raw_deprecated', request_method='GET',
344 358 renderer=None)
345 359 def repo_commit_raw(self):
346 360 commit_id = self.request.matchdict['commit_id']
347 361 return self._commit(commit_id, method='raw')
348 362
349 363 @LoginRequired()
350 364 @HasRepoPermissionAnyDecorator(
351 365 'repository.read', 'repository.write', 'repository.admin')
352 366 @view_config(
353 367 route_name='repo_commit_patch', request_method='GET',
354 368 renderer=None)
355 369 def repo_commit_patch(self):
356 370 commit_id = self.request.matchdict['commit_id']
357 371 return self._commit(commit_id, method='patch')
358 372
359 373 @LoginRequired()
360 374 @HasRepoPermissionAnyDecorator(
361 375 'repository.read', 'repository.write', 'repository.admin')
362 376 @view_config(
363 377 route_name='repo_commit_download', request_method='GET',
364 378 renderer=None)
365 379 def repo_commit_download(self):
366 380 commit_id = self.request.matchdict['commit_id']
367 381 return self._commit(commit_id, method='download')
368 382
383 def _commit_comments_create(self, commit_id, comments):
384 _ = self.request.translate
385 data = {}
386 if not comments:
387 return
388
389 commit = self.db_repo.get_commit(commit_id)
390
391 all_drafts = len([x for x in comments if str2bool(x['is_draft'])]) == len(comments)
392 for entry in comments:
393 c = self.load_default_context()
394 comment_type = entry['comment_type']
395 text = entry['text']
396 status = entry['status']
397 is_draft = str2bool(entry['is_draft'])
398 resolves_comment_id = entry['resolves_comment_id']
399 f_path = entry['f_path']
400 line_no = entry['line']
401 target_elem_id = 'file-{}'.format(h.safeid(h.safe_unicode(f_path)))
402
403 if status:
404 text = text or (_('Status change %(transition_icon)s %(status)s')
405 % {'transition_icon': '>',
406 'status': ChangesetStatus.get_status_lbl(status)})
407
408 comment = CommentsModel().create(
409 text=text,
410 repo=self.db_repo.repo_id,
411 user=self._rhodecode_db_user.user_id,
412 commit_id=commit_id,
413 f_path=f_path,
414 line_no=line_no,
415 status_change=(ChangesetStatus.get_status_lbl(status)
416 if status else None),
417 status_change_type=status,
418 comment_type=comment_type,
419 is_draft=is_draft,
420 resolves_comment_id=resolves_comment_id,
421 auth_user=self._rhodecode_user,
422 send_email=not is_draft, # skip notification for draft comments
423 )
424 is_inline = comment.is_inline
425
426 # get status if set !
427 if status:
428 # `dont_allow_on_closed_pull_request = True` means
429 # if latest status was from pull request and it's closed
430 # disallow changing status !
431
432 try:
433 ChangesetStatusModel().set_status(
434 self.db_repo.repo_id,
435 status,
436 self._rhodecode_db_user.user_id,
437 comment,
438 revision=commit_id,
439 dont_allow_on_closed_pull_request=True
440 )
441 except StatusChangeOnClosedPullRequestError:
442 msg = _('Changing the status of a commit associated with '
443 'a closed pull request is not allowed')
444 log.exception(msg)
445 h.flash(msg, category='warning')
446 raise HTTPFound(h.route_path(
447 'repo_commit', repo_name=self.db_repo_name,
448 commit_id=commit_id))
449
450 Session().flush()
451 # this is somehow required to get access to some relationship
452 # loaded on comment
453 Session().refresh(comment)
454
455 # skip notifications for drafts
456 if not is_draft:
457 CommentsModel().trigger_commit_comment_hook(
458 self.db_repo, self._rhodecode_user, 'create',
459 data={'comment': comment, 'commit': commit})
460
461 comment_id = comment.comment_id
462 data[comment_id] = {
463 'target_id': target_elem_id
464 }
465 Session().flush()
466
467 c.co = comment
468 c.at_version_num = 0
469 c.is_new = True
470 rendered_comment = render(
471 'rhodecode:templates/changeset/changeset_comment_block.mako',
472 self._get_template_context(c), self.request)
473
474 data[comment_id].update(comment.get_dict())
475 data[comment_id].update({'rendered_text': rendered_comment})
476
477 # finalize, commit and redirect
478 Session().commit()
479
480 # skip channelstream for draft comments
481 if not all_drafts:
482 comment_broadcast_channel = channelstream.comment_channel(
483 self.db_repo_name, commit_obj=commit)
484
485 comment_data = data
486 posted_comment_type = 'inline' if is_inline else 'general'
487 if len(data) == 1:
488 msg = _('posted {} new {} comment').format(len(data), posted_comment_type)
489 else:
490 msg = _('posted {} new {} comments').format(len(data), posted_comment_type)
491
492 channelstream.comment_channelstream_push(
493 self.request, comment_broadcast_channel, self._rhodecode_user, msg,
494 comment_data=comment_data)
495
496 return data
497
369 498 @LoginRequired()
370 499 @NotAnonymous()
371 500 @HasRepoPermissionAnyDecorator(
372 501 'repository.read', 'repository.write', 'repository.admin')
373 502 @CSRFRequired()
374 503 @view_config(
375 504 route_name='repo_commit_comment_create', request_method='POST',
376 505 renderer='json_ext')
377 506 def repo_commit_comment_create(self):
378 507 _ = self.request.translate
379 508 commit_id = self.request.matchdict['commit_id']
380 509
381 c = self.load_default_context()
382 status = self.request.POST.get('changeset_status', None)
383 text = self.request.POST.get('text')
384 comment_type = self.request.POST.get('comment_type')
385 resolves_comment_id = self.request.POST.get('resolves_comment_id', None)
386
387 if status:
388 text = text or (_('Status change %(transition_icon)s %(status)s')
389 % {'transition_icon': '>',
390 'status': ChangesetStatus.get_status_lbl(status)})
391
392 510 multi_commit_ids = []
393 511 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
394 512 if _commit_id not in ['', None, EmptyCommit.raw_id]:
395 513 if _commit_id not in multi_commit_ids:
396 514 multi_commit_ids.append(_commit_id)
397 515
398 516 commit_ids = multi_commit_ids or [commit_id]
399 517
400 comment = None
518 data = []
519 # Multiple comments for each passed commit id
401 520 for current_id in filter(None, commit_ids):
402 comment = CommentsModel().create(
403 text=text,
404 repo=self.db_repo.repo_id,
405 user=self._rhodecode_db_user.user_id,
406 commit_id=current_id,
407 f_path=self.request.POST.get('f_path'),
408 line_no=self.request.POST.get('line'),
409 status_change=(ChangesetStatus.get_status_lbl(status)
410 if status else None),
411 status_change_type=status,
412 comment_type=comment_type,
413 resolves_comment_id=resolves_comment_id,
414 auth_user=self._rhodecode_user
415 )
416 is_inline = comment.is_inline
417
418 # get status if set !
419 if status:
420 # if latest status was from pull request and it's closed
421 # disallow changing status !
422 # dont_allow_on_closed_pull_request = True !
521 comment_data = {
522 'comment_type': self.request.POST.get('comment_type'),
523 'text': self.request.POST.get('text'),
524 'status': self.request.POST.get('changeset_status', None),
525 'is_draft': self.request.POST.get('draft'),
526 'resolves_comment_id': self.request.POST.get('resolves_comment_id', None),
527 'close_pull_request': self.request.POST.get('close_pull_request'),
528 'f_path': self.request.POST.get('f_path'),
529 'line': self.request.POST.get('line'),
530 }
531 comment = self._commit_comments_create(commit_id=current_id, comments=[comment_data])
532 data.append(comment)
423 533
424 try:
425 ChangesetStatusModel().set_status(
426 self.db_repo.repo_id,
427 status,
428 self._rhodecode_db_user.user_id,
429 comment,
430 revision=current_id,
431 dont_allow_on_closed_pull_request=True
432 )
433 except StatusChangeOnClosedPullRequestError:
434 msg = _('Changing the status of a commit associated with '
435 'a closed pull request is not allowed')
436 log.exception(msg)
437 h.flash(msg, category='warning')
438 raise HTTPFound(h.route_path(
439 'repo_commit', repo_name=self.db_repo_name,
440 commit_id=current_id))
441
442 commit = self.db_repo.get_commit(current_id)
443 CommentsModel().trigger_commit_comment_hook(
444 self.db_repo, self._rhodecode_user, 'create',
445 data={'comment': comment, 'commit': commit})
446
447 # finalize, commit and redirect
448 Session().commit()
449
450 data = {
451 'target_id': h.safeid(h.safe_unicode(
452 self.request.POST.get('f_path'))),
453 }
454 if comment:
455 c.co = comment
456 c.at_version_num = 0
457 rendered_comment = render(
458 'rhodecode:templates/changeset/changeset_comment_block.mako',
459 self._get_template_context(c), self.request)
460
461 data.update(comment.get_dict())
462 data.update({'rendered_text': rendered_comment})
463
464 comment_broadcast_channel = channelstream.comment_channel(
465 self.db_repo_name, commit_obj=commit)
466
467 comment_data = data
468 comment_type = 'inline' if is_inline else 'general'
469 channelstream.comment_channelstream_push(
470 self.request, comment_broadcast_channel, self._rhodecode_user,
471 _('posted a new {} comment').format(comment_type),
472 comment_data=comment_data)
473
474 return data
534 return data if len(data) > 1 else data[0]
475 535
476 536 @LoginRequired()
477 537 @NotAnonymous()
478 538 @HasRepoPermissionAnyDecorator(
479 539 'repository.read', 'repository.write', 'repository.admin')
480 540 @CSRFRequired()
481 541 @view_config(
482 542 route_name='repo_commit_comment_preview', request_method='POST',
483 543 renderer='string', xhr=True)
484 544 def repo_commit_comment_preview(self):
485 545 # Technically a CSRF token is not needed as no state changes with this
486 546 # call. However, as this is a POST is better to have it, so automated
487 547 # tools don't flag it as potential CSRF.
488 548 # Post is required because the payload could be bigger than the maximum
489 549 # allowed by GET.
490 550
491 551 text = self.request.POST.get('text')
492 552 renderer = self.request.POST.get('renderer') or 'rst'
493 553 if text:
494 554 return h.render(text, renderer=renderer, mentions=True,
495 555 repo_name=self.db_repo_name)
496 556 return ''
497 557
498 558 @LoginRequired()
499 559 @HasRepoPermissionAnyDecorator(
500 560 'repository.read', 'repository.write', 'repository.admin')
501 561 @CSRFRequired()
502 562 @view_config(
503 563 route_name='repo_commit_comment_history_view', request_method='POST',
504 564 renderer='string', xhr=True)
505 565 def repo_commit_comment_history_view(self):
506 566 c = self.load_default_context()
507 567
508 568 comment_history_id = self.request.matchdict['comment_history_id']
509 569 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
510 570 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
511 571
512 572 if is_repo_comment:
513 573 c.comment_history = comment_history
514 574
515 575 rendered_comment = render(
516 576 'rhodecode:templates/changeset/comment_history.mako',
517 577 self._get_template_context(c)
518 578 , self.request)
519 579 return rendered_comment
520 580 else:
521 581 log.warning('No permissions for user %s to show comment_history_id: %s',
522 582 self._rhodecode_db_user, comment_history_id)
523 583 raise HTTPNotFound()
524 584
525 585 @LoginRequired()
526 586 @NotAnonymous()
527 587 @HasRepoPermissionAnyDecorator(
528 588 'repository.read', 'repository.write', 'repository.admin')
529 589 @CSRFRequired()
530 590 @view_config(
531 591 route_name='repo_commit_comment_attachment_upload', request_method='POST',
532 592 renderer='json_ext', xhr=True)
533 593 def repo_commit_comment_attachment_upload(self):
534 594 c = self.load_default_context()
535 595 upload_key = 'attachment'
536 596
537 597 file_obj = self.request.POST.get(upload_key)
538 598
539 599 if file_obj is None:
540 600 self.request.response.status = 400
541 601 return {'store_fid': None,
542 602 'access_path': None,
543 603 'error': '{} data field is missing'.format(upload_key)}
544 604
545 605 if not hasattr(file_obj, 'filename'):
546 606 self.request.response.status = 400
547 607 return {'store_fid': None,
548 608 'access_path': None,
549 609 'error': 'filename cannot be read from the data field'}
550 610
551 611 filename = file_obj.filename
552 612 file_display_name = filename
553 613
554 614 metadata = {
555 615 'user_uploaded': {'username': self._rhodecode_user.username,
556 616 'user_id': self._rhodecode_user.user_id,
557 617 'ip': self._rhodecode_user.ip_addr}}
558 618
559 619 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
560 620 allowed_extensions = [
561 621 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
562 622 '.pptx', '.txt', '.xlsx', '.zip']
563 623 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
564 624
565 625 try:
566 626 storage = store_utils.get_file_storage(self.request.registry.settings)
567 627 store_uid, metadata = storage.save_file(
568 628 file_obj.file, filename, extra_metadata=metadata,
569 629 extensions=allowed_extensions, max_filesize=max_file_size)
570 630 except FileNotAllowedException:
571 631 self.request.response.status = 400
572 632 permitted_extensions = ', '.join(allowed_extensions)
573 633 error_msg = 'File `{}` is not allowed. ' \
574 634 'Only following extensions are permitted: {}'.format(
575 635 filename, permitted_extensions)
576 636 return {'store_fid': None,
577 637 'access_path': None,
578 638 'error': error_msg}
579 639 except FileOverSizeException:
580 640 self.request.response.status = 400
581 641 limit_mb = h.format_byte_size_binary(max_file_size)
582 642 return {'store_fid': None,
583 643 'access_path': None,
584 644 'error': 'File {} is exceeding allowed limit of {}.'.format(
585 645 filename, limit_mb)}
586 646
587 647 try:
588 648 entry = FileStore.create(
589 649 file_uid=store_uid, filename=metadata["filename"],
590 650 file_hash=metadata["sha256"], file_size=metadata["size"],
591 651 file_display_name=file_display_name,
592 652 file_description=u'comment attachment `{}`'.format(safe_unicode(filename)),
593 653 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
594 654 scope_repo_id=self.db_repo.repo_id
595 655 )
596 656 Session().add(entry)
597 657 Session().commit()
598 658 log.debug('Stored upload in DB as %s', entry)
599 659 except Exception:
600 660 log.exception('Failed to store file %s', filename)
601 661 self.request.response.status = 400
602 662 return {'store_fid': None,
603 663 'access_path': None,
604 664 'error': 'File {} failed to store in DB.'.format(filename)}
605 665
606 666 Session().commit()
607 667
608 668 return {
609 669 'store_fid': store_uid,
610 670 'access_path': h.route_path(
611 671 'download_file', fid=store_uid),
612 672 'fqn_access_path': h.route_url(
613 673 'download_file', fid=store_uid),
614 674 'repo_access_path': h.route_path(
615 675 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
616 676 'repo_fqn_access_path': h.route_url(
617 677 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
618 678 }
619 679
620 680 @LoginRequired()
621 681 @NotAnonymous()
622 682 @HasRepoPermissionAnyDecorator(
623 683 'repository.read', 'repository.write', 'repository.admin')
624 684 @CSRFRequired()
625 685 @view_config(
626 686 route_name='repo_commit_comment_delete', request_method='POST',
627 687 renderer='json_ext')
628 688 def repo_commit_comment_delete(self):
629 689 commit_id = self.request.matchdict['commit_id']
630 690 comment_id = self.request.matchdict['comment_id']
631 691
632 692 comment = ChangesetComment.get_or_404(comment_id)
633 693 if not comment:
634 694 log.debug('Comment with id:%s not found, skipping', comment_id)
635 695 # comment already deleted in another call probably
636 696 return True
637 697
638 698 if comment.immutable:
639 699 # don't allow deleting comments that are immutable
640 700 raise HTTPForbidden()
641 701
642 702 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
643 703 super_admin = h.HasPermissionAny('hg.admin')()
644 704 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
645 705 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
646 706 comment_repo_admin = is_repo_admin and is_repo_comment
647 707
648 708 if super_admin or comment_owner or comment_repo_admin:
649 709 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
650 710 Session().commit()
651 711 return True
652 712 else:
653 713 log.warning('No permissions for user %s to delete comment_id: %s',
654 714 self._rhodecode_db_user, comment_id)
655 715 raise HTTPNotFound()
656 716
657 717 @LoginRequired()
658 718 @NotAnonymous()
659 719 @HasRepoPermissionAnyDecorator(
660 720 'repository.read', 'repository.write', 'repository.admin')
661 721 @CSRFRequired()
662 722 @view_config(
663 723 route_name='repo_commit_comment_edit', request_method='POST',
664 724 renderer='json_ext')
665 725 def repo_commit_comment_edit(self):
666 726 self.load_default_context()
667 727
728 commit_id = self.request.matchdict['commit_id']
668 729 comment_id = self.request.matchdict['comment_id']
669 730 comment = ChangesetComment.get_or_404(comment_id)
670 731
671 732 if comment.immutable:
672 733 # don't allow deleting comments that are immutable
673 734 raise HTTPForbidden()
674 735
675 736 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
676 737 super_admin = h.HasPermissionAny('hg.admin')()
677 738 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
678 739 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
679 740 comment_repo_admin = is_repo_admin and is_repo_comment
680 741
681 742 if super_admin or comment_owner or comment_repo_admin:
682 743 text = self.request.POST.get('text')
683 744 version = self.request.POST.get('version')
684 745 if text == comment.text:
685 746 log.warning(
686 747 'Comment(repo): '
687 748 'Trying to create new version '
688 749 'with the same comment body {}'.format(
689 750 comment_id,
690 751 )
691 752 )
692 753 raise HTTPNotFound()
693 754
694 755 if version.isdigit():
695 756 version = int(version)
696 757 else:
697 758 log.warning(
698 759 'Comment(repo): Wrong version type {} {} '
699 760 'for comment {}'.format(
700 761 version,
701 762 type(version),
702 763 comment_id,
703 764 )
704 765 )
705 766 raise HTTPNotFound()
706 767
707 768 try:
708 769 comment_history = CommentsModel().edit(
709 770 comment_id=comment_id,
710 771 text=text,
711 772 auth_user=self._rhodecode_user,
712 773 version=version,
713 774 )
714 775 except CommentVersionMismatch:
715 776 raise HTTPConflict()
716 777
717 778 if not comment_history:
718 779 raise HTTPNotFound()
719 780
720 commit_id = self.request.matchdict['commit_id']
721 commit = self.db_repo.get_commit(commit_id)
722 CommentsModel().trigger_commit_comment_hook(
723 self.db_repo, self._rhodecode_user, 'edit',
724 data={'comment': comment, 'commit': commit})
781 if not comment.draft:
782 commit = self.db_repo.get_commit(commit_id)
783 CommentsModel().trigger_commit_comment_hook(
784 self.db_repo, self._rhodecode_user, 'edit',
785 data={'comment': comment, 'commit': commit})
725 786
726 787 Session().commit()
727 788 return {
728 789 'comment_history_id': comment_history.comment_history_id,
729 790 'comment_id': comment.comment_id,
730 791 'comment_version': comment_history.version,
731 792 'comment_author_username': comment_history.author.username,
732 793 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16),
733 794 'comment_created_on': h.age_component(comment_history.created_on,
734 795 time_is_local=True),
735 796 }
736 797 else:
737 798 log.warning('No permissions for user %s to edit comment_id: %s',
738 799 self._rhodecode_db_user, comment_id)
739 800 raise HTTPNotFound()
740 801
741 802 @LoginRequired()
742 803 @HasRepoPermissionAnyDecorator(
743 804 'repository.read', 'repository.write', 'repository.admin')
744 805 @view_config(
745 806 route_name='repo_commit_data', request_method='GET',
746 807 renderer='json_ext', xhr=True)
747 808 def repo_commit_data(self):
748 809 commit_id = self.request.matchdict['commit_id']
749 810 self.load_default_context()
750 811
751 812 try:
752 813 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
753 814 except CommitDoesNotExistError as e:
754 815 return EmptyCommit(message=str(e))
755 816
756 817 @LoginRequired()
757 818 @HasRepoPermissionAnyDecorator(
758 819 'repository.read', 'repository.write', 'repository.admin')
759 820 @view_config(
760 821 route_name='repo_commit_children', request_method='GET',
761 822 renderer='json_ext', xhr=True)
762 823 def repo_commit_children(self):
763 824 commit_id = self.request.matchdict['commit_id']
764 825 self.load_default_context()
765 826
766 827 try:
767 828 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
768 829 children = commit.children
769 830 except CommitDoesNotExistError:
770 831 children = []
771 832
772 833 result = {"results": children}
773 834 return result
774 835
775 836 @LoginRequired()
776 837 @HasRepoPermissionAnyDecorator(
777 838 'repository.read', 'repository.write', 'repository.admin')
778 839 @view_config(
779 840 route_name='repo_commit_parents', request_method='GET',
780 841 renderer='json_ext')
781 842 def repo_commit_parents(self):
782 843 commit_id = self.request.matchdict['commit_id']
783 844 self.load_default_context()
784 845
785 846 try:
786 847 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
787 848 parents = commit.parents
788 849 except CommitDoesNotExistError:
789 850 parents = []
790 851 result = {"results": parents}
791 852 return result
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
General Comments 0
You need to be logged in to leave comments. Login now