##// END OF EJS Templates
release: Merge default into stable for release preparation
milka -
r4566:091769de merge stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,89 b''
1 |RCE| 4.23.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2020-11-20
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - Comments: introduced new draft comments.
14
15 * drafts are private to author
16 * not triggering any notifications
17 * sidebar doesn't display draft comments
18 * They are just placeholders for longer review.
19
20 - Comments: when channelstream is enabled, comments are pushed live, so there's no
21 need to refresh page to see other participant comments.
22 New comments are marker in the sidebar.
23
24 - Comments: multiple changes on comments navigation/display logic.
25
26 * toggle icon is smarter, open/hide windows according to actions. E.g commenting opens threads
27 * toggle are mor explicit
28 * possible to hide/show only single threads using the toggle icon.
29 * new UI for showing thread comments
30
31 - Reviewers: new logic for author/commit-author rules.
32 It's not possible to define if author or commit author should be excluded, or always included in a review.
33 - Reviewers: no reviewers would now allow a PR to be merged, unless review rules require some.
34 Use case is that pr can be created without review needed, maybe just for sharing, or CI checks
35 - Pull requests: save permanently the state if sorting columns for pull-request grids.
36 - Commit ranges: enable combined diff compare directly from range selector.
37
38
39 General
40 ^^^^^^^
41
42 - Authentication: enable custom names for auth plugins. It's possible to name the authentication
43 buttons now for SAML plugins.
44 - Login: optimized UI for login/register/password reset windows.
45 - Repo mapper: make it more resilient to errors, it's better it executes and skip certain
46 repositories, rather then crash whole mapper.
47 - Markdown: improved styling, and fixed nl2br extensions to only do br on new elements not inline.
48 - Pull requests: show pr version in the my-account and repo pr listing grids.
49 - Archives: allowing to obtain archives without the commit short id in the name for
50 better automation of obtained artifacts.
51 New url flag called `?=with_hash=1` controls this
52 - Error document: update info about stored exception retrieval.
53 - Range diff: enable hovercards for commits in range-diff.
54
55
56 Security
57 ^^^^^^^^
58
59
60
61 Performance
62 ^^^^^^^^^^^
63
64 - Improved logic of repo archive, now it's much faster to run archiver as VCSServer
65 communication was removed, and job is delegated to VCSServer itself.
66 - Improved VCSServer startup times.
67 - Notifications: skip double rendering just to generate email title/desc.
68 We'll re-use those now for better performance of creating notifications.
69 - App: improve logging, and remove DB calls on app startup.
70
71
72 Fixes
73 ^^^^^
74
75 - Login/register: fixed header width problem on mobile devices
76 - Exception tracker: don't fail on empty request in context of celery app for example.
77 - Exceptions: improved reporting of unhandled vcsserver exceptions.
78 - Sidebar: fixed refresh of TODOs url.
79 - Remap-rescan: fixes #5636 initial rescan problem.
80 - API: fixed SVN raw diff export. The API method was inconsistent, and used different logic.
81 Now it shares the same code as raw-diff from web-ui.
82
83
84 Upgrade notes
85 ^^^^^^^^^^^^^
86
87 - Scheduled feature release.
88 Please note that now the reviewers logic changed a bit, it's possible to create a pull request
89 Without any reviewers initially, and such pull request doesn't need to have an approval for merging.
1 NO CONTENT: new file 100644
NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: new file 100644
NO CONTENT: new file 100644
The requested commit or file is too big and content was truncated. Show full diff
@@ -1,5 +1,5 b''
1 [bumpversion]
1 [bumpversion]
2 current_version = 4.22.0
2 current_version = 4.23.0
3 message = release: Bump version {current_version} to {new_version}
3 message = release: Bump version {current_version} to {new_version}
4
4
5 [bumpversion:file:rhodecode/VERSION]
5 [bumpversion:file:rhodecode/VERSION]
@@ -1,33 +1,28 b''
1 [DEFAULT]
1 [DEFAULT]
2 done = false
2 done = false
3
3
4 [task:bump_version]
4 [task:bump_version]
5 done = true
5 done = true
6
6
7 [task:rc_tools_pinned]
7 [task:rc_tools_pinned]
8 done = true
9
8
10 [task:fixes_on_stable]
9 [task:fixes_on_stable]
11 done = true
12
10
13 [task:pip2nix_generated]
11 [task:pip2nix_generated]
14 done = true
15
12
16 [task:changelog_updated]
13 [task:changelog_updated]
17 done = true
18
14
19 [task:generate_api_docs]
15 [task:generate_api_docs]
20 done = true
16
17 [task:updated_translation]
21
18
22 [release]
19 [release]
23 state = prepared
20 state = in_progress
24 version = 4.22.0
21 version = 4.23.0
25
26 [task:updated_translation]
27
22
28 [task:generate_js_routes]
23 [task:generate_js_routes]
29
24
30 [task:updated_trial_license]
25 [task:updated_trial_license]
31
26
32 [task:generate_oss_licenses]
27 [task:generate_oss_licenses]
33
28
@@ -1,148 +1,149 b''
1 .. _rhodecode-release-notes-ref:
1 .. _rhodecode-release-notes-ref:
2
2
3 Release Notes
3 Release Notes
4 =============
4 =============
5
5
6 |RCE| 4.x Versions
6 |RCE| 4.x Versions
7 ------------------
7 ------------------
8
8
9 .. toctree::
9 .. toctree::
10 :maxdepth: 1
10 :maxdepth: 1
11
11
12 release-notes-4.23.0.rst
12 release-notes-4.22.0.rst
13 release-notes-4.22.0.rst
13 release-notes-4.21.0.rst
14 release-notes-4.21.0.rst
14 release-notes-4.20.1.rst
15 release-notes-4.20.1.rst
15 release-notes-4.20.0.rst
16 release-notes-4.20.0.rst
16 release-notes-4.19.3.rst
17 release-notes-4.19.3.rst
17 release-notes-4.19.2.rst
18 release-notes-4.19.2.rst
18 release-notes-4.19.1.rst
19 release-notes-4.19.1.rst
19 release-notes-4.19.0.rst
20 release-notes-4.19.0.rst
20 release-notes-4.18.3.rst
21 release-notes-4.18.3.rst
21 release-notes-4.18.2.rst
22 release-notes-4.18.2.rst
22 release-notes-4.18.1.rst
23 release-notes-4.18.1.rst
23 release-notes-4.18.0.rst
24 release-notes-4.18.0.rst
24 release-notes-4.17.4.rst
25 release-notes-4.17.4.rst
25 release-notes-4.17.3.rst
26 release-notes-4.17.3.rst
26 release-notes-4.17.2.rst
27 release-notes-4.17.2.rst
27 release-notes-4.17.1.rst
28 release-notes-4.17.1.rst
28 release-notes-4.17.0.rst
29 release-notes-4.17.0.rst
29 release-notes-4.16.2.rst
30 release-notes-4.16.2.rst
30 release-notes-4.16.1.rst
31 release-notes-4.16.1.rst
31 release-notes-4.16.0.rst
32 release-notes-4.16.0.rst
32 release-notes-4.15.2.rst
33 release-notes-4.15.2.rst
33 release-notes-4.15.1.rst
34 release-notes-4.15.1.rst
34 release-notes-4.15.0.rst
35 release-notes-4.15.0.rst
35 release-notes-4.14.1.rst
36 release-notes-4.14.1.rst
36 release-notes-4.14.0.rst
37 release-notes-4.14.0.rst
37 release-notes-4.13.3.rst
38 release-notes-4.13.3.rst
38 release-notes-4.13.2.rst
39 release-notes-4.13.2.rst
39 release-notes-4.13.1.rst
40 release-notes-4.13.1.rst
40 release-notes-4.13.0.rst
41 release-notes-4.13.0.rst
41 release-notes-4.12.4.rst
42 release-notes-4.12.4.rst
42 release-notes-4.12.3.rst
43 release-notes-4.12.3.rst
43 release-notes-4.12.2.rst
44 release-notes-4.12.2.rst
44 release-notes-4.12.1.rst
45 release-notes-4.12.1.rst
45 release-notes-4.12.0.rst
46 release-notes-4.12.0.rst
46 release-notes-4.11.6.rst
47 release-notes-4.11.6.rst
47 release-notes-4.11.5.rst
48 release-notes-4.11.5.rst
48 release-notes-4.11.4.rst
49 release-notes-4.11.4.rst
49 release-notes-4.11.3.rst
50 release-notes-4.11.3.rst
50 release-notes-4.11.2.rst
51 release-notes-4.11.2.rst
51 release-notes-4.11.1.rst
52 release-notes-4.11.1.rst
52 release-notes-4.11.0.rst
53 release-notes-4.11.0.rst
53 release-notes-4.10.6.rst
54 release-notes-4.10.6.rst
54 release-notes-4.10.5.rst
55 release-notes-4.10.5.rst
55 release-notes-4.10.4.rst
56 release-notes-4.10.4.rst
56 release-notes-4.10.3.rst
57 release-notes-4.10.3.rst
57 release-notes-4.10.2.rst
58 release-notes-4.10.2.rst
58 release-notes-4.10.1.rst
59 release-notes-4.10.1.rst
59 release-notes-4.10.0.rst
60 release-notes-4.10.0.rst
60 release-notes-4.9.1.rst
61 release-notes-4.9.1.rst
61 release-notes-4.9.0.rst
62 release-notes-4.9.0.rst
62 release-notes-4.8.0.rst
63 release-notes-4.8.0.rst
63 release-notes-4.7.2.rst
64 release-notes-4.7.2.rst
64 release-notes-4.7.1.rst
65 release-notes-4.7.1.rst
65 release-notes-4.7.0.rst
66 release-notes-4.7.0.rst
66 release-notes-4.6.1.rst
67 release-notes-4.6.1.rst
67 release-notes-4.6.0.rst
68 release-notes-4.6.0.rst
68 release-notes-4.5.2.rst
69 release-notes-4.5.2.rst
69 release-notes-4.5.1.rst
70 release-notes-4.5.1.rst
70 release-notes-4.5.0.rst
71 release-notes-4.5.0.rst
71 release-notes-4.4.2.rst
72 release-notes-4.4.2.rst
72 release-notes-4.4.1.rst
73 release-notes-4.4.1.rst
73 release-notes-4.4.0.rst
74 release-notes-4.4.0.rst
74 release-notes-4.3.1.rst
75 release-notes-4.3.1.rst
75 release-notes-4.3.0.rst
76 release-notes-4.3.0.rst
76 release-notes-4.2.1.rst
77 release-notes-4.2.1.rst
77 release-notes-4.2.0.rst
78 release-notes-4.2.0.rst
78 release-notes-4.1.2.rst
79 release-notes-4.1.2.rst
79 release-notes-4.1.1.rst
80 release-notes-4.1.1.rst
80 release-notes-4.1.0.rst
81 release-notes-4.1.0.rst
81 release-notes-4.0.1.rst
82 release-notes-4.0.1.rst
82 release-notes-4.0.0.rst
83 release-notes-4.0.0.rst
83
84
84 |RCE| 3.x Versions
85 |RCE| 3.x Versions
85 ------------------
86 ------------------
86
87
87 .. toctree::
88 .. toctree::
88 :maxdepth: 1
89 :maxdepth: 1
89
90
90 release-notes-3.8.4.rst
91 release-notes-3.8.4.rst
91 release-notes-3.8.3.rst
92 release-notes-3.8.3.rst
92 release-notes-3.8.2.rst
93 release-notes-3.8.2.rst
93 release-notes-3.8.1.rst
94 release-notes-3.8.1.rst
94 release-notes-3.8.0.rst
95 release-notes-3.8.0.rst
95 release-notes-3.7.1.rst
96 release-notes-3.7.1.rst
96 release-notes-3.7.0.rst
97 release-notes-3.7.0.rst
97 release-notes-3.6.1.rst
98 release-notes-3.6.1.rst
98 release-notes-3.6.0.rst
99 release-notes-3.6.0.rst
99 release-notes-3.5.2.rst
100 release-notes-3.5.2.rst
100 release-notes-3.5.1.rst
101 release-notes-3.5.1.rst
101 release-notes-3.5.0.rst
102 release-notes-3.5.0.rst
102 release-notes-3.4.1.rst
103 release-notes-3.4.1.rst
103 release-notes-3.4.0.rst
104 release-notes-3.4.0.rst
104 release-notes-3.3.4.rst
105 release-notes-3.3.4.rst
105 release-notes-3.3.3.rst
106 release-notes-3.3.3.rst
106 release-notes-3.3.2.rst
107 release-notes-3.3.2.rst
107 release-notes-3.3.1.rst
108 release-notes-3.3.1.rst
108 release-notes-3.3.0.rst
109 release-notes-3.3.0.rst
109 release-notes-3.2.3.rst
110 release-notes-3.2.3.rst
110 release-notes-3.2.2.rst
111 release-notes-3.2.2.rst
111 release-notes-3.2.1.rst
112 release-notes-3.2.1.rst
112 release-notes-3.2.0.rst
113 release-notes-3.2.0.rst
113 release-notes-3.1.1.rst
114 release-notes-3.1.1.rst
114 release-notes-3.1.0.rst
115 release-notes-3.1.0.rst
115 release-notes-3.0.2.rst
116 release-notes-3.0.2.rst
116 release-notes-3.0.1.rst
117 release-notes-3.0.1.rst
117 release-notes-3.0.0.rst
118 release-notes-3.0.0.rst
118
119
119 |RCE| 2.x Versions
120 |RCE| 2.x Versions
120 ------------------
121 ------------------
121
122
122 .. toctree::
123 .. toctree::
123 :maxdepth: 1
124 :maxdepth: 1
124
125
125 release-notes-2.2.8.rst
126 release-notes-2.2.8.rst
126 release-notes-2.2.7.rst
127 release-notes-2.2.7.rst
127 release-notes-2.2.6.rst
128 release-notes-2.2.6.rst
128 release-notes-2.2.5.rst
129 release-notes-2.2.5.rst
129 release-notes-2.2.4.rst
130 release-notes-2.2.4.rst
130 release-notes-2.2.3.rst
131 release-notes-2.2.3.rst
131 release-notes-2.2.2.rst
132 release-notes-2.2.2.rst
132 release-notes-2.2.1.rst
133 release-notes-2.2.1.rst
133 release-notes-2.2.0.rst
134 release-notes-2.2.0.rst
134 release-notes-2.1.0.rst
135 release-notes-2.1.0.rst
135 release-notes-2.0.2.rst
136 release-notes-2.0.2.rst
136 release-notes-2.0.1.rst
137 release-notes-2.0.1.rst
137 release-notes-2.0.0.rst
138 release-notes-2.0.0.rst
138
139
139 |RCE| 1.x Versions
140 |RCE| 1.x Versions
140 ------------------
141 ------------------
141
142
142 .. toctree::
143 .. toctree::
143 :maxdepth: 1
144 :maxdepth: 1
144
145
145 release-notes-1.7.2.rst
146 release-notes-1.7.2.rst
146 release-notes-1.7.1.rst
147 release-notes-1.7.1.rst
147 release-notes-1.7.0.rst
148 release-notes-1.7.0.rst
148 release-notes-1.6.0.rst
149 release-notes-1.6.0.rst
@@ -1,2509 +1,2509 b''
1 # Generated by pip2nix 0.8.0.dev1
1 # Generated by pip2nix 0.8.0.dev1
2 # See https://github.com/johbo/pip2nix
2 # See https://github.com/johbo/pip2nix
3
3
4 { pkgs, fetchurl, fetchgit, fetchhg }:
4 { pkgs, fetchurl, fetchgit, fetchhg }:
5
5
6 self: super: {
6 self: super: {
7 "alembic" = super.buildPythonPackage {
7 "alembic" = super.buildPythonPackage {
8 name = "alembic-1.4.2";
8 name = "alembic-1.4.2";
9 doCheck = false;
9 doCheck = false;
10 propagatedBuildInputs = [
10 propagatedBuildInputs = [
11 self."sqlalchemy"
11 self."sqlalchemy"
12 self."mako"
12 self."mako"
13 self."python-editor"
13 self."python-editor"
14 self."python-dateutil"
14 self."python-dateutil"
15 ];
15 ];
16 src = fetchurl {
16 src = fetchurl {
17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
19 };
19 };
20 meta = {
20 meta = {
21 license = [ pkgs.lib.licenses.mit ];
21 license = [ pkgs.lib.licenses.mit ];
22 };
22 };
23 };
23 };
24 "amqp" = super.buildPythonPackage {
24 "amqp" = super.buildPythonPackage {
25 name = "amqp-2.5.2";
25 name = "amqp-2.5.2";
26 doCheck = false;
26 doCheck = false;
27 propagatedBuildInputs = [
27 propagatedBuildInputs = [
28 self."vine"
28 self."vine"
29 ];
29 ];
30 src = fetchurl {
30 src = fetchurl {
31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
33 };
33 };
34 meta = {
34 meta = {
35 license = [ pkgs.lib.licenses.bsdOriginal ];
35 license = [ pkgs.lib.licenses.bsdOriginal ];
36 };
36 };
37 };
37 };
38 "apispec" = super.buildPythonPackage {
38 "apispec" = super.buildPythonPackage {
39 name = "apispec-1.0.0";
39 name = "apispec-1.0.0";
40 doCheck = false;
40 doCheck = false;
41 propagatedBuildInputs = [
41 propagatedBuildInputs = [
42 self."PyYAML"
42 self."PyYAML"
43 ];
43 ];
44 src = fetchurl {
44 src = fetchurl {
45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
47 };
47 };
48 meta = {
48 meta = {
49 license = [ pkgs.lib.licenses.mit ];
49 license = [ pkgs.lib.licenses.mit ];
50 };
50 };
51 };
51 };
52 "appenlight-client" = super.buildPythonPackage {
52 "appenlight-client" = super.buildPythonPackage {
53 name = "appenlight-client-0.6.26";
53 name = "appenlight-client-0.6.26";
54 doCheck = false;
54 doCheck = false;
55 propagatedBuildInputs = [
55 propagatedBuildInputs = [
56 self."webob"
56 self."webob"
57 self."requests"
57 self."requests"
58 self."six"
58 self."six"
59 ];
59 ];
60 src = fetchurl {
60 src = fetchurl {
61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
63 };
63 };
64 meta = {
64 meta = {
65 license = [ pkgs.lib.licenses.bsdOriginal ];
65 license = [ pkgs.lib.licenses.bsdOriginal ];
66 };
66 };
67 };
67 };
68 "asn1crypto" = super.buildPythonPackage {
68 "asn1crypto" = super.buildPythonPackage {
69 name = "asn1crypto-0.24.0";
69 name = "asn1crypto-0.24.0";
70 doCheck = false;
70 doCheck = false;
71 src = fetchurl {
71 src = fetchurl {
72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
74 };
74 };
75 meta = {
75 meta = {
76 license = [ pkgs.lib.licenses.mit ];
76 license = [ pkgs.lib.licenses.mit ];
77 };
77 };
78 };
78 };
79 "atomicwrites" = super.buildPythonPackage {
79 "atomicwrites" = super.buildPythonPackage {
80 name = "atomicwrites-1.3.0";
80 name = "atomicwrites-1.3.0";
81 doCheck = false;
81 doCheck = false;
82 src = fetchurl {
82 src = fetchurl {
83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
85 };
85 };
86 meta = {
86 meta = {
87 license = [ pkgs.lib.licenses.mit ];
87 license = [ pkgs.lib.licenses.mit ];
88 };
88 };
89 };
89 };
90 "attrs" = super.buildPythonPackage {
90 "attrs" = super.buildPythonPackage {
91 name = "attrs-19.3.0";
91 name = "attrs-19.3.0";
92 doCheck = false;
92 doCheck = false;
93 src = fetchurl {
93 src = fetchurl {
94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
96 };
96 };
97 meta = {
97 meta = {
98 license = [ pkgs.lib.licenses.mit ];
98 license = [ pkgs.lib.licenses.mit ];
99 };
99 };
100 };
100 };
101 "babel" = super.buildPythonPackage {
101 "babel" = super.buildPythonPackage {
102 name = "babel-1.3";
102 name = "babel-1.3";
103 doCheck = false;
103 doCheck = false;
104 propagatedBuildInputs = [
104 propagatedBuildInputs = [
105 self."pytz"
105 self."pytz"
106 ];
106 ];
107 src = fetchurl {
107 src = fetchurl {
108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
110 };
110 };
111 meta = {
111 meta = {
112 license = [ pkgs.lib.licenses.bsdOriginal ];
112 license = [ pkgs.lib.licenses.bsdOriginal ];
113 };
113 };
114 };
114 };
115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
116 name = "backports.shutil-get-terminal-size-1.0.0";
116 name = "backports.shutil-get-terminal-size-1.0.0";
117 doCheck = false;
117 doCheck = false;
118 src = fetchurl {
118 src = fetchurl {
119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
121 };
121 };
122 meta = {
122 meta = {
123 license = [ pkgs.lib.licenses.mit ];
123 license = [ pkgs.lib.licenses.mit ];
124 };
124 };
125 };
125 };
126 "beaker" = super.buildPythonPackage {
126 "beaker" = super.buildPythonPackage {
127 name = "beaker-1.9.1";
127 name = "beaker-1.9.1";
128 doCheck = false;
128 doCheck = false;
129 propagatedBuildInputs = [
129 propagatedBuildInputs = [
130 self."funcsigs"
130 self."funcsigs"
131 ];
131 ];
132 src = fetchurl {
132 src = fetchurl {
133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
135 };
135 };
136 meta = {
136 meta = {
137 license = [ pkgs.lib.licenses.bsdOriginal ];
137 license = [ pkgs.lib.licenses.bsdOriginal ];
138 };
138 };
139 };
139 };
140 "beautifulsoup4" = super.buildPythonPackage {
140 "beautifulsoup4" = super.buildPythonPackage {
141 name = "beautifulsoup4-4.6.3";
141 name = "beautifulsoup4-4.6.3";
142 doCheck = false;
142 doCheck = false;
143 src = fetchurl {
143 src = fetchurl {
144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
146 };
146 };
147 meta = {
147 meta = {
148 license = [ pkgs.lib.licenses.mit ];
148 license = [ pkgs.lib.licenses.mit ];
149 };
149 };
150 };
150 };
151 "billiard" = super.buildPythonPackage {
151 "billiard" = super.buildPythonPackage {
152 name = "billiard-3.6.1.0";
152 name = "billiard-3.6.1.0";
153 doCheck = false;
153 doCheck = false;
154 src = fetchurl {
154 src = fetchurl {
155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
157 };
157 };
158 meta = {
158 meta = {
159 license = [ pkgs.lib.licenses.bsdOriginal ];
159 license = [ pkgs.lib.licenses.bsdOriginal ];
160 };
160 };
161 };
161 };
162 "bleach" = super.buildPythonPackage {
162 "bleach" = super.buildPythonPackage {
163 name = "bleach-3.1.3";
163 name = "bleach-3.1.3";
164 doCheck = false;
164 doCheck = false;
165 propagatedBuildInputs = [
165 propagatedBuildInputs = [
166 self."six"
166 self."six"
167 self."webencodings"
167 self."webencodings"
168 ];
168 ];
169 src = fetchurl {
169 src = fetchurl {
170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
172 };
172 };
173 meta = {
173 meta = {
174 license = [ pkgs.lib.licenses.asl20 ];
174 license = [ pkgs.lib.licenses.asl20 ];
175 };
175 };
176 };
176 };
177 "bumpversion" = super.buildPythonPackage {
177 "bumpversion" = super.buildPythonPackage {
178 name = "bumpversion-0.5.3";
178 name = "bumpversion-0.5.3";
179 doCheck = false;
179 doCheck = false;
180 src = fetchurl {
180 src = fetchurl {
181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
183 };
183 };
184 meta = {
184 meta = {
185 license = [ pkgs.lib.licenses.mit ];
185 license = [ pkgs.lib.licenses.mit ];
186 };
186 };
187 };
187 };
188 "cachetools" = super.buildPythonPackage {
188 "cachetools" = super.buildPythonPackage {
189 name = "cachetools-3.1.1";
189 name = "cachetools-3.1.1";
190 doCheck = false;
190 doCheck = false;
191 src = fetchurl {
191 src = fetchurl {
192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
194 };
194 };
195 meta = {
195 meta = {
196 license = [ pkgs.lib.licenses.mit ];
196 license = [ pkgs.lib.licenses.mit ];
197 };
197 };
198 };
198 };
199 "celery" = super.buildPythonPackage {
199 "celery" = super.buildPythonPackage {
200 name = "celery-4.3.0";
200 name = "celery-4.3.0";
201 doCheck = false;
201 doCheck = false;
202 propagatedBuildInputs = [
202 propagatedBuildInputs = [
203 self."pytz"
203 self."pytz"
204 self."billiard"
204 self."billiard"
205 self."kombu"
205 self."kombu"
206 self."vine"
206 self."vine"
207 ];
207 ];
208 src = fetchurl {
208 src = fetchurl {
209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
211 };
211 };
212 meta = {
212 meta = {
213 license = [ pkgs.lib.licenses.bsdOriginal ];
213 license = [ pkgs.lib.licenses.bsdOriginal ];
214 };
214 };
215 };
215 };
216 "certifi" = super.buildPythonPackage {
216 "certifi" = super.buildPythonPackage {
217 name = "certifi-2020.4.5.1";
217 name = "certifi-2020.4.5.1";
218 doCheck = false;
218 doCheck = false;
219 src = fetchurl {
219 src = fetchurl {
220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
222 };
222 };
223 meta = {
223 meta = {
224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
225 };
225 };
226 };
226 };
227 "cffi" = super.buildPythonPackage {
227 "cffi" = super.buildPythonPackage {
228 name = "cffi-1.12.3";
228 name = "cffi-1.12.3";
229 doCheck = false;
229 doCheck = false;
230 propagatedBuildInputs = [
230 propagatedBuildInputs = [
231 self."pycparser"
231 self."pycparser"
232 ];
232 ];
233 src = fetchurl {
233 src = fetchurl {
234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
236 };
236 };
237 meta = {
237 meta = {
238 license = [ pkgs.lib.licenses.mit ];
238 license = [ pkgs.lib.licenses.mit ];
239 };
239 };
240 };
240 };
241 "chameleon" = super.buildPythonPackage {
241 "chameleon" = super.buildPythonPackage {
242 name = "chameleon-2.24";
242 name = "chameleon-2.24";
243 doCheck = false;
243 doCheck = false;
244 src = fetchurl {
244 src = fetchurl {
245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
247 };
247 };
248 meta = {
248 meta = {
249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
250 };
250 };
251 };
251 };
252 "channelstream" = super.buildPythonPackage {
252 "channelstream" = super.buildPythonPackage {
253 name = "channelstream-0.6.14";
253 name = "channelstream-0.6.14";
254 doCheck = false;
254 doCheck = false;
255 propagatedBuildInputs = [
255 propagatedBuildInputs = [
256 self."gevent"
256 self."gevent"
257 self."ws4py"
257 self."ws4py"
258 self."marshmallow"
258 self."marshmallow"
259 self."python-dateutil"
259 self."python-dateutil"
260 self."pyramid"
260 self."pyramid"
261 self."pyramid-jinja2"
261 self."pyramid-jinja2"
262 self."pyramid-apispec"
262 self."pyramid-apispec"
263 self."itsdangerous"
263 self."itsdangerous"
264 self."requests"
264 self."requests"
265 self."six"
265 self."six"
266 ];
266 ];
267 src = fetchurl {
267 src = fetchurl {
268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
270 };
270 };
271 meta = {
271 meta = {
272 license = [ pkgs.lib.licenses.bsdOriginal ];
272 license = [ pkgs.lib.licenses.bsdOriginal ];
273 };
273 };
274 };
274 };
275 "chardet" = super.buildPythonPackage {
275 "chardet" = super.buildPythonPackage {
276 name = "chardet-3.0.4";
276 name = "chardet-3.0.4";
277 doCheck = false;
277 doCheck = false;
278 src = fetchurl {
278 src = fetchurl {
279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
281 };
281 };
282 meta = {
282 meta = {
283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
284 };
284 };
285 };
285 };
286 "click" = super.buildPythonPackage {
286 "click" = super.buildPythonPackage {
287 name = "click-7.0";
287 name = "click-7.0";
288 doCheck = false;
288 doCheck = false;
289 src = fetchurl {
289 src = fetchurl {
290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
292 };
292 };
293 meta = {
293 meta = {
294 license = [ pkgs.lib.licenses.bsdOriginal ];
294 license = [ pkgs.lib.licenses.bsdOriginal ];
295 };
295 };
296 };
296 };
297 "colander" = super.buildPythonPackage {
297 "colander" = super.buildPythonPackage {
298 name = "colander-1.7.0";
298 name = "colander-1.7.0";
299 doCheck = false;
299 doCheck = false;
300 propagatedBuildInputs = [
300 propagatedBuildInputs = [
301 self."translationstring"
301 self."translationstring"
302 self."iso8601"
302 self."iso8601"
303 self."enum34"
303 self."enum34"
304 ];
304 ];
305 src = fetchurl {
305 src = fetchurl {
306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
308 };
308 };
309 meta = {
309 meta = {
310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
311 };
311 };
312 };
312 };
313 "configobj" = super.buildPythonPackage {
313 "configobj" = super.buildPythonPackage {
314 name = "configobj-5.0.6";
314 name = "configobj-5.0.6";
315 doCheck = false;
315 doCheck = false;
316 propagatedBuildInputs = [
316 propagatedBuildInputs = [
317 self."six"
317 self."six"
318 ];
318 ];
319 src = fetchurl {
319 src = fetchurl {
320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
322 };
322 };
323 meta = {
323 meta = {
324 license = [ pkgs.lib.licenses.bsdOriginal ];
324 license = [ pkgs.lib.licenses.bsdOriginal ];
325 };
325 };
326 };
326 };
327 "configparser" = super.buildPythonPackage {
327 "configparser" = super.buildPythonPackage {
328 name = "configparser-4.0.2";
328 name = "configparser-4.0.2";
329 doCheck = false;
329 doCheck = false;
330 src = fetchurl {
330 src = fetchurl {
331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
333 };
333 };
334 meta = {
334 meta = {
335 license = [ pkgs.lib.licenses.mit ];
335 license = [ pkgs.lib.licenses.mit ];
336 };
336 };
337 };
337 };
338 "contextlib2" = super.buildPythonPackage {
338 "contextlib2" = super.buildPythonPackage {
339 name = "contextlib2-0.6.0.post1";
339 name = "contextlib2-0.6.0.post1";
340 doCheck = false;
340 doCheck = false;
341 src = fetchurl {
341 src = fetchurl {
342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
344 };
344 };
345 meta = {
345 meta = {
346 license = [ pkgs.lib.licenses.psfl ];
346 license = [ pkgs.lib.licenses.psfl ];
347 };
347 };
348 };
348 };
349 "cov-core" = super.buildPythonPackage {
349 "cov-core" = super.buildPythonPackage {
350 name = "cov-core-1.15.0";
350 name = "cov-core-1.15.0";
351 doCheck = false;
351 doCheck = false;
352 propagatedBuildInputs = [
352 propagatedBuildInputs = [
353 self."coverage"
353 self."coverage"
354 ];
354 ];
355 src = fetchurl {
355 src = fetchurl {
356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
358 };
358 };
359 meta = {
359 meta = {
360 license = [ pkgs.lib.licenses.mit ];
360 license = [ pkgs.lib.licenses.mit ];
361 };
361 };
362 };
362 };
363 "coverage" = super.buildPythonPackage {
363 "coverage" = super.buildPythonPackage {
364 name = "coverage-4.5.4";
364 name = "coverage-4.5.4";
365 doCheck = false;
365 doCheck = false;
366 src = fetchurl {
366 src = fetchurl {
367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
369 };
369 };
370 meta = {
370 meta = {
371 license = [ pkgs.lib.licenses.asl20 ];
371 license = [ pkgs.lib.licenses.asl20 ];
372 };
372 };
373 };
373 };
374 "cryptography" = super.buildPythonPackage {
374 "cryptography" = super.buildPythonPackage {
375 name = "cryptography-2.6.1";
375 name = "cryptography-2.6.1";
376 doCheck = false;
376 doCheck = false;
377 propagatedBuildInputs = [
377 propagatedBuildInputs = [
378 self."asn1crypto"
378 self."asn1crypto"
379 self."six"
379 self."six"
380 self."cffi"
380 self."cffi"
381 self."enum34"
381 self."enum34"
382 self."ipaddress"
382 self."ipaddress"
383 ];
383 ];
384 src = fetchurl {
384 src = fetchurl {
385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
387 };
387 };
388 meta = {
388 meta = {
389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
390 };
390 };
391 };
391 };
392 "cssselect" = super.buildPythonPackage {
392 "cssselect" = super.buildPythonPackage {
393 name = "cssselect-1.0.3";
393 name = "cssselect-1.0.3";
394 doCheck = false;
394 doCheck = false;
395 src = fetchurl {
395 src = fetchurl {
396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
398 };
398 };
399 meta = {
399 meta = {
400 license = [ pkgs.lib.licenses.bsdOriginal ];
400 license = [ pkgs.lib.licenses.bsdOriginal ];
401 };
401 };
402 };
402 };
403 "cssutils" = super.buildPythonPackage {
403 "cssutils" = super.buildPythonPackage {
404 name = "cssutils-1.0.2";
404 name = "cssutils-1.0.2";
405 doCheck = false;
405 doCheck = false;
406 src = fetchurl {
406 src = fetchurl {
407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
409 };
409 };
410 meta = {
410 meta = {
411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
412 };
412 };
413 };
413 };
414 "decorator" = super.buildPythonPackage {
414 "decorator" = super.buildPythonPackage {
415 name = "decorator-4.1.2";
415 name = "decorator-4.1.2";
416 doCheck = false;
416 doCheck = false;
417 src = fetchurl {
417 src = fetchurl {
418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
420 };
420 };
421 meta = {
421 meta = {
422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
423 };
423 };
424 };
424 };
425 "deform" = super.buildPythonPackage {
425 "deform" = super.buildPythonPackage {
426 name = "deform-2.0.8";
426 name = "deform-2.0.8";
427 doCheck = false;
427 doCheck = false;
428 propagatedBuildInputs = [
428 propagatedBuildInputs = [
429 self."chameleon"
429 self."chameleon"
430 self."colander"
430 self."colander"
431 self."iso8601"
431 self."iso8601"
432 self."peppercorn"
432 self."peppercorn"
433 self."translationstring"
433 self."translationstring"
434 self."zope.deprecation"
434 self."zope.deprecation"
435 ];
435 ];
436 src = fetchurl {
436 src = fetchurl {
437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
439 };
439 };
440 meta = {
440 meta = {
441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
442 };
442 };
443 };
443 };
444 "defusedxml" = super.buildPythonPackage {
444 "defusedxml" = super.buildPythonPackage {
445 name = "defusedxml-0.6.0";
445 name = "defusedxml-0.6.0";
446 doCheck = false;
446 doCheck = false;
447 src = fetchurl {
447 src = fetchurl {
448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
450 };
450 };
451 meta = {
451 meta = {
452 license = [ pkgs.lib.licenses.psfl ];
452 license = [ pkgs.lib.licenses.psfl ];
453 };
453 };
454 };
454 };
455 "dm.xmlsec.binding" = super.buildPythonPackage {
455 "dm.xmlsec.binding" = super.buildPythonPackage {
456 name = "dm.xmlsec.binding-1.3.7";
456 name = "dm.xmlsec.binding-1.3.7";
457 doCheck = false;
457 doCheck = false;
458 propagatedBuildInputs = [
458 propagatedBuildInputs = [
459 self."setuptools"
459 self."setuptools"
460 self."lxml"
460 self."lxml"
461 ];
461 ];
462 src = fetchurl {
462 src = fetchurl {
463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
465 };
465 };
466 meta = {
466 meta = {
467 license = [ pkgs.lib.licenses.bsdOriginal ];
467 license = [ pkgs.lib.licenses.bsdOriginal ];
468 };
468 };
469 };
469 };
470 "docutils" = super.buildPythonPackage {
470 "docutils" = super.buildPythonPackage {
471 name = "docutils-0.16";
471 name = "docutils-0.16";
472 doCheck = false;
472 doCheck = false;
473 src = fetchurl {
473 src = fetchurl {
474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
476 };
476 };
477 meta = {
477 meta = {
478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
479 };
479 };
480 };
480 };
481 "dogpile.cache" = super.buildPythonPackage {
481 "dogpile.cache" = super.buildPythonPackage {
482 name = "dogpile.cache-0.9.0";
482 name = "dogpile.cache-0.9.0";
483 doCheck = false;
483 doCheck = false;
484 propagatedBuildInputs = [
484 propagatedBuildInputs = [
485 self."decorator"
485 self."decorator"
486 ];
486 ];
487 src = fetchurl {
487 src = fetchurl {
488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
490 };
490 };
491 meta = {
491 meta = {
492 license = [ pkgs.lib.licenses.bsdOriginal ];
492 license = [ pkgs.lib.licenses.bsdOriginal ];
493 };
493 };
494 };
494 };
495 "dogpile.core" = super.buildPythonPackage {
495 "dogpile.core" = super.buildPythonPackage {
496 name = "dogpile.core-0.4.1";
496 name = "dogpile.core-0.4.1";
497 doCheck = false;
497 doCheck = false;
498 src = fetchurl {
498 src = fetchurl {
499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
501 };
501 };
502 meta = {
502 meta = {
503 license = [ pkgs.lib.licenses.bsdOriginal ];
503 license = [ pkgs.lib.licenses.bsdOriginal ];
504 };
504 };
505 };
505 };
506 "ecdsa" = super.buildPythonPackage {
506 "ecdsa" = super.buildPythonPackage {
507 name = "ecdsa-0.13.2";
507 name = "ecdsa-0.13.2";
508 doCheck = false;
508 doCheck = false;
509 src = fetchurl {
509 src = fetchurl {
510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
512 };
512 };
513 meta = {
513 meta = {
514 license = [ pkgs.lib.licenses.mit ];
514 license = [ pkgs.lib.licenses.mit ];
515 };
515 };
516 };
516 };
517 "elasticsearch" = super.buildPythonPackage {
517 "elasticsearch" = super.buildPythonPackage {
518 name = "elasticsearch-6.3.1";
518 name = "elasticsearch-6.3.1";
519 doCheck = false;
519 doCheck = false;
520 propagatedBuildInputs = [
520 propagatedBuildInputs = [
521 self."urllib3"
521 self."urllib3"
522 ];
522 ];
523 src = fetchurl {
523 src = fetchurl {
524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
526 };
526 };
527 meta = {
527 meta = {
528 license = [ pkgs.lib.licenses.asl20 ];
528 license = [ pkgs.lib.licenses.asl20 ];
529 };
529 };
530 };
530 };
531 "elasticsearch-dsl" = super.buildPythonPackage {
531 "elasticsearch-dsl" = super.buildPythonPackage {
532 name = "elasticsearch-dsl-6.3.1";
532 name = "elasticsearch-dsl-6.3.1";
533 doCheck = false;
533 doCheck = false;
534 propagatedBuildInputs = [
534 propagatedBuildInputs = [
535 self."six"
535 self."six"
536 self."python-dateutil"
536 self."python-dateutil"
537 self."elasticsearch"
537 self."elasticsearch"
538 self."ipaddress"
538 self."ipaddress"
539 ];
539 ];
540 src = fetchurl {
540 src = fetchurl {
541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
543 };
543 };
544 meta = {
544 meta = {
545 license = [ pkgs.lib.licenses.asl20 ];
545 license = [ pkgs.lib.licenses.asl20 ];
546 };
546 };
547 };
547 };
548 "elasticsearch1" = super.buildPythonPackage {
548 "elasticsearch1" = super.buildPythonPackage {
549 name = "elasticsearch1-1.10.0";
549 name = "elasticsearch1-1.10.0";
550 doCheck = false;
550 doCheck = false;
551 propagatedBuildInputs = [
551 propagatedBuildInputs = [
552 self."urllib3"
552 self."urllib3"
553 ];
553 ];
554 src = fetchurl {
554 src = fetchurl {
555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
557 };
557 };
558 meta = {
558 meta = {
559 license = [ pkgs.lib.licenses.asl20 ];
559 license = [ pkgs.lib.licenses.asl20 ];
560 };
560 };
561 };
561 };
562 "elasticsearch1-dsl" = super.buildPythonPackage {
562 "elasticsearch1-dsl" = super.buildPythonPackage {
563 name = "elasticsearch1-dsl-0.0.12";
563 name = "elasticsearch1-dsl-0.0.12";
564 doCheck = false;
564 doCheck = false;
565 propagatedBuildInputs = [
565 propagatedBuildInputs = [
566 self."six"
566 self."six"
567 self."python-dateutil"
567 self."python-dateutil"
568 self."elasticsearch1"
568 self."elasticsearch1"
569 ];
569 ];
570 src = fetchurl {
570 src = fetchurl {
571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
573 };
573 };
574 meta = {
574 meta = {
575 license = [ pkgs.lib.licenses.asl20 ];
575 license = [ pkgs.lib.licenses.asl20 ];
576 };
576 };
577 };
577 };
578 "elasticsearch2" = super.buildPythonPackage {
578 "elasticsearch2" = super.buildPythonPackage {
579 name = "elasticsearch2-2.5.1";
579 name = "elasticsearch2-2.5.1";
580 doCheck = false;
580 doCheck = false;
581 propagatedBuildInputs = [
581 propagatedBuildInputs = [
582 self."urllib3"
582 self."urllib3"
583 ];
583 ];
584 src = fetchurl {
584 src = fetchurl {
585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
587 };
587 };
588 meta = {
588 meta = {
589 license = [ pkgs.lib.licenses.asl20 ];
589 license = [ pkgs.lib.licenses.asl20 ];
590 };
590 };
591 };
591 };
592 "entrypoints" = super.buildPythonPackage {
592 "entrypoints" = super.buildPythonPackage {
593 name = "entrypoints-0.2.2";
593 name = "entrypoints-0.2.2";
594 doCheck = false;
594 doCheck = false;
595 propagatedBuildInputs = [
595 propagatedBuildInputs = [
596 self."configparser"
596 self."configparser"
597 ];
597 ];
598 src = fetchurl {
598 src = fetchurl {
599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
601 };
601 };
602 meta = {
602 meta = {
603 license = [ pkgs.lib.licenses.mit ];
603 license = [ pkgs.lib.licenses.mit ];
604 };
604 };
605 };
605 };
606 "enum34" = super.buildPythonPackage {
606 "enum34" = super.buildPythonPackage {
607 name = "enum34-1.1.10";
607 name = "enum34-1.1.10";
608 doCheck = false;
608 doCheck = false;
609 src = fetchurl {
609 src = fetchurl {
610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
612 };
612 };
613 meta = {
613 meta = {
614 license = [ pkgs.lib.licenses.bsdOriginal ];
614 license = [ pkgs.lib.licenses.bsdOriginal ];
615 };
615 };
616 };
616 };
617 "formencode" = super.buildPythonPackage {
617 "formencode" = super.buildPythonPackage {
618 name = "formencode-1.2.4";
618 name = "formencode-1.2.4";
619 doCheck = false;
619 doCheck = false;
620 src = fetchurl {
620 src = fetchurl {
621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
623 };
623 };
624 meta = {
624 meta = {
625 license = [ pkgs.lib.licenses.psfl ];
625 license = [ pkgs.lib.licenses.psfl ];
626 };
626 };
627 };
627 };
628 "funcsigs" = super.buildPythonPackage {
628 "funcsigs" = super.buildPythonPackage {
629 name = "funcsigs-1.0.2";
629 name = "funcsigs-1.0.2";
630 doCheck = false;
630 doCheck = false;
631 src = fetchurl {
631 src = fetchurl {
632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
634 };
634 };
635 meta = {
635 meta = {
636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
637 };
637 };
638 };
638 };
639 "functools32" = super.buildPythonPackage {
639 "functools32" = super.buildPythonPackage {
640 name = "functools32-3.2.3.post2";
640 name = "functools32-3.2.3.post2";
641 doCheck = false;
641 doCheck = false;
642 src = fetchurl {
642 src = fetchurl {
643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
645 };
645 };
646 meta = {
646 meta = {
647 license = [ pkgs.lib.licenses.psfl ];
647 license = [ pkgs.lib.licenses.psfl ];
648 };
648 };
649 };
649 };
650 "future" = super.buildPythonPackage {
650 "future" = super.buildPythonPackage {
651 name = "future-0.14.3";
651 name = "future-0.14.3";
652 doCheck = false;
652 doCheck = false;
653 src = fetchurl {
653 src = fetchurl {
654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
656 };
656 };
657 meta = {
657 meta = {
658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
659 };
659 };
660 };
660 };
661 "futures" = super.buildPythonPackage {
661 "futures" = super.buildPythonPackage {
662 name = "futures-3.0.2";
662 name = "futures-3.0.2";
663 doCheck = false;
663 doCheck = false;
664 src = fetchurl {
664 src = fetchurl {
665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
667 };
667 };
668 meta = {
668 meta = {
669 license = [ pkgs.lib.licenses.bsdOriginal ];
669 license = [ pkgs.lib.licenses.bsdOriginal ];
670 };
670 };
671 };
671 };
672 "gevent" = super.buildPythonPackage {
672 "gevent" = super.buildPythonPackage {
673 name = "gevent-1.5.0";
673 name = "gevent-1.5.0";
674 doCheck = false;
674 doCheck = false;
675 propagatedBuildInputs = [
675 propagatedBuildInputs = [
676 self."greenlet"
676 self."greenlet"
677 ];
677 ];
678 src = fetchurl {
678 src = fetchurl {
679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
681 };
681 };
682 meta = {
682 meta = {
683 license = [ pkgs.lib.licenses.mit ];
683 license = [ pkgs.lib.licenses.mit ];
684 };
684 };
685 };
685 };
686 "gnureadline" = super.buildPythonPackage {
686 "gnureadline" = super.buildPythonPackage {
687 name = "gnureadline-6.3.8";
687 name = "gnureadline-6.3.8";
688 doCheck = false;
688 doCheck = false;
689 src = fetchurl {
689 src = fetchurl {
690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
692 };
692 };
693 meta = {
693 meta = {
694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
695 };
695 };
696 };
696 };
697 "gprof2dot" = super.buildPythonPackage {
697 "gprof2dot" = super.buildPythonPackage {
698 name = "gprof2dot-2017.9.19";
698 name = "gprof2dot-2017.9.19";
699 doCheck = false;
699 doCheck = false;
700 src = fetchurl {
700 src = fetchurl {
701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
703 };
703 };
704 meta = {
704 meta = {
705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
706 };
706 };
707 };
707 };
708 "greenlet" = super.buildPythonPackage {
708 "greenlet" = super.buildPythonPackage {
709 name = "greenlet-0.4.15";
709 name = "greenlet-0.4.15";
710 doCheck = false;
710 doCheck = false;
711 src = fetchurl {
711 src = fetchurl {
712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
714 };
714 };
715 meta = {
715 meta = {
716 license = [ pkgs.lib.licenses.mit ];
716 license = [ pkgs.lib.licenses.mit ];
717 };
717 };
718 };
718 };
719 "gunicorn" = super.buildPythonPackage {
719 "gunicorn" = super.buildPythonPackage {
720 name = "gunicorn-19.9.0";
720 name = "gunicorn-19.9.0";
721 doCheck = false;
721 doCheck = false;
722 src = fetchurl {
722 src = fetchurl {
723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
725 };
725 };
726 meta = {
726 meta = {
727 license = [ pkgs.lib.licenses.mit ];
727 license = [ pkgs.lib.licenses.mit ];
728 };
728 };
729 };
729 };
730 "hupper" = super.buildPythonPackage {
730 "hupper" = super.buildPythonPackage {
731 name = "hupper-1.10.2";
731 name = "hupper-1.10.2";
732 doCheck = false;
732 doCheck = false;
733 src = fetchurl {
733 src = fetchurl {
734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
736 };
736 };
737 meta = {
737 meta = {
738 license = [ pkgs.lib.licenses.mit ];
738 license = [ pkgs.lib.licenses.mit ];
739 };
739 };
740 };
740 };
741 "idna" = super.buildPythonPackage {
741 "idna" = super.buildPythonPackage {
742 name = "idna-2.8";
742 name = "idna-2.8";
743 doCheck = false;
743 doCheck = false;
744 src = fetchurl {
744 src = fetchurl {
745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
747 };
747 };
748 meta = {
748 meta = {
749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
750 };
750 };
751 };
751 };
752 "importlib-metadata" = super.buildPythonPackage {
752 "importlib-metadata" = super.buildPythonPackage {
753 name = "importlib-metadata-1.6.0";
753 name = "importlib-metadata-1.6.0";
754 doCheck = false;
754 doCheck = false;
755 propagatedBuildInputs = [
755 propagatedBuildInputs = [
756 self."zipp"
756 self."zipp"
757 self."pathlib2"
757 self."pathlib2"
758 self."contextlib2"
758 self."contextlib2"
759 self."configparser"
759 self."configparser"
760 ];
760 ];
761 src = fetchurl {
761 src = fetchurl {
762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
764 };
764 };
765 meta = {
765 meta = {
766 license = [ pkgs.lib.licenses.asl20 ];
766 license = [ pkgs.lib.licenses.asl20 ];
767 };
767 };
768 };
768 };
769 "infrae.cache" = super.buildPythonPackage {
769 "infrae.cache" = super.buildPythonPackage {
770 name = "infrae.cache-1.0.1";
770 name = "infrae.cache-1.0.1";
771 doCheck = false;
771 doCheck = false;
772 propagatedBuildInputs = [
772 propagatedBuildInputs = [
773 self."beaker"
773 self."beaker"
774 self."repoze.lru"
774 self."repoze.lru"
775 ];
775 ];
776 src = fetchurl {
776 src = fetchurl {
777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
779 };
779 };
780 meta = {
780 meta = {
781 license = [ pkgs.lib.licenses.zpl21 ];
781 license = [ pkgs.lib.licenses.zpl21 ];
782 };
782 };
783 };
783 };
784 "invoke" = super.buildPythonPackage {
784 "invoke" = super.buildPythonPackage {
785 name = "invoke-0.13.0";
785 name = "invoke-0.13.0";
786 doCheck = false;
786 doCheck = false;
787 src = fetchurl {
787 src = fetchurl {
788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
790 };
790 };
791 meta = {
791 meta = {
792 license = [ pkgs.lib.licenses.bsdOriginal ];
792 license = [ pkgs.lib.licenses.bsdOriginal ];
793 };
793 };
794 };
794 };
795 "ipaddress" = super.buildPythonPackage {
795 "ipaddress" = super.buildPythonPackage {
796 name = "ipaddress-1.0.23";
796 name = "ipaddress-1.0.23";
797 doCheck = false;
797 doCheck = false;
798 src = fetchurl {
798 src = fetchurl {
799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
801 };
801 };
802 meta = {
802 meta = {
803 license = [ pkgs.lib.licenses.psfl ];
803 license = [ pkgs.lib.licenses.psfl ];
804 };
804 };
805 };
805 };
806 "ipdb" = super.buildPythonPackage {
806 "ipdb" = super.buildPythonPackage {
807 name = "ipdb-0.13.2";
807 name = "ipdb-0.13.2";
808 doCheck = false;
808 doCheck = false;
809 propagatedBuildInputs = [
809 propagatedBuildInputs = [
810 self."setuptools"
810 self."setuptools"
811 self."ipython"
811 self."ipython"
812 ];
812 ];
813 src = fetchurl {
813 src = fetchurl {
814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
816 };
816 };
817 meta = {
817 meta = {
818 license = [ pkgs.lib.licenses.bsdOriginal ];
818 license = [ pkgs.lib.licenses.bsdOriginal ];
819 };
819 };
820 };
820 };
821 "ipython" = super.buildPythonPackage {
821 "ipython" = super.buildPythonPackage {
822 name = "ipython-5.1.0";
822 name = "ipython-5.1.0";
823 doCheck = false;
823 doCheck = false;
824 propagatedBuildInputs = [
824 propagatedBuildInputs = [
825 self."setuptools"
825 self."setuptools"
826 self."decorator"
826 self."decorator"
827 self."pickleshare"
827 self."pickleshare"
828 self."simplegeneric"
828 self."simplegeneric"
829 self."traitlets"
829 self."traitlets"
830 self."prompt-toolkit"
830 self."prompt-toolkit"
831 self."pygments"
831 self."pygments"
832 self."pexpect"
832 self."pexpect"
833 self."backports.shutil-get-terminal-size"
833 self."backports.shutil-get-terminal-size"
834 self."pathlib2"
834 self."pathlib2"
835 self."pexpect"
835 self."pexpect"
836 ];
836 ];
837 src = fetchurl {
837 src = fetchurl {
838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
840 };
840 };
841 meta = {
841 meta = {
842 license = [ pkgs.lib.licenses.bsdOriginal ];
842 license = [ pkgs.lib.licenses.bsdOriginal ];
843 };
843 };
844 };
844 };
845 "ipython-genutils" = super.buildPythonPackage {
845 "ipython-genutils" = super.buildPythonPackage {
846 name = "ipython-genutils-0.2.0";
846 name = "ipython-genutils-0.2.0";
847 doCheck = false;
847 doCheck = false;
848 src = fetchurl {
848 src = fetchurl {
849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
851 };
851 };
852 meta = {
852 meta = {
853 license = [ pkgs.lib.licenses.bsdOriginal ];
853 license = [ pkgs.lib.licenses.bsdOriginal ];
854 };
854 };
855 };
855 };
856 "iso8601" = super.buildPythonPackage {
856 "iso8601" = super.buildPythonPackage {
857 name = "iso8601-0.1.12";
857 name = "iso8601-0.1.12";
858 doCheck = false;
858 doCheck = false;
859 src = fetchurl {
859 src = fetchurl {
860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
862 };
862 };
863 meta = {
863 meta = {
864 license = [ pkgs.lib.licenses.mit ];
864 license = [ pkgs.lib.licenses.mit ];
865 };
865 };
866 };
866 };
867 "isodate" = super.buildPythonPackage {
867 "isodate" = super.buildPythonPackage {
868 name = "isodate-0.6.0";
868 name = "isodate-0.6.0";
869 doCheck = false;
869 doCheck = false;
870 propagatedBuildInputs = [
870 propagatedBuildInputs = [
871 self."six"
871 self."six"
872 ];
872 ];
873 src = fetchurl {
873 src = fetchurl {
874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
876 };
876 };
877 meta = {
877 meta = {
878 license = [ pkgs.lib.licenses.bsdOriginal ];
878 license = [ pkgs.lib.licenses.bsdOriginal ];
879 };
879 };
880 };
880 };
881 "itsdangerous" = super.buildPythonPackage {
881 "itsdangerous" = super.buildPythonPackage {
882 name = "itsdangerous-1.1.0";
882 name = "itsdangerous-1.1.0";
883 doCheck = false;
883 doCheck = false;
884 src = fetchurl {
884 src = fetchurl {
885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
887 };
887 };
888 meta = {
888 meta = {
889 license = [ pkgs.lib.licenses.bsdOriginal ];
889 license = [ pkgs.lib.licenses.bsdOriginal ];
890 };
890 };
891 };
891 };
892 "jinja2" = super.buildPythonPackage {
892 "jinja2" = super.buildPythonPackage {
893 name = "jinja2-2.9.6";
893 name = "jinja2-2.9.6";
894 doCheck = false;
894 doCheck = false;
895 propagatedBuildInputs = [
895 propagatedBuildInputs = [
896 self."markupsafe"
896 self."markupsafe"
897 ];
897 ];
898 src = fetchurl {
898 src = fetchurl {
899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
901 };
901 };
902 meta = {
902 meta = {
903 license = [ pkgs.lib.licenses.bsdOriginal ];
903 license = [ pkgs.lib.licenses.bsdOriginal ];
904 };
904 };
905 };
905 };
906 "jsonschema" = super.buildPythonPackage {
906 "jsonschema" = super.buildPythonPackage {
907 name = "jsonschema-2.6.0";
907 name = "jsonschema-2.6.0";
908 doCheck = false;
908 doCheck = false;
909 propagatedBuildInputs = [
909 propagatedBuildInputs = [
910 self."functools32"
910 self."functools32"
911 ];
911 ];
912 src = fetchurl {
912 src = fetchurl {
913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
915 };
915 };
916 meta = {
916 meta = {
917 license = [ pkgs.lib.licenses.mit ];
917 license = [ pkgs.lib.licenses.mit ];
918 };
918 };
919 };
919 };
920 "jupyter-client" = super.buildPythonPackage {
920 "jupyter-client" = super.buildPythonPackage {
921 name = "jupyter-client-5.0.0";
921 name = "jupyter-client-5.0.0";
922 doCheck = false;
922 doCheck = false;
923 propagatedBuildInputs = [
923 propagatedBuildInputs = [
924 self."traitlets"
924 self."traitlets"
925 self."jupyter-core"
925 self."jupyter-core"
926 self."pyzmq"
926 self."pyzmq"
927 self."python-dateutil"
927 self."python-dateutil"
928 ];
928 ];
929 src = fetchurl {
929 src = fetchurl {
930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
932 };
932 };
933 meta = {
933 meta = {
934 license = [ pkgs.lib.licenses.bsdOriginal ];
934 license = [ pkgs.lib.licenses.bsdOriginal ];
935 };
935 };
936 };
936 };
937 "jupyter-core" = super.buildPythonPackage {
937 "jupyter-core" = super.buildPythonPackage {
938 name = "jupyter-core-4.5.0";
938 name = "jupyter-core-4.5.0";
939 doCheck = false;
939 doCheck = false;
940 propagatedBuildInputs = [
940 propagatedBuildInputs = [
941 self."traitlets"
941 self."traitlets"
942 ];
942 ];
943 src = fetchurl {
943 src = fetchurl {
944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
946 };
946 };
947 meta = {
947 meta = {
948 license = [ pkgs.lib.licenses.bsdOriginal ];
948 license = [ pkgs.lib.licenses.bsdOriginal ];
949 };
949 };
950 };
950 };
951 "kombu" = super.buildPythonPackage {
951 "kombu" = super.buildPythonPackage {
952 name = "kombu-4.6.6";
952 name = "kombu-4.6.6";
953 doCheck = false;
953 doCheck = false;
954 propagatedBuildInputs = [
954 propagatedBuildInputs = [
955 self."amqp"
955 self."amqp"
956 self."importlib-metadata"
956 self."importlib-metadata"
957 ];
957 ];
958 src = fetchurl {
958 src = fetchurl {
959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
961 };
961 };
962 meta = {
962 meta = {
963 license = [ pkgs.lib.licenses.bsdOriginal ];
963 license = [ pkgs.lib.licenses.bsdOriginal ];
964 };
964 };
965 };
965 };
966 "lxml" = super.buildPythonPackage {
966 "lxml" = super.buildPythonPackage {
967 name = "lxml-4.2.5";
967 name = "lxml-4.2.5";
968 doCheck = false;
968 doCheck = false;
969 src = fetchurl {
969 src = fetchurl {
970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
972 };
972 };
973 meta = {
973 meta = {
974 license = [ pkgs.lib.licenses.bsdOriginal ];
974 license = [ pkgs.lib.licenses.bsdOriginal ];
975 };
975 };
976 };
976 };
977 "mako" = super.buildPythonPackage {
977 "mako" = super.buildPythonPackage {
978 name = "mako-1.1.0";
978 name = "mako-1.1.0";
979 doCheck = false;
979 doCheck = false;
980 propagatedBuildInputs = [
980 propagatedBuildInputs = [
981 self."markupsafe"
981 self."markupsafe"
982 ];
982 ];
983 src = fetchurl {
983 src = fetchurl {
984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
986 };
986 };
987 meta = {
987 meta = {
988 license = [ pkgs.lib.licenses.mit ];
988 license = [ pkgs.lib.licenses.mit ];
989 };
989 };
990 };
990 };
991 "markdown" = super.buildPythonPackage {
991 "markdown" = super.buildPythonPackage {
992 name = "markdown-2.6.11";
992 name = "markdown-2.6.11";
993 doCheck = false;
993 doCheck = false;
994 src = fetchurl {
994 src = fetchurl {
995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
997 };
997 };
998 meta = {
998 meta = {
999 license = [ pkgs.lib.licenses.bsdOriginal ];
999 license = [ pkgs.lib.licenses.bsdOriginal ];
1000 };
1000 };
1001 };
1001 };
1002 "markupsafe" = super.buildPythonPackage {
1002 "markupsafe" = super.buildPythonPackage {
1003 name = "markupsafe-1.1.1";
1003 name = "markupsafe-1.1.1";
1004 doCheck = false;
1004 doCheck = false;
1005 src = fetchurl {
1005 src = fetchurl {
1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1008 };
1008 };
1009 meta = {
1009 meta = {
1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1011 };
1011 };
1012 };
1012 };
1013 "marshmallow" = super.buildPythonPackage {
1013 "marshmallow" = super.buildPythonPackage {
1014 name = "marshmallow-2.18.0";
1014 name = "marshmallow-2.18.0";
1015 doCheck = false;
1015 doCheck = false;
1016 src = fetchurl {
1016 src = fetchurl {
1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1019 };
1019 };
1020 meta = {
1020 meta = {
1021 license = [ pkgs.lib.licenses.mit ];
1021 license = [ pkgs.lib.licenses.mit ];
1022 };
1022 };
1023 };
1023 };
1024 "mistune" = super.buildPythonPackage {
1024 "mistune" = super.buildPythonPackage {
1025 name = "mistune-0.8.4";
1025 name = "mistune-0.8.4";
1026 doCheck = false;
1026 doCheck = false;
1027 src = fetchurl {
1027 src = fetchurl {
1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1030 };
1030 };
1031 meta = {
1031 meta = {
1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1033 };
1033 };
1034 };
1034 };
1035 "mock" = super.buildPythonPackage {
1035 "mock" = super.buildPythonPackage {
1036 name = "mock-3.0.5";
1036 name = "mock-3.0.5";
1037 doCheck = false;
1037 doCheck = false;
1038 propagatedBuildInputs = [
1038 propagatedBuildInputs = [
1039 self."six"
1039 self."six"
1040 self."funcsigs"
1040 self."funcsigs"
1041 ];
1041 ];
1042 src = fetchurl {
1042 src = fetchurl {
1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1045 };
1045 };
1046 meta = {
1046 meta = {
1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1048 };
1048 };
1049 };
1049 };
1050 "more-itertools" = super.buildPythonPackage {
1050 "more-itertools" = super.buildPythonPackage {
1051 name = "more-itertools-5.0.0";
1051 name = "more-itertools-5.0.0";
1052 doCheck = false;
1052 doCheck = false;
1053 propagatedBuildInputs = [
1053 propagatedBuildInputs = [
1054 self."six"
1054 self."six"
1055 ];
1055 ];
1056 src = fetchurl {
1056 src = fetchurl {
1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1059 };
1059 };
1060 meta = {
1060 meta = {
1061 license = [ pkgs.lib.licenses.mit ];
1061 license = [ pkgs.lib.licenses.mit ];
1062 };
1062 };
1063 };
1063 };
1064 "msgpack-python" = super.buildPythonPackage {
1064 "msgpack-python" = super.buildPythonPackage {
1065 name = "msgpack-python-0.5.6";
1065 name = "msgpack-python-0.5.6";
1066 doCheck = false;
1066 doCheck = false;
1067 src = fetchurl {
1067 src = fetchurl {
1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1070 };
1070 };
1071 meta = {
1071 meta = {
1072 license = [ pkgs.lib.licenses.asl20 ];
1072 license = [ pkgs.lib.licenses.asl20 ];
1073 };
1073 };
1074 };
1074 };
1075 "mysql-python" = super.buildPythonPackage {
1075 "mysql-python" = super.buildPythonPackage {
1076 name = "mysql-python-1.2.5";
1076 name = "mysql-python-1.2.5";
1077 doCheck = false;
1077 doCheck = false;
1078 src = fetchurl {
1078 src = fetchurl {
1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1081 };
1081 };
1082 meta = {
1082 meta = {
1083 license = [ pkgs.lib.licenses.gpl1 ];
1083 license = [ pkgs.lib.licenses.gpl1 ];
1084 };
1084 };
1085 };
1085 };
1086 "nbconvert" = super.buildPythonPackage {
1086 "nbconvert" = super.buildPythonPackage {
1087 name = "nbconvert-5.3.1";
1087 name = "nbconvert-5.3.1";
1088 doCheck = false;
1088 doCheck = false;
1089 propagatedBuildInputs = [
1089 propagatedBuildInputs = [
1090 self."mistune"
1090 self."mistune"
1091 self."jinja2"
1091 self."jinja2"
1092 self."pygments"
1092 self."pygments"
1093 self."traitlets"
1093 self."traitlets"
1094 self."jupyter-core"
1094 self."jupyter-core"
1095 self."nbformat"
1095 self."nbformat"
1096 self."entrypoints"
1096 self."entrypoints"
1097 self."bleach"
1097 self."bleach"
1098 self."pandocfilters"
1098 self."pandocfilters"
1099 self."testpath"
1099 self."testpath"
1100 ];
1100 ];
1101 src = fetchurl {
1101 src = fetchurl {
1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1104 };
1104 };
1105 meta = {
1105 meta = {
1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1107 };
1107 };
1108 };
1108 };
1109 "nbformat" = super.buildPythonPackage {
1109 "nbformat" = super.buildPythonPackage {
1110 name = "nbformat-4.4.0";
1110 name = "nbformat-4.4.0";
1111 doCheck = false;
1111 doCheck = false;
1112 propagatedBuildInputs = [
1112 propagatedBuildInputs = [
1113 self."ipython-genutils"
1113 self."ipython-genutils"
1114 self."traitlets"
1114 self."traitlets"
1115 self."jsonschema"
1115 self."jsonschema"
1116 self."jupyter-core"
1116 self."jupyter-core"
1117 ];
1117 ];
1118 src = fetchurl {
1118 src = fetchurl {
1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1121 };
1121 };
1122 meta = {
1122 meta = {
1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1124 };
1124 };
1125 };
1125 };
1126 "packaging" = super.buildPythonPackage {
1126 "packaging" = super.buildPythonPackage {
1127 name = "packaging-20.3";
1127 name = "packaging-20.3";
1128 doCheck = false;
1128 doCheck = false;
1129 propagatedBuildInputs = [
1129 propagatedBuildInputs = [
1130 self."pyparsing"
1130 self."pyparsing"
1131 self."six"
1131 self."six"
1132 ];
1132 ];
1133 src = fetchurl {
1133 src = fetchurl {
1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1136 };
1136 };
1137 meta = {
1137 meta = {
1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1139 };
1139 };
1140 };
1140 };
1141 "pandocfilters" = super.buildPythonPackage {
1141 "pandocfilters" = super.buildPythonPackage {
1142 name = "pandocfilters-1.4.2";
1142 name = "pandocfilters-1.4.2";
1143 doCheck = false;
1143 doCheck = false;
1144 src = fetchurl {
1144 src = fetchurl {
1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1147 };
1147 };
1148 meta = {
1148 meta = {
1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1150 };
1150 };
1151 };
1151 };
1152 "paste" = super.buildPythonPackage {
1152 "paste" = super.buildPythonPackage {
1153 name = "paste-3.4.0";
1153 name = "paste-3.4.0";
1154 doCheck = false;
1154 doCheck = false;
1155 propagatedBuildInputs = [
1155 propagatedBuildInputs = [
1156 self."six"
1156 self."six"
1157 ];
1157 ];
1158 src = fetchurl {
1158 src = fetchurl {
1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1161 };
1161 };
1162 meta = {
1162 meta = {
1163 license = [ pkgs.lib.licenses.mit ];
1163 license = [ pkgs.lib.licenses.mit ];
1164 };
1164 };
1165 };
1165 };
1166 "pastedeploy" = super.buildPythonPackage {
1166 "pastedeploy" = super.buildPythonPackage {
1167 name = "pastedeploy-2.1.0";
1167 name = "pastedeploy-2.1.0";
1168 doCheck = false;
1168 doCheck = false;
1169 src = fetchurl {
1169 src = fetchurl {
1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1172 };
1172 };
1173 meta = {
1173 meta = {
1174 license = [ pkgs.lib.licenses.mit ];
1174 license = [ pkgs.lib.licenses.mit ];
1175 };
1175 };
1176 };
1176 };
1177 "pastescript" = super.buildPythonPackage {
1177 "pastescript" = super.buildPythonPackage {
1178 name = "pastescript-3.2.0";
1178 name = "pastescript-3.2.0";
1179 doCheck = false;
1179 doCheck = false;
1180 propagatedBuildInputs = [
1180 propagatedBuildInputs = [
1181 self."paste"
1181 self."paste"
1182 self."pastedeploy"
1182 self."pastedeploy"
1183 self."six"
1183 self."six"
1184 ];
1184 ];
1185 src = fetchurl {
1185 src = fetchurl {
1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1188 };
1188 };
1189 meta = {
1189 meta = {
1190 license = [ pkgs.lib.licenses.mit ];
1190 license = [ pkgs.lib.licenses.mit ];
1191 };
1191 };
1192 };
1192 };
1193 "pathlib2" = super.buildPythonPackage {
1193 "pathlib2" = super.buildPythonPackage {
1194 name = "pathlib2-2.3.5";
1194 name = "pathlib2-2.3.5";
1195 doCheck = false;
1195 doCheck = false;
1196 propagatedBuildInputs = [
1196 propagatedBuildInputs = [
1197 self."six"
1197 self."six"
1198 self."scandir"
1198 self."scandir"
1199 ];
1199 ];
1200 src = fetchurl {
1200 src = fetchurl {
1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1203 };
1203 };
1204 meta = {
1204 meta = {
1205 license = [ pkgs.lib.licenses.mit ];
1205 license = [ pkgs.lib.licenses.mit ];
1206 };
1206 };
1207 };
1207 };
1208 "peppercorn" = super.buildPythonPackage {
1208 "peppercorn" = super.buildPythonPackage {
1209 name = "peppercorn-0.6";
1209 name = "peppercorn-0.6";
1210 doCheck = false;
1210 doCheck = false;
1211 src = fetchurl {
1211 src = fetchurl {
1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1214 };
1214 };
1215 meta = {
1215 meta = {
1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1217 };
1217 };
1218 };
1218 };
1219 "pexpect" = super.buildPythonPackage {
1219 "pexpect" = super.buildPythonPackage {
1220 name = "pexpect-4.8.0";
1220 name = "pexpect-4.8.0";
1221 doCheck = false;
1221 doCheck = false;
1222 propagatedBuildInputs = [
1222 propagatedBuildInputs = [
1223 self."ptyprocess"
1223 self."ptyprocess"
1224 ];
1224 ];
1225 src = fetchurl {
1225 src = fetchurl {
1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1228 };
1228 };
1229 meta = {
1229 meta = {
1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1231 };
1231 };
1232 };
1232 };
1233 "pickleshare" = super.buildPythonPackage {
1233 "pickleshare" = super.buildPythonPackage {
1234 name = "pickleshare-0.7.5";
1234 name = "pickleshare-0.7.5";
1235 doCheck = false;
1235 doCheck = false;
1236 propagatedBuildInputs = [
1236 propagatedBuildInputs = [
1237 self."pathlib2"
1237 self."pathlib2"
1238 ];
1238 ];
1239 src = fetchurl {
1239 src = fetchurl {
1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1242 };
1242 };
1243 meta = {
1243 meta = {
1244 license = [ pkgs.lib.licenses.mit ];
1244 license = [ pkgs.lib.licenses.mit ];
1245 };
1245 };
1246 };
1246 };
1247 "plaster" = super.buildPythonPackage {
1247 "plaster" = super.buildPythonPackage {
1248 name = "plaster-1.0";
1248 name = "plaster-1.0";
1249 doCheck = false;
1249 doCheck = false;
1250 propagatedBuildInputs = [
1250 propagatedBuildInputs = [
1251 self."setuptools"
1251 self."setuptools"
1252 ];
1252 ];
1253 src = fetchurl {
1253 src = fetchurl {
1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1256 };
1256 };
1257 meta = {
1257 meta = {
1258 license = [ pkgs.lib.licenses.mit ];
1258 license = [ pkgs.lib.licenses.mit ];
1259 };
1259 };
1260 };
1260 };
1261 "plaster-pastedeploy" = super.buildPythonPackage {
1261 "plaster-pastedeploy" = super.buildPythonPackage {
1262 name = "plaster-pastedeploy-0.7";
1262 name = "plaster-pastedeploy-0.7";
1263 doCheck = false;
1263 doCheck = false;
1264 propagatedBuildInputs = [
1264 propagatedBuildInputs = [
1265 self."pastedeploy"
1265 self."pastedeploy"
1266 self."plaster"
1266 self."plaster"
1267 ];
1267 ];
1268 src = fetchurl {
1268 src = fetchurl {
1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1271 };
1271 };
1272 meta = {
1272 meta = {
1273 license = [ pkgs.lib.licenses.mit ];
1273 license = [ pkgs.lib.licenses.mit ];
1274 };
1274 };
1275 };
1275 };
1276 "pluggy" = super.buildPythonPackage {
1276 "pluggy" = super.buildPythonPackage {
1277 name = "pluggy-0.13.1";
1277 name = "pluggy-0.13.1";
1278 doCheck = false;
1278 doCheck = false;
1279 propagatedBuildInputs = [
1279 propagatedBuildInputs = [
1280 self."importlib-metadata"
1280 self."importlib-metadata"
1281 ];
1281 ];
1282 src = fetchurl {
1282 src = fetchurl {
1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1285 };
1285 };
1286 meta = {
1286 meta = {
1287 license = [ pkgs.lib.licenses.mit ];
1287 license = [ pkgs.lib.licenses.mit ];
1288 };
1288 };
1289 };
1289 };
1290 "premailer" = super.buildPythonPackage {
1290 "premailer" = super.buildPythonPackage {
1291 name = "premailer-3.6.1";
1291 name = "premailer-3.6.1";
1292 doCheck = false;
1292 doCheck = false;
1293 propagatedBuildInputs = [
1293 propagatedBuildInputs = [
1294 self."lxml"
1294 self."lxml"
1295 self."cssselect"
1295 self."cssselect"
1296 self."cssutils"
1296 self."cssutils"
1297 self."requests"
1297 self."requests"
1298 self."cachetools"
1298 self."cachetools"
1299 ];
1299 ];
1300 src = fetchurl {
1300 src = fetchurl {
1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1303 };
1303 };
1304 meta = {
1304 meta = {
1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1306 };
1306 };
1307 };
1307 };
1308 "prompt-toolkit" = super.buildPythonPackage {
1308 "prompt-toolkit" = super.buildPythonPackage {
1309 name = "prompt-toolkit-1.0.18";
1309 name = "prompt-toolkit-1.0.18";
1310 doCheck = false;
1310 doCheck = false;
1311 propagatedBuildInputs = [
1311 propagatedBuildInputs = [
1312 self."six"
1312 self."six"
1313 self."wcwidth"
1313 self."wcwidth"
1314 ];
1314 ];
1315 src = fetchurl {
1315 src = fetchurl {
1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1318 };
1318 };
1319 meta = {
1319 meta = {
1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1321 };
1321 };
1322 };
1322 };
1323 "psutil" = super.buildPythonPackage {
1323 "psutil" = super.buildPythonPackage {
1324 name = "psutil-5.7.0";
1324 name = "psutil-5.7.0";
1325 doCheck = false;
1325 doCheck = false;
1326 src = fetchurl {
1326 src = fetchurl {
1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1329 };
1329 };
1330 meta = {
1330 meta = {
1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1332 };
1332 };
1333 };
1333 };
1334 "psycopg2" = super.buildPythonPackage {
1334 "psycopg2" = super.buildPythonPackage {
1335 name = "psycopg2-2.8.4";
1335 name = "psycopg2-2.8.4";
1336 doCheck = false;
1336 doCheck = false;
1337 src = fetchurl {
1337 src = fetchurl {
1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1340 };
1340 };
1341 meta = {
1341 meta = {
1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1343 };
1343 };
1344 };
1344 };
1345 "ptyprocess" = super.buildPythonPackage {
1345 "ptyprocess" = super.buildPythonPackage {
1346 name = "ptyprocess-0.6.0";
1346 name = "ptyprocess-0.6.0";
1347 doCheck = false;
1347 doCheck = false;
1348 src = fetchurl {
1348 src = fetchurl {
1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1351 };
1351 };
1352 meta = {
1352 meta = {
1353 license = [ ];
1353 license = [ ];
1354 };
1354 };
1355 };
1355 };
1356 "py" = super.buildPythonPackage {
1356 "py" = super.buildPythonPackage {
1357 name = "py-1.8.0";
1357 name = "py-1.8.0";
1358 doCheck = false;
1358 doCheck = false;
1359 src = fetchurl {
1359 src = fetchurl {
1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1362 };
1362 };
1363 meta = {
1363 meta = {
1364 license = [ pkgs.lib.licenses.mit ];
1364 license = [ pkgs.lib.licenses.mit ];
1365 };
1365 };
1366 };
1366 };
1367 "py-bcrypt" = super.buildPythonPackage {
1367 "py-bcrypt" = super.buildPythonPackage {
1368 name = "py-bcrypt-0.4";
1368 name = "py-bcrypt-0.4";
1369 doCheck = false;
1369 doCheck = false;
1370 src = fetchurl {
1370 src = fetchurl {
1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1373 };
1373 };
1374 meta = {
1374 meta = {
1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1376 };
1376 };
1377 };
1377 };
1378 "py-gfm" = super.buildPythonPackage {
1378 "py-gfm" = super.buildPythonPackage {
1379 name = "py-gfm-0.1.4";
1379 name = "py-gfm-0.1.4";
1380 doCheck = false;
1380 doCheck = false;
1381 propagatedBuildInputs = [
1381 propagatedBuildInputs = [
1382 self."setuptools"
1382 self."setuptools"
1383 self."markdown"
1383 self."markdown"
1384 ];
1384 ];
1385 src = fetchurl {
1385 src = fetchurl {
1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1388 };
1388 };
1389 meta = {
1389 meta = {
1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1391 };
1391 };
1392 };
1392 };
1393 "pyasn1" = super.buildPythonPackage {
1393 "pyasn1" = super.buildPythonPackage {
1394 name = "pyasn1-0.4.8";
1394 name = "pyasn1-0.4.8";
1395 doCheck = false;
1395 doCheck = false;
1396 src = fetchurl {
1396 src = fetchurl {
1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1399 };
1399 };
1400 meta = {
1400 meta = {
1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1402 };
1402 };
1403 };
1403 };
1404 "pyasn1-modules" = super.buildPythonPackage {
1404 "pyasn1-modules" = super.buildPythonPackage {
1405 name = "pyasn1-modules-0.2.6";
1405 name = "pyasn1-modules-0.2.6";
1406 doCheck = false;
1406 doCheck = false;
1407 propagatedBuildInputs = [
1407 propagatedBuildInputs = [
1408 self."pyasn1"
1408 self."pyasn1"
1409 ];
1409 ];
1410 src = fetchurl {
1410 src = fetchurl {
1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1413 };
1413 };
1414 meta = {
1414 meta = {
1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1416 };
1416 };
1417 };
1417 };
1418 "pycparser" = super.buildPythonPackage {
1418 "pycparser" = super.buildPythonPackage {
1419 name = "pycparser-2.20";
1419 name = "pycparser-2.20";
1420 doCheck = false;
1420 doCheck = false;
1421 src = fetchurl {
1421 src = fetchurl {
1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1424 };
1424 };
1425 meta = {
1425 meta = {
1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1427 };
1427 };
1428 };
1428 };
1429 "pycrypto" = super.buildPythonPackage {
1429 "pycrypto" = super.buildPythonPackage {
1430 name = "pycrypto-2.6.1";
1430 name = "pycrypto-2.6.1";
1431 doCheck = false;
1431 doCheck = false;
1432 src = fetchurl {
1432 src = fetchurl {
1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1435 };
1435 };
1436 meta = {
1436 meta = {
1437 license = [ pkgs.lib.licenses.publicDomain ];
1437 license = [ pkgs.lib.licenses.publicDomain ];
1438 };
1438 };
1439 };
1439 };
1440 "pycurl" = super.buildPythonPackage {
1440 "pycurl" = super.buildPythonPackage {
1441 name = "pycurl-7.43.0.3";
1441 name = "pycurl-7.43.0.3";
1442 doCheck = false;
1442 doCheck = false;
1443 src = fetchurl {
1443 src = fetchurl {
1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1446 };
1446 };
1447 meta = {
1447 meta = {
1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1449 };
1449 };
1450 };
1450 };
1451 "pygments" = super.buildPythonPackage {
1451 "pygments" = super.buildPythonPackage {
1452 name = "pygments-2.4.2";
1452 name = "pygments-2.4.2";
1453 doCheck = false;
1453 doCheck = false;
1454 src = fetchurl {
1454 src = fetchurl {
1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1457 };
1457 };
1458 meta = {
1458 meta = {
1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1460 };
1460 };
1461 };
1461 };
1462 "pymysql" = super.buildPythonPackage {
1462 "pymysql" = super.buildPythonPackage {
1463 name = "pymysql-0.8.1";
1463 name = "pymysql-0.8.1";
1464 doCheck = false;
1464 doCheck = false;
1465 src = fetchurl {
1465 src = fetchurl {
1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1468 };
1468 };
1469 meta = {
1469 meta = {
1470 license = [ pkgs.lib.licenses.mit ];
1470 license = [ pkgs.lib.licenses.mit ];
1471 };
1471 };
1472 };
1472 };
1473 "pyotp" = super.buildPythonPackage {
1473 "pyotp" = super.buildPythonPackage {
1474 name = "pyotp-2.3.0";
1474 name = "pyotp-2.3.0";
1475 doCheck = false;
1475 doCheck = false;
1476 src = fetchurl {
1476 src = fetchurl {
1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1479 };
1479 };
1480 meta = {
1480 meta = {
1481 license = [ pkgs.lib.licenses.mit ];
1481 license = [ pkgs.lib.licenses.mit ];
1482 };
1482 };
1483 };
1483 };
1484 "pyparsing" = super.buildPythonPackage {
1484 "pyparsing" = super.buildPythonPackage {
1485 name = "pyparsing-2.4.7";
1485 name = "pyparsing-2.4.7";
1486 doCheck = false;
1486 doCheck = false;
1487 src = fetchurl {
1487 src = fetchurl {
1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1490 };
1490 };
1491 meta = {
1491 meta = {
1492 license = [ pkgs.lib.licenses.mit ];
1492 license = [ pkgs.lib.licenses.mit ];
1493 };
1493 };
1494 };
1494 };
1495 "pyramid" = super.buildPythonPackage {
1495 "pyramid" = super.buildPythonPackage {
1496 name = "pyramid-1.10.4";
1496 name = "pyramid-1.10.4";
1497 doCheck = false;
1497 doCheck = false;
1498 propagatedBuildInputs = [
1498 propagatedBuildInputs = [
1499 self."hupper"
1499 self."hupper"
1500 self."plaster"
1500 self."plaster"
1501 self."plaster-pastedeploy"
1501 self."plaster-pastedeploy"
1502 self."setuptools"
1502 self."setuptools"
1503 self."translationstring"
1503 self."translationstring"
1504 self."venusian"
1504 self."venusian"
1505 self."webob"
1505 self."webob"
1506 self."zope.deprecation"
1506 self."zope.deprecation"
1507 self."zope.interface"
1507 self."zope.interface"
1508 self."repoze.lru"
1508 self."repoze.lru"
1509 ];
1509 ];
1510 src = fetchurl {
1510 src = fetchurl {
1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1513 };
1513 };
1514 meta = {
1514 meta = {
1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1516 };
1516 };
1517 };
1517 };
1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1519 name = "pyramid-debugtoolbar-4.6.1";
1519 name = "pyramid-debugtoolbar-4.6.1";
1520 doCheck = false;
1520 doCheck = false;
1521 propagatedBuildInputs = [
1521 propagatedBuildInputs = [
1522 self."pyramid"
1522 self."pyramid"
1523 self."pyramid-mako"
1523 self."pyramid-mako"
1524 self."repoze.lru"
1524 self."repoze.lru"
1525 self."pygments"
1525 self."pygments"
1526 self."ipaddress"
1526 self."ipaddress"
1527 ];
1527 ];
1528 src = fetchurl {
1528 src = fetchurl {
1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1531 };
1531 };
1532 meta = {
1532 meta = {
1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1534 };
1534 };
1535 };
1535 };
1536 "pyramid-jinja2" = super.buildPythonPackage {
1536 "pyramid-jinja2" = super.buildPythonPackage {
1537 name = "pyramid-jinja2-2.7";
1537 name = "pyramid-jinja2-2.7";
1538 doCheck = false;
1538 doCheck = false;
1539 propagatedBuildInputs = [
1539 propagatedBuildInputs = [
1540 self."pyramid"
1540 self."pyramid"
1541 self."zope.deprecation"
1541 self."zope.deprecation"
1542 self."jinja2"
1542 self."jinja2"
1543 self."markupsafe"
1543 self."markupsafe"
1544 ];
1544 ];
1545 src = fetchurl {
1545 src = fetchurl {
1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1548 };
1548 };
1549 meta = {
1549 meta = {
1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1551 };
1551 };
1552 };
1552 };
1553 "pyramid-apispec" = super.buildPythonPackage {
1553 "pyramid-apispec" = super.buildPythonPackage {
1554 name = "pyramid-apispec-0.3.2";
1554 name = "pyramid-apispec-0.3.2";
1555 doCheck = false;
1555 doCheck = false;
1556 propagatedBuildInputs = [
1556 propagatedBuildInputs = [
1557 self."apispec"
1557 self."apispec"
1558 ];
1558 ];
1559 src = fetchurl {
1559 src = fetchurl {
1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1562 };
1562 };
1563 meta = {
1563 meta = {
1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1565 };
1565 };
1566 };
1566 };
1567 "pyramid-mailer" = super.buildPythonPackage {
1567 "pyramid-mailer" = super.buildPythonPackage {
1568 name = "pyramid-mailer-0.15.1";
1568 name = "pyramid-mailer-0.15.1";
1569 doCheck = false;
1569 doCheck = false;
1570 propagatedBuildInputs = [
1570 propagatedBuildInputs = [
1571 self."pyramid"
1571 self."pyramid"
1572 self."repoze.sendmail"
1572 self."repoze.sendmail"
1573 self."transaction"
1573 self."transaction"
1574 ];
1574 ];
1575 src = fetchurl {
1575 src = fetchurl {
1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1578 };
1578 };
1579 meta = {
1579 meta = {
1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1581 };
1581 };
1582 };
1582 };
1583 "pyramid-mako" = super.buildPythonPackage {
1583 "pyramid-mako" = super.buildPythonPackage {
1584 name = "pyramid-mako-1.1.0";
1584 name = "pyramid-mako-1.1.0";
1585 doCheck = false;
1585 doCheck = false;
1586 propagatedBuildInputs = [
1586 propagatedBuildInputs = [
1587 self."pyramid"
1587 self."pyramid"
1588 self."mako"
1588 self."mako"
1589 ];
1589 ];
1590 src = fetchurl {
1590 src = fetchurl {
1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1593 };
1593 };
1594 meta = {
1594 meta = {
1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1596 };
1596 };
1597 };
1597 };
1598 "pysqlite" = super.buildPythonPackage {
1598 "pysqlite" = super.buildPythonPackage {
1599 name = "pysqlite-2.8.3";
1599 name = "pysqlite-2.8.3";
1600 doCheck = false;
1600 doCheck = false;
1601 src = fetchurl {
1601 src = fetchurl {
1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1604 };
1604 };
1605 meta = {
1605 meta = {
1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1607 };
1607 };
1608 };
1608 };
1609 "pytest" = super.buildPythonPackage {
1609 "pytest" = super.buildPythonPackage {
1610 name = "pytest-4.6.5";
1610 name = "pytest-4.6.5";
1611 doCheck = false;
1611 doCheck = false;
1612 propagatedBuildInputs = [
1612 propagatedBuildInputs = [
1613 self."py"
1613 self."py"
1614 self."six"
1614 self."six"
1615 self."packaging"
1615 self."packaging"
1616 self."attrs"
1616 self."attrs"
1617 self."atomicwrites"
1617 self."atomicwrites"
1618 self."pluggy"
1618 self."pluggy"
1619 self."importlib-metadata"
1619 self."importlib-metadata"
1620 self."wcwidth"
1620 self."wcwidth"
1621 self."funcsigs"
1621 self."funcsigs"
1622 self."pathlib2"
1622 self."pathlib2"
1623 self."more-itertools"
1623 self."more-itertools"
1624 ];
1624 ];
1625 src = fetchurl {
1625 src = fetchurl {
1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1628 };
1628 };
1629 meta = {
1629 meta = {
1630 license = [ pkgs.lib.licenses.mit ];
1630 license = [ pkgs.lib.licenses.mit ];
1631 };
1631 };
1632 };
1632 };
1633 "pytest-cov" = super.buildPythonPackage {
1633 "pytest-cov" = super.buildPythonPackage {
1634 name = "pytest-cov-2.7.1";
1634 name = "pytest-cov-2.7.1";
1635 doCheck = false;
1635 doCheck = false;
1636 propagatedBuildInputs = [
1636 propagatedBuildInputs = [
1637 self."pytest"
1637 self."pytest"
1638 self."coverage"
1638 self."coverage"
1639 ];
1639 ];
1640 src = fetchurl {
1640 src = fetchurl {
1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1643 };
1643 };
1644 meta = {
1644 meta = {
1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1646 };
1646 };
1647 };
1647 };
1648 "pytest-profiling" = super.buildPythonPackage {
1648 "pytest-profiling" = super.buildPythonPackage {
1649 name = "pytest-profiling-1.7.0";
1649 name = "pytest-profiling-1.7.0";
1650 doCheck = false;
1650 doCheck = false;
1651 propagatedBuildInputs = [
1651 propagatedBuildInputs = [
1652 self."six"
1652 self."six"
1653 self."pytest"
1653 self."pytest"
1654 self."gprof2dot"
1654 self."gprof2dot"
1655 ];
1655 ];
1656 src = fetchurl {
1656 src = fetchurl {
1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1659 };
1659 };
1660 meta = {
1660 meta = {
1661 license = [ pkgs.lib.licenses.mit ];
1661 license = [ pkgs.lib.licenses.mit ];
1662 };
1662 };
1663 };
1663 };
1664 "pytest-runner" = super.buildPythonPackage {
1664 "pytest-runner" = super.buildPythonPackage {
1665 name = "pytest-runner-5.1";
1665 name = "pytest-runner-5.1";
1666 doCheck = false;
1666 doCheck = false;
1667 src = fetchurl {
1667 src = fetchurl {
1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1670 };
1670 };
1671 meta = {
1671 meta = {
1672 license = [ pkgs.lib.licenses.mit ];
1672 license = [ pkgs.lib.licenses.mit ];
1673 };
1673 };
1674 };
1674 };
1675 "pytest-sugar" = super.buildPythonPackage {
1675 "pytest-sugar" = super.buildPythonPackage {
1676 name = "pytest-sugar-0.9.2";
1676 name = "pytest-sugar-0.9.2";
1677 doCheck = false;
1677 doCheck = false;
1678 propagatedBuildInputs = [
1678 propagatedBuildInputs = [
1679 self."pytest"
1679 self."pytest"
1680 self."termcolor"
1680 self."termcolor"
1681 self."packaging"
1681 self."packaging"
1682 ];
1682 ];
1683 src = fetchurl {
1683 src = fetchurl {
1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1686 };
1686 };
1687 meta = {
1687 meta = {
1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1689 };
1689 };
1690 };
1690 };
1691 "pytest-timeout" = super.buildPythonPackage {
1691 "pytest-timeout" = super.buildPythonPackage {
1692 name = "pytest-timeout-1.3.3";
1692 name = "pytest-timeout-1.3.3";
1693 doCheck = false;
1693 doCheck = false;
1694 propagatedBuildInputs = [
1694 propagatedBuildInputs = [
1695 self."pytest"
1695 self."pytest"
1696 ];
1696 ];
1697 src = fetchurl {
1697 src = fetchurl {
1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1700 };
1700 };
1701 meta = {
1701 meta = {
1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1703 };
1703 };
1704 };
1704 };
1705 "python-dateutil" = super.buildPythonPackage {
1705 "python-dateutil" = super.buildPythonPackage {
1706 name = "python-dateutil-2.8.1";
1706 name = "python-dateutil-2.8.1";
1707 doCheck = false;
1707 doCheck = false;
1708 propagatedBuildInputs = [
1708 propagatedBuildInputs = [
1709 self."six"
1709 self."six"
1710 ];
1710 ];
1711 src = fetchurl {
1711 src = fetchurl {
1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1714 };
1714 };
1715 meta = {
1715 meta = {
1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1717 };
1717 };
1718 };
1718 };
1719 "python-editor" = super.buildPythonPackage {
1719 "python-editor" = super.buildPythonPackage {
1720 name = "python-editor-1.0.4";
1720 name = "python-editor-1.0.4";
1721 doCheck = false;
1721 doCheck = false;
1722 src = fetchurl {
1722 src = fetchurl {
1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1725 };
1725 };
1726 meta = {
1726 meta = {
1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1728 };
1728 };
1729 };
1729 };
1730 "python-ldap" = super.buildPythonPackage {
1730 "python-ldap" = super.buildPythonPackage {
1731 name = "python-ldap-3.2.0";
1731 name = "python-ldap-3.2.0";
1732 doCheck = false;
1732 doCheck = false;
1733 propagatedBuildInputs = [
1733 propagatedBuildInputs = [
1734 self."pyasn1"
1734 self."pyasn1"
1735 self."pyasn1-modules"
1735 self."pyasn1-modules"
1736 ];
1736 ];
1737 src = fetchurl {
1737 src = fetchurl {
1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1740 };
1740 };
1741 meta = {
1741 meta = {
1742 license = [ pkgs.lib.licenses.psfl ];
1742 license = [ pkgs.lib.licenses.psfl ];
1743 };
1743 };
1744 };
1744 };
1745 "python-memcached" = super.buildPythonPackage {
1745 "python-memcached" = super.buildPythonPackage {
1746 name = "python-memcached-1.59";
1746 name = "python-memcached-1.59";
1747 doCheck = false;
1747 doCheck = false;
1748 propagatedBuildInputs = [
1748 propagatedBuildInputs = [
1749 self."six"
1749 self."six"
1750 ];
1750 ];
1751 src = fetchurl {
1751 src = fetchurl {
1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1754 };
1754 };
1755 meta = {
1755 meta = {
1756 license = [ pkgs.lib.licenses.psfl ];
1756 license = [ pkgs.lib.licenses.psfl ];
1757 };
1757 };
1758 };
1758 };
1759 "python-pam" = super.buildPythonPackage {
1759 "python-pam" = super.buildPythonPackage {
1760 name = "python-pam-1.8.4";
1760 name = "python-pam-1.8.4";
1761 doCheck = false;
1761 doCheck = false;
1762 src = fetchurl {
1762 src = fetchurl {
1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1765 };
1765 };
1766 meta = {
1766 meta = {
1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1768 };
1768 };
1769 };
1769 };
1770 "python-saml" = super.buildPythonPackage {
1770 "python-saml" = super.buildPythonPackage {
1771 name = "python-saml-2.4.2";
1771 name = "python-saml-2.4.2";
1772 doCheck = false;
1772 doCheck = false;
1773 propagatedBuildInputs = [
1773 propagatedBuildInputs = [
1774 self."dm.xmlsec.binding"
1774 self."dm.xmlsec.binding"
1775 self."isodate"
1775 self."isodate"
1776 self."defusedxml"
1776 self."defusedxml"
1777 ];
1777 ];
1778 src = fetchurl {
1778 src = fetchurl {
1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1781 };
1781 };
1782 meta = {
1782 meta = {
1783 license = [ pkgs.lib.licenses.mit ];
1783 license = [ pkgs.lib.licenses.mit ];
1784 };
1784 };
1785 };
1785 };
1786 "pytz" = super.buildPythonPackage {
1786 "pytz" = super.buildPythonPackage {
1787 name = "pytz-2019.3";
1787 name = "pytz-2019.3";
1788 doCheck = false;
1788 doCheck = false;
1789 src = fetchurl {
1789 src = fetchurl {
1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1792 };
1792 };
1793 meta = {
1793 meta = {
1794 license = [ pkgs.lib.licenses.mit ];
1794 license = [ pkgs.lib.licenses.mit ];
1795 };
1795 };
1796 };
1796 };
1797 "pyzmq" = super.buildPythonPackage {
1797 "pyzmq" = super.buildPythonPackage {
1798 name = "pyzmq-14.6.0";
1798 name = "pyzmq-14.6.0";
1799 doCheck = false;
1799 doCheck = false;
1800 src = fetchurl {
1800 src = fetchurl {
1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1803 };
1803 };
1804 meta = {
1804 meta = {
1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1806 };
1806 };
1807 };
1807 };
1808 "PyYAML" = super.buildPythonPackage {
1808 "PyYAML" = super.buildPythonPackage {
1809 name = "PyYAML-5.3.1";
1809 name = "PyYAML-5.3.1";
1810 doCheck = false;
1810 doCheck = false;
1811 src = fetchurl {
1811 src = fetchurl {
1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1814 };
1814 };
1815 meta = {
1815 meta = {
1816 license = [ pkgs.lib.licenses.mit ];
1816 license = [ pkgs.lib.licenses.mit ];
1817 };
1817 };
1818 };
1818 };
1819 "regex" = super.buildPythonPackage {
1819 "regex" = super.buildPythonPackage {
1820 name = "regex-2020.9.27";
1820 name = "regex-2020.9.27";
1821 doCheck = false;
1821 doCheck = false;
1822 src = fetchurl {
1822 src = fetchurl {
1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1825 };
1825 };
1826 meta = {
1826 meta = {
1827 license = [ pkgs.lib.licenses.psfl ];
1827 license = [ pkgs.lib.licenses.psfl ];
1828 };
1828 };
1829 };
1829 };
1830 "redis" = super.buildPythonPackage {
1830 "redis" = super.buildPythonPackage {
1831 name = "redis-3.4.1";
1831 name = "redis-3.4.1";
1832 doCheck = false;
1832 doCheck = false;
1833 src = fetchurl {
1833 src = fetchurl {
1834 url = "https://files.pythonhosted.org/packages/ef/2e/2c0f59891db7db087a7eeaa79bc7c7f2c039e71a2b5b0a41391e9d462926/redis-3.4.1.tar.gz";
1834 url = "https://files.pythonhosted.org/packages/ef/2e/2c0f59891db7db087a7eeaa79bc7c7f2c039e71a2b5b0a41391e9d462926/redis-3.4.1.tar.gz";
1835 sha256 = "07yaj0j9fs7xdkg5bg926fa990khyigjbp31si8ai20vj8sv7kqd";
1835 sha256 = "07yaj0j9fs7xdkg5bg926fa990khyigjbp31si8ai20vj8sv7kqd";
1836 };
1836 };
1837 meta = {
1837 meta = {
1838 license = [ pkgs.lib.licenses.mit ];
1838 license = [ pkgs.lib.licenses.mit ];
1839 };
1839 };
1840 };
1840 };
1841 "repoze.lru" = super.buildPythonPackage {
1841 "repoze.lru" = super.buildPythonPackage {
1842 name = "repoze.lru-0.7";
1842 name = "repoze.lru-0.7";
1843 doCheck = false;
1843 doCheck = false;
1844 src = fetchurl {
1844 src = fetchurl {
1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1847 };
1847 };
1848 meta = {
1848 meta = {
1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1850 };
1850 };
1851 };
1851 };
1852 "repoze.sendmail" = super.buildPythonPackage {
1852 "repoze.sendmail" = super.buildPythonPackage {
1853 name = "repoze.sendmail-4.4.1";
1853 name = "repoze.sendmail-4.4.1";
1854 doCheck = false;
1854 doCheck = false;
1855 propagatedBuildInputs = [
1855 propagatedBuildInputs = [
1856 self."setuptools"
1856 self."setuptools"
1857 self."zope.interface"
1857 self."zope.interface"
1858 self."transaction"
1858 self."transaction"
1859 ];
1859 ];
1860 src = fetchurl {
1860 src = fetchurl {
1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1863 };
1863 };
1864 meta = {
1864 meta = {
1865 license = [ pkgs.lib.licenses.zpl21 ];
1865 license = [ pkgs.lib.licenses.zpl21 ];
1866 };
1866 };
1867 };
1867 };
1868 "requests" = super.buildPythonPackage {
1868 "requests" = super.buildPythonPackage {
1869 name = "requests-2.22.0";
1869 name = "requests-2.22.0";
1870 doCheck = false;
1870 doCheck = false;
1871 propagatedBuildInputs = [
1871 propagatedBuildInputs = [
1872 self."chardet"
1872 self."chardet"
1873 self."idna"
1873 self."idna"
1874 self."urllib3"
1874 self."urllib3"
1875 self."certifi"
1875 self."certifi"
1876 ];
1876 ];
1877 src = fetchurl {
1877 src = fetchurl {
1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1880 };
1880 };
1881 meta = {
1881 meta = {
1882 license = [ pkgs.lib.licenses.asl20 ];
1882 license = [ pkgs.lib.licenses.asl20 ];
1883 };
1883 };
1884 };
1884 };
1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1886 name = "rhodecode-enterprise-ce-4.22.0";
1886 name = "rhodecode-enterprise-ce-4.23.0";
1887 buildInputs = [
1887 buildInputs = [
1888 self."pytest"
1888 self."pytest"
1889 self."py"
1889 self."py"
1890 self."pytest-cov"
1890 self."pytest-cov"
1891 self."pytest-sugar"
1891 self."pytest-sugar"
1892 self."pytest-runner"
1892 self."pytest-runner"
1893 self."pytest-profiling"
1893 self."pytest-profiling"
1894 self."pytest-timeout"
1894 self."pytest-timeout"
1895 self."gprof2dot"
1895 self."gprof2dot"
1896 self."mock"
1896 self."mock"
1897 self."cov-core"
1897 self."cov-core"
1898 self."coverage"
1898 self."coverage"
1899 self."webtest"
1899 self."webtest"
1900 self."beautifulsoup4"
1900 self."beautifulsoup4"
1901 self."configobj"
1901 self."configobj"
1902 ];
1902 ];
1903 doCheck = true;
1903 doCheck = true;
1904 propagatedBuildInputs = [
1904 propagatedBuildInputs = [
1905 self."amqp"
1905 self."amqp"
1906 self."babel"
1906 self."babel"
1907 self."beaker"
1907 self."beaker"
1908 self."bleach"
1908 self."bleach"
1909 self."celery"
1909 self."celery"
1910 self."channelstream"
1910 self."channelstream"
1911 self."click"
1911 self."click"
1912 self."colander"
1912 self."colander"
1913 self."configobj"
1913 self."configobj"
1914 self."cssselect"
1914 self."cssselect"
1915 self."cryptography"
1915 self."cryptography"
1916 self."decorator"
1916 self."decorator"
1917 self."deform"
1917 self."deform"
1918 self."docutils"
1918 self."docutils"
1919 self."dogpile.cache"
1919 self."dogpile.cache"
1920 self."dogpile.core"
1920 self."dogpile.core"
1921 self."formencode"
1921 self."formencode"
1922 self."future"
1922 self."future"
1923 self."futures"
1923 self."futures"
1924 self."infrae.cache"
1924 self."infrae.cache"
1925 self."iso8601"
1925 self."iso8601"
1926 self."itsdangerous"
1926 self."itsdangerous"
1927 self."kombu"
1927 self."kombu"
1928 self."lxml"
1928 self."lxml"
1929 self."mako"
1929 self."mako"
1930 self."markdown"
1930 self."markdown"
1931 self."markupsafe"
1931 self."markupsafe"
1932 self."msgpack-python"
1932 self."msgpack-python"
1933 self."pyotp"
1933 self."pyotp"
1934 self."packaging"
1934 self."packaging"
1935 self."pathlib2"
1935 self."pathlib2"
1936 self."paste"
1936 self."paste"
1937 self."pastedeploy"
1937 self."pastedeploy"
1938 self."pastescript"
1938 self."pastescript"
1939 self."peppercorn"
1939 self."peppercorn"
1940 self."premailer"
1940 self."premailer"
1941 self."psutil"
1941 self."psutil"
1942 self."py-bcrypt"
1942 self."py-bcrypt"
1943 self."pycurl"
1943 self."pycurl"
1944 self."pycrypto"
1944 self."pycrypto"
1945 self."pygments"
1945 self."pygments"
1946 self."pyparsing"
1946 self."pyparsing"
1947 self."pyramid-debugtoolbar"
1947 self."pyramid-debugtoolbar"
1948 self."pyramid-mako"
1948 self."pyramid-mako"
1949 self."pyramid"
1949 self."pyramid"
1950 self."pyramid-mailer"
1950 self."pyramid-mailer"
1951 self."python-dateutil"
1951 self."python-dateutil"
1952 self."python-ldap"
1952 self."python-ldap"
1953 self."python-memcached"
1953 self."python-memcached"
1954 self."python-pam"
1954 self."python-pam"
1955 self."python-saml"
1955 self."python-saml"
1956 self."pytz"
1956 self."pytz"
1957 self."tzlocal"
1957 self."tzlocal"
1958 self."pyzmq"
1958 self."pyzmq"
1959 self."py-gfm"
1959 self."py-gfm"
1960 self."regex"
1960 self."regex"
1961 self."redis"
1961 self."redis"
1962 self."repoze.lru"
1962 self."repoze.lru"
1963 self."requests"
1963 self."requests"
1964 self."routes"
1964 self."routes"
1965 self."simplejson"
1965 self."simplejson"
1966 self."six"
1966 self."six"
1967 self."sqlalchemy"
1967 self."sqlalchemy"
1968 self."sshpubkeys"
1968 self."sshpubkeys"
1969 self."subprocess32"
1969 self."subprocess32"
1970 self."supervisor"
1970 self."supervisor"
1971 self."translationstring"
1971 self."translationstring"
1972 self."urllib3"
1972 self."urllib3"
1973 self."urlobject"
1973 self."urlobject"
1974 self."venusian"
1974 self."venusian"
1975 self."weberror"
1975 self."weberror"
1976 self."webhelpers2"
1976 self."webhelpers2"
1977 self."webob"
1977 self."webob"
1978 self."whoosh"
1978 self."whoosh"
1979 self."wsgiref"
1979 self."wsgiref"
1980 self."zope.cachedescriptors"
1980 self."zope.cachedescriptors"
1981 self."zope.deprecation"
1981 self."zope.deprecation"
1982 self."zope.event"
1982 self."zope.event"
1983 self."zope.interface"
1983 self."zope.interface"
1984 self."mysql-python"
1984 self."mysql-python"
1985 self."pymysql"
1985 self."pymysql"
1986 self."pysqlite"
1986 self."pysqlite"
1987 self."psycopg2"
1987 self."psycopg2"
1988 self."nbconvert"
1988 self."nbconvert"
1989 self."nbformat"
1989 self."nbformat"
1990 self."jupyter-client"
1990 self."jupyter-client"
1991 self."jupyter-core"
1991 self."jupyter-core"
1992 self."alembic"
1992 self."alembic"
1993 self."invoke"
1993 self."invoke"
1994 self."bumpversion"
1994 self."bumpversion"
1995 self."gevent"
1995 self."gevent"
1996 self."greenlet"
1996 self."greenlet"
1997 self."gunicorn"
1997 self."gunicorn"
1998 self."waitress"
1998 self."waitress"
1999 self."ipdb"
1999 self."ipdb"
2000 self."ipython"
2000 self."ipython"
2001 self."rhodecode-tools"
2001 self."rhodecode-tools"
2002 self."appenlight-client"
2002 self."appenlight-client"
2003 self."pytest"
2003 self."pytest"
2004 self."py"
2004 self."py"
2005 self."pytest-cov"
2005 self."pytest-cov"
2006 self."pytest-sugar"
2006 self."pytest-sugar"
2007 self."pytest-runner"
2007 self."pytest-runner"
2008 self."pytest-profiling"
2008 self."pytest-profiling"
2009 self."pytest-timeout"
2009 self."pytest-timeout"
2010 self."gprof2dot"
2010 self."gprof2dot"
2011 self."mock"
2011 self."mock"
2012 self."cov-core"
2012 self."cov-core"
2013 self."coverage"
2013 self."coverage"
2014 self."webtest"
2014 self."webtest"
2015 self."beautifulsoup4"
2015 self."beautifulsoup4"
2016 ];
2016 ];
2017 src = ./.;
2017 src = ./.;
2018 meta = {
2018 meta = {
2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2020 };
2020 };
2021 };
2021 };
2022 "rhodecode-tools" = super.buildPythonPackage {
2022 "rhodecode-tools" = super.buildPythonPackage {
2023 name = "rhodecode-tools-1.4.0";
2023 name = "rhodecode-tools-1.4.0";
2024 doCheck = false;
2024 doCheck = false;
2025 propagatedBuildInputs = [
2025 propagatedBuildInputs = [
2026 self."click"
2026 self."click"
2027 self."future"
2027 self."future"
2028 self."six"
2028 self."six"
2029 self."mako"
2029 self."mako"
2030 self."markupsafe"
2030 self."markupsafe"
2031 self."requests"
2031 self."requests"
2032 self."urllib3"
2032 self."urllib3"
2033 self."whoosh"
2033 self."whoosh"
2034 self."elasticsearch"
2034 self."elasticsearch"
2035 self."elasticsearch-dsl"
2035 self."elasticsearch-dsl"
2036 self."elasticsearch2"
2036 self."elasticsearch2"
2037 self."elasticsearch1-dsl"
2037 self."elasticsearch1-dsl"
2038 ];
2038 ];
2039 src = fetchurl {
2039 src = fetchurl {
2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2042 };
2042 };
2043 meta = {
2043 meta = {
2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2045 };
2045 };
2046 };
2046 };
2047 "routes" = super.buildPythonPackage {
2047 "routes" = super.buildPythonPackage {
2048 name = "routes-2.4.1";
2048 name = "routes-2.4.1";
2049 doCheck = false;
2049 doCheck = false;
2050 propagatedBuildInputs = [
2050 propagatedBuildInputs = [
2051 self."six"
2051 self."six"
2052 self."repoze.lru"
2052 self."repoze.lru"
2053 ];
2053 ];
2054 src = fetchurl {
2054 src = fetchurl {
2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2057 };
2057 };
2058 meta = {
2058 meta = {
2059 license = [ pkgs.lib.licenses.mit ];
2059 license = [ pkgs.lib.licenses.mit ];
2060 };
2060 };
2061 };
2061 };
2062 "scandir" = super.buildPythonPackage {
2062 "scandir" = super.buildPythonPackage {
2063 name = "scandir-1.10.0";
2063 name = "scandir-1.10.0";
2064 doCheck = false;
2064 doCheck = false;
2065 src = fetchurl {
2065 src = fetchurl {
2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2068 };
2068 };
2069 meta = {
2069 meta = {
2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2071 };
2071 };
2072 };
2072 };
2073 "setproctitle" = super.buildPythonPackage {
2073 "setproctitle" = super.buildPythonPackage {
2074 name = "setproctitle-1.1.10";
2074 name = "setproctitle-1.1.10";
2075 doCheck = false;
2075 doCheck = false;
2076 src = fetchurl {
2076 src = fetchurl {
2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2079 };
2079 };
2080 meta = {
2080 meta = {
2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2082 };
2082 };
2083 };
2083 };
2084 "setuptools" = super.buildPythonPackage {
2084 "setuptools" = super.buildPythonPackage {
2085 name = "setuptools-44.1.0";
2085 name = "setuptools-44.1.0";
2086 doCheck = false;
2086 doCheck = false;
2087 src = fetchurl {
2087 src = fetchurl {
2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2090 };
2090 };
2091 meta = {
2091 meta = {
2092 license = [ pkgs.lib.licenses.mit ];
2092 license = [ pkgs.lib.licenses.mit ];
2093 };
2093 };
2094 };
2094 };
2095 "simplegeneric" = super.buildPythonPackage {
2095 "simplegeneric" = super.buildPythonPackage {
2096 name = "simplegeneric-0.8.1";
2096 name = "simplegeneric-0.8.1";
2097 doCheck = false;
2097 doCheck = false;
2098 src = fetchurl {
2098 src = fetchurl {
2099 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2099 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2100 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2100 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2101 };
2101 };
2102 meta = {
2102 meta = {
2103 license = [ pkgs.lib.licenses.zpl21 ];
2103 license = [ pkgs.lib.licenses.zpl21 ];
2104 };
2104 };
2105 };
2105 };
2106 "simplejson" = super.buildPythonPackage {
2106 "simplejson" = super.buildPythonPackage {
2107 name = "simplejson-3.16.0";
2107 name = "simplejson-3.16.0";
2108 doCheck = false;
2108 doCheck = false;
2109 src = fetchurl {
2109 src = fetchurl {
2110 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2110 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2111 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2111 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2112 };
2112 };
2113 meta = {
2113 meta = {
2114 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2114 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2115 };
2115 };
2116 };
2116 };
2117 "six" = super.buildPythonPackage {
2117 "six" = super.buildPythonPackage {
2118 name = "six-1.11.0";
2118 name = "six-1.11.0";
2119 doCheck = false;
2119 doCheck = false;
2120 src = fetchurl {
2120 src = fetchurl {
2121 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2121 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2122 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2122 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2123 };
2123 };
2124 meta = {
2124 meta = {
2125 license = [ pkgs.lib.licenses.mit ];
2125 license = [ pkgs.lib.licenses.mit ];
2126 };
2126 };
2127 };
2127 };
2128 "sqlalchemy" = super.buildPythonPackage {
2128 "sqlalchemy" = super.buildPythonPackage {
2129 name = "sqlalchemy-1.3.15";
2129 name = "sqlalchemy-1.3.15";
2130 doCheck = false;
2130 doCheck = false;
2131 src = fetchurl {
2131 src = fetchurl {
2132 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2132 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2133 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2133 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2134 };
2134 };
2135 meta = {
2135 meta = {
2136 license = [ pkgs.lib.licenses.mit ];
2136 license = [ pkgs.lib.licenses.mit ];
2137 };
2137 };
2138 };
2138 };
2139 "sshpubkeys" = super.buildPythonPackage {
2139 "sshpubkeys" = super.buildPythonPackage {
2140 name = "sshpubkeys-3.1.0";
2140 name = "sshpubkeys-3.1.0";
2141 doCheck = false;
2141 doCheck = false;
2142 propagatedBuildInputs = [
2142 propagatedBuildInputs = [
2143 self."cryptography"
2143 self."cryptography"
2144 self."ecdsa"
2144 self."ecdsa"
2145 ];
2145 ];
2146 src = fetchurl {
2146 src = fetchurl {
2147 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2147 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2148 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2148 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2149 };
2149 };
2150 meta = {
2150 meta = {
2151 license = [ pkgs.lib.licenses.bsdOriginal ];
2151 license = [ pkgs.lib.licenses.bsdOriginal ];
2152 };
2152 };
2153 };
2153 };
2154 "subprocess32" = super.buildPythonPackage {
2154 "subprocess32" = super.buildPythonPackage {
2155 name = "subprocess32-3.5.4";
2155 name = "subprocess32-3.5.4";
2156 doCheck = false;
2156 doCheck = false;
2157 src = fetchurl {
2157 src = fetchurl {
2158 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2158 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2159 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2159 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2160 };
2160 };
2161 meta = {
2161 meta = {
2162 license = [ pkgs.lib.licenses.psfl ];
2162 license = [ pkgs.lib.licenses.psfl ];
2163 };
2163 };
2164 };
2164 };
2165 "supervisor" = super.buildPythonPackage {
2165 "supervisor" = super.buildPythonPackage {
2166 name = "supervisor-4.1.0";
2166 name = "supervisor-4.1.0";
2167 doCheck = false;
2167 doCheck = false;
2168 src = fetchurl {
2168 src = fetchurl {
2169 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2169 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2170 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2170 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2171 };
2171 };
2172 meta = {
2172 meta = {
2173 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2173 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2174 };
2174 };
2175 };
2175 };
2176 "tempita" = super.buildPythonPackage {
2176 "tempita" = super.buildPythonPackage {
2177 name = "tempita-0.5.2";
2177 name = "tempita-0.5.2";
2178 doCheck = false;
2178 doCheck = false;
2179 src = fetchurl {
2179 src = fetchurl {
2180 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2180 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2181 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2181 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2182 };
2182 };
2183 meta = {
2183 meta = {
2184 license = [ pkgs.lib.licenses.mit ];
2184 license = [ pkgs.lib.licenses.mit ];
2185 };
2185 };
2186 };
2186 };
2187 "termcolor" = super.buildPythonPackage {
2187 "termcolor" = super.buildPythonPackage {
2188 name = "termcolor-1.1.0";
2188 name = "termcolor-1.1.0";
2189 doCheck = false;
2189 doCheck = false;
2190 src = fetchurl {
2190 src = fetchurl {
2191 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2191 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2192 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2192 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2193 };
2193 };
2194 meta = {
2194 meta = {
2195 license = [ pkgs.lib.licenses.mit ];
2195 license = [ pkgs.lib.licenses.mit ];
2196 };
2196 };
2197 };
2197 };
2198 "testpath" = super.buildPythonPackage {
2198 "testpath" = super.buildPythonPackage {
2199 name = "testpath-0.4.4";
2199 name = "testpath-0.4.4";
2200 doCheck = false;
2200 doCheck = false;
2201 src = fetchurl {
2201 src = fetchurl {
2202 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2202 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2203 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2203 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2204 };
2204 };
2205 meta = {
2205 meta = {
2206 license = [ ];
2206 license = [ ];
2207 };
2207 };
2208 };
2208 };
2209 "traitlets" = super.buildPythonPackage {
2209 "traitlets" = super.buildPythonPackage {
2210 name = "traitlets-4.3.3";
2210 name = "traitlets-4.3.3";
2211 doCheck = false;
2211 doCheck = false;
2212 propagatedBuildInputs = [
2212 propagatedBuildInputs = [
2213 self."ipython-genutils"
2213 self."ipython-genutils"
2214 self."six"
2214 self."six"
2215 self."decorator"
2215 self."decorator"
2216 self."enum34"
2216 self."enum34"
2217 ];
2217 ];
2218 src = fetchurl {
2218 src = fetchurl {
2219 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2219 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2220 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2220 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2221 };
2221 };
2222 meta = {
2222 meta = {
2223 license = [ pkgs.lib.licenses.bsdOriginal ];
2223 license = [ pkgs.lib.licenses.bsdOriginal ];
2224 };
2224 };
2225 };
2225 };
2226 "transaction" = super.buildPythonPackage {
2226 "transaction" = super.buildPythonPackage {
2227 name = "transaction-2.4.0";
2227 name = "transaction-2.4.0";
2228 doCheck = false;
2228 doCheck = false;
2229 propagatedBuildInputs = [
2229 propagatedBuildInputs = [
2230 self."zope.interface"
2230 self."zope.interface"
2231 ];
2231 ];
2232 src = fetchurl {
2232 src = fetchurl {
2233 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2233 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2234 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2234 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2235 };
2235 };
2236 meta = {
2236 meta = {
2237 license = [ pkgs.lib.licenses.zpl21 ];
2237 license = [ pkgs.lib.licenses.zpl21 ];
2238 };
2238 };
2239 };
2239 };
2240 "translationstring" = super.buildPythonPackage {
2240 "translationstring" = super.buildPythonPackage {
2241 name = "translationstring-1.3";
2241 name = "translationstring-1.3";
2242 doCheck = false;
2242 doCheck = false;
2243 src = fetchurl {
2243 src = fetchurl {
2244 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2244 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2245 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2245 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2246 };
2246 };
2247 meta = {
2247 meta = {
2248 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2248 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2249 };
2249 };
2250 };
2250 };
2251 "tzlocal" = super.buildPythonPackage {
2251 "tzlocal" = super.buildPythonPackage {
2252 name = "tzlocal-1.5.1";
2252 name = "tzlocal-1.5.1";
2253 doCheck = false;
2253 doCheck = false;
2254 propagatedBuildInputs = [
2254 propagatedBuildInputs = [
2255 self."pytz"
2255 self."pytz"
2256 ];
2256 ];
2257 src = fetchurl {
2257 src = fetchurl {
2258 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2258 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2259 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2259 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2260 };
2260 };
2261 meta = {
2261 meta = {
2262 license = [ pkgs.lib.licenses.mit ];
2262 license = [ pkgs.lib.licenses.mit ];
2263 };
2263 };
2264 };
2264 };
2265 "urllib3" = super.buildPythonPackage {
2265 "urllib3" = super.buildPythonPackage {
2266 name = "urllib3-1.25.2";
2266 name = "urllib3-1.25.2";
2267 doCheck = false;
2267 doCheck = false;
2268 src = fetchurl {
2268 src = fetchurl {
2269 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2269 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2270 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2270 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2271 };
2271 };
2272 meta = {
2272 meta = {
2273 license = [ pkgs.lib.licenses.mit ];
2273 license = [ pkgs.lib.licenses.mit ];
2274 };
2274 };
2275 };
2275 };
2276 "urlobject" = super.buildPythonPackage {
2276 "urlobject" = super.buildPythonPackage {
2277 name = "urlobject-2.4.3";
2277 name = "urlobject-2.4.3";
2278 doCheck = false;
2278 doCheck = false;
2279 src = fetchurl {
2279 src = fetchurl {
2280 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2280 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2281 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2281 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2282 };
2282 };
2283 meta = {
2283 meta = {
2284 license = [ pkgs.lib.licenses.publicDomain ];
2284 license = [ pkgs.lib.licenses.publicDomain ];
2285 };
2285 };
2286 };
2286 };
2287 "venusian" = super.buildPythonPackage {
2287 "venusian" = super.buildPythonPackage {
2288 name = "venusian-1.2.0";
2288 name = "venusian-1.2.0";
2289 doCheck = false;
2289 doCheck = false;
2290 src = fetchurl {
2290 src = fetchurl {
2291 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2291 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2292 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2292 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2293 };
2293 };
2294 meta = {
2294 meta = {
2295 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2295 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2296 };
2296 };
2297 };
2297 };
2298 "vine" = super.buildPythonPackage {
2298 "vine" = super.buildPythonPackage {
2299 name = "vine-1.3.0";
2299 name = "vine-1.3.0";
2300 doCheck = false;
2300 doCheck = false;
2301 src = fetchurl {
2301 src = fetchurl {
2302 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2302 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2303 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2303 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2304 };
2304 };
2305 meta = {
2305 meta = {
2306 license = [ pkgs.lib.licenses.bsdOriginal ];
2306 license = [ pkgs.lib.licenses.bsdOriginal ];
2307 };
2307 };
2308 };
2308 };
2309 "waitress" = super.buildPythonPackage {
2309 "waitress" = super.buildPythonPackage {
2310 name = "waitress-1.3.1";
2310 name = "waitress-1.3.1";
2311 doCheck = false;
2311 doCheck = false;
2312 src = fetchurl {
2312 src = fetchurl {
2313 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2313 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2314 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2314 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2315 };
2315 };
2316 meta = {
2316 meta = {
2317 license = [ pkgs.lib.licenses.zpl21 ];
2317 license = [ pkgs.lib.licenses.zpl21 ];
2318 };
2318 };
2319 };
2319 };
2320 "wcwidth" = super.buildPythonPackage {
2320 "wcwidth" = super.buildPythonPackage {
2321 name = "wcwidth-0.1.9";
2321 name = "wcwidth-0.1.9";
2322 doCheck = false;
2322 doCheck = false;
2323 src = fetchurl {
2323 src = fetchurl {
2324 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2324 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2325 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2325 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2326 };
2326 };
2327 meta = {
2327 meta = {
2328 license = [ pkgs.lib.licenses.mit ];
2328 license = [ pkgs.lib.licenses.mit ];
2329 };
2329 };
2330 };
2330 };
2331 "webencodings" = super.buildPythonPackage {
2331 "webencodings" = super.buildPythonPackage {
2332 name = "webencodings-0.5.1";
2332 name = "webencodings-0.5.1";
2333 doCheck = false;
2333 doCheck = false;
2334 src = fetchurl {
2334 src = fetchurl {
2335 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2335 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2336 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2336 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2337 };
2337 };
2338 meta = {
2338 meta = {
2339 license = [ pkgs.lib.licenses.bsdOriginal ];
2339 license = [ pkgs.lib.licenses.bsdOriginal ];
2340 };
2340 };
2341 };
2341 };
2342 "weberror" = super.buildPythonPackage {
2342 "weberror" = super.buildPythonPackage {
2343 name = "weberror-0.13.1";
2343 name = "weberror-0.13.1";
2344 doCheck = false;
2344 doCheck = false;
2345 propagatedBuildInputs = [
2345 propagatedBuildInputs = [
2346 self."webob"
2346 self."webob"
2347 self."tempita"
2347 self."tempita"
2348 self."pygments"
2348 self."pygments"
2349 self."paste"
2349 self."paste"
2350 ];
2350 ];
2351 src = fetchurl {
2351 src = fetchurl {
2352 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2352 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2353 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2353 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2354 };
2354 };
2355 meta = {
2355 meta = {
2356 license = [ pkgs.lib.licenses.mit ];
2356 license = [ pkgs.lib.licenses.mit ];
2357 };
2357 };
2358 };
2358 };
2359 "webhelpers2" = super.buildPythonPackage {
2359 "webhelpers2" = super.buildPythonPackage {
2360 name = "webhelpers2-2.0";
2360 name = "webhelpers2-2.0";
2361 doCheck = false;
2361 doCheck = false;
2362 propagatedBuildInputs = [
2362 propagatedBuildInputs = [
2363 self."markupsafe"
2363 self."markupsafe"
2364 self."six"
2364 self."six"
2365 ];
2365 ];
2366 src = fetchurl {
2366 src = fetchurl {
2367 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2367 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2368 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2368 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2369 };
2369 };
2370 meta = {
2370 meta = {
2371 license = [ pkgs.lib.licenses.mit ];
2371 license = [ pkgs.lib.licenses.mit ];
2372 };
2372 };
2373 };
2373 };
2374 "webob" = super.buildPythonPackage {
2374 "webob" = super.buildPythonPackage {
2375 name = "webob-1.8.5";
2375 name = "webob-1.8.5";
2376 doCheck = false;
2376 doCheck = false;
2377 src = fetchurl {
2377 src = fetchurl {
2378 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2378 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2379 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2379 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2380 };
2380 };
2381 meta = {
2381 meta = {
2382 license = [ pkgs.lib.licenses.mit ];
2382 license = [ pkgs.lib.licenses.mit ];
2383 };
2383 };
2384 };
2384 };
2385 "webtest" = super.buildPythonPackage {
2385 "webtest" = super.buildPythonPackage {
2386 name = "webtest-2.0.34";
2386 name = "webtest-2.0.34";
2387 doCheck = false;
2387 doCheck = false;
2388 propagatedBuildInputs = [
2388 propagatedBuildInputs = [
2389 self."six"
2389 self."six"
2390 self."webob"
2390 self."webob"
2391 self."waitress"
2391 self."waitress"
2392 self."beautifulsoup4"
2392 self."beautifulsoup4"
2393 ];
2393 ];
2394 src = fetchurl {
2394 src = fetchurl {
2395 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2395 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2396 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2396 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2397 };
2397 };
2398 meta = {
2398 meta = {
2399 license = [ pkgs.lib.licenses.mit ];
2399 license = [ pkgs.lib.licenses.mit ];
2400 };
2400 };
2401 };
2401 };
2402 "whoosh" = super.buildPythonPackage {
2402 "whoosh" = super.buildPythonPackage {
2403 name = "whoosh-2.7.4";
2403 name = "whoosh-2.7.4";
2404 doCheck = false;
2404 doCheck = false;
2405 src = fetchurl {
2405 src = fetchurl {
2406 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2406 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2407 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2407 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2408 };
2408 };
2409 meta = {
2409 meta = {
2410 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2410 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2411 };
2411 };
2412 };
2412 };
2413 "ws4py" = super.buildPythonPackage {
2413 "ws4py" = super.buildPythonPackage {
2414 name = "ws4py-0.5.1";
2414 name = "ws4py-0.5.1";
2415 doCheck = false;
2415 doCheck = false;
2416 src = fetchurl {
2416 src = fetchurl {
2417 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2417 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2418 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2418 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2419 };
2419 };
2420 meta = {
2420 meta = {
2421 license = [ pkgs.lib.licenses.bsdOriginal ];
2421 license = [ pkgs.lib.licenses.bsdOriginal ];
2422 };
2422 };
2423 };
2423 };
2424 "wsgiref" = super.buildPythonPackage {
2424 "wsgiref" = super.buildPythonPackage {
2425 name = "wsgiref-0.1.2";
2425 name = "wsgiref-0.1.2";
2426 doCheck = false;
2426 doCheck = false;
2427 src = fetchurl {
2427 src = fetchurl {
2428 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2428 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2429 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2429 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2430 };
2430 };
2431 meta = {
2431 meta = {
2432 license = [ { fullName = "PSF or ZPL"; } ];
2432 license = [ { fullName = "PSF or ZPL"; } ];
2433 };
2433 };
2434 };
2434 };
2435 "zipp" = super.buildPythonPackage {
2435 "zipp" = super.buildPythonPackage {
2436 name = "zipp-1.2.0";
2436 name = "zipp-1.2.0";
2437 doCheck = false;
2437 doCheck = false;
2438 propagatedBuildInputs = [
2438 propagatedBuildInputs = [
2439 self."contextlib2"
2439 self."contextlib2"
2440 ];
2440 ];
2441 src = fetchurl {
2441 src = fetchurl {
2442 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2442 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2443 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2443 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2444 };
2444 };
2445 meta = {
2445 meta = {
2446 license = [ pkgs.lib.licenses.mit ];
2446 license = [ pkgs.lib.licenses.mit ];
2447 };
2447 };
2448 };
2448 };
2449 "zope.cachedescriptors" = super.buildPythonPackage {
2449 "zope.cachedescriptors" = super.buildPythonPackage {
2450 name = "zope.cachedescriptors-4.3.1";
2450 name = "zope.cachedescriptors-4.3.1";
2451 doCheck = false;
2451 doCheck = false;
2452 propagatedBuildInputs = [
2452 propagatedBuildInputs = [
2453 self."setuptools"
2453 self."setuptools"
2454 ];
2454 ];
2455 src = fetchurl {
2455 src = fetchurl {
2456 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2456 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2457 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2457 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2458 };
2458 };
2459 meta = {
2459 meta = {
2460 license = [ pkgs.lib.licenses.zpl21 ];
2460 license = [ pkgs.lib.licenses.zpl21 ];
2461 };
2461 };
2462 };
2462 };
2463 "zope.deprecation" = super.buildPythonPackage {
2463 "zope.deprecation" = super.buildPythonPackage {
2464 name = "zope.deprecation-4.4.0";
2464 name = "zope.deprecation-4.4.0";
2465 doCheck = false;
2465 doCheck = false;
2466 propagatedBuildInputs = [
2466 propagatedBuildInputs = [
2467 self."setuptools"
2467 self."setuptools"
2468 ];
2468 ];
2469 src = fetchurl {
2469 src = fetchurl {
2470 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2470 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2471 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2471 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2472 };
2472 };
2473 meta = {
2473 meta = {
2474 license = [ pkgs.lib.licenses.zpl21 ];
2474 license = [ pkgs.lib.licenses.zpl21 ];
2475 };
2475 };
2476 };
2476 };
2477 "zope.event" = super.buildPythonPackage {
2477 "zope.event" = super.buildPythonPackage {
2478 name = "zope.event-4.4";
2478 name = "zope.event-4.4";
2479 doCheck = false;
2479 doCheck = false;
2480 propagatedBuildInputs = [
2480 propagatedBuildInputs = [
2481 self."setuptools"
2481 self."setuptools"
2482 ];
2482 ];
2483 src = fetchurl {
2483 src = fetchurl {
2484 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2484 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2485 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2485 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2486 };
2486 };
2487 meta = {
2487 meta = {
2488 license = [ pkgs.lib.licenses.zpl21 ];
2488 license = [ pkgs.lib.licenses.zpl21 ];
2489 };
2489 };
2490 };
2490 };
2491 "zope.interface" = super.buildPythonPackage {
2491 "zope.interface" = super.buildPythonPackage {
2492 name = "zope.interface-4.6.0";
2492 name = "zope.interface-4.6.0";
2493 doCheck = false;
2493 doCheck = false;
2494 propagatedBuildInputs = [
2494 propagatedBuildInputs = [
2495 self."setuptools"
2495 self."setuptools"
2496 ];
2496 ];
2497 src = fetchurl {
2497 src = fetchurl {
2498 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2498 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2499 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2499 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2500 };
2500 };
2501 meta = {
2501 meta = {
2502 license = [ pkgs.lib.licenses.zpl21 ];
2502 license = [ pkgs.lib.licenses.zpl21 ];
2503 };
2503 };
2504 };
2504 };
2505
2505
2506 ### Test requirements
2506 ### Test requirements
2507
2507
2508
2508
2509 }
2509 }
@@ -1,1 +1,1 b''
1 4.22.0 No newline at end of file
1 4.23.0 No newline at end of file
@@ -1,60 +1,60 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import os
22 from collections import OrderedDict
22 from collections import OrderedDict
23
23
24 import sys
24 import sys
25 import platform
25 import platform
26
26
27 VERSION = tuple(open(os.path.join(
27 VERSION = tuple(open(os.path.join(
28 os.path.dirname(__file__), 'VERSION')).read().split('.'))
28 os.path.dirname(__file__), 'VERSION')).read().split('.'))
29
29
30 BACKENDS = OrderedDict()
30 BACKENDS = OrderedDict()
31
31
32 BACKENDS['hg'] = 'Mercurial repository'
32 BACKENDS['hg'] = 'Mercurial repository'
33 BACKENDS['git'] = 'Git repository'
33 BACKENDS['git'] = 'Git repository'
34 BACKENDS['svn'] = 'Subversion repository'
34 BACKENDS['svn'] = 'Subversion repository'
35
35
36
36
37 CELERY_ENABLED = False
37 CELERY_ENABLED = False
38 CELERY_EAGER = False
38 CELERY_EAGER = False
39
39
40 # link to config for pyramid
40 # link to config for pyramid
41 CONFIG = {}
41 CONFIG = {}
42
42
43 # Populated with the settings dictionary from application init in
43 # Populated with the settings dictionary from application init in
44 # rhodecode.conf.environment.load_pyramid_environment
44 # rhodecode.conf.environment.load_pyramid_environment
45 PYRAMID_SETTINGS = {}
45 PYRAMID_SETTINGS = {}
46
46
47 # Linked module for extensions
47 # Linked module for extensions
48 EXTENSIONS = {}
48 EXTENSIONS = {}
49
49
50 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
50 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
51 __dbversion__ = 110 # defines current db version for migrations
51 __dbversion__ = 112 # defines current db version for migrations
52 __platform__ = platform.system()
52 __platform__ = platform.system()
53 __license__ = 'AGPLv3, and Commercial License'
53 __license__ = 'AGPLv3, and Commercial License'
54 __author__ = 'RhodeCode GmbH'
54 __author__ = 'RhodeCode GmbH'
55 __url__ = 'https://code.rhodecode.com'
55 __url__ = 'https://code.rhodecode.com'
56
56
57 is_windows = __platform__ in ['Windows']
57 is_windows = __platform__ in ['Windows']
58 is_unix = not is_windows
58 is_unix = not is_windows
59 is_test = False
59 is_test = False
60 disable_error_handler = False
60 disable_error_handler = False
@@ -1,452 +1,458 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2014-2020 RhodeCode GmbH
3 # Copyright (C) 2014-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 """
21 """
22 JSON RPC utils
22 JSON RPC utils
23 """
23 """
24
24
25 import collections
25 import collections
26 import logging
26 import logging
27
27
28 from rhodecode.api.exc import JSONRPCError
28 from rhodecode.api.exc import JSONRPCError
29 from rhodecode.lib.auth import (
29 from rhodecode.lib.auth import (
30 HasPermissionAnyApi, HasRepoPermissionAnyApi, HasRepoGroupPermissionAnyApi)
30 HasPermissionAnyApi, HasRepoPermissionAnyApi, HasRepoGroupPermissionAnyApi)
31 from rhodecode.lib.utils import safe_unicode
31 from rhodecode.lib.utils import safe_unicode
32 from rhodecode.lib.vcs.exceptions import RepositoryError
32 from rhodecode.lib.vcs.exceptions import RepositoryError
33 from rhodecode.lib.view_utils import get_commit_from_ref_name
33 from rhodecode.lib.view_utils import get_commit_from_ref_name
34 from rhodecode.lib.utils2 import str2bool
34 from rhodecode.lib.utils2 import str2bool
35
35
36 log = logging.getLogger(__name__)
36 log = logging.getLogger(__name__)
37
37
38
38
39 class OAttr(object):
39 class OAttr(object):
40 """
40 """
41 Special Option that defines other attribute, and can default to them
41 Special Option that defines other attribute, and can default to them
42
42
43 Example::
43 Example::
44
44
45 def test(apiuser, userid=Optional(OAttr('apiuser')):
45 def test(apiuser, userid=Optional(OAttr('apiuser')):
46 user = Optional.extract(userid, evaluate_locals=local())
46 user = Optional.extract(userid, evaluate_locals=local())
47 #if we pass in userid, we get it, else it will default to apiuser
47 #if we pass in userid, we get it, else it will default to apiuser
48 #attribute
48 #attribute
49 """
49 """
50
50
51 def __init__(self, attr_name):
51 def __init__(self, attr_name):
52 self.attr_name = attr_name
52 self.attr_name = attr_name
53
53
54 def __repr__(self):
54 def __repr__(self):
55 return '<OptionalAttr:%s>' % self.attr_name
55 return '<OptionalAttr:%s>' % self.attr_name
56
56
57 def __call__(self):
57 def __call__(self):
58 return self
58 return self
59
59
60
60
61 class Optional(object):
61 class Optional(object):
62 """
62 """
63 Defines an optional parameter::
63 Defines an optional parameter::
64
64
65 param = param.getval() if isinstance(param, Optional) else param
65 param = param.getval() if isinstance(param, Optional) else param
66 param = param() if isinstance(param, Optional) else param
66 param = param() if isinstance(param, Optional) else param
67
67
68 is equivalent of::
68 is equivalent of::
69
69
70 param = Optional.extract(param)
70 param = Optional.extract(param)
71
71
72 """
72 """
73
73
74 def __init__(self, type_):
74 def __init__(self, type_):
75 self.type_ = type_
75 self.type_ = type_
76
76
77 def __repr__(self):
77 def __repr__(self):
78 return '<Optional:%s>' % self.type_.__repr__()
78 return '<Optional:%s>' % self.type_.__repr__()
79
79
80 def __call__(self):
80 def __call__(self):
81 return self.getval()
81 return self.getval()
82
82
83 def getval(self, evaluate_locals=None):
83 def getval(self, evaluate_locals=None):
84 """
84 """
85 returns value from this Optional instance
85 returns value from this Optional instance
86 """
86 """
87 if isinstance(self.type_, OAttr):
87 if isinstance(self.type_, OAttr):
88 param_name = self.type_.attr_name
88 param_name = self.type_.attr_name
89 if evaluate_locals:
89 if evaluate_locals:
90 return evaluate_locals[param_name]
90 return evaluate_locals[param_name]
91 # use params name
91 # use params name
92 return param_name
92 return param_name
93 return self.type_
93 return self.type_
94
94
95 @classmethod
95 @classmethod
96 def extract(cls, val, evaluate_locals=None, binary=None):
96 def extract(cls, val, evaluate_locals=None, binary=None):
97 """
97 """
98 Extracts value from Optional() instance
98 Extracts value from Optional() instance
99
99
100 :param val:
100 :param val:
101 :return: original value if it's not Optional instance else
101 :return: original value if it's not Optional instance else
102 value of instance
102 value of instance
103 """
103 """
104 if isinstance(val, cls):
104 if isinstance(val, cls):
105 val = val.getval(evaluate_locals)
105 val = val.getval(evaluate_locals)
106
106
107 if binary:
107 if binary:
108 val = str2bool(val)
108 val = str2bool(val)
109
109
110 return val
110 return val
111
111
112
112
113 def parse_args(cli_args, key_prefix=''):
113 def parse_args(cli_args, key_prefix=''):
114 from rhodecode.lib.utils2 import (escape_split)
114 from rhodecode.lib.utils2 import (escape_split)
115 kwargs = collections.defaultdict(dict)
115 kwargs = collections.defaultdict(dict)
116 for el in escape_split(cli_args, ','):
116 for el in escape_split(cli_args, ','):
117 kv = escape_split(el, '=', 1)
117 kv = escape_split(el, '=', 1)
118 if len(kv) == 2:
118 if len(kv) == 2:
119 k, v = kv
119 k, v = kv
120 kwargs[key_prefix + k] = v
120 kwargs[key_prefix + k] = v
121 return kwargs
121 return kwargs
122
122
123
123
124 def get_origin(obj):
124 def get_origin(obj):
125 """
125 """
126 Get origin of permission from object.
126 Get origin of permission from object.
127
127
128 :param obj:
128 :param obj:
129 """
129 """
130 origin = 'permission'
130 origin = 'permission'
131
131
132 if getattr(obj, 'owner_row', '') and getattr(obj, 'admin_row', ''):
132 if getattr(obj, 'owner_row', '') and getattr(obj, 'admin_row', ''):
133 # admin and owner case, maybe we should use dual string ?
133 # admin and owner case, maybe we should use dual string ?
134 origin = 'owner'
134 origin = 'owner'
135 elif getattr(obj, 'owner_row', ''):
135 elif getattr(obj, 'owner_row', ''):
136 origin = 'owner'
136 origin = 'owner'
137 elif getattr(obj, 'admin_row', ''):
137 elif getattr(obj, 'admin_row', ''):
138 origin = 'super-admin'
138 origin = 'super-admin'
139 return origin
139 return origin
140
140
141
141
142 def store_update(updates, attr, name):
142 def store_update(updates, attr, name):
143 """
143 """
144 Stores param in updates dict if it's not instance of Optional
144 Stores param in updates dict if it's not instance of Optional
145 allows easy updates of passed in params
145 allows easy updates of passed in params
146 """
146 """
147 if not isinstance(attr, Optional):
147 if not isinstance(attr, Optional):
148 updates[name] = attr
148 updates[name] = attr
149
149
150
150
151 def has_superadmin_permission(apiuser):
151 def has_superadmin_permission(apiuser):
152 """
152 """
153 Return True if apiuser is admin or return False
153 Return True if apiuser is admin or return False
154
154
155 :param apiuser:
155 :param apiuser:
156 """
156 """
157 if HasPermissionAnyApi('hg.admin')(user=apiuser):
157 if HasPermissionAnyApi('hg.admin')(user=apiuser):
158 return True
158 return True
159 return False
159 return False
160
160
161
161
162 def validate_repo_permissions(apiuser, repoid, repo, perms):
162 def validate_repo_permissions(apiuser, repoid, repo, perms):
163 """
163 """
164 Raise JsonRPCError if apiuser is not authorized or return True
164 Raise JsonRPCError if apiuser is not authorized or return True
165
165
166 :param apiuser:
166 :param apiuser:
167 :param repoid:
167 :param repoid:
168 :param repo:
168 :param repo:
169 :param perms:
169 :param perms:
170 """
170 """
171 if not HasRepoPermissionAnyApi(*perms)(
171 if not HasRepoPermissionAnyApi(*perms)(
172 user=apiuser, repo_name=repo.repo_name):
172 user=apiuser, repo_name=repo.repo_name):
173 raise JSONRPCError('repository `%s` does not exist' % repoid)
173 raise JSONRPCError('repository `%s` does not exist' % repoid)
174
174
175 return True
175 return True
176
176
177
177
178 def validate_repo_group_permissions(apiuser, repogroupid, repo_group, perms):
178 def validate_repo_group_permissions(apiuser, repogroupid, repo_group, perms):
179 """
179 """
180 Raise JsonRPCError if apiuser is not authorized or return True
180 Raise JsonRPCError if apiuser is not authorized or return True
181
181
182 :param apiuser:
182 :param apiuser:
183 :param repogroupid: just the id of repository group
183 :param repogroupid: just the id of repository group
184 :param repo_group: instance of repo_group
184 :param repo_group: instance of repo_group
185 :param perms:
185 :param perms:
186 """
186 """
187 if not HasRepoGroupPermissionAnyApi(*perms)(
187 if not HasRepoGroupPermissionAnyApi(*perms)(
188 user=apiuser, group_name=repo_group.group_name):
188 user=apiuser, group_name=repo_group.group_name):
189 raise JSONRPCError(
189 raise JSONRPCError(
190 'repository group `%s` does not exist' % repogroupid)
190 'repository group `%s` does not exist' % repogroupid)
191
191
192 return True
192 return True
193
193
194
194
195 def validate_set_owner_permissions(apiuser, owner):
195 def validate_set_owner_permissions(apiuser, owner):
196 if isinstance(owner, Optional):
196 if isinstance(owner, Optional):
197 owner = get_user_or_error(apiuser.user_id)
197 owner = get_user_or_error(apiuser.user_id)
198 else:
198 else:
199 if has_superadmin_permission(apiuser):
199 if has_superadmin_permission(apiuser):
200 owner = get_user_or_error(owner)
200 owner = get_user_or_error(owner)
201 else:
201 else:
202 # forbid setting owner for non-admins
202 # forbid setting owner for non-admins
203 raise JSONRPCError(
203 raise JSONRPCError(
204 'Only RhodeCode super-admin can specify `owner` param')
204 'Only RhodeCode super-admin can specify `owner` param')
205 return owner
205 return owner
206
206
207
207
208 def get_user_or_error(userid):
208 def get_user_or_error(userid):
209 """
209 """
210 Get user by id or name or return JsonRPCError if not found
210 Get user by id or name or return JsonRPCError if not found
211
211
212 :param userid:
212 :param userid:
213 """
213 """
214 from rhodecode.model.user import UserModel
214 from rhodecode.model.user import UserModel
215 user_model = UserModel()
215 user_model = UserModel()
216
216
217 if isinstance(userid, (int, long)):
217 if isinstance(userid, (int, long)):
218 try:
218 try:
219 user = user_model.get_user(userid)
219 user = user_model.get_user(userid)
220 except ValueError:
220 except ValueError:
221 user = None
221 user = None
222 else:
222 else:
223 user = user_model.get_by_username(userid)
223 user = user_model.get_by_username(userid)
224
224
225 if user is None:
225 if user is None:
226 raise JSONRPCError(
226 raise JSONRPCError(
227 'user `%s` does not exist' % (userid,))
227 'user `%s` does not exist' % (userid,))
228 return user
228 return user
229
229
230
230
231 def get_repo_or_error(repoid):
231 def get_repo_or_error(repoid):
232 """
232 """
233 Get repo by id or name or return JsonRPCError if not found
233 Get repo by id or name or return JsonRPCError if not found
234
234
235 :param repoid:
235 :param repoid:
236 """
236 """
237 from rhodecode.model.repo import RepoModel
237 from rhodecode.model.repo import RepoModel
238 repo_model = RepoModel()
238 repo_model = RepoModel()
239
239
240 if isinstance(repoid, (int, long)):
240 if isinstance(repoid, (int, long)):
241 try:
241 try:
242 repo = repo_model.get_repo(repoid)
242 repo = repo_model.get_repo(repoid)
243 except ValueError:
243 except ValueError:
244 repo = None
244 repo = None
245 else:
245 else:
246 repo = repo_model.get_by_repo_name(repoid)
246 repo = repo_model.get_by_repo_name(repoid)
247
247
248 if repo is None:
248 if repo is None:
249 raise JSONRPCError(
249 raise JSONRPCError(
250 'repository `%s` does not exist' % (repoid,))
250 'repository `%s` does not exist' % (repoid,))
251 return repo
251 return repo
252
252
253
253
254 def get_repo_group_or_error(repogroupid):
254 def get_repo_group_or_error(repogroupid):
255 """
255 """
256 Get repo group by id or name or return JsonRPCError if not found
256 Get repo group by id or name or return JsonRPCError if not found
257
257
258 :param repogroupid:
258 :param repogroupid:
259 """
259 """
260 from rhodecode.model.repo_group import RepoGroupModel
260 from rhodecode.model.repo_group import RepoGroupModel
261 repo_group_model = RepoGroupModel()
261 repo_group_model = RepoGroupModel()
262
262
263 if isinstance(repogroupid, (int, long)):
263 if isinstance(repogroupid, (int, long)):
264 try:
264 try:
265 repo_group = repo_group_model._get_repo_group(repogroupid)
265 repo_group = repo_group_model._get_repo_group(repogroupid)
266 except ValueError:
266 except ValueError:
267 repo_group = None
267 repo_group = None
268 else:
268 else:
269 repo_group = repo_group_model.get_by_group_name(repogroupid)
269 repo_group = repo_group_model.get_by_group_name(repogroupid)
270
270
271 if repo_group is None:
271 if repo_group is None:
272 raise JSONRPCError(
272 raise JSONRPCError(
273 'repository group `%s` does not exist' % (repogroupid,))
273 'repository group `%s` does not exist' % (repogroupid,))
274 return repo_group
274 return repo_group
275
275
276
276
277 def get_user_group_or_error(usergroupid):
277 def get_user_group_or_error(usergroupid):
278 """
278 """
279 Get user group by id or name or return JsonRPCError if not found
279 Get user group by id or name or return JsonRPCError if not found
280
280
281 :param usergroupid:
281 :param usergroupid:
282 """
282 """
283 from rhodecode.model.user_group import UserGroupModel
283 from rhodecode.model.user_group import UserGroupModel
284 user_group_model = UserGroupModel()
284 user_group_model = UserGroupModel()
285
285
286 if isinstance(usergroupid, (int, long)):
286 if isinstance(usergroupid, (int, long)):
287 try:
287 try:
288 user_group = user_group_model.get_group(usergroupid)
288 user_group = user_group_model.get_group(usergroupid)
289 except ValueError:
289 except ValueError:
290 user_group = None
290 user_group = None
291 else:
291 else:
292 user_group = user_group_model.get_by_name(usergroupid)
292 user_group = user_group_model.get_by_name(usergroupid)
293
293
294 if user_group is None:
294 if user_group is None:
295 raise JSONRPCError(
295 raise JSONRPCError(
296 'user group `%s` does not exist' % (usergroupid,))
296 'user group `%s` does not exist' % (usergroupid,))
297 return user_group
297 return user_group
298
298
299
299
300 def get_perm_or_error(permid, prefix=None):
300 def get_perm_or_error(permid, prefix=None):
301 """
301 """
302 Get permission by id or name or return JsonRPCError if not found
302 Get permission by id or name or return JsonRPCError if not found
303
303
304 :param permid:
304 :param permid:
305 """
305 """
306 from rhodecode.model.permission import PermissionModel
306 from rhodecode.model.permission import PermissionModel
307
307
308 perm = PermissionModel.cls.get_by_key(permid)
308 perm = PermissionModel.cls.get_by_key(permid)
309 if perm is None:
309 if perm is None:
310 msg = 'permission `{}` does not exist.'.format(permid)
310 msg = 'permission `{}` does not exist.'.format(permid)
311 if prefix:
311 if prefix:
312 msg += ' Permission should start with prefix: `{}`'.format(prefix)
312 msg += ' Permission should start with prefix: `{}`'.format(prefix)
313 raise JSONRPCError(msg)
313 raise JSONRPCError(msg)
314
314
315 if prefix:
315 if prefix:
316 if not perm.permission_name.startswith(prefix):
316 if not perm.permission_name.startswith(prefix):
317 raise JSONRPCError('permission `%s` is invalid, '
317 raise JSONRPCError('permission `%s` is invalid, '
318 'should start with %s' % (permid, prefix))
318 'should start with %s' % (permid, prefix))
319 return perm
319 return perm
320
320
321
321
322 def get_gist_or_error(gistid):
322 def get_gist_or_error(gistid):
323 """
323 """
324 Get gist by id or gist_access_id or return JsonRPCError if not found
324 Get gist by id or gist_access_id or return JsonRPCError if not found
325
325
326 :param gistid:
326 :param gistid:
327 """
327 """
328 from rhodecode.model.gist import GistModel
328 from rhodecode.model.gist import GistModel
329
329
330 gist = GistModel.cls.get_by_access_id(gistid)
330 gist = GistModel.cls.get_by_access_id(gistid)
331 if gist is None:
331 if gist is None:
332 raise JSONRPCError('gist `%s` does not exist' % (gistid,))
332 raise JSONRPCError('gist `%s` does not exist' % (gistid,))
333 return gist
333 return gist
334
334
335
335
336 def get_pull_request_or_error(pullrequestid):
336 def get_pull_request_or_error(pullrequestid):
337 """
337 """
338 Get pull request by id or return JsonRPCError if not found
338 Get pull request by id or return JsonRPCError if not found
339
339
340 :param pullrequestid:
340 :param pullrequestid:
341 """
341 """
342 from rhodecode.model.pull_request import PullRequestModel
342 from rhodecode.model.pull_request import PullRequestModel
343
343
344 try:
344 try:
345 pull_request = PullRequestModel().get(int(pullrequestid))
345 pull_request = PullRequestModel().get(int(pullrequestid))
346 except ValueError:
346 except ValueError:
347 raise JSONRPCError('pullrequestid must be an integer')
347 raise JSONRPCError('pullrequestid must be an integer')
348 if not pull_request:
348 if not pull_request:
349 raise JSONRPCError('pull request `%s` does not exist' % (
349 raise JSONRPCError('pull request `%s` does not exist' % (
350 pullrequestid,))
350 pullrequestid,))
351 return pull_request
351 return pull_request
352
352
353
353
354 def build_commit_data(commit, detail_level):
354 def build_commit_data(rhodecode_vcs_repo, commit, detail_level):
355 commit2 = commit
356 commit1 = commit.first_parent
357
355 parsed_diff = []
358 parsed_diff = []
356 if detail_level == 'extended':
359 if detail_level == 'extended':
357 for f_path in commit.added_paths:
360 for f_path in commit.added_paths:
358 parsed_diff.append(_get_commit_dict(filename=f_path, op='A'))
361 parsed_diff.append(_get_commit_dict(filename=f_path, op='A'))
359 for f_path in commit.changed_paths:
362 for f_path in commit.changed_paths:
360 parsed_diff.append(_get_commit_dict(filename=f_path, op='M'))
363 parsed_diff.append(_get_commit_dict(filename=f_path, op='M'))
361 for f_path in commit.removed_paths:
364 for f_path in commit.removed_paths:
362 parsed_diff.append(_get_commit_dict(filename=f_path, op='D'))
365 parsed_diff.append(_get_commit_dict(filename=f_path, op='D'))
363
366
364 elif detail_level == 'full':
367 elif detail_level == 'full':
365 from rhodecode.lib.diffs import DiffProcessor
368 from rhodecode.lib import diffs
366 diff_processor = DiffProcessor(commit.diff())
369
370 _diff = rhodecode_vcs_repo.get_diff(commit1, commit2,)
371 diff_processor = diffs.DiffProcessor(_diff, format='newdiff', show_full_diff=True)
372
367 for dp in diff_processor.prepare():
373 for dp in diff_processor.prepare():
368 del dp['stats']['ops']
374 del dp['stats']['ops']
369 _stats = dp['stats']
375 _stats = dp['stats']
370 parsed_diff.append(_get_commit_dict(
376 parsed_diff.append(_get_commit_dict(
371 filename=dp['filename'], op=dp['operation'],
377 filename=dp['filename'], op=dp['operation'],
372 new_revision=dp['new_revision'],
378 new_revision=dp['new_revision'],
373 old_revision=dp['old_revision'],
379 old_revision=dp['old_revision'],
374 raw_diff=dp['raw_diff'], stats=_stats))
380 raw_diff=dp['raw_diff'], stats=_stats))
375
381
376 return parsed_diff
382 return parsed_diff
377
383
378
384
379 def get_commit_or_error(ref, repo):
385 def get_commit_or_error(ref, repo):
380 try:
386 try:
381 ref_type, _, ref_hash = ref.split(':')
387 ref_type, _, ref_hash = ref.split(':')
382 except ValueError:
388 except ValueError:
383 raise JSONRPCError(
389 raise JSONRPCError(
384 'Ref `{ref}` given in a wrong format. Please check the API'
390 'Ref `{ref}` given in a wrong format. Please check the API'
385 ' documentation for more details'.format(ref=ref))
391 ' documentation for more details'.format(ref=ref))
386 try:
392 try:
387 # TODO: dan: refactor this to use repo.scm_instance().get_commit()
393 # TODO: dan: refactor this to use repo.scm_instance().get_commit()
388 # once get_commit supports ref_types
394 # once get_commit supports ref_types
389 return get_commit_from_ref_name(repo, ref_hash)
395 return get_commit_from_ref_name(repo, ref_hash)
390 except RepositoryError:
396 except RepositoryError:
391 raise JSONRPCError('Ref `{ref}` does not exist'.format(ref=ref))
397 raise JSONRPCError('Ref `{ref}` does not exist'.format(ref=ref))
392
398
393
399
394 def _get_ref_hash(repo, type_, name):
400 def _get_ref_hash(repo, type_, name):
395 vcs_repo = repo.scm_instance()
401 vcs_repo = repo.scm_instance()
396 if type_ in ['branch'] and vcs_repo.alias in ('hg', 'git'):
402 if type_ in ['branch'] and vcs_repo.alias in ('hg', 'git'):
397 return vcs_repo.branches[name]
403 return vcs_repo.branches[name]
398 elif type_ in ['bookmark', 'book'] and vcs_repo.alias == 'hg':
404 elif type_ in ['bookmark', 'book'] and vcs_repo.alias == 'hg':
399 return vcs_repo.bookmarks[name]
405 return vcs_repo.bookmarks[name]
400 else:
406 else:
401 raise ValueError()
407 raise ValueError()
402
408
403
409
404 def resolve_ref_or_error(ref, repo, allowed_ref_types=None):
410 def resolve_ref_or_error(ref, repo, allowed_ref_types=None):
405 allowed_ref_types = allowed_ref_types or ['bookmark', 'book', 'tag', 'branch']
411 allowed_ref_types = allowed_ref_types or ['bookmark', 'book', 'tag', 'branch']
406
412
407 def _parse_ref(type_, name, hash_=None):
413 def _parse_ref(type_, name, hash_=None):
408 return type_, name, hash_
414 return type_, name, hash_
409
415
410 try:
416 try:
411 ref_type, ref_name, ref_hash = _parse_ref(*ref.split(':'))
417 ref_type, ref_name, ref_hash = _parse_ref(*ref.split(':'))
412 except TypeError:
418 except TypeError:
413 raise JSONRPCError(
419 raise JSONRPCError(
414 'Ref `{ref}` given in a wrong format. Please check the API'
420 'Ref `{ref}` given in a wrong format. Please check the API'
415 ' documentation for more details'.format(ref=ref))
421 ' documentation for more details'.format(ref=ref))
416
422
417 if ref_type not in allowed_ref_types:
423 if ref_type not in allowed_ref_types:
418 raise JSONRPCError(
424 raise JSONRPCError(
419 'Ref `{ref}` type is not allowed. '
425 'Ref `{ref}` type is not allowed. '
420 'Only:{allowed_refs} are possible.'.format(
426 'Only:{allowed_refs} are possible.'.format(
421 ref=ref, allowed_refs=allowed_ref_types))
427 ref=ref, allowed_refs=allowed_ref_types))
422
428
423 try:
429 try:
424 ref_hash = ref_hash or _get_ref_hash(repo, ref_type, ref_name)
430 ref_hash = ref_hash or _get_ref_hash(repo, ref_type, ref_name)
425 except (KeyError, ValueError):
431 except (KeyError, ValueError):
426 raise JSONRPCError(
432 raise JSONRPCError(
427 'The specified value:{type}:`{name}` does not exist, or is not allowed.'.format(
433 'The specified value:{type}:`{name}` does not exist, or is not allowed.'.format(
428 type=ref_type, name=ref_name))
434 type=ref_type, name=ref_name))
429
435
430 return ':'.join([ref_type, ref_name, ref_hash])
436 return ':'.join([ref_type, ref_name, ref_hash])
431
437
432
438
433 def _get_commit_dict(
439 def _get_commit_dict(
434 filename, op, new_revision=None, old_revision=None,
440 filename, op, new_revision=None, old_revision=None,
435 raw_diff=None, stats=None):
441 raw_diff=None, stats=None):
436 if stats is None:
442 if stats is None:
437 stats = {
443 stats = {
438 "added": None,
444 "added": None,
439 "binary": None,
445 "binary": None,
440 "deleted": None
446 "deleted": None
441 }
447 }
442 return {
448 return {
443 "filename": safe_unicode(filename),
449 "filename": safe_unicode(filename),
444 "op": op,
450 "op": op,
445
451
446 # extra details
452 # extra details
447 "new_revision": new_revision,
453 "new_revision": new_revision,
448 "old_revision": old_revision,
454 "old_revision": old_revision,
449
455
450 "raw_diff": raw_diff,
456 "raw_diff": raw_diff,
451 "stats": stats
457 "stats": stats
452 }
458 }
@@ -1,2523 +1,2524 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2011-2020 RhodeCode GmbH
3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import time
22 import time
23
23
24 import rhodecode
24 import rhodecode
25 from rhodecode.api import (
25 from rhodecode.api import (
26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
27 from rhodecode.api.utils import (
27 from rhodecode.api.utils import (
28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
30 get_perm_or_error, parse_args, get_origin, build_commit_data,
30 get_perm_or_error, parse_args, get_origin, build_commit_data,
31 validate_set_owner_permissions)
31 validate_set_owner_permissions)
32 from rhodecode.lib import audit_logger, rc_cache, channelstream
32 from rhodecode.lib import audit_logger, rc_cache, channelstream
33 from rhodecode.lib import repo_maintenance
33 from rhodecode.lib import repo_maintenance
34 from rhodecode.lib.auth import (
34 from rhodecode.lib.auth import (
35 HasPermissionAnyApi, HasUserGroupPermissionAnyApi,
35 HasPermissionAnyApi, HasUserGroupPermissionAnyApi,
36 HasRepoPermissionAnyApi)
36 HasRepoPermissionAnyApi)
37 from rhodecode.lib.celerylib.utils import get_task_id
37 from rhodecode.lib.celerylib.utils import get_task_id
38 from rhodecode.lib.utils2 import (
38 from rhodecode.lib.utils2 import (
39 str2bool, time_to_datetime, safe_str, safe_int, safe_unicode)
39 str2bool, time_to_datetime, safe_str, safe_int, safe_unicode)
40 from rhodecode.lib.ext_json import json
40 from rhodecode.lib.ext_json import json
41 from rhodecode.lib.exceptions import (
41 from rhodecode.lib.exceptions import (
42 StatusChangeOnClosedPullRequestError, CommentVersionMismatch)
42 StatusChangeOnClosedPullRequestError, CommentVersionMismatch)
43 from rhodecode.lib.vcs import RepositoryError
43 from rhodecode.lib.vcs import RepositoryError
44 from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError
44 from rhodecode.lib.vcs.exceptions import NodeDoesNotExistError
45 from rhodecode.model.changeset_status import ChangesetStatusModel
45 from rhodecode.model.changeset_status import ChangesetStatusModel
46 from rhodecode.model.comment import CommentsModel
46 from rhodecode.model.comment import CommentsModel
47 from rhodecode.model.db import (
47 from rhodecode.model.db import (
48 Session, ChangesetStatus, RepositoryField, Repository, RepoGroup,
48 Session, ChangesetStatus, RepositoryField, Repository, RepoGroup,
49 ChangesetComment)
49 ChangesetComment)
50 from rhodecode.model.permission import PermissionModel
50 from rhodecode.model.permission import PermissionModel
51 from rhodecode.model.pull_request import PullRequestModel
51 from rhodecode.model.pull_request import PullRequestModel
52 from rhodecode.model.repo import RepoModel
52 from rhodecode.model.repo import RepoModel
53 from rhodecode.model.scm import ScmModel, RepoList
53 from rhodecode.model.scm import ScmModel, RepoList
54 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
54 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
55 from rhodecode.model import validation_schema
55 from rhodecode.model import validation_schema
56 from rhodecode.model.validation_schema.schemas import repo_schema
56 from rhodecode.model.validation_schema.schemas import repo_schema
57
57
58 log = logging.getLogger(__name__)
58 log = logging.getLogger(__name__)
59
59
60
60
61 @jsonrpc_method()
61 @jsonrpc_method()
62 def get_repo(request, apiuser, repoid, cache=Optional(True)):
62 def get_repo(request, apiuser, repoid, cache=Optional(True)):
63 """
63 """
64 Gets an existing repository by its name or repository_id.
64 Gets an existing repository by its name or repository_id.
65
65
66 The members section so the output returns users groups or users
66 The members section so the output returns users groups or users
67 associated with that repository.
67 associated with that repository.
68
68
69 This command can only be run using an |authtoken| with admin rights,
69 This command can only be run using an |authtoken| with admin rights,
70 or users with at least read rights to the |repo|.
70 or users with at least read rights to the |repo|.
71
71
72 :param apiuser: This is filled automatically from the |authtoken|.
72 :param apiuser: This is filled automatically from the |authtoken|.
73 :type apiuser: AuthUser
73 :type apiuser: AuthUser
74 :param repoid: The repository name or repository id.
74 :param repoid: The repository name or repository id.
75 :type repoid: str or int
75 :type repoid: str or int
76 :param cache: use the cached value for last changeset
76 :param cache: use the cached value for last changeset
77 :type: cache: Optional(bool)
77 :type: cache: Optional(bool)
78
78
79 Example output:
79 Example output:
80
80
81 .. code-block:: bash
81 .. code-block:: bash
82
82
83 {
83 {
84 "error": null,
84 "error": null,
85 "id": <repo_id>,
85 "id": <repo_id>,
86 "result": {
86 "result": {
87 "clone_uri": null,
87 "clone_uri": null,
88 "created_on": "timestamp",
88 "created_on": "timestamp",
89 "description": "repo description",
89 "description": "repo description",
90 "enable_downloads": false,
90 "enable_downloads": false,
91 "enable_locking": false,
91 "enable_locking": false,
92 "enable_statistics": false,
92 "enable_statistics": false,
93 "followers": [
93 "followers": [
94 {
94 {
95 "active": true,
95 "active": true,
96 "admin": false,
96 "admin": false,
97 "api_key": "****************************************",
97 "api_key": "****************************************",
98 "api_keys": [
98 "api_keys": [
99 "****************************************"
99 "****************************************"
100 ],
100 ],
101 "email": "user@example.com",
101 "email": "user@example.com",
102 "emails": [
102 "emails": [
103 "user@example.com"
103 "user@example.com"
104 ],
104 ],
105 "extern_name": "rhodecode",
105 "extern_name": "rhodecode",
106 "extern_type": "rhodecode",
106 "extern_type": "rhodecode",
107 "firstname": "username",
107 "firstname": "username",
108 "ip_addresses": [],
108 "ip_addresses": [],
109 "language": null,
109 "language": null,
110 "last_login": "2015-09-16T17:16:35.854",
110 "last_login": "2015-09-16T17:16:35.854",
111 "lastname": "surname",
111 "lastname": "surname",
112 "user_id": <user_id>,
112 "user_id": <user_id>,
113 "username": "name"
113 "username": "name"
114 }
114 }
115 ],
115 ],
116 "fork_of": "parent-repo",
116 "fork_of": "parent-repo",
117 "landing_rev": [
117 "landing_rev": [
118 "rev",
118 "rev",
119 "tip"
119 "tip"
120 ],
120 ],
121 "last_changeset": {
121 "last_changeset": {
122 "author": "User <user@example.com>",
122 "author": "User <user@example.com>",
123 "branch": "default",
123 "branch": "default",
124 "date": "timestamp",
124 "date": "timestamp",
125 "message": "last commit message",
125 "message": "last commit message",
126 "parents": [
126 "parents": [
127 {
127 {
128 "raw_id": "commit-id"
128 "raw_id": "commit-id"
129 }
129 }
130 ],
130 ],
131 "raw_id": "commit-id",
131 "raw_id": "commit-id",
132 "revision": <revision number>,
132 "revision": <revision number>,
133 "short_id": "short id"
133 "short_id": "short id"
134 },
134 },
135 "lock_reason": null,
135 "lock_reason": null,
136 "locked_by": null,
136 "locked_by": null,
137 "locked_date": null,
137 "locked_date": null,
138 "owner": "owner-name",
138 "owner": "owner-name",
139 "permissions": [
139 "permissions": [
140 {
140 {
141 "name": "super-admin-name",
141 "name": "super-admin-name",
142 "origin": "super-admin",
142 "origin": "super-admin",
143 "permission": "repository.admin",
143 "permission": "repository.admin",
144 "type": "user"
144 "type": "user"
145 },
145 },
146 {
146 {
147 "name": "owner-name",
147 "name": "owner-name",
148 "origin": "owner",
148 "origin": "owner",
149 "permission": "repository.admin",
149 "permission": "repository.admin",
150 "type": "user"
150 "type": "user"
151 },
151 },
152 {
152 {
153 "name": "user-group-name",
153 "name": "user-group-name",
154 "origin": "permission",
154 "origin": "permission",
155 "permission": "repository.write",
155 "permission": "repository.write",
156 "type": "user_group"
156 "type": "user_group"
157 }
157 }
158 ],
158 ],
159 "private": true,
159 "private": true,
160 "repo_id": 676,
160 "repo_id": 676,
161 "repo_name": "user-group/repo-name",
161 "repo_name": "user-group/repo-name",
162 "repo_type": "hg"
162 "repo_type": "hg"
163 }
163 }
164 }
164 }
165 """
165 """
166
166
167 repo = get_repo_or_error(repoid)
167 repo = get_repo_or_error(repoid)
168 cache = Optional.extract(cache)
168 cache = Optional.extract(cache)
169
169
170 include_secrets = False
170 include_secrets = False
171 if has_superadmin_permission(apiuser):
171 if has_superadmin_permission(apiuser):
172 include_secrets = True
172 include_secrets = True
173 else:
173 else:
174 # check if we have at least read permission for this repo !
174 # check if we have at least read permission for this repo !
175 _perms = (
175 _perms = (
176 'repository.admin', 'repository.write', 'repository.read',)
176 'repository.admin', 'repository.write', 'repository.read',)
177 validate_repo_permissions(apiuser, repoid, repo, _perms)
177 validate_repo_permissions(apiuser, repoid, repo, _perms)
178
178
179 permissions = []
179 permissions = []
180 for _user in repo.permissions():
180 for _user in repo.permissions():
181 user_data = {
181 user_data = {
182 'name': _user.username,
182 'name': _user.username,
183 'permission': _user.permission,
183 'permission': _user.permission,
184 'origin': get_origin(_user),
184 'origin': get_origin(_user),
185 'type': "user",
185 'type': "user",
186 }
186 }
187 permissions.append(user_data)
187 permissions.append(user_data)
188
188
189 for _user_group in repo.permission_user_groups():
189 for _user_group in repo.permission_user_groups():
190 user_group_data = {
190 user_group_data = {
191 'name': _user_group.users_group_name,
191 'name': _user_group.users_group_name,
192 'permission': _user_group.permission,
192 'permission': _user_group.permission,
193 'origin': get_origin(_user_group),
193 'origin': get_origin(_user_group),
194 'type': "user_group",
194 'type': "user_group",
195 }
195 }
196 permissions.append(user_group_data)
196 permissions.append(user_group_data)
197
197
198 following_users = [
198 following_users = [
199 user.user.get_api_data(include_secrets=include_secrets)
199 user.user.get_api_data(include_secrets=include_secrets)
200 for user in repo.followers]
200 for user in repo.followers]
201
201
202 if not cache:
202 if not cache:
203 repo.update_commit_cache()
203 repo.update_commit_cache()
204 data = repo.get_api_data(include_secrets=include_secrets)
204 data = repo.get_api_data(include_secrets=include_secrets)
205 data['permissions'] = permissions
205 data['permissions'] = permissions
206 data['followers'] = following_users
206 data['followers'] = following_users
207 return data
207 return data
208
208
209
209
210 @jsonrpc_method()
210 @jsonrpc_method()
211 def get_repos(request, apiuser, root=Optional(None), traverse=Optional(True)):
211 def get_repos(request, apiuser, root=Optional(None), traverse=Optional(True)):
212 """
212 """
213 Lists all existing repositories.
213 Lists all existing repositories.
214
214
215 This command can only be run using an |authtoken| with admin rights,
215 This command can only be run using an |authtoken| with admin rights,
216 or users with at least read rights to |repos|.
216 or users with at least read rights to |repos|.
217
217
218 :param apiuser: This is filled automatically from the |authtoken|.
218 :param apiuser: This is filled automatically from the |authtoken|.
219 :type apiuser: AuthUser
219 :type apiuser: AuthUser
220 :param root: specify root repository group to fetch repositories.
220 :param root: specify root repository group to fetch repositories.
221 filters the returned repositories to be members of given root group.
221 filters the returned repositories to be members of given root group.
222 :type root: Optional(None)
222 :type root: Optional(None)
223 :param traverse: traverse given root into subrepositories. With this flag
223 :param traverse: traverse given root into subrepositories. With this flag
224 set to False, it will only return top-level repositories from `root`.
224 set to False, it will only return top-level repositories from `root`.
225 if root is empty it will return just top-level repositories.
225 if root is empty it will return just top-level repositories.
226 :type traverse: Optional(True)
226 :type traverse: Optional(True)
227
227
228
228
229 Example output:
229 Example output:
230
230
231 .. code-block:: bash
231 .. code-block:: bash
232
232
233 id : <id_given_in_input>
233 id : <id_given_in_input>
234 result: [
234 result: [
235 {
235 {
236 "repo_id" : "<repo_id>",
236 "repo_id" : "<repo_id>",
237 "repo_name" : "<reponame>"
237 "repo_name" : "<reponame>"
238 "repo_type" : "<repo_type>",
238 "repo_type" : "<repo_type>",
239 "clone_uri" : "<clone_uri>",
239 "clone_uri" : "<clone_uri>",
240 "private": : "<bool>",
240 "private": : "<bool>",
241 "created_on" : "<datetimecreated>",
241 "created_on" : "<datetimecreated>",
242 "description" : "<description>",
242 "description" : "<description>",
243 "landing_rev": "<landing_rev>",
243 "landing_rev": "<landing_rev>",
244 "owner": "<repo_owner>",
244 "owner": "<repo_owner>",
245 "fork_of": "<name_of_fork_parent>",
245 "fork_of": "<name_of_fork_parent>",
246 "enable_downloads": "<bool>",
246 "enable_downloads": "<bool>",
247 "enable_locking": "<bool>",
247 "enable_locking": "<bool>",
248 "enable_statistics": "<bool>",
248 "enable_statistics": "<bool>",
249 },
249 },
250 ...
250 ...
251 ]
251 ]
252 error: null
252 error: null
253 """
253 """
254
254
255 include_secrets = has_superadmin_permission(apiuser)
255 include_secrets = has_superadmin_permission(apiuser)
256 _perms = ('repository.read', 'repository.write', 'repository.admin',)
256 _perms = ('repository.read', 'repository.write', 'repository.admin',)
257 extras = {'user': apiuser}
257 extras = {'user': apiuser}
258
258
259 root = Optional.extract(root)
259 root = Optional.extract(root)
260 traverse = Optional.extract(traverse, binary=True)
260 traverse = Optional.extract(traverse, binary=True)
261
261
262 if root:
262 if root:
263 # verify parent existance, if it's empty return an error
263 # verify parent existance, if it's empty return an error
264 parent = RepoGroup.get_by_group_name(root)
264 parent = RepoGroup.get_by_group_name(root)
265 if not parent:
265 if not parent:
266 raise JSONRPCError(
266 raise JSONRPCError(
267 'Root repository group `{}` does not exist'.format(root))
267 'Root repository group `{}` does not exist'.format(root))
268
268
269 if traverse:
269 if traverse:
270 repos = RepoModel().get_repos_for_root(root=root, traverse=traverse)
270 repos = RepoModel().get_repos_for_root(root=root, traverse=traverse)
271 else:
271 else:
272 repos = RepoModel().get_repos_for_root(root=parent)
272 repos = RepoModel().get_repos_for_root(root=parent)
273 else:
273 else:
274 if traverse:
274 if traverse:
275 repos = RepoModel().get_all()
275 repos = RepoModel().get_all()
276 else:
276 else:
277 # return just top-level
277 # return just top-level
278 repos = RepoModel().get_repos_for_root(root=None)
278 repos = RepoModel().get_repos_for_root(root=None)
279
279
280 repo_list = RepoList(repos, perm_set=_perms, extra_kwargs=extras)
280 repo_list = RepoList(repos, perm_set=_perms, extra_kwargs=extras)
281 return [repo.get_api_data(include_secrets=include_secrets)
281 return [repo.get_api_data(include_secrets=include_secrets)
282 for repo in repo_list]
282 for repo in repo_list]
283
283
284
284
285 @jsonrpc_method()
285 @jsonrpc_method()
286 def get_repo_changeset(request, apiuser, repoid, revision,
286 def get_repo_changeset(request, apiuser, repoid, revision,
287 details=Optional('basic')):
287 details=Optional('basic')):
288 """
288 """
289 Returns information about a changeset.
289 Returns information about a changeset.
290
290
291 Additionally parameters define the amount of details returned by
291 Additionally parameters define the amount of details returned by
292 this function.
292 this function.
293
293
294 This command can only be run using an |authtoken| with admin rights,
294 This command can only be run using an |authtoken| with admin rights,
295 or users with at least read rights to the |repo|.
295 or users with at least read rights to the |repo|.
296
296
297 :param apiuser: This is filled automatically from the |authtoken|.
297 :param apiuser: This is filled automatically from the |authtoken|.
298 :type apiuser: AuthUser
298 :type apiuser: AuthUser
299 :param repoid: The repository name or repository id
299 :param repoid: The repository name or repository id
300 :type repoid: str or int
300 :type repoid: str or int
301 :param revision: revision for which listing should be done
301 :param revision: revision for which listing should be done
302 :type revision: str
302 :type revision: str
303 :param details: details can be 'basic|extended|full' full gives diff
303 :param details: details can be 'basic|extended|full' full gives diff
304 info details like the diff itself, and number of changed files etc.
304 info details like the diff itself, and number of changed files etc.
305 :type details: Optional(str)
305 :type details: Optional(str)
306
306
307 """
307 """
308 repo = get_repo_or_error(repoid)
308 repo = get_repo_or_error(repoid)
309 if not has_superadmin_permission(apiuser):
309 if not has_superadmin_permission(apiuser):
310 _perms = ('repository.admin', 'repository.write', 'repository.read',)
310 _perms = ('repository.admin', 'repository.write', 'repository.read',)
311 validate_repo_permissions(apiuser, repoid, repo, _perms)
311 validate_repo_permissions(apiuser, repoid, repo, _perms)
312
312
313 changes_details = Optional.extract(details)
313 changes_details = Optional.extract(details)
314 _changes_details_types = ['basic', 'extended', 'full']
314 _changes_details_types = ['basic', 'extended', 'full']
315 if changes_details not in _changes_details_types:
315 if changes_details not in _changes_details_types:
316 raise JSONRPCError(
316 raise JSONRPCError(
317 'ret_type must be one of %s' % (
317 'ret_type must be one of %s' % (
318 ','.join(_changes_details_types)))
318 ','.join(_changes_details_types)))
319
319
320 vcs_repo = repo.scm_instance()
320 pre_load = ['author', 'branch', 'date', 'message', 'parents',
321 pre_load = ['author', 'branch', 'date', 'message', 'parents',
321 'status', '_commit', '_file_paths']
322 'status', '_commit', '_file_paths']
322
323
323 try:
324 try:
324 cs = repo.get_commit(commit_id=revision, pre_load=pre_load)
325 commit = repo.get_commit(commit_id=revision, pre_load=pre_load)
325 except TypeError as e:
326 except TypeError as e:
326 raise JSONRPCError(safe_str(e))
327 raise JSONRPCError(safe_str(e))
327 _cs_json = cs.__json__()
328 _cs_json = commit.__json__()
328 _cs_json['diff'] = build_commit_data(cs, changes_details)
329 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
329 if changes_details == 'full':
330 if changes_details == 'full':
330 _cs_json['refs'] = cs._get_refs()
331 _cs_json['refs'] = commit._get_refs()
331 return _cs_json
332 return _cs_json
332
333
333
334
334 @jsonrpc_method()
335 @jsonrpc_method()
335 def get_repo_changesets(request, apiuser, repoid, start_rev, limit,
336 def get_repo_changesets(request, apiuser, repoid, start_rev, limit,
336 details=Optional('basic')):
337 details=Optional('basic')):
337 """
338 """
338 Returns a set of commits limited by the number starting
339 Returns a set of commits limited by the number starting
339 from the `start_rev` option.
340 from the `start_rev` option.
340
341
341 Additional parameters define the amount of details returned by this
342 Additional parameters define the amount of details returned by this
342 function.
343 function.
343
344
344 This command can only be run using an |authtoken| with admin rights,
345 This command can only be run using an |authtoken| with admin rights,
345 or users with at least read rights to |repos|.
346 or users with at least read rights to |repos|.
346
347
347 :param apiuser: This is filled automatically from the |authtoken|.
348 :param apiuser: This is filled automatically from the |authtoken|.
348 :type apiuser: AuthUser
349 :type apiuser: AuthUser
349 :param repoid: The repository name or repository ID.
350 :param repoid: The repository name or repository ID.
350 :type repoid: str or int
351 :type repoid: str or int
351 :param start_rev: The starting revision from where to get changesets.
352 :param start_rev: The starting revision from where to get changesets.
352 :type start_rev: str
353 :type start_rev: str
353 :param limit: Limit the number of commits to this amount
354 :param limit: Limit the number of commits to this amount
354 :type limit: str or int
355 :type limit: str or int
355 :param details: Set the level of detail returned. Valid option are:
356 :param details: Set the level of detail returned. Valid option are:
356 ``basic``, ``extended`` and ``full``.
357 ``basic``, ``extended`` and ``full``.
357 :type details: Optional(str)
358 :type details: Optional(str)
358
359
359 .. note::
360 .. note::
360
361
361 Setting the parameter `details` to the value ``full`` is extensive
362 Setting the parameter `details` to the value ``full`` is extensive
362 and returns details like the diff itself, and the number
363 and returns details like the diff itself, and the number
363 of changed files.
364 of changed files.
364
365
365 """
366 """
366 repo = get_repo_or_error(repoid)
367 repo = get_repo_or_error(repoid)
367 if not has_superadmin_permission(apiuser):
368 if not has_superadmin_permission(apiuser):
368 _perms = ('repository.admin', 'repository.write', 'repository.read',)
369 _perms = ('repository.admin', 'repository.write', 'repository.read',)
369 validate_repo_permissions(apiuser, repoid, repo, _perms)
370 validate_repo_permissions(apiuser, repoid, repo, _perms)
370
371
371 changes_details = Optional.extract(details)
372 changes_details = Optional.extract(details)
372 _changes_details_types = ['basic', 'extended', 'full']
373 _changes_details_types = ['basic', 'extended', 'full']
373 if changes_details not in _changes_details_types:
374 if changes_details not in _changes_details_types:
374 raise JSONRPCError(
375 raise JSONRPCError(
375 'ret_type must be one of %s' % (
376 'ret_type must be one of %s' % (
376 ','.join(_changes_details_types)))
377 ','.join(_changes_details_types)))
377
378
378 limit = int(limit)
379 limit = int(limit)
379 pre_load = ['author', 'branch', 'date', 'message', 'parents',
380 pre_load = ['author', 'branch', 'date', 'message', 'parents',
380 'status', '_commit', '_file_paths']
381 'status', '_commit', '_file_paths']
381
382
382 vcs_repo = repo.scm_instance()
383 vcs_repo = repo.scm_instance()
383 # SVN needs a special case to distinguish its index and commit id
384 # SVN needs a special case to distinguish its index and commit id
384 if vcs_repo and vcs_repo.alias == 'svn' and (start_rev == '0'):
385 if vcs_repo and vcs_repo.alias == 'svn' and (start_rev == '0'):
385 start_rev = vcs_repo.commit_ids[0]
386 start_rev = vcs_repo.commit_ids[0]
386
387
387 try:
388 try:
388 commits = vcs_repo.get_commits(
389 commits = vcs_repo.get_commits(
389 start_id=start_rev, pre_load=pre_load, translate_tags=False)
390 start_id=start_rev, pre_load=pre_load, translate_tags=False)
390 except TypeError as e:
391 except TypeError as e:
391 raise JSONRPCError(safe_str(e))
392 raise JSONRPCError(safe_str(e))
392 except Exception:
393 except Exception:
393 log.exception('Fetching of commits failed')
394 log.exception('Fetching of commits failed')
394 raise JSONRPCError('Error occurred during commit fetching')
395 raise JSONRPCError('Error occurred during commit fetching')
395
396
396 ret = []
397 ret = []
397 for cnt, commit in enumerate(commits):
398 for cnt, commit in enumerate(commits):
398 if cnt >= limit != -1:
399 if cnt >= limit != -1:
399 break
400 break
400 _cs_json = commit.__json__()
401 _cs_json = commit.__json__()
401 _cs_json['diff'] = build_commit_data(commit, changes_details)
402 _cs_json['diff'] = build_commit_data(vcs_repo, commit, changes_details)
402 if changes_details == 'full':
403 if changes_details == 'full':
403 _cs_json['refs'] = {
404 _cs_json['refs'] = {
404 'branches': [commit.branch],
405 'branches': [commit.branch],
405 'bookmarks': getattr(commit, 'bookmarks', []),
406 'bookmarks': getattr(commit, 'bookmarks', []),
406 'tags': commit.tags
407 'tags': commit.tags
407 }
408 }
408 ret.append(_cs_json)
409 ret.append(_cs_json)
409 return ret
410 return ret
410
411
411
412
412 @jsonrpc_method()
413 @jsonrpc_method()
413 def get_repo_nodes(request, apiuser, repoid, revision, root_path,
414 def get_repo_nodes(request, apiuser, repoid, revision, root_path,
414 ret_type=Optional('all'), details=Optional('basic'),
415 ret_type=Optional('all'), details=Optional('basic'),
415 max_file_bytes=Optional(None)):
416 max_file_bytes=Optional(None)):
416 """
417 """
417 Returns a list of nodes and children in a flat list for a given
418 Returns a list of nodes and children in a flat list for a given
418 path at given revision.
419 path at given revision.
419
420
420 It's possible to specify ret_type to show only `files` or `dirs`.
421 It's possible to specify ret_type to show only `files` or `dirs`.
421
422
422 This command can only be run using an |authtoken| with admin rights,
423 This command can only be run using an |authtoken| with admin rights,
423 or users with at least read rights to |repos|.
424 or users with at least read rights to |repos|.
424
425
425 :param apiuser: This is filled automatically from the |authtoken|.
426 :param apiuser: This is filled automatically from the |authtoken|.
426 :type apiuser: AuthUser
427 :type apiuser: AuthUser
427 :param repoid: The repository name or repository ID.
428 :param repoid: The repository name or repository ID.
428 :type repoid: str or int
429 :type repoid: str or int
429 :param revision: The revision for which listing should be done.
430 :param revision: The revision for which listing should be done.
430 :type revision: str
431 :type revision: str
431 :param root_path: The path from which to start displaying.
432 :param root_path: The path from which to start displaying.
432 :type root_path: str
433 :type root_path: str
433 :param ret_type: Set the return type. Valid options are
434 :param ret_type: Set the return type. Valid options are
434 ``all`` (default), ``files`` and ``dirs``.
435 ``all`` (default), ``files`` and ``dirs``.
435 :type ret_type: Optional(str)
436 :type ret_type: Optional(str)
436 :param details: Returns extended information about nodes, such as
437 :param details: Returns extended information about nodes, such as
437 md5, binary, and or content.
438 md5, binary, and or content.
438 The valid options are ``basic`` and ``full``.
439 The valid options are ``basic`` and ``full``.
439 :type details: Optional(str)
440 :type details: Optional(str)
440 :param max_file_bytes: Only return file content under this file size bytes
441 :param max_file_bytes: Only return file content under this file size bytes
441 :type details: Optional(int)
442 :type details: Optional(int)
442
443
443 Example output:
444 Example output:
444
445
445 .. code-block:: bash
446 .. code-block:: bash
446
447
447 id : <id_given_in_input>
448 id : <id_given_in_input>
448 result: [
449 result: [
449 {
450 {
450 "binary": false,
451 "binary": false,
451 "content": "File line",
452 "content": "File line",
452 "extension": "md",
453 "extension": "md",
453 "lines": 2,
454 "lines": 2,
454 "md5": "059fa5d29b19c0657e384749480f6422",
455 "md5": "059fa5d29b19c0657e384749480f6422",
455 "mimetype": "text/x-minidsrc",
456 "mimetype": "text/x-minidsrc",
456 "name": "file.md",
457 "name": "file.md",
457 "size": 580,
458 "size": 580,
458 "type": "file"
459 "type": "file"
459 },
460 },
460 ...
461 ...
461 ]
462 ]
462 error: null
463 error: null
463 """
464 """
464
465
465 repo = get_repo_or_error(repoid)
466 repo = get_repo_or_error(repoid)
466 if not has_superadmin_permission(apiuser):
467 if not has_superadmin_permission(apiuser):
467 _perms = ('repository.admin', 'repository.write', 'repository.read',)
468 _perms = ('repository.admin', 'repository.write', 'repository.read',)
468 validate_repo_permissions(apiuser, repoid, repo, _perms)
469 validate_repo_permissions(apiuser, repoid, repo, _perms)
469
470
470 ret_type = Optional.extract(ret_type)
471 ret_type = Optional.extract(ret_type)
471 details = Optional.extract(details)
472 details = Optional.extract(details)
472 _extended_types = ['basic', 'full']
473 _extended_types = ['basic', 'full']
473 if details not in _extended_types:
474 if details not in _extended_types:
474 raise JSONRPCError('ret_type must be one of %s' % (','.join(_extended_types)))
475 raise JSONRPCError('ret_type must be one of %s' % (','.join(_extended_types)))
475 extended_info = False
476 extended_info = False
476 content = False
477 content = False
477 if details == 'basic':
478 if details == 'basic':
478 extended_info = True
479 extended_info = True
479
480
480 if details == 'full':
481 if details == 'full':
481 extended_info = content = True
482 extended_info = content = True
482
483
483 _map = {}
484 _map = {}
484 try:
485 try:
485 # check if repo is not empty by any chance, skip quicker if it is.
486 # check if repo is not empty by any chance, skip quicker if it is.
486 _scm = repo.scm_instance()
487 _scm = repo.scm_instance()
487 if _scm.is_empty():
488 if _scm.is_empty():
488 return []
489 return []
489
490
490 _d, _f = ScmModel().get_nodes(
491 _d, _f = ScmModel().get_nodes(
491 repo, revision, root_path, flat=False,
492 repo, revision, root_path, flat=False,
492 extended_info=extended_info, content=content,
493 extended_info=extended_info, content=content,
493 max_file_bytes=max_file_bytes)
494 max_file_bytes=max_file_bytes)
494 _map = {
495 _map = {
495 'all': _d + _f,
496 'all': _d + _f,
496 'files': _f,
497 'files': _f,
497 'dirs': _d,
498 'dirs': _d,
498 }
499 }
499 return _map[ret_type]
500 return _map[ret_type]
500 except KeyError:
501 except KeyError:
501 raise JSONRPCError(
502 raise JSONRPCError(
502 'ret_type must be one of %s' % (','.join(sorted(_map.keys()))))
503 'ret_type must be one of %s' % (','.join(sorted(_map.keys()))))
503 except Exception:
504 except Exception:
504 log.exception("Exception occurred while trying to get repo nodes")
505 log.exception("Exception occurred while trying to get repo nodes")
505 raise JSONRPCError(
506 raise JSONRPCError(
506 'failed to get repo: `%s` nodes' % repo.repo_name
507 'failed to get repo: `%s` nodes' % repo.repo_name
507 )
508 )
508
509
509
510
510 @jsonrpc_method()
511 @jsonrpc_method()
511 def get_repo_file(request, apiuser, repoid, commit_id, file_path,
512 def get_repo_file(request, apiuser, repoid, commit_id, file_path,
512 max_file_bytes=Optional(None), details=Optional('basic'),
513 max_file_bytes=Optional(None), details=Optional('basic'),
513 cache=Optional(True)):
514 cache=Optional(True)):
514 """
515 """
515 Returns a single file from repository at given revision.
516 Returns a single file from repository at given revision.
516
517
517 This command can only be run using an |authtoken| with admin rights,
518 This command can only be run using an |authtoken| with admin rights,
518 or users with at least read rights to |repos|.
519 or users with at least read rights to |repos|.
519
520
520 :param apiuser: This is filled automatically from the |authtoken|.
521 :param apiuser: This is filled automatically from the |authtoken|.
521 :type apiuser: AuthUser
522 :type apiuser: AuthUser
522 :param repoid: The repository name or repository ID.
523 :param repoid: The repository name or repository ID.
523 :type repoid: str or int
524 :type repoid: str or int
524 :param commit_id: The revision for which listing should be done.
525 :param commit_id: The revision for which listing should be done.
525 :type commit_id: str
526 :type commit_id: str
526 :param file_path: The path from which to start displaying.
527 :param file_path: The path from which to start displaying.
527 :type file_path: str
528 :type file_path: str
528 :param details: Returns different set of information about nodes.
529 :param details: Returns different set of information about nodes.
529 The valid options are ``minimal`` ``basic`` and ``full``.
530 The valid options are ``minimal`` ``basic`` and ``full``.
530 :type details: Optional(str)
531 :type details: Optional(str)
531 :param max_file_bytes: Only return file content under this file size bytes
532 :param max_file_bytes: Only return file content under this file size bytes
532 :type max_file_bytes: Optional(int)
533 :type max_file_bytes: Optional(int)
533 :param cache: Use internal caches for fetching files. If disabled fetching
534 :param cache: Use internal caches for fetching files. If disabled fetching
534 files is slower but more memory efficient
535 files is slower but more memory efficient
535 :type cache: Optional(bool)
536 :type cache: Optional(bool)
536
537
537 Example output:
538 Example output:
538
539
539 .. code-block:: bash
540 .. code-block:: bash
540
541
541 id : <id_given_in_input>
542 id : <id_given_in_input>
542 result: {
543 result: {
543 "binary": false,
544 "binary": false,
544 "extension": "py",
545 "extension": "py",
545 "lines": 35,
546 "lines": 35,
546 "content": "....",
547 "content": "....",
547 "md5": "76318336366b0f17ee249e11b0c99c41",
548 "md5": "76318336366b0f17ee249e11b0c99c41",
548 "mimetype": "text/x-python",
549 "mimetype": "text/x-python",
549 "name": "python.py",
550 "name": "python.py",
550 "size": 817,
551 "size": 817,
551 "type": "file",
552 "type": "file",
552 }
553 }
553 error: null
554 error: null
554 """
555 """
555
556
556 repo = get_repo_or_error(repoid)
557 repo = get_repo_or_error(repoid)
557 if not has_superadmin_permission(apiuser):
558 if not has_superadmin_permission(apiuser):
558 _perms = ('repository.admin', 'repository.write', 'repository.read',)
559 _perms = ('repository.admin', 'repository.write', 'repository.read',)
559 validate_repo_permissions(apiuser, repoid, repo, _perms)
560 validate_repo_permissions(apiuser, repoid, repo, _perms)
560
561
561 cache = Optional.extract(cache, binary=True)
562 cache = Optional.extract(cache, binary=True)
562 details = Optional.extract(details)
563 details = Optional.extract(details)
563 _extended_types = ['minimal', 'minimal+search', 'basic', 'full']
564 _extended_types = ['minimal', 'minimal+search', 'basic', 'full']
564 if details not in _extended_types:
565 if details not in _extended_types:
565 raise JSONRPCError(
566 raise JSONRPCError(
566 'ret_type must be one of %s, got %s' % (','.join(_extended_types)), details)
567 'ret_type must be one of %s, got %s' % (','.join(_extended_types)), details)
567 extended_info = False
568 extended_info = False
568 content = False
569 content = False
569
570
570 if details == 'minimal':
571 if details == 'minimal':
571 extended_info = False
572 extended_info = False
572
573
573 elif details == 'basic':
574 elif details == 'basic':
574 extended_info = True
575 extended_info = True
575
576
576 elif details == 'full':
577 elif details == 'full':
577 extended_info = content = True
578 extended_info = content = True
578
579
579 file_path = safe_unicode(file_path)
580 file_path = safe_unicode(file_path)
580 try:
581 try:
581 # check if repo is not empty by any chance, skip quicker if it is.
582 # check if repo is not empty by any chance, skip quicker if it is.
582 _scm = repo.scm_instance()
583 _scm = repo.scm_instance()
583 if _scm.is_empty():
584 if _scm.is_empty():
584 return None
585 return None
585
586
586 node = ScmModel().get_node(
587 node = ScmModel().get_node(
587 repo, commit_id, file_path, extended_info=extended_info,
588 repo, commit_id, file_path, extended_info=extended_info,
588 content=content, max_file_bytes=max_file_bytes, cache=cache)
589 content=content, max_file_bytes=max_file_bytes, cache=cache)
589 except NodeDoesNotExistError:
590 except NodeDoesNotExistError:
590 raise JSONRPCError(u'There is no file in repo: `{}` at path `{}` for commit: `{}`'.format(
591 raise JSONRPCError(u'There is no file in repo: `{}` at path `{}` for commit: `{}`'.format(
591 repo.repo_name, file_path, commit_id))
592 repo.repo_name, file_path, commit_id))
592 except Exception:
593 except Exception:
593 log.exception(u"Exception occurred while trying to get repo %s file",
594 log.exception(u"Exception occurred while trying to get repo %s file",
594 repo.repo_name)
595 repo.repo_name)
595 raise JSONRPCError(u'failed to get repo: `{}` file at path {}'.format(
596 raise JSONRPCError(u'failed to get repo: `{}` file at path {}'.format(
596 repo.repo_name, file_path))
597 repo.repo_name, file_path))
597
598
598 return node
599 return node
599
600
600
601
601 @jsonrpc_method()
602 @jsonrpc_method()
602 def get_repo_fts_tree(request, apiuser, repoid, commit_id, root_path):
603 def get_repo_fts_tree(request, apiuser, repoid, commit_id, root_path):
603 """
604 """
604 Returns a list of tree nodes for path at given revision. This api is built
605 Returns a list of tree nodes for path at given revision. This api is built
605 strictly for usage in full text search building, and shouldn't be consumed
606 strictly for usage in full text search building, and shouldn't be consumed
606
607
607 This command can only be run using an |authtoken| with admin rights,
608 This command can only be run using an |authtoken| with admin rights,
608 or users with at least read rights to |repos|.
609 or users with at least read rights to |repos|.
609
610
610 """
611 """
611
612
612 repo = get_repo_or_error(repoid)
613 repo = get_repo_or_error(repoid)
613 if not has_superadmin_permission(apiuser):
614 if not has_superadmin_permission(apiuser):
614 _perms = ('repository.admin', 'repository.write', 'repository.read',)
615 _perms = ('repository.admin', 'repository.write', 'repository.read',)
615 validate_repo_permissions(apiuser, repoid, repo, _perms)
616 validate_repo_permissions(apiuser, repoid, repo, _perms)
616
617
617 repo_id = repo.repo_id
618 repo_id = repo.repo_id
618 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
619 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
619 cache_on = cache_seconds > 0
620 cache_on = cache_seconds > 0
620
621
621 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
622 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
622 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
623 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
623
624
624 def compute_fts_tree(cache_ver, repo_id, commit_id, root_path):
625 def compute_fts_tree(cache_ver, repo_id, commit_id, root_path):
625 return ScmModel().get_fts_data(repo_id, commit_id, root_path)
626 return ScmModel().get_fts_data(repo_id, commit_id, root_path)
626
627
627 try:
628 try:
628 # check if repo is not empty by any chance, skip quicker if it is.
629 # check if repo is not empty by any chance, skip quicker if it is.
629 _scm = repo.scm_instance()
630 _scm = repo.scm_instance()
630 if _scm.is_empty():
631 if _scm.is_empty():
631 return []
632 return []
632 except RepositoryError:
633 except RepositoryError:
633 log.exception("Exception occurred while trying to get repo nodes")
634 log.exception("Exception occurred while trying to get repo nodes")
634 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
635 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
635
636
636 try:
637 try:
637 # we need to resolve commit_id to a FULL sha for cache to work correctly.
638 # we need to resolve commit_id to a FULL sha for cache to work correctly.
638 # sending 'master' is a pointer that needs to be translated to current commit.
639 # sending 'master' is a pointer that needs to be translated to current commit.
639 commit_id = _scm.get_commit(commit_id=commit_id).raw_id
640 commit_id = _scm.get_commit(commit_id=commit_id).raw_id
640 log.debug(
641 log.debug(
641 'Computing FTS REPO TREE for repo_id %s commit_id `%s` '
642 'Computing FTS REPO TREE for repo_id %s commit_id `%s` '
642 'with caching: %s[TTL: %ss]' % (
643 'with caching: %s[TTL: %ss]' % (
643 repo_id, commit_id, cache_on, cache_seconds or 0))
644 repo_id, commit_id, cache_on, cache_seconds or 0))
644
645
645 tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path)
646 tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path)
646 return tree_files
647 return tree_files
647
648
648 except Exception:
649 except Exception:
649 log.exception("Exception occurred while trying to get repo nodes")
650 log.exception("Exception occurred while trying to get repo nodes")
650 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
651 raise JSONRPCError('failed to get repo: `%s` nodes' % repo.repo_name)
651
652
652
653
653 @jsonrpc_method()
654 @jsonrpc_method()
654 def get_repo_refs(request, apiuser, repoid):
655 def get_repo_refs(request, apiuser, repoid):
655 """
656 """
656 Returns a dictionary of current references. It returns
657 Returns a dictionary of current references. It returns
657 bookmarks, branches, closed_branches, and tags for given repository
658 bookmarks, branches, closed_branches, and tags for given repository
658
659
659 It's possible to specify ret_type to show only `files` or `dirs`.
660 It's possible to specify ret_type to show only `files` or `dirs`.
660
661
661 This command can only be run using an |authtoken| with admin rights,
662 This command can only be run using an |authtoken| with admin rights,
662 or users with at least read rights to |repos|.
663 or users with at least read rights to |repos|.
663
664
664 :param apiuser: This is filled automatically from the |authtoken|.
665 :param apiuser: This is filled automatically from the |authtoken|.
665 :type apiuser: AuthUser
666 :type apiuser: AuthUser
666 :param repoid: The repository name or repository ID.
667 :param repoid: The repository name or repository ID.
667 :type repoid: str or int
668 :type repoid: str or int
668
669
669 Example output:
670 Example output:
670
671
671 .. code-block:: bash
672 .. code-block:: bash
672
673
673 id : <id_given_in_input>
674 id : <id_given_in_input>
674 "result": {
675 "result": {
675 "bookmarks": {
676 "bookmarks": {
676 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
677 "dev": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
677 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
678 "master": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
678 },
679 },
679 "branches": {
680 "branches": {
680 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
681 "default": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
681 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
682 "stable": "367f590445081d8ec8c2ea0456e73ae1f1c3d6cf"
682 },
683 },
683 "branches_closed": {},
684 "branches_closed": {},
684 "tags": {
685 "tags": {
685 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
686 "tip": "5611d30200f4040ba2ab4f3d64e5b06408a02188",
686 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
687 "v4.4.0": "1232313f9e6adac5ce5399c2a891dc1e72b79022",
687 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
688 "v4.4.1": "cbb9f1d329ae5768379cdec55a62ebdd546c4e27",
688 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
689 "v4.4.2": "24ffe44a27fcd1c5b6936144e176b9f6dd2f3a17",
689 }
690 }
690 }
691 }
691 error: null
692 error: null
692 """
693 """
693
694
694 repo = get_repo_or_error(repoid)
695 repo = get_repo_or_error(repoid)
695 if not has_superadmin_permission(apiuser):
696 if not has_superadmin_permission(apiuser):
696 _perms = ('repository.admin', 'repository.write', 'repository.read',)
697 _perms = ('repository.admin', 'repository.write', 'repository.read',)
697 validate_repo_permissions(apiuser, repoid, repo, _perms)
698 validate_repo_permissions(apiuser, repoid, repo, _perms)
698
699
699 try:
700 try:
700 # check if repo is not empty by any chance, skip quicker if it is.
701 # check if repo is not empty by any chance, skip quicker if it is.
701 vcs_instance = repo.scm_instance()
702 vcs_instance = repo.scm_instance()
702 refs = vcs_instance.refs()
703 refs = vcs_instance.refs()
703 return refs
704 return refs
704 except Exception:
705 except Exception:
705 log.exception("Exception occurred while trying to get repo refs")
706 log.exception("Exception occurred while trying to get repo refs")
706 raise JSONRPCError(
707 raise JSONRPCError(
707 'failed to get repo: `%s` references' % repo.repo_name
708 'failed to get repo: `%s` references' % repo.repo_name
708 )
709 )
709
710
710
711
711 @jsonrpc_method()
712 @jsonrpc_method()
712 def create_repo(
713 def create_repo(
713 request, apiuser, repo_name, repo_type,
714 request, apiuser, repo_name, repo_type,
714 owner=Optional(OAttr('apiuser')),
715 owner=Optional(OAttr('apiuser')),
715 description=Optional(''),
716 description=Optional(''),
716 private=Optional(False),
717 private=Optional(False),
717 clone_uri=Optional(None),
718 clone_uri=Optional(None),
718 push_uri=Optional(None),
719 push_uri=Optional(None),
719 landing_rev=Optional(None),
720 landing_rev=Optional(None),
720 enable_statistics=Optional(False),
721 enable_statistics=Optional(False),
721 enable_locking=Optional(False),
722 enable_locking=Optional(False),
722 enable_downloads=Optional(False),
723 enable_downloads=Optional(False),
723 copy_permissions=Optional(False)):
724 copy_permissions=Optional(False)):
724 """
725 """
725 Creates a repository.
726 Creates a repository.
726
727
727 * If the repository name contains "/", repository will be created inside
728 * If the repository name contains "/", repository will be created inside
728 a repository group or nested repository groups
729 a repository group or nested repository groups
729
730
730 For example "foo/bar/repo1" will create |repo| called "repo1" inside
731 For example "foo/bar/repo1" will create |repo| called "repo1" inside
731 group "foo/bar". You have to have permissions to access and write to
732 group "foo/bar". You have to have permissions to access and write to
732 the last repository group ("bar" in this example)
733 the last repository group ("bar" in this example)
733
734
734 This command can only be run using an |authtoken| with at least
735 This command can only be run using an |authtoken| with at least
735 permissions to create repositories, or write permissions to
736 permissions to create repositories, or write permissions to
736 parent repository groups.
737 parent repository groups.
737
738
738 :param apiuser: This is filled automatically from the |authtoken|.
739 :param apiuser: This is filled automatically from the |authtoken|.
739 :type apiuser: AuthUser
740 :type apiuser: AuthUser
740 :param repo_name: Set the repository name.
741 :param repo_name: Set the repository name.
741 :type repo_name: str
742 :type repo_name: str
742 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
743 :param repo_type: Set the repository type; 'hg','git', or 'svn'.
743 :type repo_type: str
744 :type repo_type: str
744 :param owner: user_id or username
745 :param owner: user_id or username
745 :type owner: Optional(str)
746 :type owner: Optional(str)
746 :param description: Set the repository description.
747 :param description: Set the repository description.
747 :type description: Optional(str)
748 :type description: Optional(str)
748 :param private: set repository as private
749 :param private: set repository as private
749 :type private: bool
750 :type private: bool
750 :param clone_uri: set clone_uri
751 :param clone_uri: set clone_uri
751 :type clone_uri: str
752 :type clone_uri: str
752 :param push_uri: set push_uri
753 :param push_uri: set push_uri
753 :type push_uri: str
754 :type push_uri: str
754 :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd
755 :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd
755 :type landing_rev: str
756 :type landing_rev: str
756 :param enable_locking:
757 :param enable_locking:
757 :type enable_locking: bool
758 :type enable_locking: bool
758 :param enable_downloads:
759 :param enable_downloads:
759 :type enable_downloads: bool
760 :type enable_downloads: bool
760 :param enable_statistics:
761 :param enable_statistics:
761 :type enable_statistics: bool
762 :type enable_statistics: bool
762 :param copy_permissions: Copy permission from group in which the
763 :param copy_permissions: Copy permission from group in which the
763 repository is being created.
764 repository is being created.
764 :type copy_permissions: bool
765 :type copy_permissions: bool
765
766
766
767
767 Example output:
768 Example output:
768
769
769 .. code-block:: bash
770 .. code-block:: bash
770
771
771 id : <id_given_in_input>
772 id : <id_given_in_input>
772 result: {
773 result: {
773 "msg": "Created new repository `<reponame>`",
774 "msg": "Created new repository `<reponame>`",
774 "success": true,
775 "success": true,
775 "task": "<celery task id or None if done sync>"
776 "task": "<celery task id or None if done sync>"
776 }
777 }
777 error: null
778 error: null
778
779
779
780
780 Example error output:
781 Example error output:
781
782
782 .. code-block:: bash
783 .. code-block:: bash
783
784
784 id : <id_given_in_input>
785 id : <id_given_in_input>
785 result : null
786 result : null
786 error : {
787 error : {
787 'failed to create repository `<repo_name>`'
788 'failed to create repository `<repo_name>`'
788 }
789 }
789
790
790 """
791 """
791
792
792 owner = validate_set_owner_permissions(apiuser, owner)
793 owner = validate_set_owner_permissions(apiuser, owner)
793
794
794 description = Optional.extract(description)
795 description = Optional.extract(description)
795 copy_permissions = Optional.extract(copy_permissions)
796 copy_permissions = Optional.extract(copy_permissions)
796 clone_uri = Optional.extract(clone_uri)
797 clone_uri = Optional.extract(clone_uri)
797 push_uri = Optional.extract(push_uri)
798 push_uri = Optional.extract(push_uri)
798
799
799 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
800 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
800 if isinstance(private, Optional):
801 if isinstance(private, Optional):
801 private = defs.get('repo_private') or Optional.extract(private)
802 private = defs.get('repo_private') or Optional.extract(private)
802 if isinstance(repo_type, Optional):
803 if isinstance(repo_type, Optional):
803 repo_type = defs.get('repo_type')
804 repo_type = defs.get('repo_type')
804 if isinstance(enable_statistics, Optional):
805 if isinstance(enable_statistics, Optional):
805 enable_statistics = defs.get('repo_enable_statistics')
806 enable_statistics = defs.get('repo_enable_statistics')
806 if isinstance(enable_locking, Optional):
807 if isinstance(enable_locking, Optional):
807 enable_locking = defs.get('repo_enable_locking')
808 enable_locking = defs.get('repo_enable_locking')
808 if isinstance(enable_downloads, Optional):
809 if isinstance(enable_downloads, Optional):
809 enable_downloads = defs.get('repo_enable_downloads')
810 enable_downloads = defs.get('repo_enable_downloads')
810
811
811 landing_ref, _label = ScmModel.backend_landing_ref(repo_type)
812 landing_ref, _label = ScmModel.backend_landing_ref(repo_type)
812 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
813 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
813 ref_choices = list(set(ref_choices + [landing_ref]))
814 ref_choices = list(set(ref_choices + [landing_ref]))
814
815
815 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
816 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
816
817
817 schema = repo_schema.RepoSchema().bind(
818 schema = repo_schema.RepoSchema().bind(
818 repo_type_options=rhodecode.BACKENDS.keys(),
819 repo_type_options=rhodecode.BACKENDS.keys(),
819 repo_ref_options=ref_choices,
820 repo_ref_options=ref_choices,
820 repo_type=repo_type,
821 repo_type=repo_type,
821 # user caller
822 # user caller
822 user=apiuser)
823 user=apiuser)
823
824
824 try:
825 try:
825 schema_data = schema.deserialize(dict(
826 schema_data = schema.deserialize(dict(
826 repo_name=repo_name,
827 repo_name=repo_name,
827 repo_type=repo_type,
828 repo_type=repo_type,
828 repo_owner=owner.username,
829 repo_owner=owner.username,
829 repo_description=description,
830 repo_description=description,
830 repo_landing_commit_ref=landing_commit_ref,
831 repo_landing_commit_ref=landing_commit_ref,
831 repo_clone_uri=clone_uri,
832 repo_clone_uri=clone_uri,
832 repo_push_uri=push_uri,
833 repo_push_uri=push_uri,
833 repo_private=private,
834 repo_private=private,
834 repo_copy_permissions=copy_permissions,
835 repo_copy_permissions=copy_permissions,
835 repo_enable_statistics=enable_statistics,
836 repo_enable_statistics=enable_statistics,
836 repo_enable_downloads=enable_downloads,
837 repo_enable_downloads=enable_downloads,
837 repo_enable_locking=enable_locking))
838 repo_enable_locking=enable_locking))
838 except validation_schema.Invalid as err:
839 except validation_schema.Invalid as err:
839 raise JSONRPCValidationError(colander_exc=err)
840 raise JSONRPCValidationError(colander_exc=err)
840
841
841 try:
842 try:
842 data = {
843 data = {
843 'owner': owner,
844 'owner': owner,
844 'repo_name': schema_data['repo_group']['repo_name_without_group'],
845 'repo_name': schema_data['repo_group']['repo_name_without_group'],
845 'repo_name_full': schema_data['repo_name'],
846 'repo_name_full': schema_data['repo_name'],
846 'repo_group': schema_data['repo_group']['repo_group_id'],
847 'repo_group': schema_data['repo_group']['repo_group_id'],
847 'repo_type': schema_data['repo_type'],
848 'repo_type': schema_data['repo_type'],
848 'repo_description': schema_data['repo_description'],
849 'repo_description': schema_data['repo_description'],
849 'repo_private': schema_data['repo_private'],
850 'repo_private': schema_data['repo_private'],
850 'clone_uri': schema_data['repo_clone_uri'],
851 'clone_uri': schema_data['repo_clone_uri'],
851 'push_uri': schema_data['repo_push_uri'],
852 'push_uri': schema_data['repo_push_uri'],
852 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
853 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
853 'enable_statistics': schema_data['repo_enable_statistics'],
854 'enable_statistics': schema_data['repo_enable_statistics'],
854 'enable_locking': schema_data['repo_enable_locking'],
855 'enable_locking': schema_data['repo_enable_locking'],
855 'enable_downloads': schema_data['repo_enable_downloads'],
856 'enable_downloads': schema_data['repo_enable_downloads'],
856 'repo_copy_permissions': schema_data['repo_copy_permissions'],
857 'repo_copy_permissions': schema_data['repo_copy_permissions'],
857 }
858 }
858
859
859 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
860 task = RepoModel().create(form_data=data, cur_user=owner.user_id)
860 task_id = get_task_id(task)
861 task_id = get_task_id(task)
861 # no commit, it's done in RepoModel, or async via celery
862 # no commit, it's done in RepoModel, or async via celery
862 return {
863 return {
863 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
864 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
864 'success': True, # cannot return the repo data here since fork
865 'success': True, # cannot return the repo data here since fork
865 # can be done async
866 # can be done async
866 'task': task_id
867 'task': task_id
867 }
868 }
868 except Exception:
869 except Exception:
869 log.exception(
870 log.exception(
870 u"Exception while trying to create the repository %s",
871 u"Exception while trying to create the repository %s",
871 schema_data['repo_name'])
872 schema_data['repo_name'])
872 raise JSONRPCError(
873 raise JSONRPCError(
873 'failed to create repository `%s`' % (schema_data['repo_name'],))
874 'failed to create repository `%s`' % (schema_data['repo_name'],))
874
875
875
876
876 @jsonrpc_method()
877 @jsonrpc_method()
877 def add_field_to_repo(request, apiuser, repoid, key, label=Optional(''),
878 def add_field_to_repo(request, apiuser, repoid, key, label=Optional(''),
878 description=Optional('')):
879 description=Optional('')):
879 """
880 """
880 Adds an extra field to a repository.
881 Adds an extra field to a repository.
881
882
882 This command can only be run using an |authtoken| with at least
883 This command can only be run using an |authtoken| with at least
883 write permissions to the |repo|.
884 write permissions to the |repo|.
884
885
885 :param apiuser: This is filled automatically from the |authtoken|.
886 :param apiuser: This is filled automatically from the |authtoken|.
886 :type apiuser: AuthUser
887 :type apiuser: AuthUser
887 :param repoid: Set the repository name or repository id.
888 :param repoid: Set the repository name or repository id.
888 :type repoid: str or int
889 :type repoid: str or int
889 :param key: Create a unique field key for this repository.
890 :param key: Create a unique field key for this repository.
890 :type key: str
891 :type key: str
891 :param label:
892 :param label:
892 :type label: Optional(str)
893 :type label: Optional(str)
893 :param description:
894 :param description:
894 :type description: Optional(str)
895 :type description: Optional(str)
895 """
896 """
896 repo = get_repo_or_error(repoid)
897 repo = get_repo_or_error(repoid)
897 if not has_superadmin_permission(apiuser):
898 if not has_superadmin_permission(apiuser):
898 _perms = ('repository.admin',)
899 _perms = ('repository.admin',)
899 validate_repo_permissions(apiuser, repoid, repo, _perms)
900 validate_repo_permissions(apiuser, repoid, repo, _perms)
900
901
901 label = Optional.extract(label) or key
902 label = Optional.extract(label) or key
902 description = Optional.extract(description)
903 description = Optional.extract(description)
903
904
904 field = RepositoryField.get_by_key_name(key, repo)
905 field = RepositoryField.get_by_key_name(key, repo)
905 if field:
906 if field:
906 raise JSONRPCError('Field with key '
907 raise JSONRPCError('Field with key '
907 '`%s` exists for repo `%s`' % (key, repoid))
908 '`%s` exists for repo `%s`' % (key, repoid))
908
909
909 try:
910 try:
910 RepoModel().add_repo_field(repo, key, field_label=label,
911 RepoModel().add_repo_field(repo, key, field_label=label,
911 field_desc=description)
912 field_desc=description)
912 Session().commit()
913 Session().commit()
913 return {
914 return {
914 'msg': "Added new repository field `%s`" % (key,),
915 'msg': "Added new repository field `%s`" % (key,),
915 'success': True,
916 'success': True,
916 }
917 }
917 except Exception:
918 except Exception:
918 log.exception("Exception occurred while trying to add field to repo")
919 log.exception("Exception occurred while trying to add field to repo")
919 raise JSONRPCError(
920 raise JSONRPCError(
920 'failed to create new field for repository `%s`' % (repoid,))
921 'failed to create new field for repository `%s`' % (repoid,))
921
922
922
923
923 @jsonrpc_method()
924 @jsonrpc_method()
924 def remove_field_from_repo(request, apiuser, repoid, key):
925 def remove_field_from_repo(request, apiuser, repoid, key):
925 """
926 """
926 Removes an extra field from a repository.
927 Removes an extra field from a repository.
927
928
928 This command can only be run using an |authtoken| with at least
929 This command can only be run using an |authtoken| with at least
929 write permissions to the |repo|.
930 write permissions to the |repo|.
930
931
931 :param apiuser: This is filled automatically from the |authtoken|.
932 :param apiuser: This is filled automatically from the |authtoken|.
932 :type apiuser: AuthUser
933 :type apiuser: AuthUser
933 :param repoid: Set the repository name or repository ID.
934 :param repoid: Set the repository name or repository ID.
934 :type repoid: str or int
935 :type repoid: str or int
935 :param key: Set the unique field key for this repository.
936 :param key: Set the unique field key for this repository.
936 :type key: str
937 :type key: str
937 """
938 """
938
939
939 repo = get_repo_or_error(repoid)
940 repo = get_repo_or_error(repoid)
940 if not has_superadmin_permission(apiuser):
941 if not has_superadmin_permission(apiuser):
941 _perms = ('repository.admin',)
942 _perms = ('repository.admin',)
942 validate_repo_permissions(apiuser, repoid, repo, _perms)
943 validate_repo_permissions(apiuser, repoid, repo, _perms)
943
944
944 field = RepositoryField.get_by_key_name(key, repo)
945 field = RepositoryField.get_by_key_name(key, repo)
945 if not field:
946 if not field:
946 raise JSONRPCError('Field with key `%s` does not '
947 raise JSONRPCError('Field with key `%s` does not '
947 'exists for repo `%s`' % (key, repoid))
948 'exists for repo `%s`' % (key, repoid))
948
949
949 try:
950 try:
950 RepoModel().delete_repo_field(repo, field_key=key)
951 RepoModel().delete_repo_field(repo, field_key=key)
951 Session().commit()
952 Session().commit()
952 return {
953 return {
953 'msg': "Deleted repository field `%s`" % (key,),
954 'msg': "Deleted repository field `%s`" % (key,),
954 'success': True,
955 'success': True,
955 }
956 }
956 except Exception:
957 except Exception:
957 log.exception(
958 log.exception(
958 "Exception occurred while trying to delete field from repo")
959 "Exception occurred while trying to delete field from repo")
959 raise JSONRPCError(
960 raise JSONRPCError(
960 'failed to delete field for repository `%s`' % (repoid,))
961 'failed to delete field for repository `%s`' % (repoid,))
961
962
962
963
963 @jsonrpc_method()
964 @jsonrpc_method()
964 def update_repo(
965 def update_repo(
965 request, apiuser, repoid, repo_name=Optional(None),
966 request, apiuser, repoid, repo_name=Optional(None),
966 owner=Optional(OAttr('apiuser')), description=Optional(''),
967 owner=Optional(OAttr('apiuser')), description=Optional(''),
967 private=Optional(False),
968 private=Optional(False),
968 clone_uri=Optional(None), push_uri=Optional(None),
969 clone_uri=Optional(None), push_uri=Optional(None),
969 landing_rev=Optional(None), fork_of=Optional(None),
970 landing_rev=Optional(None), fork_of=Optional(None),
970 enable_statistics=Optional(False),
971 enable_statistics=Optional(False),
971 enable_locking=Optional(False),
972 enable_locking=Optional(False),
972 enable_downloads=Optional(False), fields=Optional('')):
973 enable_downloads=Optional(False), fields=Optional('')):
973 """
974 """
974 Updates a repository with the given information.
975 Updates a repository with the given information.
975
976
976 This command can only be run using an |authtoken| with at least
977 This command can only be run using an |authtoken| with at least
977 admin permissions to the |repo|.
978 admin permissions to the |repo|.
978
979
979 * If the repository name contains "/", repository will be updated
980 * If the repository name contains "/", repository will be updated
980 accordingly with a repository group or nested repository groups
981 accordingly with a repository group or nested repository groups
981
982
982 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
983 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
983 called "repo-test" and place it inside group "foo/bar".
984 called "repo-test" and place it inside group "foo/bar".
984 You have to have permissions to access and write to the last repository
985 You have to have permissions to access and write to the last repository
985 group ("bar" in this example)
986 group ("bar" in this example)
986
987
987 :param apiuser: This is filled automatically from the |authtoken|.
988 :param apiuser: This is filled automatically from the |authtoken|.
988 :type apiuser: AuthUser
989 :type apiuser: AuthUser
989 :param repoid: repository name or repository ID.
990 :param repoid: repository name or repository ID.
990 :type repoid: str or int
991 :type repoid: str or int
991 :param repo_name: Update the |repo| name, including the
992 :param repo_name: Update the |repo| name, including the
992 repository group it's in.
993 repository group it's in.
993 :type repo_name: str
994 :type repo_name: str
994 :param owner: Set the |repo| owner.
995 :param owner: Set the |repo| owner.
995 :type owner: str
996 :type owner: str
996 :param fork_of: Set the |repo| as fork of another |repo|.
997 :param fork_of: Set the |repo| as fork of another |repo|.
997 :type fork_of: str
998 :type fork_of: str
998 :param description: Update the |repo| description.
999 :param description: Update the |repo| description.
999 :type description: str
1000 :type description: str
1000 :param private: Set the |repo| as private. (True | False)
1001 :param private: Set the |repo| as private. (True | False)
1001 :type private: bool
1002 :type private: bool
1002 :param clone_uri: Update the |repo| clone URI.
1003 :param clone_uri: Update the |repo| clone URI.
1003 :type clone_uri: str
1004 :type clone_uri: str
1004 :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd
1005 :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd
1005 :type landing_rev: str
1006 :type landing_rev: str
1006 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1007 :param enable_statistics: Enable statistics on the |repo|, (True | False).
1007 :type enable_statistics: bool
1008 :type enable_statistics: bool
1008 :param enable_locking: Enable |repo| locking.
1009 :param enable_locking: Enable |repo| locking.
1009 :type enable_locking: bool
1010 :type enable_locking: bool
1010 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1011 :param enable_downloads: Enable downloads from the |repo|, (True | False).
1011 :type enable_downloads: bool
1012 :type enable_downloads: bool
1012 :param fields: Add extra fields to the |repo|. Use the following
1013 :param fields: Add extra fields to the |repo|. Use the following
1013 example format: ``field_key=field_val,field_key2=fieldval2``.
1014 example format: ``field_key=field_val,field_key2=fieldval2``.
1014 Escape ', ' with \,
1015 Escape ', ' with \,
1015 :type fields: str
1016 :type fields: str
1016 """
1017 """
1017
1018
1018 repo = get_repo_or_error(repoid)
1019 repo = get_repo_or_error(repoid)
1019
1020
1020 include_secrets = False
1021 include_secrets = False
1021 if not has_superadmin_permission(apiuser):
1022 if not has_superadmin_permission(apiuser):
1022 _perms = ('repository.admin',)
1023 _perms = ('repository.admin',)
1023 validate_repo_permissions(apiuser, repoid, repo, _perms)
1024 validate_repo_permissions(apiuser, repoid, repo, _perms)
1024 else:
1025 else:
1025 include_secrets = True
1026 include_secrets = True
1026
1027
1027 updates = dict(
1028 updates = dict(
1028 repo_name=repo_name
1029 repo_name=repo_name
1029 if not isinstance(repo_name, Optional) else repo.repo_name,
1030 if not isinstance(repo_name, Optional) else repo.repo_name,
1030
1031
1031 fork_id=fork_of
1032 fork_id=fork_of
1032 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
1033 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
1033
1034
1034 user=owner
1035 user=owner
1035 if not isinstance(owner, Optional) else repo.user.username,
1036 if not isinstance(owner, Optional) else repo.user.username,
1036
1037
1037 repo_description=description
1038 repo_description=description
1038 if not isinstance(description, Optional) else repo.description,
1039 if not isinstance(description, Optional) else repo.description,
1039
1040
1040 repo_private=private
1041 repo_private=private
1041 if not isinstance(private, Optional) else repo.private,
1042 if not isinstance(private, Optional) else repo.private,
1042
1043
1043 clone_uri=clone_uri
1044 clone_uri=clone_uri
1044 if not isinstance(clone_uri, Optional) else repo.clone_uri,
1045 if not isinstance(clone_uri, Optional) else repo.clone_uri,
1045
1046
1046 push_uri=push_uri
1047 push_uri=push_uri
1047 if not isinstance(push_uri, Optional) else repo.push_uri,
1048 if not isinstance(push_uri, Optional) else repo.push_uri,
1048
1049
1049 repo_landing_rev=landing_rev
1050 repo_landing_rev=landing_rev
1050 if not isinstance(landing_rev, Optional) else repo._landing_revision,
1051 if not isinstance(landing_rev, Optional) else repo._landing_revision,
1051
1052
1052 repo_enable_statistics=enable_statistics
1053 repo_enable_statistics=enable_statistics
1053 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
1054 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
1054
1055
1055 repo_enable_locking=enable_locking
1056 repo_enable_locking=enable_locking
1056 if not isinstance(enable_locking, Optional) else repo.enable_locking,
1057 if not isinstance(enable_locking, Optional) else repo.enable_locking,
1057
1058
1058 repo_enable_downloads=enable_downloads
1059 repo_enable_downloads=enable_downloads
1059 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
1060 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
1060
1061
1061 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1062 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1062 ref_choices, _labels = ScmModel().get_repo_landing_revs(
1063 ref_choices, _labels = ScmModel().get_repo_landing_revs(
1063 request.translate, repo=repo)
1064 request.translate, repo=repo)
1064 ref_choices = list(set(ref_choices + [landing_ref]))
1065 ref_choices = list(set(ref_choices + [landing_ref]))
1065
1066
1066 old_values = repo.get_api_data()
1067 old_values = repo.get_api_data()
1067 repo_type = repo.repo_type
1068 repo_type = repo.repo_type
1068 schema = repo_schema.RepoSchema().bind(
1069 schema = repo_schema.RepoSchema().bind(
1069 repo_type_options=rhodecode.BACKENDS.keys(),
1070 repo_type_options=rhodecode.BACKENDS.keys(),
1070 repo_ref_options=ref_choices,
1071 repo_ref_options=ref_choices,
1071 repo_type=repo_type,
1072 repo_type=repo_type,
1072 # user caller
1073 # user caller
1073 user=apiuser,
1074 user=apiuser,
1074 old_values=old_values)
1075 old_values=old_values)
1075 try:
1076 try:
1076 schema_data = schema.deserialize(dict(
1077 schema_data = schema.deserialize(dict(
1077 # we save old value, users cannot change type
1078 # we save old value, users cannot change type
1078 repo_type=repo_type,
1079 repo_type=repo_type,
1079
1080
1080 repo_name=updates['repo_name'],
1081 repo_name=updates['repo_name'],
1081 repo_owner=updates['user'],
1082 repo_owner=updates['user'],
1082 repo_description=updates['repo_description'],
1083 repo_description=updates['repo_description'],
1083 repo_clone_uri=updates['clone_uri'],
1084 repo_clone_uri=updates['clone_uri'],
1084 repo_push_uri=updates['push_uri'],
1085 repo_push_uri=updates['push_uri'],
1085 repo_fork_of=updates['fork_id'],
1086 repo_fork_of=updates['fork_id'],
1086 repo_private=updates['repo_private'],
1087 repo_private=updates['repo_private'],
1087 repo_landing_commit_ref=updates['repo_landing_rev'],
1088 repo_landing_commit_ref=updates['repo_landing_rev'],
1088 repo_enable_statistics=updates['repo_enable_statistics'],
1089 repo_enable_statistics=updates['repo_enable_statistics'],
1089 repo_enable_downloads=updates['repo_enable_downloads'],
1090 repo_enable_downloads=updates['repo_enable_downloads'],
1090 repo_enable_locking=updates['repo_enable_locking']))
1091 repo_enable_locking=updates['repo_enable_locking']))
1091 except validation_schema.Invalid as err:
1092 except validation_schema.Invalid as err:
1092 raise JSONRPCValidationError(colander_exc=err)
1093 raise JSONRPCValidationError(colander_exc=err)
1093
1094
1094 # save validated data back into the updates dict
1095 # save validated data back into the updates dict
1095 validated_updates = dict(
1096 validated_updates = dict(
1096 repo_name=schema_data['repo_group']['repo_name_without_group'],
1097 repo_name=schema_data['repo_group']['repo_name_without_group'],
1097 repo_group=schema_data['repo_group']['repo_group_id'],
1098 repo_group=schema_data['repo_group']['repo_group_id'],
1098
1099
1099 user=schema_data['repo_owner'],
1100 user=schema_data['repo_owner'],
1100 repo_description=schema_data['repo_description'],
1101 repo_description=schema_data['repo_description'],
1101 repo_private=schema_data['repo_private'],
1102 repo_private=schema_data['repo_private'],
1102 clone_uri=schema_data['repo_clone_uri'],
1103 clone_uri=schema_data['repo_clone_uri'],
1103 push_uri=schema_data['repo_push_uri'],
1104 push_uri=schema_data['repo_push_uri'],
1104 repo_landing_rev=schema_data['repo_landing_commit_ref'],
1105 repo_landing_rev=schema_data['repo_landing_commit_ref'],
1105 repo_enable_statistics=schema_data['repo_enable_statistics'],
1106 repo_enable_statistics=schema_data['repo_enable_statistics'],
1106 repo_enable_locking=schema_data['repo_enable_locking'],
1107 repo_enable_locking=schema_data['repo_enable_locking'],
1107 repo_enable_downloads=schema_data['repo_enable_downloads'],
1108 repo_enable_downloads=schema_data['repo_enable_downloads'],
1108 )
1109 )
1109
1110
1110 if schema_data['repo_fork_of']:
1111 if schema_data['repo_fork_of']:
1111 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
1112 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
1112 validated_updates['fork_id'] = fork_repo.repo_id
1113 validated_updates['fork_id'] = fork_repo.repo_id
1113
1114
1114 # extra fields
1115 # extra fields
1115 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
1116 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
1116 if fields:
1117 if fields:
1117 validated_updates.update(fields)
1118 validated_updates.update(fields)
1118
1119
1119 try:
1120 try:
1120 RepoModel().update(repo, **validated_updates)
1121 RepoModel().update(repo, **validated_updates)
1121 audit_logger.store_api(
1122 audit_logger.store_api(
1122 'repo.edit', action_data={'old_data': old_values},
1123 'repo.edit', action_data={'old_data': old_values},
1123 user=apiuser, repo=repo)
1124 user=apiuser, repo=repo)
1124 Session().commit()
1125 Session().commit()
1125 return {
1126 return {
1126 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
1127 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
1127 'repository': repo.get_api_data(include_secrets=include_secrets)
1128 'repository': repo.get_api_data(include_secrets=include_secrets)
1128 }
1129 }
1129 except Exception:
1130 except Exception:
1130 log.exception(
1131 log.exception(
1131 u"Exception while trying to update the repository %s",
1132 u"Exception while trying to update the repository %s",
1132 repoid)
1133 repoid)
1133 raise JSONRPCError('failed to update repo `%s`' % repoid)
1134 raise JSONRPCError('failed to update repo `%s`' % repoid)
1134
1135
1135
1136
1136 @jsonrpc_method()
1137 @jsonrpc_method()
1137 def fork_repo(request, apiuser, repoid, fork_name,
1138 def fork_repo(request, apiuser, repoid, fork_name,
1138 owner=Optional(OAttr('apiuser')),
1139 owner=Optional(OAttr('apiuser')),
1139 description=Optional(''),
1140 description=Optional(''),
1140 private=Optional(False),
1141 private=Optional(False),
1141 clone_uri=Optional(None),
1142 clone_uri=Optional(None),
1142 landing_rev=Optional(None),
1143 landing_rev=Optional(None),
1143 copy_permissions=Optional(False)):
1144 copy_permissions=Optional(False)):
1144 """
1145 """
1145 Creates a fork of the specified |repo|.
1146 Creates a fork of the specified |repo|.
1146
1147
1147 * If the fork_name contains "/", fork will be created inside
1148 * If the fork_name contains "/", fork will be created inside
1148 a repository group or nested repository groups
1149 a repository group or nested repository groups
1149
1150
1150 For example "foo/bar/fork-repo" will create fork called "fork-repo"
1151 For example "foo/bar/fork-repo" will create fork called "fork-repo"
1151 inside group "foo/bar". You have to have permissions to access and
1152 inside group "foo/bar". You have to have permissions to access and
1152 write to the last repository group ("bar" in this example)
1153 write to the last repository group ("bar" in this example)
1153
1154
1154 This command can only be run using an |authtoken| with minimum
1155 This command can only be run using an |authtoken| with minimum
1155 read permissions of the forked repo, create fork permissions for an user.
1156 read permissions of the forked repo, create fork permissions for an user.
1156
1157
1157 :param apiuser: This is filled automatically from the |authtoken|.
1158 :param apiuser: This is filled automatically from the |authtoken|.
1158 :type apiuser: AuthUser
1159 :type apiuser: AuthUser
1159 :param repoid: Set repository name or repository ID.
1160 :param repoid: Set repository name or repository ID.
1160 :type repoid: str or int
1161 :type repoid: str or int
1161 :param fork_name: Set the fork name, including it's repository group membership.
1162 :param fork_name: Set the fork name, including it's repository group membership.
1162 :type fork_name: str
1163 :type fork_name: str
1163 :param owner: Set the fork owner.
1164 :param owner: Set the fork owner.
1164 :type owner: str
1165 :type owner: str
1165 :param description: Set the fork description.
1166 :param description: Set the fork description.
1166 :type description: str
1167 :type description: str
1167 :param copy_permissions: Copy permissions from parent |repo|. The
1168 :param copy_permissions: Copy permissions from parent |repo|. The
1168 default is False.
1169 default is False.
1169 :type copy_permissions: bool
1170 :type copy_permissions: bool
1170 :param private: Make the fork private. The default is False.
1171 :param private: Make the fork private. The default is False.
1171 :type private: bool
1172 :type private: bool
1172 :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd
1173 :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd
1173
1174
1174 Example output:
1175 Example output:
1175
1176
1176 .. code-block:: bash
1177 .. code-block:: bash
1177
1178
1178 id : <id_for_response>
1179 id : <id_for_response>
1179 api_key : "<api_key>"
1180 api_key : "<api_key>"
1180 args: {
1181 args: {
1181 "repoid" : "<reponame or repo_id>",
1182 "repoid" : "<reponame or repo_id>",
1182 "fork_name": "<forkname>",
1183 "fork_name": "<forkname>",
1183 "owner": "<username or user_id = Optional(=apiuser)>",
1184 "owner": "<username or user_id = Optional(=apiuser)>",
1184 "description": "<description>",
1185 "description": "<description>",
1185 "copy_permissions": "<bool>",
1186 "copy_permissions": "<bool>",
1186 "private": "<bool>",
1187 "private": "<bool>",
1187 "landing_rev": "<landing_rev>"
1188 "landing_rev": "<landing_rev>"
1188 }
1189 }
1189
1190
1190 Example error output:
1191 Example error output:
1191
1192
1192 .. code-block:: bash
1193 .. code-block:: bash
1193
1194
1194 id : <id_given_in_input>
1195 id : <id_given_in_input>
1195 result: {
1196 result: {
1196 "msg": "Created fork of `<reponame>` as `<forkname>`",
1197 "msg": "Created fork of `<reponame>` as `<forkname>`",
1197 "success": true,
1198 "success": true,
1198 "task": "<celery task id or None if done sync>"
1199 "task": "<celery task id or None if done sync>"
1199 }
1200 }
1200 error: null
1201 error: null
1201
1202
1202 """
1203 """
1203
1204
1204 repo = get_repo_or_error(repoid)
1205 repo = get_repo_or_error(repoid)
1205 repo_name = repo.repo_name
1206 repo_name = repo.repo_name
1206
1207
1207 if not has_superadmin_permission(apiuser):
1208 if not has_superadmin_permission(apiuser):
1208 # check if we have at least read permission for
1209 # check if we have at least read permission for
1209 # this repo that we fork !
1210 # this repo that we fork !
1210 _perms = ('repository.admin', 'repository.write', 'repository.read')
1211 _perms = ('repository.admin', 'repository.write', 'repository.read')
1211 validate_repo_permissions(apiuser, repoid, repo, _perms)
1212 validate_repo_permissions(apiuser, repoid, repo, _perms)
1212
1213
1213 # check if the regular user has at least fork permissions as well
1214 # check if the regular user has at least fork permissions as well
1214 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
1215 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
1215 raise JSONRPCForbidden()
1216 raise JSONRPCForbidden()
1216
1217
1217 # check if user can set owner parameter
1218 # check if user can set owner parameter
1218 owner = validate_set_owner_permissions(apiuser, owner)
1219 owner = validate_set_owner_permissions(apiuser, owner)
1219
1220
1220 description = Optional.extract(description)
1221 description = Optional.extract(description)
1221 copy_permissions = Optional.extract(copy_permissions)
1222 copy_permissions = Optional.extract(copy_permissions)
1222 clone_uri = Optional.extract(clone_uri)
1223 clone_uri = Optional.extract(clone_uri)
1223
1224
1224 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1225 landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type)
1225 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
1226 ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate)
1226 ref_choices = list(set(ref_choices + [landing_ref]))
1227 ref_choices = list(set(ref_choices + [landing_ref]))
1227 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
1228 landing_commit_ref = Optional.extract(landing_rev) or landing_ref
1228
1229
1229 private = Optional.extract(private)
1230 private = Optional.extract(private)
1230
1231
1231 schema = repo_schema.RepoSchema().bind(
1232 schema = repo_schema.RepoSchema().bind(
1232 repo_type_options=rhodecode.BACKENDS.keys(),
1233 repo_type_options=rhodecode.BACKENDS.keys(),
1233 repo_ref_options=ref_choices,
1234 repo_ref_options=ref_choices,
1234 repo_type=repo.repo_type,
1235 repo_type=repo.repo_type,
1235 # user caller
1236 # user caller
1236 user=apiuser)
1237 user=apiuser)
1237
1238
1238 try:
1239 try:
1239 schema_data = schema.deserialize(dict(
1240 schema_data = schema.deserialize(dict(
1240 repo_name=fork_name,
1241 repo_name=fork_name,
1241 repo_type=repo.repo_type,
1242 repo_type=repo.repo_type,
1242 repo_owner=owner.username,
1243 repo_owner=owner.username,
1243 repo_description=description,
1244 repo_description=description,
1244 repo_landing_commit_ref=landing_commit_ref,
1245 repo_landing_commit_ref=landing_commit_ref,
1245 repo_clone_uri=clone_uri,
1246 repo_clone_uri=clone_uri,
1246 repo_private=private,
1247 repo_private=private,
1247 repo_copy_permissions=copy_permissions))
1248 repo_copy_permissions=copy_permissions))
1248 except validation_schema.Invalid as err:
1249 except validation_schema.Invalid as err:
1249 raise JSONRPCValidationError(colander_exc=err)
1250 raise JSONRPCValidationError(colander_exc=err)
1250
1251
1251 try:
1252 try:
1252 data = {
1253 data = {
1253 'fork_parent_id': repo.repo_id,
1254 'fork_parent_id': repo.repo_id,
1254
1255
1255 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1256 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1256 'repo_name_full': schema_data['repo_name'],
1257 'repo_name_full': schema_data['repo_name'],
1257 'repo_group': schema_data['repo_group']['repo_group_id'],
1258 'repo_group': schema_data['repo_group']['repo_group_id'],
1258 'repo_type': schema_data['repo_type'],
1259 'repo_type': schema_data['repo_type'],
1259 'description': schema_data['repo_description'],
1260 'description': schema_data['repo_description'],
1260 'private': schema_data['repo_private'],
1261 'private': schema_data['repo_private'],
1261 'copy_permissions': schema_data['repo_copy_permissions'],
1262 'copy_permissions': schema_data['repo_copy_permissions'],
1262 'landing_rev': schema_data['repo_landing_commit_ref'],
1263 'landing_rev': schema_data['repo_landing_commit_ref'],
1263 }
1264 }
1264
1265
1265 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1266 task = RepoModel().create_fork(data, cur_user=owner.user_id)
1266 # no commit, it's done in RepoModel, or async via celery
1267 # no commit, it's done in RepoModel, or async via celery
1267 task_id = get_task_id(task)
1268 task_id = get_task_id(task)
1268
1269
1269 return {
1270 return {
1270 'msg': 'Created fork of `%s` as `%s`' % (
1271 'msg': 'Created fork of `%s` as `%s`' % (
1271 repo.repo_name, schema_data['repo_name']),
1272 repo.repo_name, schema_data['repo_name']),
1272 'success': True, # cannot return the repo data here since fork
1273 'success': True, # cannot return the repo data here since fork
1273 # can be done async
1274 # can be done async
1274 'task': task_id
1275 'task': task_id
1275 }
1276 }
1276 except Exception:
1277 except Exception:
1277 log.exception(
1278 log.exception(
1278 u"Exception while trying to create fork %s",
1279 u"Exception while trying to create fork %s",
1279 schema_data['repo_name'])
1280 schema_data['repo_name'])
1280 raise JSONRPCError(
1281 raise JSONRPCError(
1281 'failed to fork repository `%s` as `%s`' % (
1282 'failed to fork repository `%s` as `%s`' % (
1282 repo_name, schema_data['repo_name']))
1283 repo_name, schema_data['repo_name']))
1283
1284
1284
1285
1285 @jsonrpc_method()
1286 @jsonrpc_method()
1286 def delete_repo(request, apiuser, repoid, forks=Optional('')):
1287 def delete_repo(request, apiuser, repoid, forks=Optional('')):
1287 """
1288 """
1288 Deletes a repository.
1289 Deletes a repository.
1289
1290
1290 * When the `forks` parameter is set it's possible to detach or delete
1291 * When the `forks` parameter is set it's possible to detach or delete
1291 forks of deleted repository.
1292 forks of deleted repository.
1292
1293
1293 This command can only be run using an |authtoken| with admin
1294 This command can only be run using an |authtoken| with admin
1294 permissions on the |repo|.
1295 permissions on the |repo|.
1295
1296
1296 :param apiuser: This is filled automatically from the |authtoken|.
1297 :param apiuser: This is filled automatically from the |authtoken|.
1297 :type apiuser: AuthUser
1298 :type apiuser: AuthUser
1298 :param repoid: Set the repository name or repository ID.
1299 :param repoid: Set the repository name or repository ID.
1299 :type repoid: str or int
1300 :type repoid: str or int
1300 :param forks: Set to `detach` or `delete` forks from the |repo|.
1301 :param forks: Set to `detach` or `delete` forks from the |repo|.
1301 :type forks: Optional(str)
1302 :type forks: Optional(str)
1302
1303
1303 Example error output:
1304 Example error output:
1304
1305
1305 .. code-block:: bash
1306 .. code-block:: bash
1306
1307
1307 id : <id_given_in_input>
1308 id : <id_given_in_input>
1308 result: {
1309 result: {
1309 "msg": "Deleted repository `<reponame>`",
1310 "msg": "Deleted repository `<reponame>`",
1310 "success": true
1311 "success": true
1311 }
1312 }
1312 error: null
1313 error: null
1313 """
1314 """
1314
1315
1315 repo = get_repo_or_error(repoid)
1316 repo = get_repo_or_error(repoid)
1316 repo_name = repo.repo_name
1317 repo_name = repo.repo_name
1317 if not has_superadmin_permission(apiuser):
1318 if not has_superadmin_permission(apiuser):
1318 _perms = ('repository.admin',)
1319 _perms = ('repository.admin',)
1319 validate_repo_permissions(apiuser, repoid, repo, _perms)
1320 validate_repo_permissions(apiuser, repoid, repo, _perms)
1320
1321
1321 try:
1322 try:
1322 handle_forks = Optional.extract(forks)
1323 handle_forks = Optional.extract(forks)
1323 _forks_msg = ''
1324 _forks_msg = ''
1324 _forks = [f for f in repo.forks]
1325 _forks = [f for f in repo.forks]
1325 if handle_forks == 'detach':
1326 if handle_forks == 'detach':
1326 _forks_msg = ' ' + 'Detached %s forks' % len(_forks)
1327 _forks_msg = ' ' + 'Detached %s forks' % len(_forks)
1327 elif handle_forks == 'delete':
1328 elif handle_forks == 'delete':
1328 _forks_msg = ' ' + 'Deleted %s forks' % len(_forks)
1329 _forks_msg = ' ' + 'Deleted %s forks' % len(_forks)
1329 elif _forks:
1330 elif _forks:
1330 raise JSONRPCError(
1331 raise JSONRPCError(
1331 'Cannot delete `%s` it still contains attached forks' %
1332 'Cannot delete `%s` it still contains attached forks' %
1332 (repo.repo_name,)
1333 (repo.repo_name,)
1333 )
1334 )
1334 old_data = repo.get_api_data()
1335 old_data = repo.get_api_data()
1335 RepoModel().delete(repo, forks=forks)
1336 RepoModel().delete(repo, forks=forks)
1336
1337
1337 repo = audit_logger.RepoWrap(repo_id=None,
1338 repo = audit_logger.RepoWrap(repo_id=None,
1338 repo_name=repo.repo_name)
1339 repo_name=repo.repo_name)
1339
1340
1340 audit_logger.store_api(
1341 audit_logger.store_api(
1341 'repo.delete', action_data={'old_data': old_data},
1342 'repo.delete', action_data={'old_data': old_data},
1342 user=apiuser, repo=repo)
1343 user=apiuser, repo=repo)
1343
1344
1344 ScmModel().mark_for_invalidation(repo_name, delete=True)
1345 ScmModel().mark_for_invalidation(repo_name, delete=True)
1345 Session().commit()
1346 Session().commit()
1346 return {
1347 return {
1347 'msg': 'Deleted repository `%s`%s' % (repo_name, _forks_msg),
1348 'msg': 'Deleted repository `%s`%s' % (repo_name, _forks_msg),
1348 'success': True
1349 'success': True
1349 }
1350 }
1350 except Exception:
1351 except Exception:
1351 log.exception("Exception occurred while trying to delete repo")
1352 log.exception("Exception occurred while trying to delete repo")
1352 raise JSONRPCError(
1353 raise JSONRPCError(
1353 'failed to delete repository `%s`' % (repo_name,)
1354 'failed to delete repository `%s`' % (repo_name,)
1354 )
1355 )
1355
1356
1356
1357
1357 #TODO: marcink, change name ?
1358 #TODO: marcink, change name ?
1358 @jsonrpc_method()
1359 @jsonrpc_method()
1359 def invalidate_cache(request, apiuser, repoid, delete_keys=Optional(False)):
1360 def invalidate_cache(request, apiuser, repoid, delete_keys=Optional(False)):
1360 """
1361 """
1361 Invalidates the cache for the specified repository.
1362 Invalidates the cache for the specified repository.
1362
1363
1363 This command can only be run using an |authtoken| with admin rights to
1364 This command can only be run using an |authtoken| with admin rights to
1364 the specified repository.
1365 the specified repository.
1365
1366
1366 This command takes the following options:
1367 This command takes the following options:
1367
1368
1368 :param apiuser: This is filled automatically from |authtoken|.
1369 :param apiuser: This is filled automatically from |authtoken|.
1369 :type apiuser: AuthUser
1370 :type apiuser: AuthUser
1370 :param repoid: Sets the repository name or repository ID.
1371 :param repoid: Sets the repository name or repository ID.
1371 :type repoid: str or int
1372 :type repoid: str or int
1372 :param delete_keys: This deletes the invalidated keys instead of
1373 :param delete_keys: This deletes the invalidated keys instead of
1373 just flagging them.
1374 just flagging them.
1374 :type delete_keys: Optional(``True`` | ``False``)
1375 :type delete_keys: Optional(``True`` | ``False``)
1375
1376
1376 Example output:
1377 Example output:
1377
1378
1378 .. code-block:: bash
1379 .. code-block:: bash
1379
1380
1380 id : <id_given_in_input>
1381 id : <id_given_in_input>
1381 result : {
1382 result : {
1382 'msg': Cache for repository `<repository name>` was invalidated,
1383 'msg': Cache for repository `<repository name>` was invalidated,
1383 'repository': <repository name>
1384 'repository': <repository name>
1384 }
1385 }
1385 error : null
1386 error : null
1386
1387
1387 Example error output:
1388 Example error output:
1388
1389
1389 .. code-block:: bash
1390 .. code-block:: bash
1390
1391
1391 id : <id_given_in_input>
1392 id : <id_given_in_input>
1392 result : null
1393 result : null
1393 error : {
1394 error : {
1394 'Error occurred during cache invalidation action'
1395 'Error occurred during cache invalidation action'
1395 }
1396 }
1396
1397
1397 """
1398 """
1398
1399
1399 repo = get_repo_or_error(repoid)
1400 repo = get_repo_or_error(repoid)
1400 if not has_superadmin_permission(apiuser):
1401 if not has_superadmin_permission(apiuser):
1401 _perms = ('repository.admin', 'repository.write',)
1402 _perms = ('repository.admin', 'repository.write',)
1402 validate_repo_permissions(apiuser, repoid, repo, _perms)
1403 validate_repo_permissions(apiuser, repoid, repo, _perms)
1403
1404
1404 delete = Optional.extract(delete_keys)
1405 delete = Optional.extract(delete_keys)
1405 try:
1406 try:
1406 ScmModel().mark_for_invalidation(repo.repo_name, delete=delete)
1407 ScmModel().mark_for_invalidation(repo.repo_name, delete=delete)
1407 return {
1408 return {
1408 'msg': 'Cache for repository `%s` was invalidated' % (repoid,),
1409 'msg': 'Cache for repository `%s` was invalidated' % (repoid,),
1409 'repository': repo.repo_name
1410 'repository': repo.repo_name
1410 }
1411 }
1411 except Exception:
1412 except Exception:
1412 log.exception(
1413 log.exception(
1413 "Exception occurred while trying to invalidate repo cache")
1414 "Exception occurred while trying to invalidate repo cache")
1414 raise JSONRPCError(
1415 raise JSONRPCError(
1415 'Error occurred during cache invalidation action'
1416 'Error occurred during cache invalidation action'
1416 )
1417 )
1417
1418
1418
1419
1419 #TODO: marcink, change name ?
1420 #TODO: marcink, change name ?
1420 @jsonrpc_method()
1421 @jsonrpc_method()
1421 def lock(request, apiuser, repoid, locked=Optional(None),
1422 def lock(request, apiuser, repoid, locked=Optional(None),
1422 userid=Optional(OAttr('apiuser'))):
1423 userid=Optional(OAttr('apiuser'))):
1423 """
1424 """
1424 Sets the lock state of the specified |repo| by the given user.
1425 Sets the lock state of the specified |repo| by the given user.
1425 From more information, see :ref:`repo-locking`.
1426 From more information, see :ref:`repo-locking`.
1426
1427
1427 * If the ``userid`` option is not set, the repository is locked to the
1428 * If the ``userid`` option is not set, the repository is locked to the
1428 user who called the method.
1429 user who called the method.
1429 * If the ``locked`` parameter is not set, the current lock state of the
1430 * If the ``locked`` parameter is not set, the current lock state of the
1430 repository is displayed.
1431 repository is displayed.
1431
1432
1432 This command can only be run using an |authtoken| with admin rights to
1433 This command can only be run using an |authtoken| with admin rights to
1433 the specified repository.
1434 the specified repository.
1434
1435
1435 This command takes the following options:
1436 This command takes the following options:
1436
1437
1437 :param apiuser: This is filled automatically from the |authtoken|.
1438 :param apiuser: This is filled automatically from the |authtoken|.
1438 :type apiuser: AuthUser
1439 :type apiuser: AuthUser
1439 :param repoid: Sets the repository name or repository ID.
1440 :param repoid: Sets the repository name or repository ID.
1440 :type repoid: str or int
1441 :type repoid: str or int
1441 :param locked: Sets the lock state.
1442 :param locked: Sets the lock state.
1442 :type locked: Optional(``True`` | ``False``)
1443 :type locked: Optional(``True`` | ``False``)
1443 :param userid: Set the repository lock to this user.
1444 :param userid: Set the repository lock to this user.
1444 :type userid: Optional(str or int)
1445 :type userid: Optional(str or int)
1445
1446
1446 Example error output:
1447 Example error output:
1447
1448
1448 .. code-block:: bash
1449 .. code-block:: bash
1449
1450
1450 id : <id_given_in_input>
1451 id : <id_given_in_input>
1451 result : {
1452 result : {
1452 'repo': '<reponame>',
1453 'repo': '<reponame>',
1453 'locked': <bool: lock state>,
1454 'locked': <bool: lock state>,
1454 'locked_since': <int: lock timestamp>,
1455 'locked_since': <int: lock timestamp>,
1455 'locked_by': <username of person who made the lock>,
1456 'locked_by': <username of person who made the lock>,
1456 'lock_reason': <str: reason for locking>,
1457 'lock_reason': <str: reason for locking>,
1457 'lock_state_changed': <bool: True if lock state has been changed in this request>,
1458 'lock_state_changed': <bool: True if lock state has been changed in this request>,
1458 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
1459 'msg': 'Repo `<reponame>` locked by `<username>` on <timestamp>.'
1459 or
1460 or
1460 'msg': 'Repo `<repository name>` not locked.'
1461 'msg': 'Repo `<repository name>` not locked.'
1461 or
1462 or
1462 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
1463 'msg': 'User `<user name>` set lock state for repo `<repository name>` to `<new lock state>`'
1463 }
1464 }
1464 error : null
1465 error : null
1465
1466
1466 Example error output:
1467 Example error output:
1467
1468
1468 .. code-block:: bash
1469 .. code-block:: bash
1469
1470
1470 id : <id_given_in_input>
1471 id : <id_given_in_input>
1471 result : null
1472 result : null
1472 error : {
1473 error : {
1473 'Error occurred locking repository `<reponame>`'
1474 'Error occurred locking repository `<reponame>`'
1474 }
1475 }
1475 """
1476 """
1476
1477
1477 repo = get_repo_or_error(repoid)
1478 repo = get_repo_or_error(repoid)
1478 if not has_superadmin_permission(apiuser):
1479 if not has_superadmin_permission(apiuser):
1479 # check if we have at least write permission for this repo !
1480 # check if we have at least write permission for this repo !
1480 _perms = ('repository.admin', 'repository.write',)
1481 _perms = ('repository.admin', 'repository.write',)
1481 validate_repo_permissions(apiuser, repoid, repo, _perms)
1482 validate_repo_permissions(apiuser, repoid, repo, _perms)
1482
1483
1483 # make sure normal user does not pass someone else userid,
1484 # make sure normal user does not pass someone else userid,
1484 # he is not allowed to do that
1485 # he is not allowed to do that
1485 if not isinstance(userid, Optional) and userid != apiuser.user_id:
1486 if not isinstance(userid, Optional) and userid != apiuser.user_id:
1486 raise JSONRPCError('userid is not the same as your user')
1487 raise JSONRPCError('userid is not the same as your user')
1487
1488
1488 if isinstance(userid, Optional):
1489 if isinstance(userid, Optional):
1489 userid = apiuser.user_id
1490 userid = apiuser.user_id
1490
1491
1491 user = get_user_or_error(userid)
1492 user = get_user_or_error(userid)
1492
1493
1493 if isinstance(locked, Optional):
1494 if isinstance(locked, Optional):
1494 lockobj = repo.locked
1495 lockobj = repo.locked
1495
1496
1496 if lockobj[0] is None:
1497 if lockobj[0] is None:
1497 _d = {
1498 _d = {
1498 'repo': repo.repo_name,
1499 'repo': repo.repo_name,
1499 'locked': False,
1500 'locked': False,
1500 'locked_since': None,
1501 'locked_since': None,
1501 'locked_by': None,
1502 'locked_by': None,
1502 'lock_reason': None,
1503 'lock_reason': None,
1503 'lock_state_changed': False,
1504 'lock_state_changed': False,
1504 'msg': 'Repo `%s` not locked.' % repo.repo_name
1505 'msg': 'Repo `%s` not locked.' % repo.repo_name
1505 }
1506 }
1506 return _d
1507 return _d
1507 else:
1508 else:
1508 _user_id, _time, _reason = lockobj
1509 _user_id, _time, _reason = lockobj
1509 lock_user = get_user_or_error(userid)
1510 lock_user = get_user_or_error(userid)
1510 _d = {
1511 _d = {
1511 'repo': repo.repo_name,
1512 'repo': repo.repo_name,
1512 'locked': True,
1513 'locked': True,
1513 'locked_since': _time,
1514 'locked_since': _time,
1514 'locked_by': lock_user.username,
1515 'locked_by': lock_user.username,
1515 'lock_reason': _reason,
1516 'lock_reason': _reason,
1516 'lock_state_changed': False,
1517 'lock_state_changed': False,
1517 'msg': ('Repo `%s` locked by `%s` on `%s`.'
1518 'msg': ('Repo `%s` locked by `%s` on `%s`.'
1518 % (repo.repo_name, lock_user.username,
1519 % (repo.repo_name, lock_user.username,
1519 json.dumps(time_to_datetime(_time))))
1520 json.dumps(time_to_datetime(_time))))
1520 }
1521 }
1521 return _d
1522 return _d
1522
1523
1523 # force locked state through a flag
1524 # force locked state through a flag
1524 else:
1525 else:
1525 locked = str2bool(locked)
1526 locked = str2bool(locked)
1526 lock_reason = Repository.LOCK_API
1527 lock_reason = Repository.LOCK_API
1527 try:
1528 try:
1528 if locked:
1529 if locked:
1529 lock_time = time.time()
1530 lock_time = time.time()
1530 Repository.lock(repo, user.user_id, lock_time, lock_reason)
1531 Repository.lock(repo, user.user_id, lock_time, lock_reason)
1531 else:
1532 else:
1532 lock_time = None
1533 lock_time = None
1533 Repository.unlock(repo)
1534 Repository.unlock(repo)
1534 _d = {
1535 _d = {
1535 'repo': repo.repo_name,
1536 'repo': repo.repo_name,
1536 'locked': locked,
1537 'locked': locked,
1537 'locked_since': lock_time,
1538 'locked_since': lock_time,
1538 'locked_by': user.username,
1539 'locked_by': user.username,
1539 'lock_reason': lock_reason,
1540 'lock_reason': lock_reason,
1540 'lock_state_changed': True,
1541 'lock_state_changed': True,
1541 'msg': ('User `%s` set lock state for repo `%s` to `%s`'
1542 'msg': ('User `%s` set lock state for repo `%s` to `%s`'
1542 % (user.username, repo.repo_name, locked))
1543 % (user.username, repo.repo_name, locked))
1543 }
1544 }
1544 return _d
1545 return _d
1545 except Exception:
1546 except Exception:
1546 log.exception(
1547 log.exception(
1547 "Exception occurred while trying to lock repository")
1548 "Exception occurred while trying to lock repository")
1548 raise JSONRPCError(
1549 raise JSONRPCError(
1549 'Error occurred locking repository `%s`' % repo.repo_name
1550 'Error occurred locking repository `%s`' % repo.repo_name
1550 )
1551 )
1551
1552
1552
1553
1553 @jsonrpc_method()
1554 @jsonrpc_method()
1554 def comment_commit(
1555 def comment_commit(
1555 request, apiuser, repoid, commit_id, message, status=Optional(None),
1556 request, apiuser, repoid, commit_id, message, status=Optional(None),
1556 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
1557 comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE),
1557 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
1558 resolves_comment_id=Optional(None), extra_recipients=Optional([]),
1558 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
1559 userid=Optional(OAttr('apiuser')), send_email=Optional(True)):
1559 """
1560 """
1560 Set a commit comment, and optionally change the status of the commit.
1561 Set a commit comment, and optionally change the status of the commit.
1561
1562
1562 :param apiuser: This is filled automatically from the |authtoken|.
1563 :param apiuser: This is filled automatically from the |authtoken|.
1563 :type apiuser: AuthUser
1564 :type apiuser: AuthUser
1564 :param repoid: Set the repository name or repository ID.
1565 :param repoid: Set the repository name or repository ID.
1565 :type repoid: str or int
1566 :type repoid: str or int
1566 :param commit_id: Specify the commit_id for which to set a comment.
1567 :param commit_id: Specify the commit_id for which to set a comment.
1567 :type commit_id: str
1568 :type commit_id: str
1568 :param message: The comment text.
1569 :param message: The comment text.
1569 :type message: str
1570 :type message: str
1570 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
1571 :param status: (**Optional**) status of commit, one of: 'not_reviewed',
1571 'approved', 'rejected', 'under_review'
1572 'approved', 'rejected', 'under_review'
1572 :type status: str
1573 :type status: str
1573 :param comment_type: Comment type, one of: 'note', 'todo'
1574 :param comment_type: Comment type, one of: 'note', 'todo'
1574 :type comment_type: Optional(str), default: 'note'
1575 :type comment_type: Optional(str), default: 'note'
1575 :param resolves_comment_id: id of comment which this one will resolve
1576 :param resolves_comment_id: id of comment which this one will resolve
1576 :type resolves_comment_id: Optional(int)
1577 :type resolves_comment_id: Optional(int)
1577 :param extra_recipients: list of user ids or usernames to add
1578 :param extra_recipients: list of user ids or usernames to add
1578 notifications for this comment. Acts like a CC for notification
1579 notifications for this comment. Acts like a CC for notification
1579 :type extra_recipients: Optional(list)
1580 :type extra_recipients: Optional(list)
1580 :param userid: Set the user name of the comment creator.
1581 :param userid: Set the user name of the comment creator.
1581 :type userid: Optional(str or int)
1582 :type userid: Optional(str or int)
1582 :param send_email: Define if this comment should also send email notification
1583 :param send_email: Define if this comment should also send email notification
1583 :type send_email: Optional(bool)
1584 :type send_email: Optional(bool)
1584
1585
1585 Example error output:
1586 Example error output:
1586
1587
1587 .. code-block:: bash
1588 .. code-block:: bash
1588
1589
1589 {
1590 {
1590 "id" : <id_given_in_input>,
1591 "id" : <id_given_in_input>,
1591 "result" : {
1592 "result" : {
1592 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
1593 "msg": "Commented on commit `<commit_id>` for repository `<repoid>`",
1593 "status_change": null or <status>,
1594 "status_change": null or <status>,
1594 "success": true
1595 "success": true
1595 },
1596 },
1596 "error" : null
1597 "error" : null
1597 }
1598 }
1598
1599
1599 """
1600 """
1600 _ = request.translate
1601 _ = request.translate
1601
1602
1602 repo = get_repo_or_error(repoid)
1603 repo = get_repo_or_error(repoid)
1603 if not has_superadmin_permission(apiuser):
1604 if not has_superadmin_permission(apiuser):
1604 _perms = ('repository.read', 'repository.write', 'repository.admin')
1605 _perms = ('repository.read', 'repository.write', 'repository.admin')
1605 validate_repo_permissions(apiuser, repoid, repo, _perms)
1606 validate_repo_permissions(apiuser, repoid, repo, _perms)
1606 db_repo_name = repo.repo_name
1607 db_repo_name = repo.repo_name
1607
1608
1608 try:
1609 try:
1609 commit = repo.scm_instance().get_commit(commit_id=commit_id)
1610 commit = repo.scm_instance().get_commit(commit_id=commit_id)
1610 commit_id = commit.raw_id
1611 commit_id = commit.raw_id
1611 except Exception as e:
1612 except Exception as e:
1612 log.exception('Failed to fetch commit')
1613 log.exception('Failed to fetch commit')
1613 raise JSONRPCError(safe_str(e))
1614 raise JSONRPCError(safe_str(e))
1614
1615
1615 if isinstance(userid, Optional):
1616 if isinstance(userid, Optional):
1616 userid = apiuser.user_id
1617 userid = apiuser.user_id
1617
1618
1618 user = get_user_or_error(userid)
1619 user = get_user_or_error(userid)
1619 status = Optional.extract(status)
1620 status = Optional.extract(status)
1620 comment_type = Optional.extract(comment_type)
1621 comment_type = Optional.extract(comment_type)
1621 resolves_comment_id = Optional.extract(resolves_comment_id)
1622 resolves_comment_id = Optional.extract(resolves_comment_id)
1622 extra_recipients = Optional.extract(extra_recipients)
1623 extra_recipients = Optional.extract(extra_recipients)
1623 send_email = Optional.extract(send_email, binary=True)
1624 send_email = Optional.extract(send_email, binary=True)
1624
1625
1625 allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES]
1626 allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES]
1626 if status and status not in allowed_statuses:
1627 if status and status not in allowed_statuses:
1627 raise JSONRPCError('Bad status, must be on '
1628 raise JSONRPCError('Bad status, must be on '
1628 'of %s got %s' % (allowed_statuses, status,))
1629 'of %s got %s' % (allowed_statuses, status,))
1629
1630
1630 if resolves_comment_id:
1631 if resolves_comment_id:
1631 comment = ChangesetComment.get(resolves_comment_id)
1632 comment = ChangesetComment.get(resolves_comment_id)
1632 if not comment:
1633 if not comment:
1633 raise JSONRPCError(
1634 raise JSONRPCError(
1634 'Invalid resolves_comment_id `%s` for this commit.'
1635 'Invalid resolves_comment_id `%s` for this commit.'
1635 % resolves_comment_id)
1636 % resolves_comment_id)
1636 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
1637 if comment.comment_type != ChangesetComment.COMMENT_TYPE_TODO:
1637 raise JSONRPCError(
1638 raise JSONRPCError(
1638 'Comment `%s` is wrong type for setting status to resolved.'
1639 'Comment `%s` is wrong type for setting status to resolved.'
1639 % resolves_comment_id)
1640 % resolves_comment_id)
1640
1641
1641 try:
1642 try:
1642 rc_config = SettingsModel().get_all_settings()
1643 rc_config = SettingsModel().get_all_settings()
1643 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
1644 renderer = rc_config.get('rhodecode_markup_renderer', 'rst')
1644 status_change_label = ChangesetStatus.get_status_lbl(status)
1645 status_change_label = ChangesetStatus.get_status_lbl(status)
1645 comment = CommentsModel().create(
1646 comment = CommentsModel().create(
1646 message, repo, user, commit_id=commit_id,
1647 message, repo, user, commit_id=commit_id,
1647 status_change=status_change_label,
1648 status_change=status_change_label,
1648 status_change_type=status,
1649 status_change_type=status,
1649 renderer=renderer,
1650 renderer=renderer,
1650 comment_type=comment_type,
1651 comment_type=comment_type,
1651 resolves_comment_id=resolves_comment_id,
1652 resolves_comment_id=resolves_comment_id,
1652 auth_user=apiuser,
1653 auth_user=apiuser,
1653 extra_recipients=extra_recipients,
1654 extra_recipients=extra_recipients,
1654 send_email=send_email
1655 send_email=send_email
1655 )
1656 )
1656 is_inline = comment.is_inline
1657 is_inline = comment.is_inline
1657
1658
1658 if status:
1659 if status:
1659 # also do a status change
1660 # also do a status change
1660 try:
1661 try:
1661 ChangesetStatusModel().set_status(
1662 ChangesetStatusModel().set_status(
1662 repo, status, user, comment, revision=commit_id,
1663 repo, status, user, comment, revision=commit_id,
1663 dont_allow_on_closed_pull_request=True
1664 dont_allow_on_closed_pull_request=True
1664 )
1665 )
1665 except StatusChangeOnClosedPullRequestError:
1666 except StatusChangeOnClosedPullRequestError:
1666 log.exception(
1667 log.exception(
1667 "Exception occurred while trying to change repo commit status")
1668 "Exception occurred while trying to change repo commit status")
1668 msg = ('Changing status on a commit associated with '
1669 msg = ('Changing status on a commit associated with '
1669 'a closed pull request is not allowed')
1670 'a closed pull request is not allowed')
1670 raise JSONRPCError(msg)
1671 raise JSONRPCError(msg)
1671
1672
1672 CommentsModel().trigger_commit_comment_hook(
1673 CommentsModel().trigger_commit_comment_hook(
1673 repo, apiuser, 'create',
1674 repo, apiuser, 'create',
1674 data={'comment': comment, 'commit': commit})
1675 data={'comment': comment, 'commit': commit})
1675
1676
1676 Session().commit()
1677 Session().commit()
1677
1678
1678 comment_broadcast_channel = channelstream.comment_channel(
1679 comment_broadcast_channel = channelstream.comment_channel(
1679 db_repo_name, commit_obj=commit)
1680 db_repo_name, commit_obj=commit)
1680
1681
1681 comment_data = {'comment': comment, 'comment_id': comment.comment_id}
1682 comment_data = {'comment': comment, 'comment_id': comment.comment_id}
1682 comment_type = 'inline' if is_inline else 'general'
1683 comment_type = 'inline' if is_inline else 'general'
1683 channelstream.comment_channelstream_push(
1684 channelstream.comment_channelstream_push(
1684 request, comment_broadcast_channel, apiuser,
1685 request, comment_broadcast_channel, apiuser,
1685 _('posted a new {} comment').format(comment_type),
1686 _('posted a new {} comment').format(comment_type),
1686 comment_data=comment_data)
1687 comment_data=comment_data)
1687
1688
1688 return {
1689 return {
1689 'msg': (
1690 'msg': (
1690 'Commented on commit `%s` for repository `%s`' % (
1691 'Commented on commit `%s` for repository `%s`' % (
1691 comment.revision, repo.repo_name)),
1692 comment.revision, repo.repo_name)),
1692 'status_change': status,
1693 'status_change': status,
1693 'success': True,
1694 'success': True,
1694 }
1695 }
1695 except JSONRPCError:
1696 except JSONRPCError:
1696 # catch any inside errors, and re-raise them to prevent from
1697 # catch any inside errors, and re-raise them to prevent from
1697 # below global catch to silence them
1698 # below global catch to silence them
1698 raise
1699 raise
1699 except Exception:
1700 except Exception:
1700 log.exception("Exception occurred while trying to comment on commit")
1701 log.exception("Exception occurred while trying to comment on commit")
1701 raise JSONRPCError(
1702 raise JSONRPCError(
1702 'failed to set comment on repository `%s`' % (repo.repo_name,)
1703 'failed to set comment on repository `%s`' % (repo.repo_name,)
1703 )
1704 )
1704
1705
1705
1706
1706 @jsonrpc_method()
1707 @jsonrpc_method()
1707 def get_repo_comments(request, apiuser, repoid,
1708 def get_repo_comments(request, apiuser, repoid,
1708 commit_id=Optional(None), comment_type=Optional(None),
1709 commit_id=Optional(None), comment_type=Optional(None),
1709 userid=Optional(None)):
1710 userid=Optional(None)):
1710 """
1711 """
1711 Get all comments for a repository
1712 Get all comments for a repository
1712
1713
1713 :param apiuser: This is filled automatically from the |authtoken|.
1714 :param apiuser: This is filled automatically from the |authtoken|.
1714 :type apiuser: AuthUser
1715 :type apiuser: AuthUser
1715 :param repoid: Set the repository name or repository ID.
1716 :param repoid: Set the repository name or repository ID.
1716 :type repoid: str or int
1717 :type repoid: str or int
1717 :param commit_id: Optionally filter the comments by the commit_id
1718 :param commit_id: Optionally filter the comments by the commit_id
1718 :type commit_id: Optional(str), default: None
1719 :type commit_id: Optional(str), default: None
1719 :param comment_type: Optionally filter the comments by the comment_type
1720 :param comment_type: Optionally filter the comments by the comment_type
1720 one of: 'note', 'todo'
1721 one of: 'note', 'todo'
1721 :type comment_type: Optional(str), default: None
1722 :type comment_type: Optional(str), default: None
1722 :param userid: Optionally filter the comments by the author of comment
1723 :param userid: Optionally filter the comments by the author of comment
1723 :type userid: Optional(str or int), Default: None
1724 :type userid: Optional(str or int), Default: None
1724
1725
1725 Example error output:
1726 Example error output:
1726
1727
1727 .. code-block:: bash
1728 .. code-block:: bash
1728
1729
1729 {
1730 {
1730 "id" : <id_given_in_input>,
1731 "id" : <id_given_in_input>,
1731 "result" : [
1732 "result" : [
1732 {
1733 {
1733 "comment_author": <USER_DETAILS>,
1734 "comment_author": <USER_DETAILS>,
1734 "comment_created_on": "2017-02-01T14:38:16.309",
1735 "comment_created_on": "2017-02-01T14:38:16.309",
1735 "comment_f_path": "file.txt",
1736 "comment_f_path": "file.txt",
1736 "comment_id": 282,
1737 "comment_id": 282,
1737 "comment_lineno": "n1",
1738 "comment_lineno": "n1",
1738 "comment_resolved_by": null,
1739 "comment_resolved_by": null,
1739 "comment_status": [],
1740 "comment_status": [],
1740 "comment_text": "This file needs a header",
1741 "comment_text": "This file needs a header",
1741 "comment_type": "todo",
1742 "comment_type": "todo",
1742 "comment_last_version: 0
1743 "comment_last_version: 0
1743 }
1744 }
1744 ],
1745 ],
1745 "error" : null
1746 "error" : null
1746 }
1747 }
1747
1748
1748 """
1749 """
1749 repo = get_repo_or_error(repoid)
1750 repo = get_repo_or_error(repoid)
1750 if not has_superadmin_permission(apiuser):
1751 if not has_superadmin_permission(apiuser):
1751 _perms = ('repository.read', 'repository.write', 'repository.admin')
1752 _perms = ('repository.read', 'repository.write', 'repository.admin')
1752 validate_repo_permissions(apiuser, repoid, repo, _perms)
1753 validate_repo_permissions(apiuser, repoid, repo, _perms)
1753
1754
1754 commit_id = Optional.extract(commit_id)
1755 commit_id = Optional.extract(commit_id)
1755
1756
1756 userid = Optional.extract(userid)
1757 userid = Optional.extract(userid)
1757 if userid:
1758 if userid:
1758 user = get_user_or_error(userid)
1759 user = get_user_or_error(userid)
1759 else:
1760 else:
1760 user = None
1761 user = None
1761
1762
1762 comment_type = Optional.extract(comment_type)
1763 comment_type = Optional.extract(comment_type)
1763 if comment_type and comment_type not in ChangesetComment.COMMENT_TYPES:
1764 if comment_type and comment_type not in ChangesetComment.COMMENT_TYPES:
1764 raise JSONRPCError(
1765 raise JSONRPCError(
1765 'comment_type must be one of `{}` got {}'.format(
1766 'comment_type must be one of `{}` got {}'.format(
1766 ChangesetComment.COMMENT_TYPES, comment_type)
1767 ChangesetComment.COMMENT_TYPES, comment_type)
1767 )
1768 )
1768
1769
1769 comments = CommentsModel().get_repository_comments(
1770 comments = CommentsModel().get_repository_comments(
1770 repo=repo, comment_type=comment_type, user=user, commit_id=commit_id)
1771 repo=repo, comment_type=comment_type, user=user, commit_id=commit_id)
1771 return comments
1772 return comments
1772
1773
1773
1774
1774 @jsonrpc_method()
1775 @jsonrpc_method()
1775 def get_comment(request, apiuser, comment_id):
1776 def get_comment(request, apiuser, comment_id):
1776 """
1777 """
1777 Get single comment from repository or pull_request
1778 Get single comment from repository or pull_request
1778
1779
1779 :param apiuser: This is filled automatically from the |authtoken|.
1780 :param apiuser: This is filled automatically from the |authtoken|.
1780 :type apiuser: AuthUser
1781 :type apiuser: AuthUser
1781 :param comment_id: comment id found in the URL of comment
1782 :param comment_id: comment id found in the URL of comment
1782 :type comment_id: str or int
1783 :type comment_id: str or int
1783
1784
1784 Example error output:
1785 Example error output:
1785
1786
1786 .. code-block:: bash
1787 .. code-block:: bash
1787
1788
1788 {
1789 {
1789 "id" : <id_given_in_input>,
1790 "id" : <id_given_in_input>,
1790 "result" : {
1791 "result" : {
1791 "comment_author": <USER_DETAILS>,
1792 "comment_author": <USER_DETAILS>,
1792 "comment_created_on": "2017-02-01T14:38:16.309",
1793 "comment_created_on": "2017-02-01T14:38:16.309",
1793 "comment_f_path": "file.txt",
1794 "comment_f_path": "file.txt",
1794 "comment_id": 282,
1795 "comment_id": 282,
1795 "comment_lineno": "n1",
1796 "comment_lineno": "n1",
1796 "comment_resolved_by": null,
1797 "comment_resolved_by": null,
1797 "comment_status": [],
1798 "comment_status": [],
1798 "comment_text": "This file needs a header",
1799 "comment_text": "This file needs a header",
1799 "comment_type": "todo",
1800 "comment_type": "todo",
1800 "comment_last_version: 0
1801 "comment_last_version: 0
1801 },
1802 },
1802 "error" : null
1803 "error" : null
1803 }
1804 }
1804
1805
1805 """
1806 """
1806
1807
1807 comment = ChangesetComment.get(comment_id)
1808 comment = ChangesetComment.get(comment_id)
1808 if not comment:
1809 if not comment:
1809 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1810 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1810
1811
1811 perms = ('repository.read', 'repository.write', 'repository.admin')
1812 perms = ('repository.read', 'repository.write', 'repository.admin')
1812 has_comment_perm = HasRepoPermissionAnyApi(*perms)\
1813 has_comment_perm = HasRepoPermissionAnyApi(*perms)\
1813 (user=apiuser, repo_name=comment.repo.repo_name)
1814 (user=apiuser, repo_name=comment.repo.repo_name)
1814
1815
1815 if not has_comment_perm:
1816 if not has_comment_perm:
1816 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1817 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1817
1818
1818 return comment
1819 return comment
1819
1820
1820
1821
1821 @jsonrpc_method()
1822 @jsonrpc_method()
1822 def edit_comment(request, apiuser, message, comment_id, version,
1823 def edit_comment(request, apiuser, message, comment_id, version,
1823 userid=Optional(OAttr('apiuser'))):
1824 userid=Optional(OAttr('apiuser'))):
1824 """
1825 """
1825 Edit comment on the pull request or commit,
1826 Edit comment on the pull request or commit,
1826 specified by the `comment_id` and version. Initially version should be 0
1827 specified by the `comment_id` and version. Initially version should be 0
1827
1828
1828 :param apiuser: This is filled automatically from the |authtoken|.
1829 :param apiuser: This is filled automatically from the |authtoken|.
1829 :type apiuser: AuthUser
1830 :type apiuser: AuthUser
1830 :param comment_id: Specify the comment_id for editing
1831 :param comment_id: Specify the comment_id for editing
1831 :type comment_id: int
1832 :type comment_id: int
1832 :param version: version of the comment that will be created, starts from 0
1833 :param version: version of the comment that will be created, starts from 0
1833 :type version: int
1834 :type version: int
1834 :param message: The text content of the comment.
1835 :param message: The text content of the comment.
1835 :type message: str
1836 :type message: str
1836 :param userid: Comment on the pull request as this user
1837 :param userid: Comment on the pull request as this user
1837 :type userid: Optional(str or int)
1838 :type userid: Optional(str or int)
1838
1839
1839 Example output:
1840 Example output:
1840
1841
1841 .. code-block:: bash
1842 .. code-block:: bash
1842
1843
1843 id : <id_given_in_input>
1844 id : <id_given_in_input>
1844 result : {
1845 result : {
1845 "comment": "<comment data>",
1846 "comment": "<comment data>",
1846 "version": "<Integer>",
1847 "version": "<Integer>",
1847 },
1848 },
1848 error : null
1849 error : null
1849 """
1850 """
1850
1851
1851 auth_user = apiuser
1852 auth_user = apiuser
1852 comment = ChangesetComment.get(comment_id)
1853 comment = ChangesetComment.get(comment_id)
1853 if not comment:
1854 if not comment:
1854 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1855 raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1855
1856
1856 is_super_admin = has_superadmin_permission(apiuser)
1857 is_super_admin = has_superadmin_permission(apiuser)
1857 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1858 is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1858 (user=apiuser, repo_name=comment.repo.repo_name)
1859 (user=apiuser, repo_name=comment.repo.repo_name)
1859
1860
1860 if not isinstance(userid, Optional):
1861 if not isinstance(userid, Optional):
1861 if is_super_admin or is_repo_admin:
1862 if is_super_admin or is_repo_admin:
1862 apiuser = get_user_or_error(userid)
1863 apiuser = get_user_or_error(userid)
1863 auth_user = apiuser.AuthUser()
1864 auth_user = apiuser.AuthUser()
1864 else:
1865 else:
1865 raise JSONRPCError('userid is not the same as your user')
1866 raise JSONRPCError('userid is not the same as your user')
1866
1867
1867 comment_author = comment.author.user_id == auth_user.user_id
1868 comment_author = comment.author.user_id == auth_user.user_id
1868 if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1869 if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1869 raise JSONRPCError("you don't have access to edit this comment")
1870 raise JSONRPCError("you don't have access to edit this comment")
1870
1871
1871 try:
1872 try:
1872 comment_history = CommentsModel().edit(
1873 comment_history = CommentsModel().edit(
1873 comment_id=comment_id,
1874 comment_id=comment_id,
1874 text=message,
1875 text=message,
1875 auth_user=auth_user,
1876 auth_user=auth_user,
1876 version=version,
1877 version=version,
1877 )
1878 )
1878 Session().commit()
1879 Session().commit()
1879 except CommentVersionMismatch:
1880 except CommentVersionMismatch:
1880 raise JSONRPCError(
1881 raise JSONRPCError(
1881 'comment ({}) version ({}) mismatch'.format(comment_id, version)
1882 'comment ({}) version ({}) mismatch'.format(comment_id, version)
1882 )
1883 )
1883 if not comment_history and not message:
1884 if not comment_history and not message:
1884 raise JSONRPCError(
1885 raise JSONRPCError(
1885 "comment ({}) can't be changed with empty string".format(comment_id)
1886 "comment ({}) can't be changed with empty string".format(comment_id)
1886 )
1887 )
1887
1888
1888 if comment.pull_request:
1889 if comment.pull_request:
1889 pull_request = comment.pull_request
1890 pull_request = comment.pull_request
1890 PullRequestModel().trigger_pull_request_hook(
1891 PullRequestModel().trigger_pull_request_hook(
1891 pull_request, apiuser, 'comment_edit',
1892 pull_request, apiuser, 'comment_edit',
1892 data={'comment': comment})
1893 data={'comment': comment})
1893 else:
1894 else:
1894 db_repo = comment.repo
1895 db_repo = comment.repo
1895 commit_id = comment.revision
1896 commit_id = comment.revision
1896 commit = db_repo.get_commit(commit_id)
1897 commit = db_repo.get_commit(commit_id)
1897 CommentsModel().trigger_commit_comment_hook(
1898 CommentsModel().trigger_commit_comment_hook(
1898 db_repo, apiuser, 'edit',
1899 db_repo, apiuser, 'edit',
1899 data={'comment': comment, 'commit': commit})
1900 data={'comment': comment, 'commit': commit})
1900
1901
1901 data = {
1902 data = {
1902 'comment': comment,
1903 'comment': comment,
1903 'version': comment_history.version if comment_history else None,
1904 'version': comment_history.version if comment_history else None,
1904 }
1905 }
1905 return data
1906 return data
1906
1907
1907
1908
1908 # TODO(marcink): write this with all required logic for deleting a comments in PR or commits
1909 # TODO(marcink): write this with all required logic for deleting a comments in PR or commits
1909 # @jsonrpc_method()
1910 # @jsonrpc_method()
1910 # def delete_comment(request, apiuser, comment_id):
1911 # def delete_comment(request, apiuser, comment_id):
1911 # auth_user = apiuser
1912 # auth_user = apiuser
1912 #
1913 #
1913 # comment = ChangesetComment.get(comment_id)
1914 # comment = ChangesetComment.get(comment_id)
1914 # if not comment:
1915 # if not comment:
1915 # raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1916 # raise JSONRPCError('comment `%s` does not exist' % (comment_id,))
1916 #
1917 #
1917 # is_super_admin = has_superadmin_permission(apiuser)
1918 # is_super_admin = has_superadmin_permission(apiuser)
1918 # is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1919 # is_repo_admin = HasRepoPermissionAnyApi('repository.admin')\
1919 # (user=apiuser, repo_name=comment.repo.repo_name)
1920 # (user=apiuser, repo_name=comment.repo.repo_name)
1920 #
1921 #
1921 # comment_author = comment.author.user_id == auth_user.user_id
1922 # comment_author = comment.author.user_id == auth_user.user_id
1922 # if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1923 # if not (comment.immutable is False and (is_super_admin or is_repo_admin) or comment_author):
1923 # raise JSONRPCError("you don't have access to edit this comment")
1924 # raise JSONRPCError("you don't have access to edit this comment")
1924
1925
1925 @jsonrpc_method()
1926 @jsonrpc_method()
1926 def grant_user_permission(request, apiuser, repoid, userid, perm):
1927 def grant_user_permission(request, apiuser, repoid, userid, perm):
1927 """
1928 """
1928 Grant permissions for the specified user on the given repository,
1929 Grant permissions for the specified user on the given repository,
1929 or update existing permissions if found.
1930 or update existing permissions if found.
1930
1931
1931 This command can only be run using an |authtoken| with admin
1932 This command can only be run using an |authtoken| with admin
1932 permissions on the |repo|.
1933 permissions on the |repo|.
1933
1934
1934 :param apiuser: This is filled automatically from the |authtoken|.
1935 :param apiuser: This is filled automatically from the |authtoken|.
1935 :type apiuser: AuthUser
1936 :type apiuser: AuthUser
1936 :param repoid: Set the repository name or repository ID.
1937 :param repoid: Set the repository name or repository ID.
1937 :type repoid: str or int
1938 :type repoid: str or int
1938 :param userid: Set the user name.
1939 :param userid: Set the user name.
1939 :type userid: str
1940 :type userid: str
1940 :param perm: Set the user permissions, using the following format
1941 :param perm: Set the user permissions, using the following format
1941 ``(repository.(none|read|write|admin))``
1942 ``(repository.(none|read|write|admin))``
1942 :type perm: str
1943 :type perm: str
1943
1944
1944 Example output:
1945 Example output:
1945
1946
1946 .. code-block:: bash
1947 .. code-block:: bash
1947
1948
1948 id : <id_given_in_input>
1949 id : <id_given_in_input>
1949 result: {
1950 result: {
1950 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
1951 "msg" : "Granted perm: `<perm>` for user: `<username>` in repo: `<reponame>`",
1951 "success": true
1952 "success": true
1952 }
1953 }
1953 error: null
1954 error: null
1954 """
1955 """
1955
1956
1956 repo = get_repo_or_error(repoid)
1957 repo = get_repo_or_error(repoid)
1957 user = get_user_or_error(userid)
1958 user = get_user_or_error(userid)
1958 perm = get_perm_or_error(perm)
1959 perm = get_perm_or_error(perm)
1959 if not has_superadmin_permission(apiuser):
1960 if not has_superadmin_permission(apiuser):
1960 _perms = ('repository.admin',)
1961 _perms = ('repository.admin',)
1961 validate_repo_permissions(apiuser, repoid, repo, _perms)
1962 validate_repo_permissions(apiuser, repoid, repo, _perms)
1962
1963
1963 perm_additions = [[user.user_id, perm.permission_name, "user"]]
1964 perm_additions = [[user.user_id, perm.permission_name, "user"]]
1964 try:
1965 try:
1965 changes = RepoModel().update_permissions(
1966 changes = RepoModel().update_permissions(
1966 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
1967 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
1967
1968
1968 action_data = {
1969 action_data = {
1969 'added': changes['added'],
1970 'added': changes['added'],
1970 'updated': changes['updated'],
1971 'updated': changes['updated'],
1971 'deleted': changes['deleted'],
1972 'deleted': changes['deleted'],
1972 }
1973 }
1973 audit_logger.store_api(
1974 audit_logger.store_api(
1974 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
1975 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
1975 Session().commit()
1976 Session().commit()
1976 PermissionModel().flush_user_permission_caches(changes)
1977 PermissionModel().flush_user_permission_caches(changes)
1977
1978
1978 return {
1979 return {
1979 'msg': 'Granted perm: `%s` for user: `%s` in repo: `%s`' % (
1980 'msg': 'Granted perm: `%s` for user: `%s` in repo: `%s`' % (
1980 perm.permission_name, user.username, repo.repo_name
1981 perm.permission_name, user.username, repo.repo_name
1981 ),
1982 ),
1982 'success': True
1983 'success': True
1983 }
1984 }
1984 except Exception:
1985 except Exception:
1985 log.exception("Exception occurred while trying edit permissions for repo")
1986 log.exception("Exception occurred while trying edit permissions for repo")
1986 raise JSONRPCError(
1987 raise JSONRPCError(
1987 'failed to edit permission for user: `%s` in repo: `%s`' % (
1988 'failed to edit permission for user: `%s` in repo: `%s`' % (
1988 userid, repoid
1989 userid, repoid
1989 )
1990 )
1990 )
1991 )
1991
1992
1992
1993
1993 @jsonrpc_method()
1994 @jsonrpc_method()
1994 def revoke_user_permission(request, apiuser, repoid, userid):
1995 def revoke_user_permission(request, apiuser, repoid, userid):
1995 """
1996 """
1996 Revoke permission for a user on the specified repository.
1997 Revoke permission for a user on the specified repository.
1997
1998
1998 This command can only be run using an |authtoken| with admin
1999 This command can only be run using an |authtoken| with admin
1999 permissions on the |repo|.
2000 permissions on the |repo|.
2000
2001
2001 :param apiuser: This is filled automatically from the |authtoken|.
2002 :param apiuser: This is filled automatically from the |authtoken|.
2002 :type apiuser: AuthUser
2003 :type apiuser: AuthUser
2003 :param repoid: Set the repository name or repository ID.
2004 :param repoid: Set the repository name or repository ID.
2004 :type repoid: str or int
2005 :type repoid: str or int
2005 :param userid: Set the user name of revoked user.
2006 :param userid: Set the user name of revoked user.
2006 :type userid: str or int
2007 :type userid: str or int
2007
2008
2008 Example error output:
2009 Example error output:
2009
2010
2010 .. code-block:: bash
2011 .. code-block:: bash
2011
2012
2012 id : <id_given_in_input>
2013 id : <id_given_in_input>
2013 result: {
2014 result: {
2014 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
2015 "msg" : "Revoked perm for user: `<username>` in repo: `<reponame>`",
2015 "success": true
2016 "success": true
2016 }
2017 }
2017 error: null
2018 error: null
2018 """
2019 """
2019
2020
2020 repo = get_repo_or_error(repoid)
2021 repo = get_repo_or_error(repoid)
2021 user = get_user_or_error(userid)
2022 user = get_user_or_error(userid)
2022 if not has_superadmin_permission(apiuser):
2023 if not has_superadmin_permission(apiuser):
2023 _perms = ('repository.admin',)
2024 _perms = ('repository.admin',)
2024 validate_repo_permissions(apiuser, repoid, repo, _perms)
2025 validate_repo_permissions(apiuser, repoid, repo, _perms)
2025
2026
2026 perm_deletions = [[user.user_id, None, "user"]]
2027 perm_deletions = [[user.user_id, None, "user"]]
2027 try:
2028 try:
2028 changes = RepoModel().update_permissions(
2029 changes = RepoModel().update_permissions(
2029 repo=repo, perm_deletions=perm_deletions, cur_user=user)
2030 repo=repo, perm_deletions=perm_deletions, cur_user=user)
2030
2031
2031 action_data = {
2032 action_data = {
2032 'added': changes['added'],
2033 'added': changes['added'],
2033 'updated': changes['updated'],
2034 'updated': changes['updated'],
2034 'deleted': changes['deleted'],
2035 'deleted': changes['deleted'],
2035 }
2036 }
2036 audit_logger.store_api(
2037 audit_logger.store_api(
2037 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2038 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2038 Session().commit()
2039 Session().commit()
2039 PermissionModel().flush_user_permission_caches(changes)
2040 PermissionModel().flush_user_permission_caches(changes)
2040
2041
2041 return {
2042 return {
2042 'msg': 'Revoked perm for user: `%s` in repo: `%s`' % (
2043 'msg': 'Revoked perm for user: `%s` in repo: `%s`' % (
2043 user.username, repo.repo_name
2044 user.username, repo.repo_name
2044 ),
2045 ),
2045 'success': True
2046 'success': True
2046 }
2047 }
2047 except Exception:
2048 except Exception:
2048 log.exception("Exception occurred while trying revoke permissions to repo")
2049 log.exception("Exception occurred while trying revoke permissions to repo")
2049 raise JSONRPCError(
2050 raise JSONRPCError(
2050 'failed to edit permission for user: `%s` in repo: `%s`' % (
2051 'failed to edit permission for user: `%s` in repo: `%s`' % (
2051 userid, repoid
2052 userid, repoid
2052 )
2053 )
2053 )
2054 )
2054
2055
2055
2056
2056 @jsonrpc_method()
2057 @jsonrpc_method()
2057 def grant_user_group_permission(request, apiuser, repoid, usergroupid, perm):
2058 def grant_user_group_permission(request, apiuser, repoid, usergroupid, perm):
2058 """
2059 """
2059 Grant permission for a user group on the specified repository,
2060 Grant permission for a user group on the specified repository,
2060 or update existing permissions.
2061 or update existing permissions.
2061
2062
2062 This command can only be run using an |authtoken| with admin
2063 This command can only be run using an |authtoken| with admin
2063 permissions on the |repo|.
2064 permissions on the |repo|.
2064
2065
2065 :param apiuser: This is filled automatically from the |authtoken|.
2066 :param apiuser: This is filled automatically from the |authtoken|.
2066 :type apiuser: AuthUser
2067 :type apiuser: AuthUser
2067 :param repoid: Set the repository name or repository ID.
2068 :param repoid: Set the repository name or repository ID.
2068 :type repoid: str or int
2069 :type repoid: str or int
2069 :param usergroupid: Specify the ID of the user group.
2070 :param usergroupid: Specify the ID of the user group.
2070 :type usergroupid: str or int
2071 :type usergroupid: str or int
2071 :param perm: Set the user group permissions using the following
2072 :param perm: Set the user group permissions using the following
2072 format: (repository.(none|read|write|admin))
2073 format: (repository.(none|read|write|admin))
2073 :type perm: str
2074 :type perm: str
2074
2075
2075 Example output:
2076 Example output:
2076
2077
2077 .. code-block:: bash
2078 .. code-block:: bash
2078
2079
2079 id : <id_given_in_input>
2080 id : <id_given_in_input>
2080 result : {
2081 result : {
2081 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
2082 "msg" : "Granted perm: `<perm>` for group: `<usersgroupname>` in repo: `<reponame>`",
2082 "success": true
2083 "success": true
2083
2084
2084 }
2085 }
2085 error : null
2086 error : null
2086
2087
2087 Example error output:
2088 Example error output:
2088
2089
2089 .. code-block:: bash
2090 .. code-block:: bash
2090
2091
2091 id : <id_given_in_input>
2092 id : <id_given_in_input>
2092 result : null
2093 result : null
2093 error : {
2094 error : {
2094 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
2095 "failed to edit permission for user group: `<usergroup>` in repo `<repo>`'
2095 }
2096 }
2096
2097
2097 """
2098 """
2098
2099
2099 repo = get_repo_or_error(repoid)
2100 repo = get_repo_or_error(repoid)
2100 perm = get_perm_or_error(perm)
2101 perm = get_perm_or_error(perm)
2101 if not has_superadmin_permission(apiuser):
2102 if not has_superadmin_permission(apiuser):
2102 _perms = ('repository.admin',)
2103 _perms = ('repository.admin',)
2103 validate_repo_permissions(apiuser, repoid, repo, _perms)
2104 validate_repo_permissions(apiuser, repoid, repo, _perms)
2104
2105
2105 user_group = get_user_group_or_error(usergroupid)
2106 user_group = get_user_group_or_error(usergroupid)
2106 if not has_superadmin_permission(apiuser):
2107 if not has_superadmin_permission(apiuser):
2107 # check if we have at least read permission for this user group !
2108 # check if we have at least read permission for this user group !
2108 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2109 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2109 if not HasUserGroupPermissionAnyApi(*_perms)(
2110 if not HasUserGroupPermissionAnyApi(*_perms)(
2110 user=apiuser, user_group_name=user_group.users_group_name):
2111 user=apiuser, user_group_name=user_group.users_group_name):
2111 raise JSONRPCError(
2112 raise JSONRPCError(
2112 'user group `%s` does not exist' % (usergroupid,))
2113 'user group `%s` does not exist' % (usergroupid,))
2113
2114
2114 perm_additions = [[user_group.users_group_id, perm.permission_name, "user_group"]]
2115 perm_additions = [[user_group.users_group_id, perm.permission_name, "user_group"]]
2115 try:
2116 try:
2116 changes = RepoModel().update_permissions(
2117 changes = RepoModel().update_permissions(
2117 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
2118 repo=repo, perm_additions=perm_additions, cur_user=apiuser)
2118 action_data = {
2119 action_data = {
2119 'added': changes['added'],
2120 'added': changes['added'],
2120 'updated': changes['updated'],
2121 'updated': changes['updated'],
2121 'deleted': changes['deleted'],
2122 'deleted': changes['deleted'],
2122 }
2123 }
2123 audit_logger.store_api(
2124 audit_logger.store_api(
2124 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2125 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2125 Session().commit()
2126 Session().commit()
2126 PermissionModel().flush_user_permission_caches(changes)
2127 PermissionModel().flush_user_permission_caches(changes)
2127
2128
2128 return {
2129 return {
2129 'msg': 'Granted perm: `%s` for user group: `%s` in '
2130 'msg': 'Granted perm: `%s` for user group: `%s` in '
2130 'repo: `%s`' % (
2131 'repo: `%s`' % (
2131 perm.permission_name, user_group.users_group_name,
2132 perm.permission_name, user_group.users_group_name,
2132 repo.repo_name
2133 repo.repo_name
2133 ),
2134 ),
2134 'success': True
2135 'success': True
2135 }
2136 }
2136 except Exception:
2137 except Exception:
2137 log.exception(
2138 log.exception(
2138 "Exception occurred while trying change permission on repo")
2139 "Exception occurred while trying change permission on repo")
2139 raise JSONRPCError(
2140 raise JSONRPCError(
2140 'failed to edit permission for user group: `%s` in '
2141 'failed to edit permission for user group: `%s` in '
2141 'repo: `%s`' % (
2142 'repo: `%s`' % (
2142 usergroupid, repo.repo_name
2143 usergroupid, repo.repo_name
2143 )
2144 )
2144 )
2145 )
2145
2146
2146
2147
2147 @jsonrpc_method()
2148 @jsonrpc_method()
2148 def revoke_user_group_permission(request, apiuser, repoid, usergroupid):
2149 def revoke_user_group_permission(request, apiuser, repoid, usergroupid):
2149 """
2150 """
2150 Revoke the permissions of a user group on a given repository.
2151 Revoke the permissions of a user group on a given repository.
2151
2152
2152 This command can only be run using an |authtoken| with admin
2153 This command can only be run using an |authtoken| with admin
2153 permissions on the |repo|.
2154 permissions on the |repo|.
2154
2155
2155 :param apiuser: This is filled automatically from the |authtoken|.
2156 :param apiuser: This is filled automatically from the |authtoken|.
2156 :type apiuser: AuthUser
2157 :type apiuser: AuthUser
2157 :param repoid: Set the repository name or repository ID.
2158 :param repoid: Set the repository name or repository ID.
2158 :type repoid: str or int
2159 :type repoid: str or int
2159 :param usergroupid: Specify the user group ID.
2160 :param usergroupid: Specify the user group ID.
2160 :type usergroupid: str or int
2161 :type usergroupid: str or int
2161
2162
2162 Example output:
2163 Example output:
2163
2164
2164 .. code-block:: bash
2165 .. code-block:: bash
2165
2166
2166 id : <id_given_in_input>
2167 id : <id_given_in_input>
2167 result: {
2168 result: {
2168 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
2169 "msg" : "Revoked perm for group: `<usersgroupname>` in repo: `<reponame>`",
2169 "success": true
2170 "success": true
2170 }
2171 }
2171 error: null
2172 error: null
2172 """
2173 """
2173
2174
2174 repo = get_repo_or_error(repoid)
2175 repo = get_repo_or_error(repoid)
2175 if not has_superadmin_permission(apiuser):
2176 if not has_superadmin_permission(apiuser):
2176 _perms = ('repository.admin',)
2177 _perms = ('repository.admin',)
2177 validate_repo_permissions(apiuser, repoid, repo, _perms)
2178 validate_repo_permissions(apiuser, repoid, repo, _perms)
2178
2179
2179 user_group = get_user_group_or_error(usergroupid)
2180 user_group = get_user_group_or_error(usergroupid)
2180 if not has_superadmin_permission(apiuser):
2181 if not has_superadmin_permission(apiuser):
2181 # check if we have at least read permission for this user group !
2182 # check if we have at least read permission for this user group !
2182 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2183 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
2183 if not HasUserGroupPermissionAnyApi(*_perms)(
2184 if not HasUserGroupPermissionAnyApi(*_perms)(
2184 user=apiuser, user_group_name=user_group.users_group_name):
2185 user=apiuser, user_group_name=user_group.users_group_name):
2185 raise JSONRPCError(
2186 raise JSONRPCError(
2186 'user group `%s` does not exist' % (usergroupid,))
2187 'user group `%s` does not exist' % (usergroupid,))
2187
2188
2188 perm_deletions = [[user_group.users_group_id, None, "user_group"]]
2189 perm_deletions = [[user_group.users_group_id, None, "user_group"]]
2189 try:
2190 try:
2190 changes = RepoModel().update_permissions(
2191 changes = RepoModel().update_permissions(
2191 repo=repo, perm_deletions=perm_deletions, cur_user=apiuser)
2192 repo=repo, perm_deletions=perm_deletions, cur_user=apiuser)
2192 action_data = {
2193 action_data = {
2193 'added': changes['added'],
2194 'added': changes['added'],
2194 'updated': changes['updated'],
2195 'updated': changes['updated'],
2195 'deleted': changes['deleted'],
2196 'deleted': changes['deleted'],
2196 }
2197 }
2197 audit_logger.store_api(
2198 audit_logger.store_api(
2198 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2199 'repo.edit.permissions', action_data=action_data, user=apiuser, repo=repo)
2199 Session().commit()
2200 Session().commit()
2200 PermissionModel().flush_user_permission_caches(changes)
2201 PermissionModel().flush_user_permission_caches(changes)
2201
2202
2202 return {
2203 return {
2203 'msg': 'Revoked perm for user group: `%s` in repo: `%s`' % (
2204 'msg': 'Revoked perm for user group: `%s` in repo: `%s`' % (
2204 user_group.users_group_name, repo.repo_name
2205 user_group.users_group_name, repo.repo_name
2205 ),
2206 ),
2206 'success': True
2207 'success': True
2207 }
2208 }
2208 except Exception:
2209 except Exception:
2209 log.exception("Exception occurred while trying revoke "
2210 log.exception("Exception occurred while trying revoke "
2210 "user group permission on repo")
2211 "user group permission on repo")
2211 raise JSONRPCError(
2212 raise JSONRPCError(
2212 'failed to edit permission for user group: `%s` in '
2213 'failed to edit permission for user group: `%s` in '
2213 'repo: `%s`' % (
2214 'repo: `%s`' % (
2214 user_group.users_group_name, repo.repo_name
2215 user_group.users_group_name, repo.repo_name
2215 )
2216 )
2216 )
2217 )
2217
2218
2218
2219
2219 @jsonrpc_method()
2220 @jsonrpc_method()
2220 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
2221 def pull(request, apiuser, repoid, remote_uri=Optional(None)):
2221 """
2222 """
2222 Triggers a pull on the given repository from a remote location. You
2223 Triggers a pull on the given repository from a remote location. You
2223 can use this to keep remote repositories up-to-date.
2224 can use this to keep remote repositories up-to-date.
2224
2225
2225 This command can only be run using an |authtoken| with admin
2226 This command can only be run using an |authtoken| with admin
2226 rights to the specified repository. For more information,
2227 rights to the specified repository. For more information,
2227 see :ref:`config-token-ref`.
2228 see :ref:`config-token-ref`.
2228
2229
2229 This command takes the following options:
2230 This command takes the following options:
2230
2231
2231 :param apiuser: This is filled automatically from the |authtoken|.
2232 :param apiuser: This is filled automatically from the |authtoken|.
2232 :type apiuser: AuthUser
2233 :type apiuser: AuthUser
2233 :param repoid: The repository name or repository ID.
2234 :param repoid: The repository name or repository ID.
2234 :type repoid: str or int
2235 :type repoid: str or int
2235 :param remote_uri: Optional remote URI to pass in for pull
2236 :param remote_uri: Optional remote URI to pass in for pull
2236 :type remote_uri: str
2237 :type remote_uri: str
2237
2238
2238 Example output:
2239 Example output:
2239
2240
2240 .. code-block:: bash
2241 .. code-block:: bash
2241
2242
2242 id : <id_given_in_input>
2243 id : <id_given_in_input>
2243 result : {
2244 result : {
2244 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
2245 "msg": "Pulled from url `<remote_url>` on repo `<repository name>`"
2245 "repository": "<repository name>"
2246 "repository": "<repository name>"
2246 }
2247 }
2247 error : null
2248 error : null
2248
2249
2249 Example error output:
2250 Example error output:
2250
2251
2251 .. code-block:: bash
2252 .. code-block:: bash
2252
2253
2253 id : <id_given_in_input>
2254 id : <id_given_in_input>
2254 result : null
2255 result : null
2255 error : {
2256 error : {
2256 "Unable to push changes from `<remote_url>`"
2257 "Unable to push changes from `<remote_url>`"
2257 }
2258 }
2258
2259
2259 """
2260 """
2260
2261
2261 repo = get_repo_or_error(repoid)
2262 repo = get_repo_or_error(repoid)
2262 remote_uri = Optional.extract(remote_uri)
2263 remote_uri = Optional.extract(remote_uri)
2263 remote_uri_display = remote_uri or repo.clone_uri_hidden
2264 remote_uri_display = remote_uri or repo.clone_uri_hidden
2264 if not has_superadmin_permission(apiuser):
2265 if not has_superadmin_permission(apiuser):
2265 _perms = ('repository.admin',)
2266 _perms = ('repository.admin',)
2266 validate_repo_permissions(apiuser, repoid, repo, _perms)
2267 validate_repo_permissions(apiuser, repoid, repo, _perms)
2267
2268
2268 try:
2269 try:
2269 ScmModel().pull_changes(
2270 ScmModel().pull_changes(
2270 repo.repo_name, apiuser.username, remote_uri=remote_uri)
2271 repo.repo_name, apiuser.username, remote_uri=remote_uri)
2271 return {
2272 return {
2272 'msg': 'Pulled from url `%s` on repo `%s`' % (
2273 'msg': 'Pulled from url `%s` on repo `%s`' % (
2273 remote_uri_display, repo.repo_name),
2274 remote_uri_display, repo.repo_name),
2274 'repository': repo.repo_name
2275 'repository': repo.repo_name
2275 }
2276 }
2276 except Exception:
2277 except Exception:
2277 log.exception("Exception occurred while trying to "
2278 log.exception("Exception occurred while trying to "
2278 "pull changes from remote location")
2279 "pull changes from remote location")
2279 raise JSONRPCError(
2280 raise JSONRPCError(
2280 'Unable to pull changes from `%s`' % remote_uri_display
2281 'Unable to pull changes from `%s`' % remote_uri_display
2281 )
2282 )
2282
2283
2283
2284
2284 @jsonrpc_method()
2285 @jsonrpc_method()
2285 def strip(request, apiuser, repoid, revision, branch):
2286 def strip(request, apiuser, repoid, revision, branch):
2286 """
2287 """
2287 Strips the given revision from the specified repository.
2288 Strips the given revision from the specified repository.
2288
2289
2289 * This will remove the revision and all of its decendants.
2290 * This will remove the revision and all of its decendants.
2290
2291
2291 This command can only be run using an |authtoken| with admin rights to
2292 This command can only be run using an |authtoken| with admin rights to
2292 the specified repository.
2293 the specified repository.
2293
2294
2294 This command takes the following options:
2295 This command takes the following options:
2295
2296
2296 :param apiuser: This is filled automatically from the |authtoken|.
2297 :param apiuser: This is filled automatically from the |authtoken|.
2297 :type apiuser: AuthUser
2298 :type apiuser: AuthUser
2298 :param repoid: The repository name or repository ID.
2299 :param repoid: The repository name or repository ID.
2299 :type repoid: str or int
2300 :type repoid: str or int
2300 :param revision: The revision you wish to strip.
2301 :param revision: The revision you wish to strip.
2301 :type revision: str
2302 :type revision: str
2302 :param branch: The branch from which to strip the revision.
2303 :param branch: The branch from which to strip the revision.
2303 :type branch: str
2304 :type branch: str
2304
2305
2305 Example output:
2306 Example output:
2306
2307
2307 .. code-block:: bash
2308 .. code-block:: bash
2308
2309
2309 id : <id_given_in_input>
2310 id : <id_given_in_input>
2310 result : {
2311 result : {
2311 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
2312 "msg": "'Stripped commit <commit_hash> from repo `<repository name>`'"
2312 "repository": "<repository name>"
2313 "repository": "<repository name>"
2313 }
2314 }
2314 error : null
2315 error : null
2315
2316
2316 Example error output:
2317 Example error output:
2317
2318
2318 .. code-block:: bash
2319 .. code-block:: bash
2319
2320
2320 id : <id_given_in_input>
2321 id : <id_given_in_input>
2321 result : null
2322 result : null
2322 error : {
2323 error : {
2323 "Unable to strip commit <commit_hash> from repo `<repository name>`"
2324 "Unable to strip commit <commit_hash> from repo `<repository name>`"
2324 }
2325 }
2325
2326
2326 """
2327 """
2327
2328
2328 repo = get_repo_or_error(repoid)
2329 repo = get_repo_or_error(repoid)
2329 if not has_superadmin_permission(apiuser):
2330 if not has_superadmin_permission(apiuser):
2330 _perms = ('repository.admin',)
2331 _perms = ('repository.admin',)
2331 validate_repo_permissions(apiuser, repoid, repo, _perms)
2332 validate_repo_permissions(apiuser, repoid, repo, _perms)
2332
2333
2333 try:
2334 try:
2334 ScmModel().strip(repo, revision, branch)
2335 ScmModel().strip(repo, revision, branch)
2335 audit_logger.store_api(
2336 audit_logger.store_api(
2336 'repo.commit.strip', action_data={'commit_id': revision},
2337 'repo.commit.strip', action_data={'commit_id': revision},
2337 repo=repo,
2338 repo=repo,
2338 user=apiuser, commit=True)
2339 user=apiuser, commit=True)
2339
2340
2340 return {
2341 return {
2341 'msg': 'Stripped commit %s from repo `%s`' % (
2342 'msg': 'Stripped commit %s from repo `%s`' % (
2342 revision, repo.repo_name),
2343 revision, repo.repo_name),
2343 'repository': repo.repo_name
2344 'repository': repo.repo_name
2344 }
2345 }
2345 except Exception:
2346 except Exception:
2346 log.exception("Exception while trying to strip")
2347 log.exception("Exception while trying to strip")
2347 raise JSONRPCError(
2348 raise JSONRPCError(
2348 'Unable to strip commit %s from repo `%s`' % (
2349 'Unable to strip commit %s from repo `%s`' % (
2349 revision, repo.repo_name)
2350 revision, repo.repo_name)
2350 )
2351 )
2351
2352
2352
2353
2353 @jsonrpc_method()
2354 @jsonrpc_method()
2354 def get_repo_settings(request, apiuser, repoid, key=Optional(None)):
2355 def get_repo_settings(request, apiuser, repoid, key=Optional(None)):
2355 """
2356 """
2356 Returns all settings for a repository. If key is given it only returns the
2357 Returns all settings for a repository. If key is given it only returns the
2357 setting identified by the key or null.
2358 setting identified by the key or null.
2358
2359
2359 :param apiuser: This is filled automatically from the |authtoken|.
2360 :param apiuser: This is filled automatically from the |authtoken|.
2360 :type apiuser: AuthUser
2361 :type apiuser: AuthUser
2361 :param repoid: The repository name or repository id.
2362 :param repoid: The repository name or repository id.
2362 :type repoid: str or int
2363 :type repoid: str or int
2363 :param key: Key of the setting to return.
2364 :param key: Key of the setting to return.
2364 :type: key: Optional(str)
2365 :type: key: Optional(str)
2365
2366
2366 Example output:
2367 Example output:
2367
2368
2368 .. code-block:: bash
2369 .. code-block:: bash
2369
2370
2370 {
2371 {
2371 "error": null,
2372 "error": null,
2372 "id": 237,
2373 "id": 237,
2373 "result": {
2374 "result": {
2374 "extensions_largefiles": true,
2375 "extensions_largefiles": true,
2375 "extensions_evolve": true,
2376 "extensions_evolve": true,
2376 "hooks_changegroup_push_logger": true,
2377 "hooks_changegroup_push_logger": true,
2377 "hooks_changegroup_repo_size": false,
2378 "hooks_changegroup_repo_size": false,
2378 "hooks_outgoing_pull_logger": true,
2379 "hooks_outgoing_pull_logger": true,
2379 "phases_publish": "True",
2380 "phases_publish": "True",
2380 "rhodecode_hg_use_rebase_for_merging": true,
2381 "rhodecode_hg_use_rebase_for_merging": true,
2381 "rhodecode_pr_merge_enabled": true,
2382 "rhodecode_pr_merge_enabled": true,
2382 "rhodecode_use_outdated_comments": true
2383 "rhodecode_use_outdated_comments": true
2383 }
2384 }
2384 }
2385 }
2385 """
2386 """
2386
2387
2387 # Restrict access to this api method to super-admins, and repo admins only.
2388 # Restrict access to this api method to super-admins, and repo admins only.
2388 repo = get_repo_or_error(repoid)
2389 repo = get_repo_or_error(repoid)
2389 if not has_superadmin_permission(apiuser):
2390 if not has_superadmin_permission(apiuser):
2390 _perms = ('repository.admin',)
2391 _perms = ('repository.admin',)
2391 validate_repo_permissions(apiuser, repoid, repo, _perms)
2392 validate_repo_permissions(apiuser, repoid, repo, _perms)
2392
2393
2393 try:
2394 try:
2394 settings_model = VcsSettingsModel(repo=repo)
2395 settings_model = VcsSettingsModel(repo=repo)
2395 settings = settings_model.get_global_settings()
2396 settings = settings_model.get_global_settings()
2396 settings.update(settings_model.get_repo_settings())
2397 settings.update(settings_model.get_repo_settings())
2397
2398
2398 # If only a single setting is requested fetch it from all settings.
2399 # If only a single setting is requested fetch it from all settings.
2399 key = Optional.extract(key)
2400 key = Optional.extract(key)
2400 if key is not None:
2401 if key is not None:
2401 settings = settings.get(key, None)
2402 settings = settings.get(key, None)
2402 except Exception:
2403 except Exception:
2403 msg = 'Failed to fetch settings for repository `{}`'.format(repoid)
2404 msg = 'Failed to fetch settings for repository `{}`'.format(repoid)
2404 log.exception(msg)
2405 log.exception(msg)
2405 raise JSONRPCError(msg)
2406 raise JSONRPCError(msg)
2406
2407
2407 return settings
2408 return settings
2408
2409
2409
2410
2410 @jsonrpc_method()
2411 @jsonrpc_method()
2411 def set_repo_settings(request, apiuser, repoid, settings):
2412 def set_repo_settings(request, apiuser, repoid, settings):
2412 """
2413 """
2413 Update repository settings. Returns true on success.
2414 Update repository settings. Returns true on success.
2414
2415
2415 :param apiuser: This is filled automatically from the |authtoken|.
2416 :param apiuser: This is filled automatically from the |authtoken|.
2416 :type apiuser: AuthUser
2417 :type apiuser: AuthUser
2417 :param repoid: The repository name or repository id.
2418 :param repoid: The repository name or repository id.
2418 :type repoid: str or int
2419 :type repoid: str or int
2419 :param settings: The new settings for the repository.
2420 :param settings: The new settings for the repository.
2420 :type: settings: dict
2421 :type: settings: dict
2421
2422
2422 Example output:
2423 Example output:
2423
2424
2424 .. code-block:: bash
2425 .. code-block:: bash
2425
2426
2426 {
2427 {
2427 "error": null,
2428 "error": null,
2428 "id": 237,
2429 "id": 237,
2429 "result": true
2430 "result": true
2430 }
2431 }
2431 """
2432 """
2432 # Restrict access to this api method to super-admins, and repo admins only.
2433 # Restrict access to this api method to super-admins, and repo admins only.
2433 repo = get_repo_or_error(repoid)
2434 repo = get_repo_or_error(repoid)
2434 if not has_superadmin_permission(apiuser):
2435 if not has_superadmin_permission(apiuser):
2435 _perms = ('repository.admin',)
2436 _perms = ('repository.admin',)
2436 validate_repo_permissions(apiuser, repoid, repo, _perms)
2437 validate_repo_permissions(apiuser, repoid, repo, _perms)
2437
2438
2438 if type(settings) is not dict:
2439 if type(settings) is not dict:
2439 raise JSONRPCError('Settings have to be a JSON Object.')
2440 raise JSONRPCError('Settings have to be a JSON Object.')
2440
2441
2441 try:
2442 try:
2442 settings_model = VcsSettingsModel(repo=repoid)
2443 settings_model = VcsSettingsModel(repo=repoid)
2443
2444
2444 # Merge global, repo and incoming settings.
2445 # Merge global, repo and incoming settings.
2445 new_settings = settings_model.get_global_settings()
2446 new_settings = settings_model.get_global_settings()
2446 new_settings.update(settings_model.get_repo_settings())
2447 new_settings.update(settings_model.get_repo_settings())
2447 new_settings.update(settings)
2448 new_settings.update(settings)
2448
2449
2449 # Update the settings.
2450 # Update the settings.
2450 inherit_global_settings = new_settings.get(
2451 inherit_global_settings = new_settings.get(
2451 'inherit_global_settings', False)
2452 'inherit_global_settings', False)
2452 settings_model.create_or_update_repo_settings(
2453 settings_model.create_or_update_repo_settings(
2453 new_settings, inherit_global_settings=inherit_global_settings)
2454 new_settings, inherit_global_settings=inherit_global_settings)
2454 Session().commit()
2455 Session().commit()
2455 except Exception:
2456 except Exception:
2456 msg = 'Failed to update settings for repository `{}`'.format(repoid)
2457 msg = 'Failed to update settings for repository `{}`'.format(repoid)
2457 log.exception(msg)
2458 log.exception(msg)
2458 raise JSONRPCError(msg)
2459 raise JSONRPCError(msg)
2459
2460
2460 # Indicate success.
2461 # Indicate success.
2461 return True
2462 return True
2462
2463
2463
2464
2464 @jsonrpc_method()
2465 @jsonrpc_method()
2465 def maintenance(request, apiuser, repoid):
2466 def maintenance(request, apiuser, repoid):
2466 """
2467 """
2467 Triggers a maintenance on the given repository.
2468 Triggers a maintenance on the given repository.
2468
2469
2469 This command can only be run using an |authtoken| with admin
2470 This command can only be run using an |authtoken| with admin
2470 rights to the specified repository. For more information,
2471 rights to the specified repository. For more information,
2471 see :ref:`config-token-ref`.
2472 see :ref:`config-token-ref`.
2472
2473
2473 This command takes the following options:
2474 This command takes the following options:
2474
2475
2475 :param apiuser: This is filled automatically from the |authtoken|.
2476 :param apiuser: This is filled automatically from the |authtoken|.
2476 :type apiuser: AuthUser
2477 :type apiuser: AuthUser
2477 :param repoid: The repository name or repository ID.
2478 :param repoid: The repository name or repository ID.
2478 :type repoid: str or int
2479 :type repoid: str or int
2479
2480
2480 Example output:
2481 Example output:
2481
2482
2482 .. code-block:: bash
2483 .. code-block:: bash
2483
2484
2484 id : <id_given_in_input>
2485 id : <id_given_in_input>
2485 result : {
2486 result : {
2486 "msg": "executed maintenance command",
2487 "msg": "executed maintenance command",
2487 "executed_actions": [
2488 "executed_actions": [
2488 <action_message>, <action_message2>...
2489 <action_message>, <action_message2>...
2489 ],
2490 ],
2490 "repository": "<repository name>"
2491 "repository": "<repository name>"
2491 }
2492 }
2492 error : null
2493 error : null
2493
2494
2494 Example error output:
2495 Example error output:
2495
2496
2496 .. code-block:: bash
2497 .. code-block:: bash
2497
2498
2498 id : <id_given_in_input>
2499 id : <id_given_in_input>
2499 result : null
2500 result : null
2500 error : {
2501 error : {
2501 "Unable to execute maintenance on `<reponame>`"
2502 "Unable to execute maintenance on `<reponame>`"
2502 }
2503 }
2503
2504
2504 """
2505 """
2505
2506
2506 repo = get_repo_or_error(repoid)
2507 repo = get_repo_or_error(repoid)
2507 if not has_superadmin_permission(apiuser):
2508 if not has_superadmin_permission(apiuser):
2508 _perms = ('repository.admin',)
2509 _perms = ('repository.admin',)
2509 validate_repo_permissions(apiuser, repoid, repo, _perms)
2510 validate_repo_permissions(apiuser, repoid, repo, _perms)
2510
2511
2511 try:
2512 try:
2512 maintenance = repo_maintenance.RepoMaintenance()
2513 maintenance = repo_maintenance.RepoMaintenance()
2513 executed_actions = maintenance.execute(repo)
2514 executed_actions = maintenance.execute(repo)
2514
2515
2515 return {
2516 return {
2516 'msg': 'executed maintenance command',
2517 'msg': 'executed maintenance command',
2517 'executed_actions': executed_actions,
2518 'executed_actions': executed_actions,
2518 'repository': repo.repo_name
2519 'repository': repo.repo_name
2519 }
2520 }
2520 except Exception:
2521 except Exception:
2521 log.exception("Exception occurred while trying to run maintenance")
2522 log.exception("Exception occurred while trying to run maintenance")
2522 raise JSONRPCError(
2523 raise JSONRPCError(
2523 'Unable to execute maintenance on `%s`' % repo.repo_name)
2524 'Unable to execute maintenance on `%s`' % repo.repo_name)
@@ -1,1414 +1,1418 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import datetime
22 import datetime
23 import formencode
23 import formencode
24 import formencode.htmlfill
24 import formencode.htmlfill
25
25
26 from pyramid.httpexceptions import HTTPFound
26 from pyramid.httpexceptions import HTTPFound
27 from pyramid.view import view_config
27 from pyramid.view import view_config
28 from pyramid.renderers import render
28 from pyramid.renderers import render
29 from pyramid.response import Response
29 from pyramid.response import Response
30
30
31 from rhodecode import events
31 from rhodecode import events
32 from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView
32 from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView
33 from rhodecode.apps.ssh_support import SshKeyFileChangeEvent
33 from rhodecode.apps.ssh_support import SshKeyFileChangeEvent
34 from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin
34 from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin
35 from rhodecode.authentication.plugins import auth_rhodecode
35 from rhodecode.authentication.plugins import auth_rhodecode
36 from rhodecode.events import trigger
36 from rhodecode.events import trigger
37 from rhodecode.model.db import true, UserNotice
37 from rhodecode.model.db import true, UserNotice
38
38
39 from rhodecode.lib import audit_logger, rc_cache
39 from rhodecode.lib import audit_logger, rc_cache, auth
40 from rhodecode.lib.exceptions import (
40 from rhodecode.lib.exceptions import (
41 UserCreationError, UserOwnsReposException, UserOwnsRepoGroupsException,
41 UserCreationError, UserOwnsReposException, UserOwnsRepoGroupsException,
42 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
42 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
43 UserOwnsArtifactsException, DefaultUserException)
43 UserOwnsArtifactsException, DefaultUserException)
44 from rhodecode.lib.ext_json import json
44 from rhodecode.lib.ext_json import json
45 from rhodecode.lib.auth import (
45 from rhodecode.lib.auth import (
46 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
46 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
47 from rhodecode.lib import helpers as h
47 from rhodecode.lib import helpers as h
48 from rhodecode.lib.helpers import SqlPage
48 from rhodecode.lib.helpers import SqlPage
49 from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict
49 from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict
50 from rhodecode.model.auth_token import AuthTokenModel
50 from rhodecode.model.auth_token import AuthTokenModel
51 from rhodecode.model.forms import (
51 from rhodecode.model.forms import (
52 UserForm, UserIndividualPermissionsForm, UserPermissionsForm,
52 UserForm, UserIndividualPermissionsForm, UserPermissionsForm,
53 UserExtraEmailForm, UserExtraIpForm)
53 UserExtraEmailForm, UserExtraIpForm)
54 from rhodecode.model.permission import PermissionModel
54 from rhodecode.model.permission import PermissionModel
55 from rhodecode.model.repo_group import RepoGroupModel
55 from rhodecode.model.repo_group import RepoGroupModel
56 from rhodecode.model.ssh_key import SshKeyModel
56 from rhodecode.model.ssh_key import SshKeyModel
57 from rhodecode.model.user import UserModel
57 from rhodecode.model.user import UserModel
58 from rhodecode.model.user_group import UserGroupModel
58 from rhodecode.model.user_group import UserGroupModel
59 from rhodecode.model.db import (
59 from rhodecode.model.db import (
60 or_, coalesce,IntegrityError, User, UserGroup, UserIpMap, UserEmailMap,
60 or_, coalesce,IntegrityError, User, UserGroup, UserIpMap, UserEmailMap,
61 UserApiKeys, UserSshKeys, RepoGroup)
61 UserApiKeys, UserSshKeys, RepoGroup)
62 from rhodecode.model.meta import Session
62 from rhodecode.model.meta import Session
63
63
64 log = logging.getLogger(__name__)
64 log = logging.getLogger(__name__)
65
65
66
66
67 class AdminUsersView(BaseAppView, DataGridAppView):
67 class AdminUsersView(BaseAppView, DataGridAppView):
68
68
69 def load_default_context(self):
69 def load_default_context(self):
70 c = self._get_local_tmpl_context()
70 c = self._get_local_tmpl_context()
71 return c
71 return c
72
72
73 @LoginRequired()
73 @LoginRequired()
74 @HasPermissionAllDecorator('hg.admin')
74 @HasPermissionAllDecorator('hg.admin')
75 @view_config(
75 @view_config(
76 route_name='users', request_method='GET',
76 route_name='users', request_method='GET',
77 renderer='rhodecode:templates/admin/users/users.mako')
77 renderer='rhodecode:templates/admin/users/users.mako')
78 def users_list(self):
78 def users_list(self):
79 c = self.load_default_context()
79 c = self.load_default_context()
80 return self._get_template_context(c)
80 return self._get_template_context(c)
81
81
82 @LoginRequired()
82 @LoginRequired()
83 @HasPermissionAllDecorator('hg.admin')
83 @HasPermissionAllDecorator('hg.admin')
84 @view_config(
84 @view_config(
85 # renderer defined below
85 # renderer defined below
86 route_name='users_data', request_method='GET',
86 route_name='users_data', request_method='GET',
87 renderer='json_ext', xhr=True)
87 renderer='json_ext', xhr=True)
88 def users_list_data(self):
88 def users_list_data(self):
89 self.load_default_context()
89 self.load_default_context()
90 column_map = {
90 column_map = {
91 'first_name': 'name',
91 'first_name': 'name',
92 'last_name': 'lastname',
92 'last_name': 'lastname',
93 }
93 }
94 draw, start, limit = self._extract_chunk(self.request)
94 draw, start, limit = self._extract_chunk(self.request)
95 search_q, order_by, order_dir = self._extract_ordering(
95 search_q, order_by, order_dir = self._extract_ordering(
96 self.request, column_map=column_map)
96 self.request, column_map=column_map)
97 _render = self.request.get_partial_renderer(
97 _render = self.request.get_partial_renderer(
98 'rhodecode:templates/data_table/_dt_elements.mako')
98 'rhodecode:templates/data_table/_dt_elements.mako')
99
99
100 def user_actions(user_id, username):
100 def user_actions(user_id, username):
101 return _render("user_actions", user_id, username)
101 return _render("user_actions", user_id, username)
102
102
103 users_data_total_count = User.query()\
103 users_data_total_count = User.query()\
104 .filter(User.username != User.DEFAULT_USER) \
104 .filter(User.username != User.DEFAULT_USER) \
105 .count()
105 .count()
106
106
107 users_data_total_inactive_count = User.query()\
107 users_data_total_inactive_count = User.query()\
108 .filter(User.username != User.DEFAULT_USER) \
108 .filter(User.username != User.DEFAULT_USER) \
109 .filter(User.active != true())\
109 .filter(User.active != true())\
110 .count()
110 .count()
111
111
112 # json generate
112 # json generate
113 base_q = User.query().filter(User.username != User.DEFAULT_USER)
113 base_q = User.query().filter(User.username != User.DEFAULT_USER)
114 base_inactive_q = base_q.filter(User.active != true())
114 base_inactive_q = base_q.filter(User.active != true())
115
115
116 if search_q:
116 if search_q:
117 like_expression = u'%{}%'.format(safe_unicode(search_q))
117 like_expression = u'%{}%'.format(safe_unicode(search_q))
118 base_q = base_q.filter(or_(
118 base_q = base_q.filter(or_(
119 User.username.ilike(like_expression),
119 User.username.ilike(like_expression),
120 User._email.ilike(like_expression),
120 User._email.ilike(like_expression),
121 User.name.ilike(like_expression),
121 User.name.ilike(like_expression),
122 User.lastname.ilike(like_expression),
122 User.lastname.ilike(like_expression),
123 ))
123 ))
124 base_inactive_q = base_q.filter(User.active != true())
124 base_inactive_q = base_q.filter(User.active != true())
125
125
126 users_data_total_filtered_count = base_q.count()
126 users_data_total_filtered_count = base_q.count()
127 users_data_total_filtered_inactive_count = base_inactive_q.count()
127 users_data_total_filtered_inactive_count = base_inactive_q.count()
128
128
129 sort_col = getattr(User, order_by, None)
129 sort_col = getattr(User, order_by, None)
130 if sort_col:
130 if sort_col:
131 if order_dir == 'asc':
131 if order_dir == 'asc':
132 # handle null values properly to order by NULL last
132 # handle null values properly to order by NULL last
133 if order_by in ['last_activity']:
133 if order_by in ['last_activity']:
134 sort_col = coalesce(sort_col, datetime.date.max)
134 sort_col = coalesce(sort_col, datetime.date.max)
135 sort_col = sort_col.asc()
135 sort_col = sort_col.asc()
136 else:
136 else:
137 # handle null values properly to order by NULL last
137 # handle null values properly to order by NULL last
138 if order_by in ['last_activity']:
138 if order_by in ['last_activity']:
139 sort_col = coalesce(sort_col, datetime.date.min)
139 sort_col = coalesce(sort_col, datetime.date.min)
140 sort_col = sort_col.desc()
140 sort_col = sort_col.desc()
141
141
142 base_q = base_q.order_by(sort_col)
142 base_q = base_q.order_by(sort_col)
143 base_q = base_q.offset(start).limit(limit)
143 base_q = base_q.offset(start).limit(limit)
144
144
145 users_list = base_q.all()
145 users_list = base_q.all()
146
146
147 users_data = []
147 users_data = []
148 for user in users_list:
148 for user in users_list:
149 users_data.append({
149 users_data.append({
150 "username": h.gravatar_with_user(self.request, user.username),
150 "username": h.gravatar_with_user(self.request, user.username),
151 "email": user.email,
151 "email": user.email,
152 "first_name": user.first_name,
152 "first_name": user.first_name,
153 "last_name": user.last_name,
153 "last_name": user.last_name,
154 "last_login": h.format_date(user.last_login),
154 "last_login": h.format_date(user.last_login),
155 "last_activity": h.format_date(user.last_activity),
155 "last_activity": h.format_date(user.last_activity),
156 "active": h.bool2icon(user.active),
156 "active": h.bool2icon(user.active),
157 "active_raw": user.active,
157 "active_raw": user.active,
158 "admin": h.bool2icon(user.admin),
158 "admin": h.bool2icon(user.admin),
159 "extern_type": user.extern_type,
159 "extern_type": user.extern_type,
160 "extern_name": user.extern_name,
160 "extern_name": user.extern_name,
161 "action": user_actions(user.user_id, user.username),
161 "action": user_actions(user.user_id, user.username),
162 })
162 })
163 data = ({
163 data = ({
164 'draw': draw,
164 'draw': draw,
165 'data': users_data,
165 'data': users_data,
166 'recordsTotal': users_data_total_count,
166 'recordsTotal': users_data_total_count,
167 'recordsFiltered': users_data_total_filtered_count,
167 'recordsFiltered': users_data_total_filtered_count,
168 'recordsTotalInactive': users_data_total_inactive_count,
168 'recordsTotalInactive': users_data_total_inactive_count,
169 'recordsFilteredInactive': users_data_total_filtered_inactive_count
169 'recordsFilteredInactive': users_data_total_filtered_inactive_count
170 })
170 })
171
171
172 return data
172 return data
173
173
174 def _set_personal_repo_group_template_vars(self, c_obj):
174 def _set_personal_repo_group_template_vars(self, c_obj):
175 DummyUser = AttributeDict({
175 DummyUser = AttributeDict({
176 'username': '${username}',
176 'username': '${username}',
177 'user_id': '${user_id}',
177 'user_id': '${user_id}',
178 })
178 })
179 c_obj.default_create_repo_group = RepoGroupModel() \
179 c_obj.default_create_repo_group = RepoGroupModel() \
180 .get_default_create_personal_repo_group()
180 .get_default_create_personal_repo_group()
181 c_obj.personal_repo_group_name = RepoGroupModel() \
181 c_obj.personal_repo_group_name = RepoGroupModel() \
182 .get_personal_group_name(DummyUser)
182 .get_personal_group_name(DummyUser)
183
183
184 @LoginRequired()
184 @LoginRequired()
185 @HasPermissionAllDecorator('hg.admin')
185 @HasPermissionAllDecorator('hg.admin')
186 @view_config(
186 @view_config(
187 route_name='users_new', request_method='GET',
187 route_name='users_new', request_method='GET',
188 renderer='rhodecode:templates/admin/users/user_add.mako')
188 renderer='rhodecode:templates/admin/users/user_add.mako')
189 def users_new(self):
189 def users_new(self):
190 _ = self.request.translate
190 _ = self.request.translate
191 c = self.load_default_context()
191 c = self.load_default_context()
192 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
192 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
193 self._set_personal_repo_group_template_vars(c)
193 self._set_personal_repo_group_template_vars(c)
194 return self._get_template_context(c)
194 return self._get_template_context(c)
195
195
196 @LoginRequired()
196 @LoginRequired()
197 @HasPermissionAllDecorator('hg.admin')
197 @HasPermissionAllDecorator('hg.admin')
198 @CSRFRequired()
198 @CSRFRequired()
199 @view_config(
199 @view_config(
200 route_name='users_create', request_method='POST',
200 route_name='users_create', request_method='POST',
201 renderer='rhodecode:templates/admin/users/user_add.mako')
201 renderer='rhodecode:templates/admin/users/user_add.mako')
202 def users_create(self):
202 def users_create(self):
203 _ = self.request.translate
203 _ = self.request.translate
204 c = self.load_default_context()
204 c = self.load_default_context()
205 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
205 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
206 user_model = UserModel()
206 user_model = UserModel()
207 user_form = UserForm(self.request.translate)()
207 user_form = UserForm(self.request.translate)()
208 try:
208 try:
209 form_result = user_form.to_python(dict(self.request.POST))
209 form_result = user_form.to_python(dict(self.request.POST))
210 user = user_model.create(form_result)
210 user = user_model.create(form_result)
211 Session().flush()
211 Session().flush()
212 creation_data = user.get_api_data()
212 creation_data = user.get_api_data()
213 username = form_result['username']
213 username = form_result['username']
214
214
215 audit_logger.store_web(
215 audit_logger.store_web(
216 'user.create', action_data={'data': creation_data},
216 'user.create', action_data={'data': creation_data},
217 user=c.rhodecode_user)
217 user=c.rhodecode_user)
218
218
219 user_link = h.link_to(
219 user_link = h.link_to(
220 h.escape(username),
220 h.escape(username),
221 h.route_path('user_edit', user_id=user.user_id))
221 h.route_path('user_edit', user_id=user.user_id))
222 h.flash(h.literal(_('Created user %(user_link)s')
222 h.flash(h.literal(_('Created user %(user_link)s')
223 % {'user_link': user_link}), category='success')
223 % {'user_link': user_link}), category='success')
224 Session().commit()
224 Session().commit()
225 except formencode.Invalid as errors:
225 except formencode.Invalid as errors:
226 self._set_personal_repo_group_template_vars(c)
226 self._set_personal_repo_group_template_vars(c)
227 data = render(
227 data = render(
228 'rhodecode:templates/admin/users/user_add.mako',
228 'rhodecode:templates/admin/users/user_add.mako',
229 self._get_template_context(c), self.request)
229 self._get_template_context(c), self.request)
230 html = formencode.htmlfill.render(
230 html = formencode.htmlfill.render(
231 data,
231 data,
232 defaults=errors.value,
232 defaults=errors.value,
233 errors=errors.error_dict or {},
233 errors=errors.error_dict or {},
234 prefix_error=False,
234 prefix_error=False,
235 encoding="UTF-8",
235 encoding="UTF-8",
236 force_defaults=False
236 force_defaults=False
237 )
237 )
238 return Response(html)
238 return Response(html)
239 except UserCreationError as e:
239 except UserCreationError as e:
240 h.flash(e, 'error')
240 h.flash(e, 'error')
241 except Exception:
241 except Exception:
242 log.exception("Exception creation of user")
242 log.exception("Exception creation of user")
243 h.flash(_('Error occurred during creation of user %s')
243 h.flash(_('Error occurred during creation of user %s')
244 % self.request.POST.get('username'), category='error')
244 % self.request.POST.get('username'), category='error')
245 raise HTTPFound(h.route_path('users'))
245 raise HTTPFound(h.route_path('users'))
246
246
247
247
248 class UsersView(UserAppView):
248 class UsersView(UserAppView):
249 ALLOW_SCOPED_TOKENS = False
249 ALLOW_SCOPED_TOKENS = False
250 """
250 """
251 This view has alternative version inside EE, if modified please take a look
251 This view has alternative version inside EE, if modified please take a look
252 in there as well.
252 in there as well.
253 """
253 """
254
254
255 def get_auth_plugins(self):
255 def get_auth_plugins(self):
256 valid_plugins = []
256 valid_plugins = []
257 authn_registry = get_authn_registry(self.request.registry)
257 authn_registry = get_authn_registry(self.request.registry)
258 for plugin in authn_registry.get_plugins_for_authentication():
258 for plugin in authn_registry.get_plugins_for_authentication():
259 if isinstance(plugin, RhodeCodeExternalAuthPlugin):
259 if isinstance(plugin, RhodeCodeExternalAuthPlugin):
260 valid_plugins.append(plugin)
260 valid_plugins.append(plugin)
261 elif plugin.name == 'rhodecode':
261 elif plugin.name == 'rhodecode':
262 valid_plugins.append(plugin)
262 valid_plugins.append(plugin)
263
263
264 # extend our choices if user has set a bound plugin which isn't enabled at the
264 # extend our choices if user has set a bound plugin which isn't enabled at the
265 # moment
265 # moment
266 extern_type = self.db_user.extern_type
266 extern_type = self.db_user.extern_type
267 if extern_type not in [x.uid for x in valid_plugins]:
267 if extern_type not in [x.uid for x in valid_plugins]:
268 try:
268 try:
269 plugin = authn_registry.get_plugin_by_uid(extern_type)
269 plugin = authn_registry.get_plugin_by_uid(extern_type)
270 if plugin:
270 if plugin:
271 valid_plugins.append(plugin)
271 valid_plugins.append(plugin)
272
272
273 except Exception:
273 except Exception:
274 log.exception(
274 log.exception(
275 'Could not extend user plugins with `{}`'.format(extern_type))
275 'Could not extend user plugins with `{}`'.format(extern_type))
276 return valid_plugins
276 return valid_plugins
277
277
278 def load_default_context(self):
278 def load_default_context(self):
279 req = self.request
279 req = self.request
280
280
281 c = self._get_local_tmpl_context()
281 c = self._get_local_tmpl_context()
282 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
282 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
283 c.allowed_languages = [
283 c.allowed_languages = [
284 ('en', 'English (en)'),
284 ('en', 'English (en)'),
285 ('de', 'German (de)'),
285 ('de', 'German (de)'),
286 ('fr', 'French (fr)'),
286 ('fr', 'French (fr)'),
287 ('it', 'Italian (it)'),
287 ('it', 'Italian (it)'),
288 ('ja', 'Japanese (ja)'),
288 ('ja', 'Japanese (ja)'),
289 ('pl', 'Polish (pl)'),
289 ('pl', 'Polish (pl)'),
290 ('pt', 'Portuguese (pt)'),
290 ('pt', 'Portuguese (pt)'),
291 ('ru', 'Russian (ru)'),
291 ('ru', 'Russian (ru)'),
292 ('zh', 'Chinese (zh)'),
292 ('zh', 'Chinese (zh)'),
293 ]
293 ]
294
294
295 c.allowed_extern_types = [
295 c.allowed_extern_types = [
296 (x.uid, x.get_display_name()) for x in self.get_auth_plugins()
296 (x.uid, x.get_display_name()) for x in self.get_auth_plugins()
297 ]
297 ]
298 perms = req.registry.settings.get('available_permissions')
299 if not perms:
300 # inject info about available permissions
301 auth.set_available_permissions(req.registry.settings)
298
302
299 c.available_permissions = req.registry.settings['available_permissions']
303 c.available_permissions = req.registry.settings['available_permissions']
300 PermissionModel().set_global_permission_choices(
304 PermissionModel().set_global_permission_choices(
301 c, gettext_translator=req.translate)
305 c, gettext_translator=req.translate)
302
306
303 return c
307 return c
304
308
305 @LoginRequired()
309 @LoginRequired()
306 @HasPermissionAllDecorator('hg.admin')
310 @HasPermissionAllDecorator('hg.admin')
307 @CSRFRequired()
311 @CSRFRequired()
308 @view_config(
312 @view_config(
309 route_name='user_update', request_method='POST',
313 route_name='user_update', request_method='POST',
310 renderer='rhodecode:templates/admin/users/user_edit.mako')
314 renderer='rhodecode:templates/admin/users/user_edit.mako')
311 def user_update(self):
315 def user_update(self):
312 _ = self.request.translate
316 _ = self.request.translate
313 c = self.load_default_context()
317 c = self.load_default_context()
314
318
315 user_id = self.db_user_id
319 user_id = self.db_user_id
316 c.user = self.db_user
320 c.user = self.db_user
317
321
318 c.active = 'profile'
322 c.active = 'profile'
319 c.extern_type = c.user.extern_type
323 c.extern_type = c.user.extern_type
320 c.extern_name = c.user.extern_name
324 c.extern_name = c.user.extern_name
321 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
325 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
322 available_languages = [x[0] for x in c.allowed_languages]
326 available_languages = [x[0] for x in c.allowed_languages]
323 _form = UserForm(self.request.translate, edit=True,
327 _form = UserForm(self.request.translate, edit=True,
324 available_languages=available_languages,
328 available_languages=available_languages,
325 old_data={'user_id': user_id,
329 old_data={'user_id': user_id,
326 'email': c.user.email})()
330 'email': c.user.email})()
327 form_result = {}
331 form_result = {}
328 old_values = c.user.get_api_data()
332 old_values = c.user.get_api_data()
329 try:
333 try:
330 form_result = _form.to_python(dict(self.request.POST))
334 form_result = _form.to_python(dict(self.request.POST))
331 skip_attrs = ['extern_name']
335 skip_attrs = ['extern_name']
332 # TODO: plugin should define if username can be updated
336 # TODO: plugin should define if username can be updated
333 if c.extern_type != "rhodecode":
337 if c.extern_type != "rhodecode":
334 # forbid updating username for external accounts
338 # forbid updating username for external accounts
335 skip_attrs.append('username')
339 skip_attrs.append('username')
336
340
337 UserModel().update_user(
341 UserModel().update_user(
338 user_id, skip_attrs=skip_attrs, **form_result)
342 user_id, skip_attrs=skip_attrs, **form_result)
339
343
340 audit_logger.store_web(
344 audit_logger.store_web(
341 'user.edit', action_data={'old_data': old_values},
345 'user.edit', action_data={'old_data': old_values},
342 user=c.rhodecode_user)
346 user=c.rhodecode_user)
343
347
344 Session().commit()
348 Session().commit()
345 h.flash(_('User updated successfully'), category='success')
349 h.flash(_('User updated successfully'), category='success')
346 except formencode.Invalid as errors:
350 except formencode.Invalid as errors:
347 data = render(
351 data = render(
348 'rhodecode:templates/admin/users/user_edit.mako',
352 'rhodecode:templates/admin/users/user_edit.mako',
349 self._get_template_context(c), self.request)
353 self._get_template_context(c), self.request)
350 html = formencode.htmlfill.render(
354 html = formencode.htmlfill.render(
351 data,
355 data,
352 defaults=errors.value,
356 defaults=errors.value,
353 errors=errors.error_dict or {},
357 errors=errors.error_dict or {},
354 prefix_error=False,
358 prefix_error=False,
355 encoding="UTF-8",
359 encoding="UTF-8",
356 force_defaults=False
360 force_defaults=False
357 )
361 )
358 return Response(html)
362 return Response(html)
359 except UserCreationError as e:
363 except UserCreationError as e:
360 h.flash(e, 'error')
364 h.flash(e, 'error')
361 except Exception:
365 except Exception:
362 log.exception("Exception updating user")
366 log.exception("Exception updating user")
363 h.flash(_('Error occurred during update of user %s')
367 h.flash(_('Error occurred during update of user %s')
364 % form_result.get('username'), category='error')
368 % form_result.get('username'), category='error')
365 raise HTTPFound(h.route_path('user_edit', user_id=user_id))
369 raise HTTPFound(h.route_path('user_edit', user_id=user_id))
366
370
367 @LoginRequired()
371 @LoginRequired()
368 @HasPermissionAllDecorator('hg.admin')
372 @HasPermissionAllDecorator('hg.admin')
369 @CSRFRequired()
373 @CSRFRequired()
370 @view_config(
374 @view_config(
371 route_name='user_delete', request_method='POST',
375 route_name='user_delete', request_method='POST',
372 renderer='rhodecode:templates/admin/users/user_edit.mako')
376 renderer='rhodecode:templates/admin/users/user_edit.mako')
373 def user_delete(self):
377 def user_delete(self):
374 _ = self.request.translate
378 _ = self.request.translate
375 c = self.load_default_context()
379 c = self.load_default_context()
376 c.user = self.db_user
380 c.user = self.db_user
377
381
378 _repos = c.user.repositories
382 _repos = c.user.repositories
379 _repo_groups = c.user.repository_groups
383 _repo_groups = c.user.repository_groups
380 _user_groups = c.user.user_groups
384 _user_groups = c.user.user_groups
381 _pull_requests = c.user.user_pull_requests
385 _pull_requests = c.user.user_pull_requests
382 _artifacts = c.user.artifacts
386 _artifacts = c.user.artifacts
383
387
384 handle_repos = None
388 handle_repos = None
385 handle_repo_groups = None
389 handle_repo_groups = None
386 handle_user_groups = None
390 handle_user_groups = None
387 handle_pull_requests = None
391 handle_pull_requests = None
388 handle_artifacts = None
392 handle_artifacts = None
389
393
390 # calls for flash of handle based on handle case detach or delete
394 # calls for flash of handle based on handle case detach or delete
391 def set_handle_flash_repos():
395 def set_handle_flash_repos():
392 handle = handle_repos
396 handle = handle_repos
393 if handle == 'detach':
397 if handle == 'detach':
394 h.flash(_('Detached %s repositories') % len(_repos),
398 h.flash(_('Detached %s repositories') % len(_repos),
395 category='success')
399 category='success')
396 elif handle == 'delete':
400 elif handle == 'delete':
397 h.flash(_('Deleted %s repositories') % len(_repos),
401 h.flash(_('Deleted %s repositories') % len(_repos),
398 category='success')
402 category='success')
399
403
400 def set_handle_flash_repo_groups():
404 def set_handle_flash_repo_groups():
401 handle = handle_repo_groups
405 handle = handle_repo_groups
402 if handle == 'detach':
406 if handle == 'detach':
403 h.flash(_('Detached %s repository groups') % len(_repo_groups),
407 h.flash(_('Detached %s repository groups') % len(_repo_groups),
404 category='success')
408 category='success')
405 elif handle == 'delete':
409 elif handle == 'delete':
406 h.flash(_('Deleted %s repository groups') % len(_repo_groups),
410 h.flash(_('Deleted %s repository groups') % len(_repo_groups),
407 category='success')
411 category='success')
408
412
409 def set_handle_flash_user_groups():
413 def set_handle_flash_user_groups():
410 handle = handle_user_groups
414 handle = handle_user_groups
411 if handle == 'detach':
415 if handle == 'detach':
412 h.flash(_('Detached %s user groups') % len(_user_groups),
416 h.flash(_('Detached %s user groups') % len(_user_groups),
413 category='success')
417 category='success')
414 elif handle == 'delete':
418 elif handle == 'delete':
415 h.flash(_('Deleted %s user groups') % len(_user_groups),
419 h.flash(_('Deleted %s user groups') % len(_user_groups),
416 category='success')
420 category='success')
417
421
418 def set_handle_flash_pull_requests():
422 def set_handle_flash_pull_requests():
419 handle = handle_pull_requests
423 handle = handle_pull_requests
420 if handle == 'detach':
424 if handle == 'detach':
421 h.flash(_('Detached %s pull requests') % len(_pull_requests),
425 h.flash(_('Detached %s pull requests') % len(_pull_requests),
422 category='success')
426 category='success')
423 elif handle == 'delete':
427 elif handle == 'delete':
424 h.flash(_('Deleted %s pull requests') % len(_pull_requests),
428 h.flash(_('Deleted %s pull requests') % len(_pull_requests),
425 category='success')
429 category='success')
426
430
427 def set_handle_flash_artifacts():
431 def set_handle_flash_artifacts():
428 handle = handle_artifacts
432 handle = handle_artifacts
429 if handle == 'detach':
433 if handle == 'detach':
430 h.flash(_('Detached %s artifacts') % len(_artifacts),
434 h.flash(_('Detached %s artifacts') % len(_artifacts),
431 category='success')
435 category='success')
432 elif handle == 'delete':
436 elif handle == 'delete':
433 h.flash(_('Deleted %s artifacts') % len(_artifacts),
437 h.flash(_('Deleted %s artifacts') % len(_artifacts),
434 category='success')
438 category='success')
435
439
436 handle_user = User.get_first_super_admin()
440 handle_user = User.get_first_super_admin()
437 handle_user_id = safe_int(self.request.POST.get('detach_user_id'))
441 handle_user_id = safe_int(self.request.POST.get('detach_user_id'))
438 if handle_user_id:
442 if handle_user_id:
439 # NOTE(marcink): we get new owner for objects...
443 # NOTE(marcink): we get new owner for objects...
440 handle_user = User.get_or_404(handle_user_id)
444 handle_user = User.get_or_404(handle_user_id)
441
445
442 if _repos and self.request.POST.get('user_repos'):
446 if _repos and self.request.POST.get('user_repos'):
443 handle_repos = self.request.POST['user_repos']
447 handle_repos = self.request.POST['user_repos']
444
448
445 if _repo_groups and self.request.POST.get('user_repo_groups'):
449 if _repo_groups and self.request.POST.get('user_repo_groups'):
446 handle_repo_groups = self.request.POST['user_repo_groups']
450 handle_repo_groups = self.request.POST['user_repo_groups']
447
451
448 if _user_groups and self.request.POST.get('user_user_groups'):
452 if _user_groups and self.request.POST.get('user_user_groups'):
449 handle_user_groups = self.request.POST['user_user_groups']
453 handle_user_groups = self.request.POST['user_user_groups']
450
454
451 if _pull_requests and self.request.POST.get('user_pull_requests'):
455 if _pull_requests and self.request.POST.get('user_pull_requests'):
452 handle_pull_requests = self.request.POST['user_pull_requests']
456 handle_pull_requests = self.request.POST['user_pull_requests']
453
457
454 if _artifacts and self.request.POST.get('user_artifacts'):
458 if _artifacts and self.request.POST.get('user_artifacts'):
455 handle_artifacts = self.request.POST['user_artifacts']
459 handle_artifacts = self.request.POST['user_artifacts']
456
460
457 old_values = c.user.get_api_data()
461 old_values = c.user.get_api_data()
458
462
459 try:
463 try:
460
464
461 UserModel().delete(
465 UserModel().delete(
462 c.user,
466 c.user,
463 handle_repos=handle_repos,
467 handle_repos=handle_repos,
464 handle_repo_groups=handle_repo_groups,
468 handle_repo_groups=handle_repo_groups,
465 handle_user_groups=handle_user_groups,
469 handle_user_groups=handle_user_groups,
466 handle_pull_requests=handle_pull_requests,
470 handle_pull_requests=handle_pull_requests,
467 handle_artifacts=handle_artifacts,
471 handle_artifacts=handle_artifacts,
468 handle_new_owner=handle_user
472 handle_new_owner=handle_user
469 )
473 )
470
474
471 audit_logger.store_web(
475 audit_logger.store_web(
472 'user.delete', action_data={'old_data': old_values},
476 'user.delete', action_data={'old_data': old_values},
473 user=c.rhodecode_user)
477 user=c.rhodecode_user)
474
478
475 Session().commit()
479 Session().commit()
476 set_handle_flash_repos()
480 set_handle_flash_repos()
477 set_handle_flash_repo_groups()
481 set_handle_flash_repo_groups()
478 set_handle_flash_user_groups()
482 set_handle_flash_user_groups()
479 set_handle_flash_pull_requests()
483 set_handle_flash_pull_requests()
480 set_handle_flash_artifacts()
484 set_handle_flash_artifacts()
481 username = h.escape(old_values['username'])
485 username = h.escape(old_values['username'])
482 h.flash(_('Successfully deleted user `{}`').format(username), category='success')
486 h.flash(_('Successfully deleted user `{}`').format(username), category='success')
483 except (UserOwnsReposException, UserOwnsRepoGroupsException,
487 except (UserOwnsReposException, UserOwnsRepoGroupsException,
484 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
488 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
485 UserOwnsArtifactsException, DefaultUserException) as e:
489 UserOwnsArtifactsException, DefaultUserException) as e:
486 h.flash(e, category='warning')
490 h.flash(e, category='warning')
487 except Exception:
491 except Exception:
488 log.exception("Exception during deletion of user")
492 log.exception("Exception during deletion of user")
489 h.flash(_('An error occurred during deletion of user'),
493 h.flash(_('An error occurred during deletion of user'),
490 category='error')
494 category='error')
491 raise HTTPFound(h.route_path('users'))
495 raise HTTPFound(h.route_path('users'))
492
496
493 @LoginRequired()
497 @LoginRequired()
494 @HasPermissionAllDecorator('hg.admin')
498 @HasPermissionAllDecorator('hg.admin')
495 @view_config(
499 @view_config(
496 route_name='user_edit', request_method='GET',
500 route_name='user_edit', request_method='GET',
497 renderer='rhodecode:templates/admin/users/user_edit.mako')
501 renderer='rhodecode:templates/admin/users/user_edit.mako')
498 def user_edit(self):
502 def user_edit(self):
499 _ = self.request.translate
503 _ = self.request.translate
500 c = self.load_default_context()
504 c = self.load_default_context()
501 c.user = self.db_user
505 c.user = self.db_user
502
506
503 c.active = 'profile'
507 c.active = 'profile'
504 c.extern_type = c.user.extern_type
508 c.extern_type = c.user.extern_type
505 c.extern_name = c.user.extern_name
509 c.extern_name = c.user.extern_name
506 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
510 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
507
511
508 defaults = c.user.get_dict()
512 defaults = c.user.get_dict()
509 defaults.update({'language': c.user.user_data.get('language')})
513 defaults.update({'language': c.user.user_data.get('language')})
510
514
511 data = render(
515 data = render(
512 'rhodecode:templates/admin/users/user_edit.mako',
516 'rhodecode:templates/admin/users/user_edit.mako',
513 self._get_template_context(c), self.request)
517 self._get_template_context(c), self.request)
514 html = formencode.htmlfill.render(
518 html = formencode.htmlfill.render(
515 data,
519 data,
516 defaults=defaults,
520 defaults=defaults,
517 encoding="UTF-8",
521 encoding="UTF-8",
518 force_defaults=False
522 force_defaults=False
519 )
523 )
520 return Response(html)
524 return Response(html)
521
525
522 @LoginRequired()
526 @LoginRequired()
523 @HasPermissionAllDecorator('hg.admin')
527 @HasPermissionAllDecorator('hg.admin')
524 @view_config(
528 @view_config(
525 route_name='user_edit_advanced', request_method='GET',
529 route_name='user_edit_advanced', request_method='GET',
526 renderer='rhodecode:templates/admin/users/user_edit.mako')
530 renderer='rhodecode:templates/admin/users/user_edit.mako')
527 def user_edit_advanced(self):
531 def user_edit_advanced(self):
528 _ = self.request.translate
532 _ = self.request.translate
529 c = self.load_default_context()
533 c = self.load_default_context()
530
534
531 user_id = self.db_user_id
535 user_id = self.db_user_id
532 c.user = self.db_user
536 c.user = self.db_user
533
537
534 c.detach_user = User.get_first_super_admin()
538 c.detach_user = User.get_first_super_admin()
535 detach_user_id = safe_int(self.request.GET.get('detach_user_id'))
539 detach_user_id = safe_int(self.request.GET.get('detach_user_id'))
536 if detach_user_id:
540 if detach_user_id:
537 c.detach_user = User.get_or_404(detach_user_id)
541 c.detach_user = User.get_or_404(detach_user_id)
538
542
539 c.active = 'advanced'
543 c.active = 'advanced'
540 c.personal_repo_group = RepoGroup.get_user_personal_repo_group(user_id)
544 c.personal_repo_group = RepoGroup.get_user_personal_repo_group(user_id)
541 c.personal_repo_group_name = RepoGroupModel()\
545 c.personal_repo_group_name = RepoGroupModel()\
542 .get_personal_group_name(c.user)
546 .get_personal_group_name(c.user)
543
547
544 c.user_to_review_rules = sorted(
548 c.user_to_review_rules = sorted(
545 (x.user for x in c.user.user_review_rules),
549 (x.user for x in c.user.user_review_rules),
546 key=lambda u: u.username.lower())
550 key=lambda u: u.username.lower())
547
551
548 defaults = c.user.get_dict()
552 defaults = c.user.get_dict()
549
553
550 # Interim workaround if the user participated on any pull requests as a
554 # Interim workaround if the user participated on any pull requests as a
551 # reviewer.
555 # reviewer.
552 has_review = len(c.user.reviewer_pull_requests)
556 has_review = len(c.user.reviewer_pull_requests)
553 c.can_delete_user = not has_review
557 c.can_delete_user = not has_review
554 c.can_delete_user_message = ''
558 c.can_delete_user_message = ''
555 inactive_link = h.link_to(
559 inactive_link = h.link_to(
556 'inactive', h.route_path('user_edit', user_id=user_id, _anchor='active'))
560 'inactive', h.route_path('user_edit', user_id=user_id, _anchor='active'))
557 if has_review == 1:
561 if has_review == 1:
558 c.can_delete_user_message = h.literal(_(
562 c.can_delete_user_message = h.literal(_(
559 'The user participates as reviewer in {} pull request and '
563 'The user participates as reviewer in {} pull request and '
560 'cannot be deleted. \nYou can set the user to '
564 'cannot be deleted. \nYou can set the user to '
561 '"{}" instead of deleting it.').format(
565 '"{}" instead of deleting it.').format(
562 has_review, inactive_link))
566 has_review, inactive_link))
563 elif has_review:
567 elif has_review:
564 c.can_delete_user_message = h.literal(_(
568 c.can_delete_user_message = h.literal(_(
565 'The user participates as reviewer in {} pull requests and '
569 'The user participates as reviewer in {} pull requests and '
566 'cannot be deleted. \nYou can set the user to '
570 'cannot be deleted. \nYou can set the user to '
567 '"{}" instead of deleting it.').format(
571 '"{}" instead of deleting it.').format(
568 has_review, inactive_link))
572 has_review, inactive_link))
569
573
570 data = render(
574 data = render(
571 'rhodecode:templates/admin/users/user_edit.mako',
575 'rhodecode:templates/admin/users/user_edit.mako',
572 self._get_template_context(c), self.request)
576 self._get_template_context(c), self.request)
573 html = formencode.htmlfill.render(
577 html = formencode.htmlfill.render(
574 data,
578 data,
575 defaults=defaults,
579 defaults=defaults,
576 encoding="UTF-8",
580 encoding="UTF-8",
577 force_defaults=False
581 force_defaults=False
578 )
582 )
579 return Response(html)
583 return Response(html)
580
584
581 @LoginRequired()
585 @LoginRequired()
582 @HasPermissionAllDecorator('hg.admin')
586 @HasPermissionAllDecorator('hg.admin')
583 @view_config(
587 @view_config(
584 route_name='user_edit_global_perms', request_method='GET',
588 route_name='user_edit_global_perms', request_method='GET',
585 renderer='rhodecode:templates/admin/users/user_edit.mako')
589 renderer='rhodecode:templates/admin/users/user_edit.mako')
586 def user_edit_global_perms(self):
590 def user_edit_global_perms(self):
587 _ = self.request.translate
591 _ = self.request.translate
588 c = self.load_default_context()
592 c = self.load_default_context()
589 c.user = self.db_user
593 c.user = self.db_user
590
594
591 c.active = 'global_perms'
595 c.active = 'global_perms'
592
596
593 c.default_user = User.get_default_user()
597 c.default_user = User.get_default_user()
594 defaults = c.user.get_dict()
598 defaults = c.user.get_dict()
595 defaults.update(c.default_user.get_default_perms(suffix='_inherited'))
599 defaults.update(c.default_user.get_default_perms(suffix='_inherited'))
596 defaults.update(c.default_user.get_default_perms())
600 defaults.update(c.default_user.get_default_perms())
597 defaults.update(c.user.get_default_perms())
601 defaults.update(c.user.get_default_perms())
598
602
599 data = render(
603 data = render(
600 'rhodecode:templates/admin/users/user_edit.mako',
604 'rhodecode:templates/admin/users/user_edit.mako',
601 self._get_template_context(c), self.request)
605 self._get_template_context(c), self.request)
602 html = formencode.htmlfill.render(
606 html = formencode.htmlfill.render(
603 data,
607 data,
604 defaults=defaults,
608 defaults=defaults,
605 encoding="UTF-8",
609 encoding="UTF-8",
606 force_defaults=False
610 force_defaults=False
607 )
611 )
608 return Response(html)
612 return Response(html)
609
613
610 @LoginRequired()
614 @LoginRequired()
611 @HasPermissionAllDecorator('hg.admin')
615 @HasPermissionAllDecorator('hg.admin')
612 @CSRFRequired()
616 @CSRFRequired()
613 @view_config(
617 @view_config(
614 route_name='user_edit_global_perms_update', request_method='POST',
618 route_name='user_edit_global_perms_update', request_method='POST',
615 renderer='rhodecode:templates/admin/users/user_edit.mako')
619 renderer='rhodecode:templates/admin/users/user_edit.mako')
616 def user_edit_global_perms_update(self):
620 def user_edit_global_perms_update(self):
617 _ = self.request.translate
621 _ = self.request.translate
618 c = self.load_default_context()
622 c = self.load_default_context()
619
623
620 user_id = self.db_user_id
624 user_id = self.db_user_id
621 c.user = self.db_user
625 c.user = self.db_user
622
626
623 c.active = 'global_perms'
627 c.active = 'global_perms'
624 try:
628 try:
625 # first stage that verifies the checkbox
629 # first stage that verifies the checkbox
626 _form = UserIndividualPermissionsForm(self.request.translate)
630 _form = UserIndividualPermissionsForm(self.request.translate)
627 form_result = _form.to_python(dict(self.request.POST))
631 form_result = _form.to_python(dict(self.request.POST))
628 inherit_perms = form_result['inherit_default_permissions']
632 inherit_perms = form_result['inherit_default_permissions']
629 c.user.inherit_default_permissions = inherit_perms
633 c.user.inherit_default_permissions = inherit_perms
630 Session().add(c.user)
634 Session().add(c.user)
631
635
632 if not inherit_perms:
636 if not inherit_perms:
633 # only update the individual ones if we un check the flag
637 # only update the individual ones if we un check the flag
634 _form = UserPermissionsForm(
638 _form = UserPermissionsForm(
635 self.request.translate,
639 self.request.translate,
636 [x[0] for x in c.repo_create_choices],
640 [x[0] for x in c.repo_create_choices],
637 [x[0] for x in c.repo_create_on_write_choices],
641 [x[0] for x in c.repo_create_on_write_choices],
638 [x[0] for x in c.repo_group_create_choices],
642 [x[0] for x in c.repo_group_create_choices],
639 [x[0] for x in c.user_group_create_choices],
643 [x[0] for x in c.user_group_create_choices],
640 [x[0] for x in c.fork_choices],
644 [x[0] for x in c.fork_choices],
641 [x[0] for x in c.inherit_default_permission_choices])()
645 [x[0] for x in c.inherit_default_permission_choices])()
642
646
643 form_result = _form.to_python(dict(self.request.POST))
647 form_result = _form.to_python(dict(self.request.POST))
644 form_result.update({'perm_user_id': c.user.user_id})
648 form_result.update({'perm_user_id': c.user.user_id})
645
649
646 PermissionModel().update_user_permissions(form_result)
650 PermissionModel().update_user_permissions(form_result)
647
651
648 # TODO(marcink): implement global permissions
652 # TODO(marcink): implement global permissions
649 # audit_log.store_web('user.edit.permissions')
653 # audit_log.store_web('user.edit.permissions')
650
654
651 Session().commit()
655 Session().commit()
652
656
653 h.flash(_('User global permissions updated successfully'),
657 h.flash(_('User global permissions updated successfully'),
654 category='success')
658 category='success')
655
659
656 except formencode.Invalid as errors:
660 except formencode.Invalid as errors:
657 data = render(
661 data = render(
658 'rhodecode:templates/admin/users/user_edit.mako',
662 'rhodecode:templates/admin/users/user_edit.mako',
659 self._get_template_context(c), self.request)
663 self._get_template_context(c), self.request)
660 html = formencode.htmlfill.render(
664 html = formencode.htmlfill.render(
661 data,
665 data,
662 defaults=errors.value,
666 defaults=errors.value,
663 errors=errors.error_dict or {},
667 errors=errors.error_dict or {},
664 prefix_error=False,
668 prefix_error=False,
665 encoding="UTF-8",
669 encoding="UTF-8",
666 force_defaults=False
670 force_defaults=False
667 )
671 )
668 return Response(html)
672 return Response(html)
669 except Exception:
673 except Exception:
670 log.exception("Exception during permissions saving")
674 log.exception("Exception during permissions saving")
671 h.flash(_('An error occurred during permissions saving'),
675 h.flash(_('An error occurred during permissions saving'),
672 category='error')
676 category='error')
673
677
674 affected_user_ids = [user_id]
678 affected_user_ids = [user_id]
675 PermissionModel().trigger_permission_flush(affected_user_ids)
679 PermissionModel().trigger_permission_flush(affected_user_ids)
676 raise HTTPFound(h.route_path('user_edit_global_perms', user_id=user_id))
680 raise HTTPFound(h.route_path('user_edit_global_perms', user_id=user_id))
677
681
678 @LoginRequired()
682 @LoginRequired()
679 @HasPermissionAllDecorator('hg.admin')
683 @HasPermissionAllDecorator('hg.admin')
680 @CSRFRequired()
684 @CSRFRequired()
681 @view_config(
685 @view_config(
682 route_name='user_enable_force_password_reset', request_method='POST',
686 route_name='user_enable_force_password_reset', request_method='POST',
683 renderer='rhodecode:templates/admin/users/user_edit.mako')
687 renderer='rhodecode:templates/admin/users/user_edit.mako')
684 def user_enable_force_password_reset(self):
688 def user_enable_force_password_reset(self):
685 _ = self.request.translate
689 _ = self.request.translate
686 c = self.load_default_context()
690 c = self.load_default_context()
687
691
688 user_id = self.db_user_id
692 user_id = self.db_user_id
689 c.user = self.db_user
693 c.user = self.db_user
690
694
691 try:
695 try:
692 c.user.update_userdata(force_password_change=True)
696 c.user.update_userdata(force_password_change=True)
693
697
694 msg = _('Force password change enabled for user')
698 msg = _('Force password change enabled for user')
695 audit_logger.store_web('user.edit.password_reset.enabled',
699 audit_logger.store_web('user.edit.password_reset.enabled',
696 user=c.rhodecode_user)
700 user=c.rhodecode_user)
697
701
698 Session().commit()
702 Session().commit()
699 h.flash(msg, category='success')
703 h.flash(msg, category='success')
700 except Exception:
704 except Exception:
701 log.exception("Exception during password reset for user")
705 log.exception("Exception during password reset for user")
702 h.flash(_('An error occurred during password reset for user'),
706 h.flash(_('An error occurred during password reset for user'),
703 category='error')
707 category='error')
704
708
705 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
709 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
706
710
707 @LoginRequired()
711 @LoginRequired()
708 @HasPermissionAllDecorator('hg.admin')
712 @HasPermissionAllDecorator('hg.admin')
709 @CSRFRequired()
713 @CSRFRequired()
710 @view_config(
714 @view_config(
711 route_name='user_disable_force_password_reset', request_method='POST',
715 route_name='user_disable_force_password_reset', request_method='POST',
712 renderer='rhodecode:templates/admin/users/user_edit.mako')
716 renderer='rhodecode:templates/admin/users/user_edit.mako')
713 def user_disable_force_password_reset(self):
717 def user_disable_force_password_reset(self):
714 _ = self.request.translate
718 _ = self.request.translate
715 c = self.load_default_context()
719 c = self.load_default_context()
716
720
717 user_id = self.db_user_id
721 user_id = self.db_user_id
718 c.user = self.db_user
722 c.user = self.db_user
719
723
720 try:
724 try:
721 c.user.update_userdata(force_password_change=False)
725 c.user.update_userdata(force_password_change=False)
722
726
723 msg = _('Force password change disabled for user')
727 msg = _('Force password change disabled for user')
724 audit_logger.store_web(
728 audit_logger.store_web(
725 'user.edit.password_reset.disabled',
729 'user.edit.password_reset.disabled',
726 user=c.rhodecode_user)
730 user=c.rhodecode_user)
727
731
728 Session().commit()
732 Session().commit()
729 h.flash(msg, category='success')
733 h.flash(msg, category='success')
730 except Exception:
734 except Exception:
731 log.exception("Exception during password reset for user")
735 log.exception("Exception during password reset for user")
732 h.flash(_('An error occurred during password reset for user'),
736 h.flash(_('An error occurred during password reset for user'),
733 category='error')
737 category='error')
734
738
735 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
739 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
736
740
737 @LoginRequired()
741 @LoginRequired()
738 @HasPermissionAllDecorator('hg.admin')
742 @HasPermissionAllDecorator('hg.admin')
739 @CSRFRequired()
743 @CSRFRequired()
740 @view_config(
744 @view_config(
741 route_name='user_notice_dismiss', request_method='POST',
745 route_name='user_notice_dismiss', request_method='POST',
742 renderer='json_ext', xhr=True)
746 renderer='json_ext', xhr=True)
743 def user_notice_dismiss(self):
747 def user_notice_dismiss(self):
744 _ = self.request.translate
748 _ = self.request.translate
745 c = self.load_default_context()
749 c = self.load_default_context()
746
750
747 user_id = self.db_user_id
751 user_id = self.db_user_id
748 c.user = self.db_user
752 c.user = self.db_user
749 user_notice_id = safe_int(self.request.POST.get('notice_id'))
753 user_notice_id = safe_int(self.request.POST.get('notice_id'))
750 notice = UserNotice().query()\
754 notice = UserNotice().query()\
751 .filter(UserNotice.user_id == user_id)\
755 .filter(UserNotice.user_id == user_id)\
752 .filter(UserNotice.user_notice_id == user_notice_id)\
756 .filter(UserNotice.user_notice_id == user_notice_id)\
753 .scalar()
757 .scalar()
754 read = False
758 read = False
755 if notice:
759 if notice:
756 notice.notice_read = True
760 notice.notice_read = True
757 Session().add(notice)
761 Session().add(notice)
758 Session().commit()
762 Session().commit()
759 read = True
763 read = True
760
764
761 return {'notice': user_notice_id, 'read': read}
765 return {'notice': user_notice_id, 'read': read}
762
766
763 @LoginRequired()
767 @LoginRequired()
764 @HasPermissionAllDecorator('hg.admin')
768 @HasPermissionAllDecorator('hg.admin')
765 @CSRFRequired()
769 @CSRFRequired()
766 @view_config(
770 @view_config(
767 route_name='user_create_personal_repo_group', request_method='POST',
771 route_name='user_create_personal_repo_group', request_method='POST',
768 renderer='rhodecode:templates/admin/users/user_edit.mako')
772 renderer='rhodecode:templates/admin/users/user_edit.mako')
769 def user_create_personal_repo_group(self):
773 def user_create_personal_repo_group(self):
770 """
774 """
771 Create personal repository group for this user
775 Create personal repository group for this user
772 """
776 """
773 from rhodecode.model.repo_group import RepoGroupModel
777 from rhodecode.model.repo_group import RepoGroupModel
774
778
775 _ = self.request.translate
779 _ = self.request.translate
776 c = self.load_default_context()
780 c = self.load_default_context()
777
781
778 user_id = self.db_user_id
782 user_id = self.db_user_id
779 c.user = self.db_user
783 c.user = self.db_user
780
784
781 personal_repo_group = RepoGroup.get_user_personal_repo_group(
785 personal_repo_group = RepoGroup.get_user_personal_repo_group(
782 c.user.user_id)
786 c.user.user_id)
783 if personal_repo_group:
787 if personal_repo_group:
784 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
788 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
785
789
786 personal_repo_group_name = RepoGroupModel().get_personal_group_name(c.user)
790 personal_repo_group_name = RepoGroupModel().get_personal_group_name(c.user)
787 named_personal_group = RepoGroup.get_by_group_name(
791 named_personal_group = RepoGroup.get_by_group_name(
788 personal_repo_group_name)
792 personal_repo_group_name)
789 try:
793 try:
790
794
791 if named_personal_group and named_personal_group.user_id == c.user.user_id:
795 if named_personal_group and named_personal_group.user_id == c.user.user_id:
792 # migrate the same named group, and mark it as personal
796 # migrate the same named group, and mark it as personal
793 named_personal_group.personal = True
797 named_personal_group.personal = True
794 Session().add(named_personal_group)
798 Session().add(named_personal_group)
795 Session().commit()
799 Session().commit()
796 msg = _('Linked repository group `%s` as personal' % (
800 msg = _('Linked repository group `%s` as personal' % (
797 personal_repo_group_name,))
801 personal_repo_group_name,))
798 h.flash(msg, category='success')
802 h.flash(msg, category='success')
799 elif not named_personal_group:
803 elif not named_personal_group:
800 RepoGroupModel().create_personal_repo_group(c.user)
804 RepoGroupModel().create_personal_repo_group(c.user)
801
805
802 msg = _('Created repository group `%s`' % (
806 msg = _('Created repository group `%s`' % (
803 personal_repo_group_name,))
807 personal_repo_group_name,))
804 h.flash(msg, category='success')
808 h.flash(msg, category='success')
805 else:
809 else:
806 msg = _('Repository group `%s` is already taken' % (
810 msg = _('Repository group `%s` is already taken' % (
807 personal_repo_group_name,))
811 personal_repo_group_name,))
808 h.flash(msg, category='warning')
812 h.flash(msg, category='warning')
809 except Exception:
813 except Exception:
810 log.exception("Exception during repository group creation")
814 log.exception("Exception during repository group creation")
811 msg = _(
815 msg = _(
812 'An error occurred during repository group creation for user')
816 'An error occurred during repository group creation for user')
813 h.flash(msg, category='error')
817 h.flash(msg, category='error')
814 Session().rollback()
818 Session().rollback()
815
819
816 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
820 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
817
821
818 @LoginRequired()
822 @LoginRequired()
819 @HasPermissionAllDecorator('hg.admin')
823 @HasPermissionAllDecorator('hg.admin')
820 @view_config(
824 @view_config(
821 route_name='edit_user_auth_tokens', request_method='GET',
825 route_name='edit_user_auth_tokens', request_method='GET',
822 renderer='rhodecode:templates/admin/users/user_edit.mako')
826 renderer='rhodecode:templates/admin/users/user_edit.mako')
823 def auth_tokens(self):
827 def auth_tokens(self):
824 _ = self.request.translate
828 _ = self.request.translate
825 c = self.load_default_context()
829 c = self.load_default_context()
826 c.user = self.db_user
830 c.user = self.db_user
827
831
828 c.active = 'auth_tokens'
832 c.active = 'auth_tokens'
829
833
830 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
834 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
831 c.role_values = [
835 c.role_values = [
832 (x, AuthTokenModel.cls._get_role_name(x))
836 (x, AuthTokenModel.cls._get_role_name(x))
833 for x in AuthTokenModel.cls.ROLES]
837 for x in AuthTokenModel.cls.ROLES]
834 c.role_options = [(c.role_values, _("Role"))]
838 c.role_options = [(c.role_values, _("Role"))]
835 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
839 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
836 c.user.user_id, show_expired=True)
840 c.user.user_id, show_expired=True)
837 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
841 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
838 return self._get_template_context(c)
842 return self._get_template_context(c)
839
843
840 @LoginRequired()
844 @LoginRequired()
841 @HasPermissionAllDecorator('hg.admin')
845 @HasPermissionAllDecorator('hg.admin')
842 @view_config(
846 @view_config(
843 route_name='edit_user_auth_tokens_view', request_method='POST',
847 route_name='edit_user_auth_tokens_view', request_method='POST',
844 renderer='json_ext', xhr=True)
848 renderer='json_ext', xhr=True)
845 def auth_tokens_view(self):
849 def auth_tokens_view(self):
846 _ = self.request.translate
850 _ = self.request.translate
847 c = self.load_default_context()
851 c = self.load_default_context()
848 c.user = self.db_user
852 c.user = self.db_user
849
853
850 auth_token_id = self.request.POST.get('auth_token_id')
854 auth_token_id = self.request.POST.get('auth_token_id')
851
855
852 if auth_token_id:
856 if auth_token_id:
853 token = UserApiKeys.get_or_404(auth_token_id)
857 token = UserApiKeys.get_or_404(auth_token_id)
854
858
855 return {
859 return {
856 'auth_token': token.api_key
860 'auth_token': token.api_key
857 }
861 }
858
862
859 def maybe_attach_token_scope(self, token):
863 def maybe_attach_token_scope(self, token):
860 # implemented in EE edition
864 # implemented in EE edition
861 pass
865 pass
862
866
863 @LoginRequired()
867 @LoginRequired()
864 @HasPermissionAllDecorator('hg.admin')
868 @HasPermissionAllDecorator('hg.admin')
865 @CSRFRequired()
869 @CSRFRequired()
866 @view_config(
870 @view_config(
867 route_name='edit_user_auth_tokens_add', request_method='POST')
871 route_name='edit_user_auth_tokens_add', request_method='POST')
868 def auth_tokens_add(self):
872 def auth_tokens_add(self):
869 _ = self.request.translate
873 _ = self.request.translate
870 c = self.load_default_context()
874 c = self.load_default_context()
871
875
872 user_id = self.db_user_id
876 user_id = self.db_user_id
873 c.user = self.db_user
877 c.user = self.db_user
874
878
875 user_data = c.user.get_api_data()
879 user_data = c.user.get_api_data()
876 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
880 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
877 description = self.request.POST.get('description')
881 description = self.request.POST.get('description')
878 role = self.request.POST.get('role')
882 role = self.request.POST.get('role')
879
883
880 token = UserModel().add_auth_token(
884 token = UserModel().add_auth_token(
881 user=c.user.user_id,
885 user=c.user.user_id,
882 lifetime_minutes=lifetime, role=role, description=description,
886 lifetime_minutes=lifetime, role=role, description=description,
883 scope_callback=self.maybe_attach_token_scope)
887 scope_callback=self.maybe_attach_token_scope)
884 token_data = token.get_api_data()
888 token_data = token.get_api_data()
885
889
886 audit_logger.store_web(
890 audit_logger.store_web(
887 'user.edit.token.add', action_data={
891 'user.edit.token.add', action_data={
888 'data': {'token': token_data, 'user': user_data}},
892 'data': {'token': token_data, 'user': user_data}},
889 user=self._rhodecode_user, )
893 user=self._rhodecode_user, )
890 Session().commit()
894 Session().commit()
891
895
892 h.flash(_("Auth token successfully created"), category='success')
896 h.flash(_("Auth token successfully created"), category='success')
893 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
897 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
894
898
895 @LoginRequired()
899 @LoginRequired()
896 @HasPermissionAllDecorator('hg.admin')
900 @HasPermissionAllDecorator('hg.admin')
897 @CSRFRequired()
901 @CSRFRequired()
898 @view_config(
902 @view_config(
899 route_name='edit_user_auth_tokens_delete', request_method='POST')
903 route_name='edit_user_auth_tokens_delete', request_method='POST')
900 def auth_tokens_delete(self):
904 def auth_tokens_delete(self):
901 _ = self.request.translate
905 _ = self.request.translate
902 c = self.load_default_context()
906 c = self.load_default_context()
903
907
904 user_id = self.db_user_id
908 user_id = self.db_user_id
905 c.user = self.db_user
909 c.user = self.db_user
906
910
907 user_data = c.user.get_api_data()
911 user_data = c.user.get_api_data()
908
912
909 del_auth_token = self.request.POST.get('del_auth_token')
913 del_auth_token = self.request.POST.get('del_auth_token')
910
914
911 if del_auth_token:
915 if del_auth_token:
912 token = UserApiKeys.get_or_404(del_auth_token)
916 token = UserApiKeys.get_or_404(del_auth_token)
913 token_data = token.get_api_data()
917 token_data = token.get_api_data()
914
918
915 AuthTokenModel().delete(del_auth_token, c.user.user_id)
919 AuthTokenModel().delete(del_auth_token, c.user.user_id)
916 audit_logger.store_web(
920 audit_logger.store_web(
917 'user.edit.token.delete', action_data={
921 'user.edit.token.delete', action_data={
918 'data': {'token': token_data, 'user': user_data}},
922 'data': {'token': token_data, 'user': user_data}},
919 user=self._rhodecode_user,)
923 user=self._rhodecode_user,)
920 Session().commit()
924 Session().commit()
921 h.flash(_("Auth token successfully deleted"), category='success')
925 h.flash(_("Auth token successfully deleted"), category='success')
922
926
923 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
927 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
924
928
925 @LoginRequired()
929 @LoginRequired()
926 @HasPermissionAllDecorator('hg.admin')
930 @HasPermissionAllDecorator('hg.admin')
927 @view_config(
931 @view_config(
928 route_name='edit_user_ssh_keys', request_method='GET',
932 route_name='edit_user_ssh_keys', request_method='GET',
929 renderer='rhodecode:templates/admin/users/user_edit.mako')
933 renderer='rhodecode:templates/admin/users/user_edit.mako')
930 def ssh_keys(self):
934 def ssh_keys(self):
931 _ = self.request.translate
935 _ = self.request.translate
932 c = self.load_default_context()
936 c = self.load_default_context()
933 c.user = self.db_user
937 c.user = self.db_user
934
938
935 c.active = 'ssh_keys'
939 c.active = 'ssh_keys'
936 c.default_key = self.request.GET.get('default_key')
940 c.default_key = self.request.GET.get('default_key')
937 c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id)
941 c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id)
938 return self._get_template_context(c)
942 return self._get_template_context(c)
939
943
940 @LoginRequired()
944 @LoginRequired()
941 @HasPermissionAllDecorator('hg.admin')
945 @HasPermissionAllDecorator('hg.admin')
942 @view_config(
946 @view_config(
943 route_name='edit_user_ssh_keys_generate_keypair', request_method='GET',
947 route_name='edit_user_ssh_keys_generate_keypair', request_method='GET',
944 renderer='rhodecode:templates/admin/users/user_edit.mako')
948 renderer='rhodecode:templates/admin/users/user_edit.mako')
945 def ssh_keys_generate_keypair(self):
949 def ssh_keys_generate_keypair(self):
946 _ = self.request.translate
950 _ = self.request.translate
947 c = self.load_default_context()
951 c = self.load_default_context()
948
952
949 c.user = self.db_user
953 c.user = self.db_user
950
954
951 c.active = 'ssh_keys_generate'
955 c.active = 'ssh_keys_generate'
952 comment = 'RhodeCode-SSH {}'.format(c.user.email or '')
956 comment = 'RhodeCode-SSH {}'.format(c.user.email or '')
953 private_format = self.request.GET.get('private_format') \
957 private_format = self.request.GET.get('private_format') \
954 or SshKeyModel.DEFAULT_PRIVATE_KEY_FORMAT
958 or SshKeyModel.DEFAULT_PRIVATE_KEY_FORMAT
955 c.private, c.public = SshKeyModel().generate_keypair(
959 c.private, c.public = SshKeyModel().generate_keypair(
956 comment=comment, private_format=private_format)
960 comment=comment, private_format=private_format)
957
961
958 return self._get_template_context(c)
962 return self._get_template_context(c)
959
963
960 @LoginRequired()
964 @LoginRequired()
961 @HasPermissionAllDecorator('hg.admin')
965 @HasPermissionAllDecorator('hg.admin')
962 @CSRFRequired()
966 @CSRFRequired()
963 @view_config(
967 @view_config(
964 route_name='edit_user_ssh_keys_add', request_method='POST')
968 route_name='edit_user_ssh_keys_add', request_method='POST')
965 def ssh_keys_add(self):
969 def ssh_keys_add(self):
966 _ = self.request.translate
970 _ = self.request.translate
967 c = self.load_default_context()
971 c = self.load_default_context()
968
972
969 user_id = self.db_user_id
973 user_id = self.db_user_id
970 c.user = self.db_user
974 c.user = self.db_user
971
975
972 user_data = c.user.get_api_data()
976 user_data = c.user.get_api_data()
973 key_data = self.request.POST.get('key_data')
977 key_data = self.request.POST.get('key_data')
974 description = self.request.POST.get('description')
978 description = self.request.POST.get('description')
975
979
976 fingerprint = 'unknown'
980 fingerprint = 'unknown'
977 try:
981 try:
978 if not key_data:
982 if not key_data:
979 raise ValueError('Please add a valid public key')
983 raise ValueError('Please add a valid public key')
980
984
981 key = SshKeyModel().parse_key(key_data.strip())
985 key = SshKeyModel().parse_key(key_data.strip())
982 fingerprint = key.hash_md5()
986 fingerprint = key.hash_md5()
983
987
984 ssh_key = SshKeyModel().create(
988 ssh_key = SshKeyModel().create(
985 c.user.user_id, fingerprint, key.keydata, description)
989 c.user.user_id, fingerprint, key.keydata, description)
986 ssh_key_data = ssh_key.get_api_data()
990 ssh_key_data = ssh_key.get_api_data()
987
991
988 audit_logger.store_web(
992 audit_logger.store_web(
989 'user.edit.ssh_key.add', action_data={
993 'user.edit.ssh_key.add', action_data={
990 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
994 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
991 user=self._rhodecode_user, )
995 user=self._rhodecode_user, )
992 Session().commit()
996 Session().commit()
993
997
994 # Trigger an event on change of keys.
998 # Trigger an event on change of keys.
995 trigger(SshKeyFileChangeEvent(), self.request.registry)
999 trigger(SshKeyFileChangeEvent(), self.request.registry)
996
1000
997 h.flash(_("Ssh Key successfully created"), category='success')
1001 h.flash(_("Ssh Key successfully created"), category='success')
998
1002
999 except IntegrityError:
1003 except IntegrityError:
1000 log.exception("Exception during ssh key saving")
1004 log.exception("Exception during ssh key saving")
1001 err = 'Such key with fingerprint `{}` already exists, ' \
1005 err = 'Such key with fingerprint `{}` already exists, ' \
1002 'please use a different one'.format(fingerprint)
1006 'please use a different one'.format(fingerprint)
1003 h.flash(_('An error occurred during ssh key saving: {}').format(err),
1007 h.flash(_('An error occurred during ssh key saving: {}').format(err),
1004 category='error')
1008 category='error')
1005 except Exception as e:
1009 except Exception as e:
1006 log.exception("Exception during ssh key saving")
1010 log.exception("Exception during ssh key saving")
1007 h.flash(_('An error occurred during ssh key saving: {}').format(e),
1011 h.flash(_('An error occurred during ssh key saving: {}').format(e),
1008 category='error')
1012 category='error')
1009
1013
1010 return HTTPFound(
1014 return HTTPFound(
1011 h.route_path('edit_user_ssh_keys', user_id=user_id))
1015 h.route_path('edit_user_ssh_keys', user_id=user_id))
1012
1016
1013 @LoginRequired()
1017 @LoginRequired()
1014 @HasPermissionAllDecorator('hg.admin')
1018 @HasPermissionAllDecorator('hg.admin')
1015 @CSRFRequired()
1019 @CSRFRequired()
1016 @view_config(
1020 @view_config(
1017 route_name='edit_user_ssh_keys_delete', request_method='POST')
1021 route_name='edit_user_ssh_keys_delete', request_method='POST')
1018 def ssh_keys_delete(self):
1022 def ssh_keys_delete(self):
1019 _ = self.request.translate
1023 _ = self.request.translate
1020 c = self.load_default_context()
1024 c = self.load_default_context()
1021
1025
1022 user_id = self.db_user_id
1026 user_id = self.db_user_id
1023 c.user = self.db_user
1027 c.user = self.db_user
1024
1028
1025 user_data = c.user.get_api_data()
1029 user_data = c.user.get_api_data()
1026
1030
1027 del_ssh_key = self.request.POST.get('del_ssh_key')
1031 del_ssh_key = self.request.POST.get('del_ssh_key')
1028
1032
1029 if del_ssh_key:
1033 if del_ssh_key:
1030 ssh_key = UserSshKeys.get_or_404(del_ssh_key)
1034 ssh_key = UserSshKeys.get_or_404(del_ssh_key)
1031 ssh_key_data = ssh_key.get_api_data()
1035 ssh_key_data = ssh_key.get_api_data()
1032
1036
1033 SshKeyModel().delete(del_ssh_key, c.user.user_id)
1037 SshKeyModel().delete(del_ssh_key, c.user.user_id)
1034 audit_logger.store_web(
1038 audit_logger.store_web(
1035 'user.edit.ssh_key.delete', action_data={
1039 'user.edit.ssh_key.delete', action_data={
1036 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
1040 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
1037 user=self._rhodecode_user,)
1041 user=self._rhodecode_user,)
1038 Session().commit()
1042 Session().commit()
1039 # Trigger an event on change of keys.
1043 # Trigger an event on change of keys.
1040 trigger(SshKeyFileChangeEvent(), self.request.registry)
1044 trigger(SshKeyFileChangeEvent(), self.request.registry)
1041 h.flash(_("Ssh key successfully deleted"), category='success')
1045 h.flash(_("Ssh key successfully deleted"), category='success')
1042
1046
1043 return HTTPFound(h.route_path('edit_user_ssh_keys', user_id=user_id))
1047 return HTTPFound(h.route_path('edit_user_ssh_keys', user_id=user_id))
1044
1048
1045 @LoginRequired()
1049 @LoginRequired()
1046 @HasPermissionAllDecorator('hg.admin')
1050 @HasPermissionAllDecorator('hg.admin')
1047 @view_config(
1051 @view_config(
1048 route_name='edit_user_emails', request_method='GET',
1052 route_name='edit_user_emails', request_method='GET',
1049 renderer='rhodecode:templates/admin/users/user_edit.mako')
1053 renderer='rhodecode:templates/admin/users/user_edit.mako')
1050 def emails(self):
1054 def emails(self):
1051 _ = self.request.translate
1055 _ = self.request.translate
1052 c = self.load_default_context()
1056 c = self.load_default_context()
1053 c.user = self.db_user
1057 c.user = self.db_user
1054
1058
1055 c.active = 'emails'
1059 c.active = 'emails'
1056 c.user_email_map = UserEmailMap.query() \
1060 c.user_email_map = UserEmailMap.query() \
1057 .filter(UserEmailMap.user == c.user).all()
1061 .filter(UserEmailMap.user == c.user).all()
1058
1062
1059 return self._get_template_context(c)
1063 return self._get_template_context(c)
1060
1064
1061 @LoginRequired()
1065 @LoginRequired()
1062 @HasPermissionAllDecorator('hg.admin')
1066 @HasPermissionAllDecorator('hg.admin')
1063 @CSRFRequired()
1067 @CSRFRequired()
1064 @view_config(
1068 @view_config(
1065 route_name='edit_user_emails_add', request_method='POST')
1069 route_name='edit_user_emails_add', request_method='POST')
1066 def emails_add(self):
1070 def emails_add(self):
1067 _ = self.request.translate
1071 _ = self.request.translate
1068 c = self.load_default_context()
1072 c = self.load_default_context()
1069
1073
1070 user_id = self.db_user_id
1074 user_id = self.db_user_id
1071 c.user = self.db_user
1075 c.user = self.db_user
1072
1076
1073 email = self.request.POST.get('new_email')
1077 email = self.request.POST.get('new_email')
1074 user_data = c.user.get_api_data()
1078 user_data = c.user.get_api_data()
1075 try:
1079 try:
1076
1080
1077 form = UserExtraEmailForm(self.request.translate)()
1081 form = UserExtraEmailForm(self.request.translate)()
1078 data = form.to_python({'email': email})
1082 data = form.to_python({'email': email})
1079 email = data['email']
1083 email = data['email']
1080
1084
1081 UserModel().add_extra_email(c.user.user_id, email)
1085 UserModel().add_extra_email(c.user.user_id, email)
1082 audit_logger.store_web(
1086 audit_logger.store_web(
1083 'user.edit.email.add',
1087 'user.edit.email.add',
1084 action_data={'email': email, 'user': user_data},
1088 action_data={'email': email, 'user': user_data},
1085 user=self._rhodecode_user)
1089 user=self._rhodecode_user)
1086 Session().commit()
1090 Session().commit()
1087 h.flash(_("Added new email address `%s` for user account") % email,
1091 h.flash(_("Added new email address `%s` for user account") % email,
1088 category='success')
1092 category='success')
1089 except formencode.Invalid as error:
1093 except formencode.Invalid as error:
1090 h.flash(h.escape(error.error_dict['email']), category='error')
1094 h.flash(h.escape(error.error_dict['email']), category='error')
1091 except IntegrityError:
1095 except IntegrityError:
1092 log.warning("Email %s already exists", email)
1096 log.warning("Email %s already exists", email)
1093 h.flash(_('Email `{}` is already registered for another user.').format(email),
1097 h.flash(_('Email `{}` is already registered for another user.').format(email),
1094 category='error')
1098 category='error')
1095 except Exception:
1099 except Exception:
1096 log.exception("Exception during email saving")
1100 log.exception("Exception during email saving")
1097 h.flash(_('An error occurred during email saving'),
1101 h.flash(_('An error occurred during email saving'),
1098 category='error')
1102 category='error')
1099 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1103 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1100
1104
1101 @LoginRequired()
1105 @LoginRequired()
1102 @HasPermissionAllDecorator('hg.admin')
1106 @HasPermissionAllDecorator('hg.admin')
1103 @CSRFRequired()
1107 @CSRFRequired()
1104 @view_config(
1108 @view_config(
1105 route_name='edit_user_emails_delete', request_method='POST')
1109 route_name='edit_user_emails_delete', request_method='POST')
1106 def emails_delete(self):
1110 def emails_delete(self):
1107 _ = self.request.translate
1111 _ = self.request.translate
1108 c = self.load_default_context()
1112 c = self.load_default_context()
1109
1113
1110 user_id = self.db_user_id
1114 user_id = self.db_user_id
1111 c.user = self.db_user
1115 c.user = self.db_user
1112
1116
1113 email_id = self.request.POST.get('del_email_id')
1117 email_id = self.request.POST.get('del_email_id')
1114 user_model = UserModel()
1118 user_model = UserModel()
1115
1119
1116 email = UserEmailMap.query().get(email_id).email
1120 email = UserEmailMap.query().get(email_id).email
1117 user_data = c.user.get_api_data()
1121 user_data = c.user.get_api_data()
1118 user_model.delete_extra_email(c.user.user_id, email_id)
1122 user_model.delete_extra_email(c.user.user_id, email_id)
1119 audit_logger.store_web(
1123 audit_logger.store_web(
1120 'user.edit.email.delete',
1124 'user.edit.email.delete',
1121 action_data={'email': email, 'user': user_data},
1125 action_data={'email': email, 'user': user_data},
1122 user=self._rhodecode_user)
1126 user=self._rhodecode_user)
1123 Session().commit()
1127 Session().commit()
1124 h.flash(_("Removed email address from user account"),
1128 h.flash(_("Removed email address from user account"),
1125 category='success')
1129 category='success')
1126 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1130 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1127
1131
1128 @LoginRequired()
1132 @LoginRequired()
1129 @HasPermissionAllDecorator('hg.admin')
1133 @HasPermissionAllDecorator('hg.admin')
1130 @view_config(
1134 @view_config(
1131 route_name='edit_user_ips', request_method='GET',
1135 route_name='edit_user_ips', request_method='GET',
1132 renderer='rhodecode:templates/admin/users/user_edit.mako')
1136 renderer='rhodecode:templates/admin/users/user_edit.mako')
1133 def ips(self):
1137 def ips(self):
1134 _ = self.request.translate
1138 _ = self.request.translate
1135 c = self.load_default_context()
1139 c = self.load_default_context()
1136 c.user = self.db_user
1140 c.user = self.db_user
1137
1141
1138 c.active = 'ips'
1142 c.active = 'ips'
1139 c.user_ip_map = UserIpMap.query() \
1143 c.user_ip_map = UserIpMap.query() \
1140 .filter(UserIpMap.user == c.user).all()
1144 .filter(UserIpMap.user == c.user).all()
1141
1145
1142 c.inherit_default_ips = c.user.inherit_default_permissions
1146 c.inherit_default_ips = c.user.inherit_default_permissions
1143 c.default_user_ip_map = UserIpMap.query() \
1147 c.default_user_ip_map = UserIpMap.query() \
1144 .filter(UserIpMap.user == User.get_default_user()).all()
1148 .filter(UserIpMap.user == User.get_default_user()).all()
1145
1149
1146 return self._get_template_context(c)
1150 return self._get_template_context(c)
1147
1151
1148 @LoginRequired()
1152 @LoginRequired()
1149 @HasPermissionAllDecorator('hg.admin')
1153 @HasPermissionAllDecorator('hg.admin')
1150 @CSRFRequired()
1154 @CSRFRequired()
1151 @view_config(
1155 @view_config(
1152 route_name='edit_user_ips_add', request_method='POST')
1156 route_name='edit_user_ips_add', request_method='POST')
1153 # NOTE(marcink): this view is allowed for default users, as we can
1157 # NOTE(marcink): this view is allowed for default users, as we can
1154 # edit their IP white list
1158 # edit their IP white list
1155 def ips_add(self):
1159 def ips_add(self):
1156 _ = self.request.translate
1160 _ = self.request.translate
1157 c = self.load_default_context()
1161 c = self.load_default_context()
1158
1162
1159 user_id = self.db_user_id
1163 user_id = self.db_user_id
1160 c.user = self.db_user
1164 c.user = self.db_user
1161
1165
1162 user_model = UserModel()
1166 user_model = UserModel()
1163 desc = self.request.POST.get('description')
1167 desc = self.request.POST.get('description')
1164 try:
1168 try:
1165 ip_list = user_model.parse_ip_range(
1169 ip_list = user_model.parse_ip_range(
1166 self.request.POST.get('new_ip'))
1170 self.request.POST.get('new_ip'))
1167 except Exception as e:
1171 except Exception as e:
1168 ip_list = []
1172 ip_list = []
1169 log.exception("Exception during ip saving")
1173 log.exception("Exception during ip saving")
1170 h.flash(_('An error occurred during ip saving:%s' % (e,)),
1174 h.flash(_('An error occurred during ip saving:%s' % (e,)),
1171 category='error')
1175 category='error')
1172 added = []
1176 added = []
1173 user_data = c.user.get_api_data()
1177 user_data = c.user.get_api_data()
1174 for ip in ip_list:
1178 for ip in ip_list:
1175 try:
1179 try:
1176 form = UserExtraIpForm(self.request.translate)()
1180 form = UserExtraIpForm(self.request.translate)()
1177 data = form.to_python({'ip': ip})
1181 data = form.to_python({'ip': ip})
1178 ip = data['ip']
1182 ip = data['ip']
1179
1183
1180 user_model.add_extra_ip(c.user.user_id, ip, desc)
1184 user_model.add_extra_ip(c.user.user_id, ip, desc)
1181 audit_logger.store_web(
1185 audit_logger.store_web(
1182 'user.edit.ip.add',
1186 'user.edit.ip.add',
1183 action_data={'ip': ip, 'user': user_data},
1187 action_data={'ip': ip, 'user': user_data},
1184 user=self._rhodecode_user)
1188 user=self._rhodecode_user)
1185 Session().commit()
1189 Session().commit()
1186 added.append(ip)
1190 added.append(ip)
1187 except formencode.Invalid as error:
1191 except formencode.Invalid as error:
1188 msg = error.error_dict['ip']
1192 msg = error.error_dict['ip']
1189 h.flash(msg, category='error')
1193 h.flash(msg, category='error')
1190 except Exception:
1194 except Exception:
1191 log.exception("Exception during ip saving")
1195 log.exception("Exception during ip saving")
1192 h.flash(_('An error occurred during ip saving'),
1196 h.flash(_('An error occurred during ip saving'),
1193 category='error')
1197 category='error')
1194 if added:
1198 if added:
1195 h.flash(
1199 h.flash(
1196 _("Added ips %s to user whitelist") % (', '.join(ip_list), ),
1200 _("Added ips %s to user whitelist") % (', '.join(ip_list), ),
1197 category='success')
1201 category='success')
1198 if 'default_user' in self.request.POST:
1202 if 'default_user' in self.request.POST:
1199 # case for editing global IP list we do it for 'DEFAULT' user
1203 # case for editing global IP list we do it for 'DEFAULT' user
1200 raise HTTPFound(h.route_path('admin_permissions_ips'))
1204 raise HTTPFound(h.route_path('admin_permissions_ips'))
1201 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1205 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1202
1206
1203 @LoginRequired()
1207 @LoginRequired()
1204 @HasPermissionAllDecorator('hg.admin')
1208 @HasPermissionAllDecorator('hg.admin')
1205 @CSRFRequired()
1209 @CSRFRequired()
1206 @view_config(
1210 @view_config(
1207 route_name='edit_user_ips_delete', request_method='POST')
1211 route_name='edit_user_ips_delete', request_method='POST')
1208 # NOTE(marcink): this view is allowed for default users, as we can
1212 # NOTE(marcink): this view is allowed for default users, as we can
1209 # edit their IP white list
1213 # edit their IP white list
1210 def ips_delete(self):
1214 def ips_delete(self):
1211 _ = self.request.translate
1215 _ = self.request.translate
1212 c = self.load_default_context()
1216 c = self.load_default_context()
1213
1217
1214 user_id = self.db_user_id
1218 user_id = self.db_user_id
1215 c.user = self.db_user
1219 c.user = self.db_user
1216
1220
1217 ip_id = self.request.POST.get('del_ip_id')
1221 ip_id = self.request.POST.get('del_ip_id')
1218 user_model = UserModel()
1222 user_model = UserModel()
1219 user_data = c.user.get_api_data()
1223 user_data = c.user.get_api_data()
1220 ip = UserIpMap.query().get(ip_id).ip_addr
1224 ip = UserIpMap.query().get(ip_id).ip_addr
1221 user_model.delete_extra_ip(c.user.user_id, ip_id)
1225 user_model.delete_extra_ip(c.user.user_id, ip_id)
1222 audit_logger.store_web(
1226 audit_logger.store_web(
1223 'user.edit.ip.delete', action_data={'ip': ip, 'user': user_data},
1227 'user.edit.ip.delete', action_data={'ip': ip, 'user': user_data},
1224 user=self._rhodecode_user)
1228 user=self._rhodecode_user)
1225 Session().commit()
1229 Session().commit()
1226 h.flash(_("Removed ip address from user whitelist"), category='success')
1230 h.flash(_("Removed ip address from user whitelist"), category='success')
1227
1231
1228 if 'default_user' in self.request.POST:
1232 if 'default_user' in self.request.POST:
1229 # case for editing global IP list we do it for 'DEFAULT' user
1233 # case for editing global IP list we do it for 'DEFAULT' user
1230 raise HTTPFound(h.route_path('admin_permissions_ips'))
1234 raise HTTPFound(h.route_path('admin_permissions_ips'))
1231 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1235 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1232
1236
1233 @LoginRequired()
1237 @LoginRequired()
1234 @HasPermissionAllDecorator('hg.admin')
1238 @HasPermissionAllDecorator('hg.admin')
1235 @view_config(
1239 @view_config(
1236 route_name='edit_user_groups_management', request_method='GET',
1240 route_name='edit_user_groups_management', request_method='GET',
1237 renderer='rhodecode:templates/admin/users/user_edit.mako')
1241 renderer='rhodecode:templates/admin/users/user_edit.mako')
1238 def groups_management(self):
1242 def groups_management(self):
1239 c = self.load_default_context()
1243 c = self.load_default_context()
1240 c.user = self.db_user
1244 c.user = self.db_user
1241 c.data = c.user.group_member
1245 c.data = c.user.group_member
1242
1246
1243 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
1247 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
1244 for group in c.user.group_member]
1248 for group in c.user.group_member]
1245 c.groups = json.dumps(groups)
1249 c.groups = json.dumps(groups)
1246 c.active = 'groups'
1250 c.active = 'groups'
1247
1251
1248 return self._get_template_context(c)
1252 return self._get_template_context(c)
1249
1253
1250 @LoginRequired()
1254 @LoginRequired()
1251 @HasPermissionAllDecorator('hg.admin')
1255 @HasPermissionAllDecorator('hg.admin')
1252 @CSRFRequired()
1256 @CSRFRequired()
1253 @view_config(
1257 @view_config(
1254 route_name='edit_user_groups_management_updates', request_method='POST')
1258 route_name='edit_user_groups_management_updates', request_method='POST')
1255 def groups_management_updates(self):
1259 def groups_management_updates(self):
1256 _ = self.request.translate
1260 _ = self.request.translate
1257 c = self.load_default_context()
1261 c = self.load_default_context()
1258
1262
1259 user_id = self.db_user_id
1263 user_id = self.db_user_id
1260 c.user = self.db_user
1264 c.user = self.db_user
1261
1265
1262 user_groups = set(self.request.POST.getall('users_group_id'))
1266 user_groups = set(self.request.POST.getall('users_group_id'))
1263 user_groups_objects = []
1267 user_groups_objects = []
1264
1268
1265 for ugid in user_groups:
1269 for ugid in user_groups:
1266 user_groups_objects.append(
1270 user_groups_objects.append(
1267 UserGroupModel().get_group(safe_int(ugid)))
1271 UserGroupModel().get_group(safe_int(ugid)))
1268 user_group_model = UserGroupModel()
1272 user_group_model = UserGroupModel()
1269 added_to_groups, removed_from_groups = \
1273 added_to_groups, removed_from_groups = \
1270 user_group_model.change_groups(c.user, user_groups_objects)
1274 user_group_model.change_groups(c.user, user_groups_objects)
1271
1275
1272 user_data = c.user.get_api_data()
1276 user_data = c.user.get_api_data()
1273 for user_group_id in added_to_groups:
1277 for user_group_id in added_to_groups:
1274 user_group = UserGroup.get(user_group_id)
1278 user_group = UserGroup.get(user_group_id)
1275 old_values = user_group.get_api_data()
1279 old_values = user_group.get_api_data()
1276 audit_logger.store_web(
1280 audit_logger.store_web(
1277 'user_group.edit.member.add',
1281 'user_group.edit.member.add',
1278 action_data={'user': user_data, 'old_data': old_values},
1282 action_data={'user': user_data, 'old_data': old_values},
1279 user=self._rhodecode_user)
1283 user=self._rhodecode_user)
1280
1284
1281 for user_group_id in removed_from_groups:
1285 for user_group_id in removed_from_groups:
1282 user_group = UserGroup.get(user_group_id)
1286 user_group = UserGroup.get(user_group_id)
1283 old_values = user_group.get_api_data()
1287 old_values = user_group.get_api_data()
1284 audit_logger.store_web(
1288 audit_logger.store_web(
1285 'user_group.edit.member.delete',
1289 'user_group.edit.member.delete',
1286 action_data={'user': user_data, 'old_data': old_values},
1290 action_data={'user': user_data, 'old_data': old_values},
1287 user=self._rhodecode_user)
1291 user=self._rhodecode_user)
1288
1292
1289 Session().commit()
1293 Session().commit()
1290 c.active = 'user_groups_management'
1294 c.active = 'user_groups_management'
1291 h.flash(_("Groups successfully changed"), category='success')
1295 h.flash(_("Groups successfully changed"), category='success')
1292
1296
1293 return HTTPFound(h.route_path(
1297 return HTTPFound(h.route_path(
1294 'edit_user_groups_management', user_id=user_id))
1298 'edit_user_groups_management', user_id=user_id))
1295
1299
1296 @LoginRequired()
1300 @LoginRequired()
1297 @HasPermissionAllDecorator('hg.admin')
1301 @HasPermissionAllDecorator('hg.admin')
1298 @view_config(
1302 @view_config(
1299 route_name='edit_user_audit_logs', request_method='GET',
1303 route_name='edit_user_audit_logs', request_method='GET',
1300 renderer='rhodecode:templates/admin/users/user_edit.mako')
1304 renderer='rhodecode:templates/admin/users/user_edit.mako')
1301 def user_audit_logs(self):
1305 def user_audit_logs(self):
1302 _ = self.request.translate
1306 _ = self.request.translate
1303 c = self.load_default_context()
1307 c = self.load_default_context()
1304 c.user = self.db_user
1308 c.user = self.db_user
1305
1309
1306 c.active = 'audit'
1310 c.active = 'audit'
1307
1311
1308 p = safe_int(self.request.GET.get('page', 1), 1)
1312 p = safe_int(self.request.GET.get('page', 1), 1)
1309
1313
1310 filter_term = self.request.GET.get('filter')
1314 filter_term = self.request.GET.get('filter')
1311 user_log = UserModel().get_user_log(c.user, filter_term)
1315 user_log = UserModel().get_user_log(c.user, filter_term)
1312
1316
1313 def url_generator(page_num):
1317 def url_generator(page_num):
1314 query_params = {
1318 query_params = {
1315 'page': page_num
1319 'page': page_num
1316 }
1320 }
1317 if filter_term:
1321 if filter_term:
1318 query_params['filter'] = filter_term
1322 query_params['filter'] = filter_term
1319 return self.request.current_route_path(_query=query_params)
1323 return self.request.current_route_path(_query=query_params)
1320
1324
1321 c.audit_logs = SqlPage(
1325 c.audit_logs = SqlPage(
1322 user_log, page=p, items_per_page=10, url_maker=url_generator)
1326 user_log, page=p, items_per_page=10, url_maker=url_generator)
1323 c.filter_term = filter_term
1327 c.filter_term = filter_term
1324 return self._get_template_context(c)
1328 return self._get_template_context(c)
1325
1329
1326 @LoginRequired()
1330 @LoginRequired()
1327 @HasPermissionAllDecorator('hg.admin')
1331 @HasPermissionAllDecorator('hg.admin')
1328 @view_config(
1332 @view_config(
1329 route_name='edit_user_audit_logs_download', request_method='GET',
1333 route_name='edit_user_audit_logs_download', request_method='GET',
1330 renderer='string')
1334 renderer='string')
1331 def user_audit_logs_download(self):
1335 def user_audit_logs_download(self):
1332 _ = self.request.translate
1336 _ = self.request.translate
1333 c = self.load_default_context()
1337 c = self.load_default_context()
1334 c.user = self.db_user
1338 c.user = self.db_user
1335
1339
1336 user_log = UserModel().get_user_log(c.user, filter_term=None)
1340 user_log = UserModel().get_user_log(c.user, filter_term=None)
1337
1341
1338 audit_log_data = {}
1342 audit_log_data = {}
1339 for entry in user_log:
1343 for entry in user_log:
1340 audit_log_data[entry.user_log_id] = entry.get_dict()
1344 audit_log_data[entry.user_log_id] = entry.get_dict()
1341
1345
1342 response = Response(json.dumps(audit_log_data, indent=4))
1346 response = Response(json.dumps(audit_log_data, indent=4))
1343 response.content_disposition = str(
1347 response.content_disposition = str(
1344 'attachment; filename=%s' % 'user_{}_audit_logs.json'.format(c.user.user_id))
1348 'attachment; filename=%s' % 'user_{}_audit_logs.json'.format(c.user.user_id))
1345 response.content_type = 'application/json'
1349 response.content_type = 'application/json'
1346
1350
1347 return response
1351 return response
1348
1352
1349 @LoginRequired()
1353 @LoginRequired()
1350 @HasPermissionAllDecorator('hg.admin')
1354 @HasPermissionAllDecorator('hg.admin')
1351 @view_config(
1355 @view_config(
1352 route_name='edit_user_perms_summary', request_method='GET',
1356 route_name='edit_user_perms_summary', request_method='GET',
1353 renderer='rhodecode:templates/admin/users/user_edit.mako')
1357 renderer='rhodecode:templates/admin/users/user_edit.mako')
1354 def user_perms_summary(self):
1358 def user_perms_summary(self):
1355 _ = self.request.translate
1359 _ = self.request.translate
1356 c = self.load_default_context()
1360 c = self.load_default_context()
1357 c.user = self.db_user
1361 c.user = self.db_user
1358
1362
1359 c.active = 'perms_summary'
1363 c.active = 'perms_summary'
1360 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1364 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1361
1365
1362 return self._get_template_context(c)
1366 return self._get_template_context(c)
1363
1367
1364 @LoginRequired()
1368 @LoginRequired()
1365 @HasPermissionAllDecorator('hg.admin')
1369 @HasPermissionAllDecorator('hg.admin')
1366 @view_config(
1370 @view_config(
1367 route_name='edit_user_perms_summary_json', request_method='GET',
1371 route_name='edit_user_perms_summary_json', request_method='GET',
1368 renderer='json_ext')
1372 renderer='json_ext')
1369 def user_perms_summary_json(self):
1373 def user_perms_summary_json(self):
1370 self.load_default_context()
1374 self.load_default_context()
1371 perm_user = self.db_user.AuthUser(ip_addr=self.request.remote_addr)
1375 perm_user = self.db_user.AuthUser(ip_addr=self.request.remote_addr)
1372
1376
1373 return perm_user.permissions
1377 return perm_user.permissions
1374
1378
1375 @LoginRequired()
1379 @LoginRequired()
1376 @HasPermissionAllDecorator('hg.admin')
1380 @HasPermissionAllDecorator('hg.admin')
1377 @view_config(
1381 @view_config(
1378 route_name='edit_user_caches', request_method='GET',
1382 route_name='edit_user_caches', request_method='GET',
1379 renderer='rhodecode:templates/admin/users/user_edit.mako')
1383 renderer='rhodecode:templates/admin/users/user_edit.mako')
1380 def user_caches(self):
1384 def user_caches(self):
1381 _ = self.request.translate
1385 _ = self.request.translate
1382 c = self.load_default_context()
1386 c = self.load_default_context()
1383 c.user = self.db_user
1387 c.user = self.db_user
1384
1388
1385 c.active = 'caches'
1389 c.active = 'caches'
1386 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1390 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1387
1391
1388 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1392 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1389 c.region = rc_cache.get_or_create_region('cache_perms', cache_namespace_uid)
1393 c.region = rc_cache.get_or_create_region('cache_perms', cache_namespace_uid)
1390 c.backend = c.region.backend
1394 c.backend = c.region.backend
1391 c.user_keys = sorted(c.region.backend.list_keys(prefix=cache_namespace_uid))
1395 c.user_keys = sorted(c.region.backend.list_keys(prefix=cache_namespace_uid))
1392
1396
1393 return self._get_template_context(c)
1397 return self._get_template_context(c)
1394
1398
1395 @LoginRequired()
1399 @LoginRequired()
1396 @HasPermissionAllDecorator('hg.admin')
1400 @HasPermissionAllDecorator('hg.admin')
1397 @CSRFRequired()
1401 @CSRFRequired()
1398 @view_config(
1402 @view_config(
1399 route_name='edit_user_caches_update', request_method='POST')
1403 route_name='edit_user_caches_update', request_method='POST')
1400 def user_caches_update(self):
1404 def user_caches_update(self):
1401 _ = self.request.translate
1405 _ = self.request.translate
1402 c = self.load_default_context()
1406 c = self.load_default_context()
1403 c.user = self.db_user
1407 c.user = self.db_user
1404
1408
1405 c.active = 'caches'
1409 c.active = 'caches'
1406 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1410 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1407
1411
1408 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1412 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1409 del_keys = rc_cache.clear_cache_namespace('cache_perms', cache_namespace_uid)
1413 del_keys = rc_cache.clear_cache_namespace('cache_perms', cache_namespace_uid)
1410
1414
1411 h.flash(_("Deleted {} cache keys").format(del_keys), category='success')
1415 h.flash(_("Deleted {} cache keys").format(del_keys), category='success')
1412
1416
1413 return HTTPFound(h.route_path(
1417 return HTTPFound(h.route_path(
1414 'edit_user_caches', user_id=c.user.user_id))
1418 'edit_user_caches', user_id=c.user.user_id))
@@ -1,486 +1,486 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import os
22 import logging
22 import logging
23 import datetime
23 import datetime
24
24
25 from pyramid.view import view_config
25 from pyramid.view import view_config
26 from pyramid.renderers import render_to_response
26 from pyramid.renderers import render_to_response
27 from rhodecode.apps._base import BaseAppView
27 from rhodecode.apps._base import BaseAppView
28 from rhodecode.lib.celerylib import run_task, tasks
28 from rhodecode.lib.celerylib import run_task, tasks
29 from rhodecode.lib.utils2 import AttributeDict
29 from rhodecode.lib.utils2 import AttributeDict
30 from rhodecode.model.db import User
30 from rhodecode.model.db import User
31 from rhodecode.model.notification import EmailNotificationModel
31 from rhodecode.model.notification import EmailNotificationModel
32
32
33 log = logging.getLogger(__name__)
33 log = logging.getLogger(__name__)
34
34
35
35
36 class DebugStyleView(BaseAppView):
36 class DebugStyleView(BaseAppView):
37
37
38 def load_default_context(self):
38 def load_default_context(self):
39 c = self._get_local_tmpl_context()
39 c = self._get_local_tmpl_context()
40
40
41 return c
41 return c
42
42
43 @view_config(
43 @view_config(
44 route_name='debug_style_home', request_method='GET',
44 route_name='debug_style_home', request_method='GET',
45 renderer=None)
45 renderer=None)
46 def index(self):
46 def index(self):
47 c = self.load_default_context()
47 c = self.load_default_context()
48 c.active = 'index'
48 c.active = 'index'
49
49
50 return render_to_response(
50 return render_to_response(
51 'debug_style/index.html', self._get_template_context(c),
51 'debug_style/index.html', self._get_template_context(c),
52 request=self.request)
52 request=self.request)
53
53
54 @view_config(
54 @view_config(
55 route_name='debug_style_email', request_method='GET',
55 route_name='debug_style_email', request_method='GET',
56 renderer=None)
56 renderer=None)
57 @view_config(
57 @view_config(
58 route_name='debug_style_email_plain_rendered', request_method='GET',
58 route_name='debug_style_email_plain_rendered', request_method='GET',
59 renderer=None)
59 renderer=None)
60 def render_email(self):
60 def render_email(self):
61 c = self.load_default_context()
61 c = self.load_default_context()
62 email_id = self.request.matchdict['email_id']
62 email_id = self.request.matchdict['email_id']
63 c.active = 'emails'
63 c.active = 'emails'
64
64
65 pr = AttributeDict(
65 pr = AttributeDict(
66 pull_request_id=123,
66 pull_request_id=123,
67 title='digital_ocean: fix redis, elastic search start on boot, '
67 title='digital_ocean: fix redis, elastic search start on boot, '
68 'fix fd limits on supervisor, set postgres 11 version',
68 'fix fd limits on supervisor, set postgres 11 version',
69 description='''
69 description='''
70 Check if we should use full-topic or mini-topic.
70 Check if we should use full-topic or mini-topic.
71
71
72 - full topic produces some problems with merge states etc
72 - full topic produces some problems with merge states etc
73 - server-mini-topic needs probably tweeks.
73 - server-mini-topic needs probably tweeks.
74 ''',
74 ''',
75 repo_name='foobar',
75 repo_name='foobar',
76 source_ref_parts=AttributeDict(type='branch', name='fix-ticket-2000'),
76 source_ref_parts=AttributeDict(type='branch', name='fix-ticket-2000'),
77 target_ref_parts=AttributeDict(type='branch', name='master'),
77 target_ref_parts=AttributeDict(type='branch', name='master'),
78 )
78 )
79
79
80 target_repo = AttributeDict(repo_name='repo_group/target_repo')
80 target_repo = AttributeDict(repo_name='repo_group/target_repo')
81 source_repo = AttributeDict(repo_name='repo_group/source_repo')
81 source_repo = AttributeDict(repo_name='repo_group/source_repo')
82 user = User.get_by_username(self.request.GET.get('user')) or self._rhodecode_db_user
82 user = User.get_by_username(self.request.GET.get('user')) or self._rhodecode_db_user
83 # file/commit changes for PR update
83 # file/commit changes for PR update
84 commit_changes = AttributeDict({
84 commit_changes = AttributeDict({
85 'added': ['aaaaaaabbbbb', 'cccccccddddddd'],
85 'added': ['aaaaaaabbbbb', 'cccccccddddddd'],
86 'removed': ['eeeeeeeeeee'],
86 'removed': ['eeeeeeeeeee'],
87 })
87 })
88
88
89 file_changes = AttributeDict({
89 file_changes = AttributeDict({
90 'added': ['a/file1.md', 'file2.py'],
90 'added': ['a/file1.md', 'file2.py'],
91 'modified': ['b/modified_file.rst'],
91 'modified': ['b/modified_file.rst'],
92 'removed': ['.idea'],
92 'removed': ['.idea'],
93 })
93 })
94
94
95 exc_traceback = {
95 exc_traceback = {
96 'exc_utc_date': '2020-03-26T12:54:50.683281',
96 'exc_utc_date': '2020-03-26T12:54:50.683281',
97 'exc_id': 139638856342656,
97 'exc_id': 139638856342656,
98 'exc_timestamp': '1585227290.683288',
98 'exc_timestamp': '1585227290.683288',
99 'version': 'v1',
99 'version': 'v1',
100 'exc_message': 'Traceback (most recent call last):\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/tweens.py", line 41, in excview_tween\n response = handler(request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/router.py", line 148, in handle_request\n registry, request, context, context_iface, view_name\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/view.py", line 667, in _call_view\n response = view_callable(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 188, in attr_view\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 214, in predicate_wrapper\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 401, in viewresult_to_response\n result = view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 132, in _class_view\n response = getattr(inst, attr)()\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/apps/debug_style/views.py", line 355, in render_email\n template_type, **email_kwargs.get(email_id, {}))\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/model/notification.py", line 402, in render_email\n body = email_template.render(None, **_kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 95, in render\n return self._render_with_exc(tmpl, args, kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 79, in _render_with_exc\n return render_func.render(*args, **kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/template.py", line 476, in render\n return runtime._render(self, self.callable_, args, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 883, in _render\n **_kwargs_for_callable(callable_, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 920, in _render_context\n _exec_template(inherit, lclcontext, args=args, kwargs=kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 947, in _exec_template\n callable_(context, *args, **kwargs)\n File "rhodecode_templates_email_templates_base_mako", line 63, in render_body\n File "rhodecode_templates_email_templates_exception_tracker_mako", line 43, in render_body\nAttributeError: \'str\' object has no attribute \'get\'\n',
100 'exc_message': 'Traceback (most recent call last):\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/tweens.py", line 41, in excview_tween\n response = handler(request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/router.py", line 148, in handle_request\n registry, request, context, context_iface, view_name\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/view.py", line 667, in _call_view\n response = view_callable(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 188, in attr_view\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/config/views.py", line 214, in predicate_wrapper\n return view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 401, in viewresult_to_response\n result = view(context, request)\n File "/nix/store/s43k2r9rysfbzmsjdqnxgzvvb7zjhkxb-python2.7-pyramid-1.10.4/lib/python2.7/site-packages/pyramid/viewderivers.py", line 132, in _class_view\n response = getattr(inst, attr)()\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/apps/debug_style/views.py", line 355, in render_email\n template_type, **email_kwargs.get(email_id, {}))\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/model/notification.py", line 402, in render_email\n body = email_template.render(None, **_kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 95, in render\n return self._render_with_exc(tmpl, args, kwargs)\n File "/mnt/hgfs/marcink/workspace/rhodecode-enterprise-ce/rhodecode/lib/partial_renderer.py", line 79, in _render_with_exc\n return render_func.render(*args, **kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/template.py", line 476, in render\n return runtime._render(self, self.callable_, args, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 883, in _render\n **_kwargs_for_callable(callable_, data)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 920, in _render_context\n _exec_template(inherit, lclcontext, args=args, kwargs=kwargs)\n File "/nix/store/dakh34sxz4yfr435c0cwjz0sd6hnd5g3-python2.7-mako-1.1.0/lib/python2.7/site-packages/mako/runtime.py", line 947, in _exec_template\n callable_(context, *args, **kwargs)\n File "rhodecode_templates_email_templates_base_mako", line 63, in render_body\n File "rhodecode_templates_email_templates_exception_tracker_mako", line 43, in render_body\nAttributeError: \'str\' object has no attribute \'get\'\n',
101 'exc_type': 'AttributeError'
101 'exc_type': 'AttributeError'
102 }
102 }
103
103
104 email_kwargs = {
104 email_kwargs = {
105 'test': {},
105 'test': {},
106
106
107 'message': {
107 'message': {
108 'body': 'message body !'
108 'body': 'message body !'
109 },
109 },
110
110
111 'email_test': {
111 'email_test': {
112 'user': user,
112 'user': user,
113 'date': datetime.datetime.now(),
113 'date': datetime.datetime.now(),
114 },
114 },
115
115
116 'exception': {
116 'exception': {
117 'email_prefix': '[RHODECODE ERROR]',
117 'email_prefix': '[RHODECODE ERROR]',
118 'exc_id': exc_traceback['exc_id'],
118 'exc_id': exc_traceback['exc_id'],
119 'exc_url': 'http://server-url/{}'.format(exc_traceback['exc_id']),
119 'exc_url': 'http://server-url/{}'.format(exc_traceback['exc_id']),
120 'exc_type_name': 'NameError',
120 'exc_type_name': 'NameError',
121 'exc_traceback': exc_traceback,
121 'exc_traceback': exc_traceback,
122 },
122 },
123
123
124 'password_reset': {
124 'password_reset': {
125 'password_reset_url': 'http://example.com/reset-rhodecode-password/token',
125 'password_reset_url': 'http://example.com/reset-rhodecode-password/token',
126
126
127 'user': user,
127 'user': user,
128 'date': datetime.datetime.now(),
128 'date': datetime.datetime.now(),
129 'email': 'test@rhodecode.com',
129 'email': 'test@rhodecode.com',
130 'first_admin_email': User.get_first_super_admin().email
130 'first_admin_email': User.get_first_super_admin().email
131 },
131 },
132
132
133 'password_reset_confirmation': {
133 'password_reset_confirmation': {
134 'new_password': 'new-password-example',
134 'new_password': 'new-password-example',
135 'user': user,
135 'user': user,
136 'date': datetime.datetime.now(),
136 'date': datetime.datetime.now(),
137 'email': 'test@rhodecode.com',
137 'email': 'test@rhodecode.com',
138 'first_admin_email': User.get_first_super_admin().email
138 'first_admin_email': User.get_first_super_admin().email
139 },
139 },
140
140
141 'registration': {
141 'registration': {
142 'user': user,
142 'user': user,
143 'date': datetime.datetime.now(),
143 'date': datetime.datetime.now(),
144 },
144 },
145
145
146 'pull_request_comment': {
146 'pull_request_comment': {
147 'user': user,
147 'user': user,
148
148
149 'status_change': None,
149 'status_change': None,
150 'status_change_type': None,
150 'status_change_type': None,
151
151
152 'pull_request': pr,
152 'pull_request': pr,
153 'pull_request_commits': [],
153 'pull_request_commits': [],
154
154
155 'pull_request_target_repo': target_repo,
155 'pull_request_target_repo': target_repo,
156 'pull_request_target_repo_url': 'http://target-repo/url',
156 'pull_request_target_repo_url': 'http://target-repo/url',
157
157
158 'pull_request_source_repo': source_repo,
158 'pull_request_source_repo': source_repo,
159 'pull_request_source_repo_url': 'http://source-repo/url',
159 'pull_request_source_repo_url': 'http://source-repo/url',
160
160
161 'pull_request_url': 'http://localhost/pr1',
161 'pull_request_url': 'http://localhost/pr1',
162 'pr_comment_url': 'http://comment-url',
162 'pr_comment_url': 'http://comment-url',
163 'pr_comment_reply_url': 'http://comment-url#reply',
163 'pr_comment_reply_url': 'http://comment-url#reply',
164
164
165 'comment_file': None,
165 'comment_file': None,
166 'comment_line': None,
166 'comment_line': None,
167 'comment_type': 'note',
167 'comment_type': 'note',
168 'comment_body': 'This is my comment body. *I like !*',
168 'comment_body': 'This is my comment body. *I like !*',
169 'comment_id': 2048,
169 'comment_id': 2048,
170 'renderer_type': 'markdown',
170 'renderer_type': 'markdown',
171 'mention': True,
171 'mention': True,
172
172
173 },
173 },
174
174
175 'pull_request_comment+status': {
175 'pull_request_comment+status': {
176 'user': user,
176 'user': user,
177
177
178 'status_change': 'approved',
178 'status_change': 'approved',
179 'status_change_type': 'approved',
179 'status_change_type': 'approved',
180
180
181 'pull_request': pr,
181 'pull_request': pr,
182 'pull_request_commits': [],
182 'pull_request_commits': [],
183
183
184 'pull_request_target_repo': target_repo,
184 'pull_request_target_repo': target_repo,
185 'pull_request_target_repo_url': 'http://target-repo/url',
185 'pull_request_target_repo_url': 'http://target-repo/url',
186
186
187 'pull_request_source_repo': source_repo,
187 'pull_request_source_repo': source_repo,
188 'pull_request_source_repo_url': 'http://source-repo/url',
188 'pull_request_source_repo_url': 'http://source-repo/url',
189
189
190 'pull_request_url': 'http://localhost/pr1',
190 'pull_request_url': 'http://localhost/pr1',
191 'pr_comment_url': 'http://comment-url',
191 'pr_comment_url': 'http://comment-url',
192 'pr_comment_reply_url': 'http://comment-url#reply',
192 'pr_comment_reply_url': 'http://comment-url#reply',
193
193
194 'comment_type': 'todo',
194 'comment_type': 'todo',
195 'comment_file': None,
195 'comment_file': None,
196 'comment_line': None,
196 'comment_line': None,
197 'comment_body': '''
197 'comment_body': '''
198 I think something like this would be better
198 I think something like this would be better
199
199
200 ```py
200 ```py
201 // markdown renderer
201 // markdown renderer
202
202
203 def db():
203 def db():
204 global connection
204 global connection
205 return connection
205 return connection
206
206
207 ```
207 ```
208
208
209 ''',
209 ''',
210 'comment_id': 2048,
210 'comment_id': 2048,
211 'renderer_type': 'markdown',
211 'renderer_type': 'markdown',
212 'mention': True,
212 'mention': True,
213
213
214 },
214 },
215
215
216 'pull_request_comment+file': {
216 'pull_request_comment+file': {
217 'user': user,
217 'user': user,
218
218
219 'status_change': None,
219 'status_change': None,
220 'status_change_type': None,
220 'status_change_type': None,
221
221
222 'pull_request': pr,
222 'pull_request': pr,
223 'pull_request_commits': [],
223 'pull_request_commits': [],
224
224
225 'pull_request_target_repo': target_repo,
225 'pull_request_target_repo': target_repo,
226 'pull_request_target_repo_url': 'http://target-repo/url',
226 'pull_request_target_repo_url': 'http://target-repo/url',
227
227
228 'pull_request_source_repo': source_repo,
228 'pull_request_source_repo': source_repo,
229 'pull_request_source_repo_url': 'http://source-repo/url',
229 'pull_request_source_repo_url': 'http://source-repo/url',
230
230
231 'pull_request_url': 'http://localhost/pr1',
231 'pull_request_url': 'http://localhost/pr1',
232
232
233 'pr_comment_url': 'http://comment-url',
233 'pr_comment_url': 'http://comment-url',
234 'pr_comment_reply_url': 'http://comment-url#reply',
234 'pr_comment_reply_url': 'http://comment-url#reply',
235
235
236 'comment_file': 'rhodecode/model/get_flow_commits',
236 'comment_file': 'rhodecode/model/get_flow_commits',
237 'comment_line': 'o1210',
237 'comment_line': 'o1210',
238 'comment_type': 'todo',
238 'comment_type': 'todo',
239 'comment_body': '''
239 'comment_body': '''
240 I like this !
240 I like this !
241
241
242 But please check this code
242 But please check this code
243
243
244 .. code-block:: javascript
244 .. code-block:: javascript
245
245
246 // THIS IS RST CODE
246 // THIS IS RST CODE
247
247
248 this.createResolutionComment = function(commentId) {
248 this.createResolutionComment = function(commentId) {
249 // hide the trigger text
249 // hide the trigger text
250 $('#resolve-comment-{0}'.format(commentId)).hide();
250 $('#resolve-comment-{0}'.format(commentId)).hide();
251
251
252 var comment = $('#comment-'+commentId);
252 var comment = $('#comment-'+commentId);
253 var commentData = comment.data();
253 var commentData = comment.data();
254 if (commentData.commentInline) {
254 if (commentData.commentInline) {
255 this.createComment(comment, commentId)
255 this.createComment(comment, f_path, line_no, commentId)
256 } else {
256 } else {
257 Rhodecode.comments.createGeneralComment('general', "$placeholder", commentId)
257 Rhodecode.comments.createGeneralComment('general', "$placeholder", commentId)
258 }
258 }
259
259
260 return false;
260 return false;
261 };
261 };
262
262
263 This should work better !
263 This should work better !
264 ''',
264 ''',
265 'comment_id': 2048,
265 'comment_id': 2048,
266 'renderer_type': 'rst',
266 'renderer_type': 'rst',
267 'mention': True,
267 'mention': True,
268
268
269 },
269 },
270
270
271 'pull_request_update': {
271 'pull_request_update': {
272 'updating_user': user,
272 'updating_user': user,
273
273
274 'status_change': None,
274 'status_change': None,
275 'status_change_type': None,
275 'status_change_type': None,
276
276
277 'pull_request': pr,
277 'pull_request': pr,
278 'pull_request_commits': [],
278 'pull_request_commits': [],
279
279
280 'pull_request_target_repo': target_repo,
280 'pull_request_target_repo': target_repo,
281 'pull_request_target_repo_url': 'http://target-repo/url',
281 'pull_request_target_repo_url': 'http://target-repo/url',
282
282
283 'pull_request_source_repo': source_repo,
283 'pull_request_source_repo': source_repo,
284 'pull_request_source_repo_url': 'http://source-repo/url',
284 'pull_request_source_repo_url': 'http://source-repo/url',
285
285
286 'pull_request_url': 'http://localhost/pr1',
286 'pull_request_url': 'http://localhost/pr1',
287
287
288 # update comment links
288 # update comment links
289 'pr_comment_url': 'http://comment-url',
289 'pr_comment_url': 'http://comment-url',
290 'pr_comment_reply_url': 'http://comment-url#reply',
290 'pr_comment_reply_url': 'http://comment-url#reply',
291 'ancestor_commit_id': 'f39bd443',
291 'ancestor_commit_id': 'f39bd443',
292 'added_commits': commit_changes.added,
292 'added_commits': commit_changes.added,
293 'removed_commits': commit_changes.removed,
293 'removed_commits': commit_changes.removed,
294 'changed_files': (file_changes.added + file_changes.modified + file_changes.removed),
294 'changed_files': (file_changes.added + file_changes.modified + file_changes.removed),
295 'added_files': file_changes.added,
295 'added_files': file_changes.added,
296 'modified_files': file_changes.modified,
296 'modified_files': file_changes.modified,
297 'removed_files': file_changes.removed,
297 'removed_files': file_changes.removed,
298 },
298 },
299
299
300 'cs_comment': {
300 'cs_comment': {
301 'user': user,
301 'user': user,
302 'commit': AttributeDict(idx=123, raw_id='a'*40, message='Commit message'),
302 'commit': AttributeDict(idx=123, raw_id='a'*40, message='Commit message'),
303 'status_change': None,
303 'status_change': None,
304 'status_change_type': None,
304 'status_change_type': None,
305
305
306 'commit_target_repo_url': 'http://foo.example.com/#comment1',
306 'commit_target_repo_url': 'http://foo.example.com/#comment1',
307 'repo_name': 'test-repo',
307 'repo_name': 'test-repo',
308 'comment_type': 'note',
308 'comment_type': 'note',
309 'comment_file': None,
309 'comment_file': None,
310 'comment_line': None,
310 'comment_line': None,
311 'commit_comment_url': 'http://comment-url',
311 'commit_comment_url': 'http://comment-url',
312 'commit_comment_reply_url': 'http://comment-url#reply',
312 'commit_comment_reply_url': 'http://comment-url#reply',
313 'comment_body': 'This is my comment body. *I like !*',
313 'comment_body': 'This is my comment body. *I like !*',
314 'comment_id': 2048,
314 'comment_id': 2048,
315 'renderer_type': 'markdown',
315 'renderer_type': 'markdown',
316 'mention': True,
316 'mention': True,
317 },
317 },
318
318
319 'cs_comment+status': {
319 'cs_comment+status': {
320 'user': user,
320 'user': user,
321 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
321 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
322 'status_change': 'approved',
322 'status_change': 'approved',
323 'status_change_type': 'approved',
323 'status_change_type': 'approved',
324
324
325 'commit_target_repo_url': 'http://foo.example.com/#comment1',
325 'commit_target_repo_url': 'http://foo.example.com/#comment1',
326 'repo_name': 'test-repo',
326 'repo_name': 'test-repo',
327 'comment_type': 'note',
327 'comment_type': 'note',
328 'comment_file': None,
328 'comment_file': None,
329 'comment_line': None,
329 'comment_line': None,
330 'commit_comment_url': 'http://comment-url',
330 'commit_comment_url': 'http://comment-url',
331 'commit_comment_reply_url': 'http://comment-url#reply',
331 'commit_comment_reply_url': 'http://comment-url#reply',
332 'comment_body': '''
332 'comment_body': '''
333 Hello **world**
333 Hello **world**
334
334
335 This is a multiline comment :)
335 This is a multiline comment :)
336
336
337 - list
337 - list
338 - list2
338 - list2
339 ''',
339 ''',
340 'comment_id': 2048,
340 'comment_id': 2048,
341 'renderer_type': 'markdown',
341 'renderer_type': 'markdown',
342 'mention': True,
342 'mention': True,
343 },
343 },
344
344
345 'cs_comment+file': {
345 'cs_comment+file': {
346 'user': user,
346 'user': user,
347 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
347 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'),
348 'status_change': None,
348 'status_change': None,
349 'status_change_type': None,
349 'status_change_type': None,
350
350
351 'commit_target_repo_url': 'http://foo.example.com/#comment1',
351 'commit_target_repo_url': 'http://foo.example.com/#comment1',
352 'repo_name': 'test-repo',
352 'repo_name': 'test-repo',
353
353
354 'comment_type': 'note',
354 'comment_type': 'note',
355 'comment_file': 'test-file.py',
355 'comment_file': 'test-file.py',
356 'comment_line': 'n100',
356 'comment_line': 'n100',
357
357
358 'commit_comment_url': 'http://comment-url',
358 'commit_comment_url': 'http://comment-url',
359 'commit_comment_reply_url': 'http://comment-url#reply',
359 'commit_comment_reply_url': 'http://comment-url#reply',
360 'comment_body': 'This is my comment body. *I like !*',
360 'comment_body': 'This is my comment body. *I like !*',
361 'comment_id': 2048,
361 'comment_id': 2048,
362 'renderer_type': 'markdown',
362 'renderer_type': 'markdown',
363 'mention': True,
363 'mention': True,
364 },
364 },
365
365
366 'pull_request': {
366 'pull_request': {
367 'user': user,
367 'user': user,
368 'pull_request': pr,
368 'pull_request': pr,
369 'pull_request_commits': [
369 'pull_request_commits': [
370 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
370 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
371 my-account: moved email closer to profile as it's similar data just moved outside.
371 my-account: moved email closer to profile as it's similar data just moved outside.
372 '''),
372 '''),
373 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
373 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
374 users: description edit fixes
374 users: description edit fixes
375
375
376 - tests
376 - tests
377 - added metatags info
377 - added metatags info
378 '''),
378 '''),
379 ],
379 ],
380
380
381 'pull_request_target_repo': target_repo,
381 'pull_request_target_repo': target_repo,
382 'pull_request_target_repo_url': 'http://target-repo/url',
382 'pull_request_target_repo_url': 'http://target-repo/url',
383
383
384 'pull_request_source_repo': source_repo,
384 'pull_request_source_repo': source_repo,
385 'pull_request_source_repo_url': 'http://source-repo/url',
385 'pull_request_source_repo_url': 'http://source-repo/url',
386
386
387 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
387 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
388 'user_role': 'reviewer',
388 'user_role': 'reviewer',
389 },
389 },
390
390
391 'pull_request+reviewer_role': {
391 'pull_request+reviewer_role': {
392 'user': user,
392 'user': user,
393 'pull_request': pr,
393 'pull_request': pr,
394 'pull_request_commits': [
394 'pull_request_commits': [
395 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
395 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
396 my-account: moved email closer to profile as it's similar data just moved outside.
396 my-account: moved email closer to profile as it's similar data just moved outside.
397 '''),
397 '''),
398 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
398 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
399 users: description edit fixes
399 users: description edit fixes
400
400
401 - tests
401 - tests
402 - added metatags info
402 - added metatags info
403 '''),
403 '''),
404 ],
404 ],
405
405
406 'pull_request_target_repo': target_repo,
406 'pull_request_target_repo': target_repo,
407 'pull_request_target_repo_url': 'http://target-repo/url',
407 'pull_request_target_repo_url': 'http://target-repo/url',
408
408
409 'pull_request_source_repo': source_repo,
409 'pull_request_source_repo': source_repo,
410 'pull_request_source_repo_url': 'http://source-repo/url',
410 'pull_request_source_repo_url': 'http://source-repo/url',
411
411
412 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
412 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
413 'user_role': 'reviewer',
413 'user_role': 'reviewer',
414 },
414 },
415
415
416 'pull_request+observer_role': {
416 'pull_request+observer_role': {
417 'user': user,
417 'user': user,
418 'pull_request': pr,
418 'pull_request': pr,
419 'pull_request_commits': [
419 'pull_request_commits': [
420 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
420 ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\
421 my-account: moved email closer to profile as it's similar data just moved outside.
421 my-account: moved email closer to profile as it's similar data just moved outside.
422 '''),
422 '''),
423 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
423 ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\
424 users: description edit fixes
424 users: description edit fixes
425
425
426 - tests
426 - tests
427 - added metatags info
427 - added metatags info
428 '''),
428 '''),
429 ],
429 ],
430
430
431 'pull_request_target_repo': target_repo,
431 'pull_request_target_repo': target_repo,
432 'pull_request_target_repo_url': 'http://target-repo/url',
432 'pull_request_target_repo_url': 'http://target-repo/url',
433
433
434 'pull_request_source_repo': source_repo,
434 'pull_request_source_repo': source_repo,
435 'pull_request_source_repo_url': 'http://source-repo/url',
435 'pull_request_source_repo_url': 'http://source-repo/url',
436
436
437 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
437 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123',
438 'user_role': 'observer'
438 'user_role': 'observer'
439 }
439 }
440 }
440 }
441
441
442 template_type = email_id.split('+')[0]
442 template_type = email_id.split('+')[0]
443 (c.subject, c.email_body, c.email_body_plaintext) = EmailNotificationModel().render_email(
443 (c.subject, c.email_body, c.email_body_plaintext) = EmailNotificationModel().render_email(
444 template_type, **email_kwargs.get(email_id, {}))
444 template_type, **email_kwargs.get(email_id, {}))
445
445
446 test_email = self.request.GET.get('email')
446 test_email = self.request.GET.get('email')
447 if test_email:
447 if test_email:
448 recipients = [test_email]
448 recipients = [test_email]
449 run_task(tasks.send_email, recipients, c.subject,
449 run_task(tasks.send_email, recipients, c.subject,
450 c.email_body_plaintext, c.email_body)
450 c.email_body_plaintext, c.email_body)
451
451
452 if self.request.matched_route.name == 'debug_style_email_plain_rendered':
452 if self.request.matched_route.name == 'debug_style_email_plain_rendered':
453 template = 'debug_style/email_plain_rendered.mako'
453 template = 'debug_style/email_plain_rendered.mako'
454 else:
454 else:
455 template = 'debug_style/email.mako'
455 template = 'debug_style/email.mako'
456 return render_to_response(
456 return render_to_response(
457 template, self._get_template_context(c),
457 template, self._get_template_context(c),
458 request=self.request)
458 request=self.request)
459
459
460 @view_config(
460 @view_config(
461 route_name='debug_style_template', request_method='GET',
461 route_name='debug_style_template', request_method='GET',
462 renderer=None)
462 renderer=None)
463 def template(self):
463 def template(self):
464 t_path = self.request.matchdict['t_path']
464 t_path = self.request.matchdict['t_path']
465 c = self.load_default_context()
465 c = self.load_default_context()
466 c.active = os.path.splitext(t_path)[0]
466 c.active = os.path.splitext(t_path)[0]
467 c.came_from = ''
467 c.came_from = ''
468 # NOTE(marcink): extend the email types with variations based on data sets
468 # NOTE(marcink): extend the email types with variations based on data sets
469 c.email_types = {
469 c.email_types = {
470 'cs_comment+file': {},
470 'cs_comment+file': {},
471 'cs_comment+status': {},
471 'cs_comment+status': {},
472
472
473 'pull_request_comment+file': {},
473 'pull_request_comment+file': {},
474 'pull_request_comment+status': {},
474 'pull_request_comment+status': {},
475
475
476 'pull_request_update': {},
476 'pull_request_update': {},
477
477
478 'pull_request+reviewer_role': {},
478 'pull_request+reviewer_role': {},
479 'pull_request+observer_role': {},
479 'pull_request+observer_role': {},
480 }
480 }
481 c.email_types.update(EmailNotificationModel.email_types)
481 c.email_types.update(EmailNotificationModel.email_types)
482
482
483 return render_to_response(
483 return render_to_response(
484 'debug_style/' + t_path, self._get_template_context(c),
484 'debug_style/' + t_path, self._get_template_context(c),
485 request=self.request)
485 request=self.request)
486
486
@@ -1,822 +1,826 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import datetime
22 import datetime
23 import string
23 import string
24
24
25 import formencode
25 import formencode
26 import formencode.htmlfill
26 import formencode.htmlfill
27 import peppercorn
27 import peppercorn
28 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
28 from pyramid.httpexceptions import HTTPFound, HTTPNotFound
29 from pyramid.view import view_config
29 from pyramid.view import view_config
30
30
31 from rhodecode.apps._base import BaseAppView, DataGridAppView
31 from rhodecode.apps._base import BaseAppView, DataGridAppView
32 from rhodecode import forms
32 from rhodecode import forms
33 from rhodecode.lib import helpers as h
33 from rhodecode.lib import helpers as h
34 from rhodecode.lib import audit_logger
34 from rhodecode.lib import audit_logger
35 from rhodecode.lib.ext_json import json
35 from rhodecode.lib.ext_json import json
36 from rhodecode.lib.auth import (
36 from rhodecode.lib.auth import (
37 LoginRequired, NotAnonymous, CSRFRequired,
37 LoginRequired, NotAnonymous, CSRFRequired,
38 HasRepoPermissionAny, HasRepoGroupPermissionAny, AuthUser)
38 HasRepoPermissionAny, HasRepoGroupPermissionAny, AuthUser)
39 from rhodecode.lib.channelstream import (
39 from rhodecode.lib.channelstream import (
40 channelstream_request, ChannelstreamException)
40 channelstream_request, ChannelstreamException)
41 from rhodecode.lib.utils2 import safe_int, md5, str2bool
41 from rhodecode.lib.utils2 import safe_int, md5, str2bool
42 from rhodecode.model.auth_token import AuthTokenModel
42 from rhodecode.model.auth_token import AuthTokenModel
43 from rhodecode.model.comment import CommentsModel
43 from rhodecode.model.comment import CommentsModel
44 from rhodecode.model.db import (
44 from rhodecode.model.db import (
45 IntegrityError, or_, in_filter_generator,
45 IntegrityError, or_, in_filter_generator,
46 Repository, UserEmailMap, UserApiKeys, UserFollowing,
46 Repository, UserEmailMap, UserApiKeys, UserFollowing,
47 PullRequest, UserBookmark, RepoGroup)
47 PullRequest, UserBookmark, RepoGroup)
48 from rhodecode.model.meta import Session
48 from rhodecode.model.meta import Session
49 from rhodecode.model.pull_request import PullRequestModel
49 from rhodecode.model.pull_request import PullRequestModel
50 from rhodecode.model.user import UserModel
50 from rhodecode.model.user import UserModel
51 from rhodecode.model.user_group import UserGroupModel
51 from rhodecode.model.user_group import UserGroupModel
52 from rhodecode.model.validation_schema.schemas import user_schema
52 from rhodecode.model.validation_schema.schemas import user_schema
53
53
54 log = logging.getLogger(__name__)
54 log = logging.getLogger(__name__)
55
55
56
56
57 class MyAccountView(BaseAppView, DataGridAppView):
57 class MyAccountView(BaseAppView, DataGridAppView):
58 ALLOW_SCOPED_TOKENS = False
58 ALLOW_SCOPED_TOKENS = False
59 """
59 """
60 This view has alternative version inside EE, if modified please take a look
60 This view has alternative version inside EE, if modified please take a look
61 in there as well.
61 in there as well.
62 """
62 """
63
63
64 def load_default_context(self):
64 def load_default_context(self):
65 c = self._get_local_tmpl_context()
65 c = self._get_local_tmpl_context()
66 c.user = c.auth_user.get_instance()
66 c.user = c.auth_user.get_instance()
67 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
67 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
68
68
69 return c
69 return c
70
70
71 @LoginRequired()
71 @LoginRequired()
72 @NotAnonymous()
72 @NotAnonymous()
73 @view_config(
73 @view_config(
74 route_name='my_account_profile', request_method='GET',
74 route_name='my_account_profile', request_method='GET',
75 renderer='rhodecode:templates/admin/my_account/my_account.mako')
75 renderer='rhodecode:templates/admin/my_account/my_account.mako')
76 def my_account_profile(self):
76 def my_account_profile(self):
77 c = self.load_default_context()
77 c = self.load_default_context()
78 c.active = 'profile'
78 c.active = 'profile'
79 c.extern_type = c.user.extern_type
79 c.extern_type = c.user.extern_type
80 return self._get_template_context(c)
80 return self._get_template_context(c)
81
81
82 @LoginRequired()
82 @LoginRequired()
83 @NotAnonymous()
83 @NotAnonymous()
84 @view_config(
84 @view_config(
85 route_name='my_account_password', request_method='GET',
85 route_name='my_account_password', request_method='GET',
86 renderer='rhodecode:templates/admin/my_account/my_account.mako')
86 renderer='rhodecode:templates/admin/my_account/my_account.mako')
87 def my_account_password(self):
87 def my_account_password(self):
88 c = self.load_default_context()
88 c = self.load_default_context()
89 c.active = 'password'
89 c.active = 'password'
90 c.extern_type = c.user.extern_type
90 c.extern_type = c.user.extern_type
91
91
92 schema = user_schema.ChangePasswordSchema().bind(
92 schema = user_schema.ChangePasswordSchema().bind(
93 username=c.user.username)
93 username=c.user.username)
94
94
95 form = forms.Form(
95 form = forms.Form(
96 schema,
96 schema,
97 action=h.route_path('my_account_password_update'),
97 action=h.route_path('my_account_password_update'),
98 buttons=(forms.buttons.save, forms.buttons.reset))
98 buttons=(forms.buttons.save, forms.buttons.reset))
99
99
100 c.form = form
100 c.form = form
101 return self._get_template_context(c)
101 return self._get_template_context(c)
102
102
103 @LoginRequired()
103 @LoginRequired()
104 @NotAnonymous()
104 @NotAnonymous()
105 @CSRFRequired()
105 @CSRFRequired()
106 @view_config(
106 @view_config(
107 route_name='my_account_password_update', request_method='POST',
107 route_name='my_account_password_update', request_method='POST',
108 renderer='rhodecode:templates/admin/my_account/my_account.mako')
108 renderer='rhodecode:templates/admin/my_account/my_account.mako')
109 def my_account_password_update(self):
109 def my_account_password_update(self):
110 _ = self.request.translate
110 _ = self.request.translate
111 c = self.load_default_context()
111 c = self.load_default_context()
112 c.active = 'password'
112 c.active = 'password'
113 c.extern_type = c.user.extern_type
113 c.extern_type = c.user.extern_type
114
114
115 schema = user_schema.ChangePasswordSchema().bind(
115 schema = user_schema.ChangePasswordSchema().bind(
116 username=c.user.username)
116 username=c.user.username)
117
117
118 form = forms.Form(
118 form = forms.Form(
119 schema, buttons=(forms.buttons.save, forms.buttons.reset))
119 schema, buttons=(forms.buttons.save, forms.buttons.reset))
120
120
121 if c.extern_type != 'rhodecode':
121 if c.extern_type != 'rhodecode':
122 raise HTTPFound(self.request.route_path('my_account_password'))
122 raise HTTPFound(self.request.route_path('my_account_password'))
123
123
124 controls = self.request.POST.items()
124 controls = self.request.POST.items()
125 try:
125 try:
126 valid_data = form.validate(controls)
126 valid_data = form.validate(controls)
127 UserModel().update_user(c.user.user_id, **valid_data)
127 UserModel().update_user(c.user.user_id, **valid_data)
128 c.user.update_userdata(force_password_change=False)
128 c.user.update_userdata(force_password_change=False)
129 Session().commit()
129 Session().commit()
130 except forms.ValidationFailure as e:
130 except forms.ValidationFailure as e:
131 c.form = e
131 c.form = e
132 return self._get_template_context(c)
132 return self._get_template_context(c)
133
133
134 except Exception:
134 except Exception:
135 log.exception("Exception updating password")
135 log.exception("Exception updating password")
136 h.flash(_('Error occurred during update of user password'),
136 h.flash(_('Error occurred during update of user password'),
137 category='error')
137 category='error')
138 else:
138 else:
139 instance = c.auth_user.get_instance()
139 instance = c.auth_user.get_instance()
140 self.session.setdefault('rhodecode_user', {}).update(
140 self.session.setdefault('rhodecode_user', {}).update(
141 {'password': md5(instance.password)})
141 {'password': md5(instance.password)})
142 self.session.save()
142 self.session.save()
143 h.flash(_("Successfully updated password"), category='success')
143 h.flash(_("Successfully updated password"), category='success')
144
144
145 raise HTTPFound(self.request.route_path('my_account_password'))
145 raise HTTPFound(self.request.route_path('my_account_password'))
146
146
147 @LoginRequired()
147 @LoginRequired()
148 @NotAnonymous()
148 @NotAnonymous()
149 @view_config(
149 @view_config(
150 route_name='my_account_auth_tokens', request_method='GET',
150 route_name='my_account_auth_tokens', request_method='GET',
151 renderer='rhodecode:templates/admin/my_account/my_account.mako')
151 renderer='rhodecode:templates/admin/my_account/my_account.mako')
152 def my_account_auth_tokens(self):
152 def my_account_auth_tokens(self):
153 _ = self.request.translate
153 _ = self.request.translate
154
154
155 c = self.load_default_context()
155 c = self.load_default_context()
156 c.active = 'auth_tokens'
156 c.active = 'auth_tokens'
157 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
157 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
158 c.role_values = [
158 c.role_values = [
159 (x, AuthTokenModel.cls._get_role_name(x))
159 (x, AuthTokenModel.cls._get_role_name(x))
160 for x in AuthTokenModel.cls.ROLES]
160 for x in AuthTokenModel.cls.ROLES]
161 c.role_options = [(c.role_values, _("Role"))]
161 c.role_options = [(c.role_values, _("Role"))]
162 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
162 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
163 c.user.user_id, show_expired=True)
163 c.user.user_id, show_expired=True)
164 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
164 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
165 return self._get_template_context(c)
165 return self._get_template_context(c)
166
166
167 @LoginRequired()
167 @LoginRequired()
168 @NotAnonymous()
168 @NotAnonymous()
169 @CSRFRequired()
169 @CSRFRequired()
170 @view_config(
170 @view_config(
171 route_name='my_account_auth_tokens_view', request_method='POST', xhr=True,
171 route_name='my_account_auth_tokens_view', request_method='POST', xhr=True,
172 renderer='json_ext')
172 renderer='json_ext')
173 def my_account_auth_tokens_view(self):
173 def my_account_auth_tokens_view(self):
174 _ = self.request.translate
174 _ = self.request.translate
175 c = self.load_default_context()
175 c = self.load_default_context()
176
176
177 auth_token_id = self.request.POST.get('auth_token_id')
177 auth_token_id = self.request.POST.get('auth_token_id')
178
178
179 if auth_token_id:
179 if auth_token_id:
180 token = UserApiKeys.get_or_404(auth_token_id)
180 token = UserApiKeys.get_or_404(auth_token_id)
181 if token.user.user_id != c.user.user_id:
181 if token.user.user_id != c.user.user_id:
182 raise HTTPNotFound()
182 raise HTTPNotFound()
183
183
184 return {
184 return {
185 'auth_token': token.api_key
185 'auth_token': token.api_key
186 }
186 }
187
187
188 def maybe_attach_token_scope(self, token):
188 def maybe_attach_token_scope(self, token):
189 # implemented in EE edition
189 # implemented in EE edition
190 pass
190 pass
191
191
192 @LoginRequired()
192 @LoginRequired()
193 @NotAnonymous()
193 @NotAnonymous()
194 @CSRFRequired()
194 @CSRFRequired()
195 @view_config(
195 @view_config(
196 route_name='my_account_auth_tokens_add', request_method='POST',)
196 route_name='my_account_auth_tokens_add', request_method='POST',)
197 def my_account_auth_tokens_add(self):
197 def my_account_auth_tokens_add(self):
198 _ = self.request.translate
198 _ = self.request.translate
199 c = self.load_default_context()
199 c = self.load_default_context()
200
200
201 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
201 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
202 description = self.request.POST.get('description')
202 description = self.request.POST.get('description')
203 role = self.request.POST.get('role')
203 role = self.request.POST.get('role')
204
204
205 token = UserModel().add_auth_token(
205 token = UserModel().add_auth_token(
206 user=c.user.user_id,
206 user=c.user.user_id,
207 lifetime_minutes=lifetime, role=role, description=description,
207 lifetime_minutes=lifetime, role=role, description=description,
208 scope_callback=self.maybe_attach_token_scope)
208 scope_callback=self.maybe_attach_token_scope)
209 token_data = token.get_api_data()
209 token_data = token.get_api_data()
210
210
211 audit_logger.store_web(
211 audit_logger.store_web(
212 'user.edit.token.add', action_data={
212 'user.edit.token.add', action_data={
213 'data': {'token': token_data, 'user': 'self'}},
213 'data': {'token': token_data, 'user': 'self'}},
214 user=self._rhodecode_user, )
214 user=self._rhodecode_user, )
215 Session().commit()
215 Session().commit()
216
216
217 h.flash(_("Auth token successfully created"), category='success')
217 h.flash(_("Auth token successfully created"), category='success')
218 return HTTPFound(h.route_path('my_account_auth_tokens'))
218 return HTTPFound(h.route_path('my_account_auth_tokens'))
219
219
220 @LoginRequired()
220 @LoginRequired()
221 @NotAnonymous()
221 @NotAnonymous()
222 @CSRFRequired()
222 @CSRFRequired()
223 @view_config(
223 @view_config(
224 route_name='my_account_auth_tokens_delete', request_method='POST')
224 route_name='my_account_auth_tokens_delete', request_method='POST')
225 def my_account_auth_tokens_delete(self):
225 def my_account_auth_tokens_delete(self):
226 _ = self.request.translate
226 _ = self.request.translate
227 c = self.load_default_context()
227 c = self.load_default_context()
228
228
229 del_auth_token = self.request.POST.get('del_auth_token')
229 del_auth_token = self.request.POST.get('del_auth_token')
230
230
231 if del_auth_token:
231 if del_auth_token:
232 token = UserApiKeys.get_or_404(del_auth_token)
232 token = UserApiKeys.get_or_404(del_auth_token)
233 token_data = token.get_api_data()
233 token_data = token.get_api_data()
234
234
235 AuthTokenModel().delete(del_auth_token, c.user.user_id)
235 AuthTokenModel().delete(del_auth_token, c.user.user_id)
236 audit_logger.store_web(
236 audit_logger.store_web(
237 'user.edit.token.delete', action_data={
237 'user.edit.token.delete', action_data={
238 'data': {'token': token_data, 'user': 'self'}},
238 'data': {'token': token_data, 'user': 'self'}},
239 user=self._rhodecode_user,)
239 user=self._rhodecode_user,)
240 Session().commit()
240 Session().commit()
241 h.flash(_("Auth token successfully deleted"), category='success')
241 h.flash(_("Auth token successfully deleted"), category='success')
242
242
243 return HTTPFound(h.route_path('my_account_auth_tokens'))
243 return HTTPFound(h.route_path('my_account_auth_tokens'))
244
244
245 @LoginRequired()
245 @LoginRequired()
246 @NotAnonymous()
246 @NotAnonymous()
247 @view_config(
247 @view_config(
248 route_name='my_account_emails', request_method='GET',
248 route_name='my_account_emails', request_method='GET',
249 renderer='rhodecode:templates/admin/my_account/my_account.mako')
249 renderer='rhodecode:templates/admin/my_account/my_account.mako')
250 def my_account_emails(self):
250 def my_account_emails(self):
251 _ = self.request.translate
251 _ = self.request.translate
252
252
253 c = self.load_default_context()
253 c = self.load_default_context()
254 c.active = 'emails'
254 c.active = 'emails'
255
255
256 c.user_email_map = UserEmailMap.query()\
256 c.user_email_map = UserEmailMap.query()\
257 .filter(UserEmailMap.user == c.user).all()
257 .filter(UserEmailMap.user == c.user).all()
258
258
259 schema = user_schema.AddEmailSchema().bind(
259 schema = user_schema.AddEmailSchema().bind(
260 username=c.user.username, user_emails=c.user.emails)
260 username=c.user.username, user_emails=c.user.emails)
261
261
262 form = forms.RcForm(schema,
262 form = forms.RcForm(schema,
263 action=h.route_path('my_account_emails_add'),
263 action=h.route_path('my_account_emails_add'),
264 buttons=(forms.buttons.save, forms.buttons.reset))
264 buttons=(forms.buttons.save, forms.buttons.reset))
265
265
266 c.form = form
266 c.form = form
267 return self._get_template_context(c)
267 return self._get_template_context(c)
268
268
269 @LoginRequired()
269 @LoginRequired()
270 @NotAnonymous()
270 @NotAnonymous()
271 @CSRFRequired()
271 @CSRFRequired()
272 @view_config(
272 @view_config(
273 route_name='my_account_emails_add', request_method='POST',
273 route_name='my_account_emails_add', request_method='POST',
274 renderer='rhodecode:templates/admin/my_account/my_account.mako')
274 renderer='rhodecode:templates/admin/my_account/my_account.mako')
275 def my_account_emails_add(self):
275 def my_account_emails_add(self):
276 _ = self.request.translate
276 _ = self.request.translate
277 c = self.load_default_context()
277 c = self.load_default_context()
278 c.active = 'emails'
278 c.active = 'emails'
279
279
280 schema = user_schema.AddEmailSchema().bind(
280 schema = user_schema.AddEmailSchema().bind(
281 username=c.user.username, user_emails=c.user.emails)
281 username=c.user.username, user_emails=c.user.emails)
282
282
283 form = forms.RcForm(
283 form = forms.RcForm(
284 schema, action=h.route_path('my_account_emails_add'),
284 schema, action=h.route_path('my_account_emails_add'),
285 buttons=(forms.buttons.save, forms.buttons.reset))
285 buttons=(forms.buttons.save, forms.buttons.reset))
286
286
287 controls = self.request.POST.items()
287 controls = self.request.POST.items()
288 try:
288 try:
289 valid_data = form.validate(controls)
289 valid_data = form.validate(controls)
290 UserModel().add_extra_email(c.user.user_id, valid_data['email'])
290 UserModel().add_extra_email(c.user.user_id, valid_data['email'])
291 audit_logger.store_web(
291 audit_logger.store_web(
292 'user.edit.email.add', action_data={
292 'user.edit.email.add', action_data={
293 'data': {'email': valid_data['email'], 'user': 'self'}},
293 'data': {'email': valid_data['email'], 'user': 'self'}},
294 user=self._rhodecode_user,)
294 user=self._rhodecode_user,)
295 Session().commit()
295 Session().commit()
296 except formencode.Invalid as error:
296 except formencode.Invalid as error:
297 h.flash(h.escape(error.error_dict['email']), category='error')
297 h.flash(h.escape(error.error_dict['email']), category='error')
298 except forms.ValidationFailure as e:
298 except forms.ValidationFailure as e:
299 c.user_email_map = UserEmailMap.query() \
299 c.user_email_map = UserEmailMap.query() \
300 .filter(UserEmailMap.user == c.user).all()
300 .filter(UserEmailMap.user == c.user).all()
301 c.form = e
301 c.form = e
302 return self._get_template_context(c)
302 return self._get_template_context(c)
303 except Exception:
303 except Exception:
304 log.exception("Exception adding email")
304 log.exception("Exception adding email")
305 h.flash(_('Error occurred during adding email'),
305 h.flash(_('Error occurred during adding email'),
306 category='error')
306 category='error')
307 else:
307 else:
308 h.flash(_("Successfully added email"), category='success')
308 h.flash(_("Successfully added email"), category='success')
309
309
310 raise HTTPFound(self.request.route_path('my_account_emails'))
310 raise HTTPFound(self.request.route_path('my_account_emails'))
311
311
312 @LoginRequired()
312 @LoginRequired()
313 @NotAnonymous()
313 @NotAnonymous()
314 @CSRFRequired()
314 @CSRFRequired()
315 @view_config(
315 @view_config(
316 route_name='my_account_emails_delete', request_method='POST')
316 route_name='my_account_emails_delete', request_method='POST')
317 def my_account_emails_delete(self):
317 def my_account_emails_delete(self):
318 _ = self.request.translate
318 _ = self.request.translate
319 c = self.load_default_context()
319 c = self.load_default_context()
320
320
321 del_email_id = self.request.POST.get('del_email_id')
321 del_email_id = self.request.POST.get('del_email_id')
322 if del_email_id:
322 if del_email_id:
323 email = UserEmailMap.get_or_404(del_email_id).email
323 email = UserEmailMap.get_or_404(del_email_id).email
324 UserModel().delete_extra_email(c.user.user_id, del_email_id)
324 UserModel().delete_extra_email(c.user.user_id, del_email_id)
325 audit_logger.store_web(
325 audit_logger.store_web(
326 'user.edit.email.delete', action_data={
326 'user.edit.email.delete', action_data={
327 'data': {'email': email, 'user': 'self'}},
327 'data': {'email': email, 'user': 'self'}},
328 user=self._rhodecode_user,)
328 user=self._rhodecode_user,)
329 Session().commit()
329 Session().commit()
330 h.flash(_("Email successfully deleted"),
330 h.flash(_("Email successfully deleted"),
331 category='success')
331 category='success')
332 return HTTPFound(h.route_path('my_account_emails'))
332 return HTTPFound(h.route_path('my_account_emails'))
333
333
334 @LoginRequired()
334 @LoginRequired()
335 @NotAnonymous()
335 @NotAnonymous()
336 @CSRFRequired()
336 @CSRFRequired()
337 @view_config(
337 @view_config(
338 route_name='my_account_notifications_test_channelstream',
338 route_name='my_account_notifications_test_channelstream',
339 request_method='POST', renderer='json_ext')
339 request_method='POST', renderer='json_ext')
340 def my_account_notifications_test_channelstream(self):
340 def my_account_notifications_test_channelstream(self):
341 message = 'Test message sent via Channelstream by user: {}, on {}'.format(
341 message = 'Test message sent via Channelstream by user: {}, on {}'.format(
342 self._rhodecode_user.username, datetime.datetime.now())
342 self._rhodecode_user.username, datetime.datetime.now())
343 payload = {
343 payload = {
344 # 'channel': 'broadcast',
344 # 'channel': 'broadcast',
345 'type': 'message',
345 'type': 'message',
346 'timestamp': datetime.datetime.utcnow(),
346 'timestamp': datetime.datetime.utcnow(),
347 'user': 'system',
347 'user': 'system',
348 'pm_users': [self._rhodecode_user.username],
348 'pm_users': [self._rhodecode_user.username],
349 'message': {
349 'message': {
350 'message': message,
350 'message': message,
351 'level': 'info',
351 'level': 'info',
352 'topic': '/notifications'
352 'topic': '/notifications'
353 }
353 }
354 }
354 }
355
355
356 registry = self.request.registry
356 registry = self.request.registry
357 rhodecode_plugins = getattr(registry, 'rhodecode_plugins', {})
357 rhodecode_plugins = getattr(registry, 'rhodecode_plugins', {})
358 channelstream_config = rhodecode_plugins.get('channelstream', {})
358 channelstream_config = rhodecode_plugins.get('channelstream', {})
359
359
360 try:
360 try:
361 channelstream_request(channelstream_config, [payload], '/message')
361 channelstream_request(channelstream_config, [payload], '/message')
362 except ChannelstreamException as e:
362 except ChannelstreamException as e:
363 log.exception('Failed to send channelstream data')
363 log.exception('Failed to send channelstream data')
364 return {"response": 'ERROR: {}'.format(e.__class__.__name__)}
364 return {"response": 'ERROR: {}'.format(e.__class__.__name__)}
365 return {"response": 'Channelstream data sent. '
365 return {"response": 'Channelstream data sent. '
366 'You should see a new live message now.'}
366 'You should see a new live message now.'}
367
367
368 def _load_my_repos_data(self, watched=False):
368 def _load_my_repos_data(self, watched=False):
369
369
370 allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(AuthUser.repo_read_perms)
370 allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(AuthUser.repo_read_perms)
371
371
372 if watched:
372 if watched:
373 # repos user watch
373 # repos user watch
374 repo_list = Session().query(
374 repo_list = Session().query(
375 Repository
375 Repository
376 ) \
376 ) \
377 .join(
377 .join(
378 (UserFollowing, UserFollowing.follows_repo_id == Repository.repo_id)
378 (UserFollowing, UserFollowing.follows_repo_id == Repository.repo_id)
379 ) \
379 ) \
380 .filter(
380 .filter(
381 UserFollowing.user_id == self._rhodecode_user.user_id
381 UserFollowing.user_id == self._rhodecode_user.user_id
382 ) \
382 ) \
383 .filter(or_(
383 .filter(or_(
384 # generate multiple IN to fix limitation problems
384 # generate multiple IN to fix limitation problems
385 *in_filter_generator(Repository.repo_id, allowed_ids))
385 *in_filter_generator(Repository.repo_id, allowed_ids))
386 ) \
386 ) \
387 .order_by(Repository.repo_name) \
387 .order_by(Repository.repo_name) \
388 .all()
388 .all()
389
389
390 else:
390 else:
391 # repos user is owner of
391 # repos user is owner of
392 repo_list = Session().query(
392 repo_list = Session().query(
393 Repository
393 Repository
394 ) \
394 ) \
395 .filter(
395 .filter(
396 Repository.user_id == self._rhodecode_user.user_id
396 Repository.user_id == self._rhodecode_user.user_id
397 ) \
397 ) \
398 .filter(or_(
398 .filter(or_(
399 # generate multiple IN to fix limitation problems
399 # generate multiple IN to fix limitation problems
400 *in_filter_generator(Repository.repo_id, allowed_ids))
400 *in_filter_generator(Repository.repo_id, allowed_ids))
401 ) \
401 ) \
402 .order_by(Repository.repo_name) \
402 .order_by(Repository.repo_name) \
403 .all()
403 .all()
404
404
405 _render = self.request.get_partial_renderer(
405 _render = self.request.get_partial_renderer(
406 'rhodecode:templates/data_table/_dt_elements.mako')
406 'rhodecode:templates/data_table/_dt_elements.mako')
407
407
408 def repo_lnk(name, rtype, rstate, private, archived, fork_of):
408 def repo_lnk(name, rtype, rstate, private, archived, fork_of):
409 return _render('repo_name', name, rtype, rstate, private, archived, fork_of,
409 return _render('repo_name', name, rtype, rstate, private, archived, fork_of,
410 short_name=False, admin=False)
410 short_name=False, admin=False)
411
411
412 repos_data = []
412 repos_data = []
413 for repo in repo_list:
413 for repo in repo_list:
414 row = {
414 row = {
415 "name": repo_lnk(repo.repo_name, repo.repo_type, repo.repo_state,
415 "name": repo_lnk(repo.repo_name, repo.repo_type, repo.repo_state,
416 repo.private, repo.archived, repo.fork),
416 repo.private, repo.archived, repo.fork),
417 "name_raw": repo.repo_name.lower(),
417 "name_raw": repo.repo_name.lower(),
418 }
418 }
419
419
420 repos_data.append(row)
420 repos_data.append(row)
421
421
422 # json used to render the grid
422 # json used to render the grid
423 return json.dumps(repos_data)
423 return json.dumps(repos_data)
424
424
425 @LoginRequired()
425 @LoginRequired()
426 @NotAnonymous()
426 @NotAnonymous()
427 @view_config(
427 @view_config(
428 route_name='my_account_repos', request_method='GET',
428 route_name='my_account_repos', request_method='GET',
429 renderer='rhodecode:templates/admin/my_account/my_account.mako')
429 renderer='rhodecode:templates/admin/my_account/my_account.mako')
430 def my_account_repos(self):
430 def my_account_repos(self):
431 c = self.load_default_context()
431 c = self.load_default_context()
432 c.active = 'repos'
432 c.active = 'repos'
433
433
434 # json used to render the grid
434 # json used to render the grid
435 c.data = self._load_my_repos_data()
435 c.data = self._load_my_repos_data()
436 return self._get_template_context(c)
436 return self._get_template_context(c)
437
437
438 @LoginRequired()
438 @LoginRequired()
439 @NotAnonymous()
439 @NotAnonymous()
440 @view_config(
440 @view_config(
441 route_name='my_account_watched', request_method='GET',
441 route_name='my_account_watched', request_method='GET',
442 renderer='rhodecode:templates/admin/my_account/my_account.mako')
442 renderer='rhodecode:templates/admin/my_account/my_account.mako')
443 def my_account_watched(self):
443 def my_account_watched(self):
444 c = self.load_default_context()
444 c = self.load_default_context()
445 c.active = 'watched'
445 c.active = 'watched'
446
446
447 # json used to render the grid
447 # json used to render the grid
448 c.data = self._load_my_repos_data(watched=True)
448 c.data = self._load_my_repos_data(watched=True)
449 return self._get_template_context(c)
449 return self._get_template_context(c)
450
450
451 @LoginRequired()
451 @LoginRequired()
452 @NotAnonymous()
452 @NotAnonymous()
453 @view_config(
453 @view_config(
454 route_name='my_account_bookmarks', request_method='GET',
454 route_name='my_account_bookmarks', request_method='GET',
455 renderer='rhodecode:templates/admin/my_account/my_account.mako')
455 renderer='rhodecode:templates/admin/my_account/my_account.mako')
456 def my_account_bookmarks(self):
456 def my_account_bookmarks(self):
457 c = self.load_default_context()
457 c = self.load_default_context()
458 c.active = 'bookmarks'
458 c.active = 'bookmarks'
459 c.bookmark_items = UserBookmark.get_bookmarks_for_user(
459 c.bookmark_items = UserBookmark.get_bookmarks_for_user(
460 self._rhodecode_db_user.user_id, cache=False)
460 self._rhodecode_db_user.user_id, cache=False)
461 return self._get_template_context(c)
461 return self._get_template_context(c)
462
462
463 def _process_bookmark_entry(self, entry, user_id):
463 def _process_bookmark_entry(self, entry, user_id):
464 position = safe_int(entry.get('position'))
464 position = safe_int(entry.get('position'))
465 cur_position = safe_int(entry.get('cur_position'))
465 cur_position = safe_int(entry.get('cur_position'))
466 if position is None:
466 if position is None:
467 return
467 return
468
468
469 # check if this is an existing entry
469 # check if this is an existing entry
470 is_new = False
470 is_new = False
471 db_entry = UserBookmark().get_by_position_for_user(cur_position, user_id)
471 db_entry = UserBookmark().get_by_position_for_user(cur_position, user_id)
472
472
473 if db_entry and str2bool(entry.get('remove')):
473 if db_entry and str2bool(entry.get('remove')):
474 log.debug('Marked bookmark %s for deletion', db_entry)
474 log.debug('Marked bookmark %s for deletion', db_entry)
475 Session().delete(db_entry)
475 Session().delete(db_entry)
476 return
476 return
477
477
478 if not db_entry:
478 if not db_entry:
479 # new
479 # new
480 db_entry = UserBookmark()
480 db_entry = UserBookmark()
481 is_new = True
481 is_new = True
482
482
483 should_save = False
483 should_save = False
484 default_redirect_url = ''
484 default_redirect_url = ''
485
485
486 # save repo
486 # save repo
487 if entry.get('bookmark_repo') and safe_int(entry.get('bookmark_repo')):
487 if entry.get('bookmark_repo') and safe_int(entry.get('bookmark_repo')):
488 repo = Repository.get(entry['bookmark_repo'])
488 repo = Repository.get(entry['bookmark_repo'])
489 perm_check = HasRepoPermissionAny(
489 perm_check = HasRepoPermissionAny(
490 'repository.read', 'repository.write', 'repository.admin')
490 'repository.read', 'repository.write', 'repository.admin')
491 if repo and perm_check(repo_name=repo.repo_name):
491 if repo and perm_check(repo_name=repo.repo_name):
492 db_entry.repository = repo
492 db_entry.repository = repo
493 should_save = True
493 should_save = True
494 default_redirect_url = '${repo_url}'
494 default_redirect_url = '${repo_url}'
495 # save repo group
495 # save repo group
496 elif entry.get('bookmark_repo_group') and safe_int(entry.get('bookmark_repo_group')):
496 elif entry.get('bookmark_repo_group') and safe_int(entry.get('bookmark_repo_group')):
497 repo_group = RepoGroup.get(entry['bookmark_repo_group'])
497 repo_group = RepoGroup.get(entry['bookmark_repo_group'])
498 perm_check = HasRepoGroupPermissionAny(
498 perm_check = HasRepoGroupPermissionAny(
499 'group.read', 'group.write', 'group.admin')
499 'group.read', 'group.write', 'group.admin')
500
500
501 if repo_group and perm_check(group_name=repo_group.group_name):
501 if repo_group and perm_check(group_name=repo_group.group_name):
502 db_entry.repository_group = repo_group
502 db_entry.repository_group = repo_group
503 should_save = True
503 should_save = True
504 default_redirect_url = '${repo_group_url}'
504 default_redirect_url = '${repo_group_url}'
505 # save generic info
505 # save generic info
506 elif entry.get('title') and entry.get('redirect_url'):
506 elif entry.get('title') and entry.get('redirect_url'):
507 should_save = True
507 should_save = True
508
508
509 if should_save:
509 if should_save:
510 # mark user and position
510 # mark user and position
511 db_entry.user_id = user_id
511 db_entry.user_id = user_id
512 db_entry.position = position
512 db_entry.position = position
513 db_entry.title = entry.get('title')
513 db_entry.title = entry.get('title')
514 db_entry.redirect_url = entry.get('redirect_url') or default_redirect_url
514 db_entry.redirect_url = entry.get('redirect_url') or default_redirect_url
515 log.debug('Saving bookmark %s, new:%s', db_entry, is_new)
515 log.debug('Saving bookmark %s, new:%s', db_entry, is_new)
516
516
517 Session().add(db_entry)
517 Session().add(db_entry)
518
518
519 @LoginRequired()
519 @LoginRequired()
520 @NotAnonymous()
520 @NotAnonymous()
521 @CSRFRequired()
521 @CSRFRequired()
522 @view_config(
522 @view_config(
523 route_name='my_account_bookmarks_update', request_method='POST')
523 route_name='my_account_bookmarks_update', request_method='POST')
524 def my_account_bookmarks_update(self):
524 def my_account_bookmarks_update(self):
525 _ = self.request.translate
525 _ = self.request.translate
526 c = self.load_default_context()
526 c = self.load_default_context()
527 c.active = 'bookmarks'
527 c.active = 'bookmarks'
528
528
529 controls = peppercorn.parse(self.request.POST.items())
529 controls = peppercorn.parse(self.request.POST.items())
530 user_id = c.user.user_id
530 user_id = c.user.user_id
531
531
532 # validate positions
532 # validate positions
533 positions = {}
533 positions = {}
534 for entry in controls.get('bookmarks', []):
534 for entry in controls.get('bookmarks', []):
535 position = safe_int(entry['position'])
535 position = safe_int(entry['position'])
536 if position is None:
536 if position is None:
537 continue
537 continue
538
538
539 if position in positions:
539 if position in positions:
540 h.flash(_("Position {} is defined twice. "
540 h.flash(_("Position {} is defined twice. "
541 "Please correct this error.").format(position), category='error')
541 "Please correct this error.").format(position), category='error')
542 return HTTPFound(h.route_path('my_account_bookmarks'))
542 return HTTPFound(h.route_path('my_account_bookmarks'))
543
543
544 entry['position'] = position
544 entry['position'] = position
545 entry['cur_position'] = safe_int(entry.get('cur_position'))
545 entry['cur_position'] = safe_int(entry.get('cur_position'))
546 positions[position] = entry
546 positions[position] = entry
547
547
548 try:
548 try:
549 for entry in positions.values():
549 for entry in positions.values():
550 self._process_bookmark_entry(entry, user_id)
550 self._process_bookmark_entry(entry, user_id)
551
551
552 Session().commit()
552 Session().commit()
553 h.flash(_("Update Bookmarks"), category='success')
553 h.flash(_("Update Bookmarks"), category='success')
554 except IntegrityError:
554 except IntegrityError:
555 h.flash(_("Failed to update bookmarks. "
555 h.flash(_("Failed to update bookmarks. "
556 "Make sure an unique position is used."), category='error')
556 "Make sure an unique position is used."), category='error')
557
557
558 return HTTPFound(h.route_path('my_account_bookmarks'))
558 return HTTPFound(h.route_path('my_account_bookmarks'))
559
559
560 @LoginRequired()
560 @LoginRequired()
561 @NotAnonymous()
561 @NotAnonymous()
562 @view_config(
562 @view_config(
563 route_name='my_account_goto_bookmark', request_method='GET',
563 route_name='my_account_goto_bookmark', request_method='GET',
564 renderer='rhodecode:templates/admin/my_account/my_account.mako')
564 renderer='rhodecode:templates/admin/my_account/my_account.mako')
565 def my_account_goto_bookmark(self):
565 def my_account_goto_bookmark(self):
566
566
567 bookmark_id = self.request.matchdict['bookmark_id']
567 bookmark_id = self.request.matchdict['bookmark_id']
568 user_bookmark = UserBookmark().query()\
568 user_bookmark = UserBookmark().query()\
569 .filter(UserBookmark.user_id == self.request.user.user_id) \
569 .filter(UserBookmark.user_id == self.request.user.user_id) \
570 .filter(UserBookmark.position == bookmark_id).scalar()
570 .filter(UserBookmark.position == bookmark_id).scalar()
571
571
572 redirect_url = h.route_path('my_account_bookmarks')
572 redirect_url = h.route_path('my_account_bookmarks')
573 if not user_bookmark:
573 if not user_bookmark:
574 raise HTTPFound(redirect_url)
574 raise HTTPFound(redirect_url)
575
575
576 # repository set
576 # repository set
577 if user_bookmark.repository:
577 if user_bookmark.repository:
578 repo_name = user_bookmark.repository.repo_name
578 repo_name = user_bookmark.repository.repo_name
579 base_redirect_url = h.route_path(
579 base_redirect_url = h.route_path(
580 'repo_summary', repo_name=repo_name)
580 'repo_summary', repo_name=repo_name)
581 if user_bookmark.redirect_url and \
581 if user_bookmark.redirect_url and \
582 '${repo_url}' in user_bookmark.redirect_url:
582 '${repo_url}' in user_bookmark.redirect_url:
583 redirect_url = string.Template(user_bookmark.redirect_url)\
583 redirect_url = string.Template(user_bookmark.redirect_url)\
584 .safe_substitute({'repo_url': base_redirect_url})
584 .safe_substitute({'repo_url': base_redirect_url})
585 else:
585 else:
586 redirect_url = base_redirect_url
586 redirect_url = base_redirect_url
587 # repository group set
587 # repository group set
588 elif user_bookmark.repository_group:
588 elif user_bookmark.repository_group:
589 repo_group_name = user_bookmark.repository_group.group_name
589 repo_group_name = user_bookmark.repository_group.group_name
590 base_redirect_url = h.route_path(
590 base_redirect_url = h.route_path(
591 'repo_group_home', repo_group_name=repo_group_name)
591 'repo_group_home', repo_group_name=repo_group_name)
592 if user_bookmark.redirect_url and \
592 if user_bookmark.redirect_url and \
593 '${repo_group_url}' in user_bookmark.redirect_url:
593 '${repo_group_url}' in user_bookmark.redirect_url:
594 redirect_url = string.Template(user_bookmark.redirect_url)\
594 redirect_url = string.Template(user_bookmark.redirect_url)\
595 .safe_substitute({'repo_group_url': base_redirect_url})
595 .safe_substitute({'repo_group_url': base_redirect_url})
596 else:
596 else:
597 redirect_url = base_redirect_url
597 redirect_url = base_redirect_url
598 # custom URL set
598 # custom URL set
599 elif user_bookmark.redirect_url:
599 elif user_bookmark.redirect_url:
600 server_url = h.route_url('home').rstrip('/')
600 server_url = h.route_url('home').rstrip('/')
601 redirect_url = string.Template(user_bookmark.redirect_url) \
601 redirect_url = string.Template(user_bookmark.redirect_url) \
602 .safe_substitute({'server_url': server_url})
602 .safe_substitute({'server_url': server_url})
603
603
604 log.debug('Redirecting bookmark %s to %s', user_bookmark, redirect_url)
604 log.debug('Redirecting bookmark %s to %s', user_bookmark, redirect_url)
605 raise HTTPFound(redirect_url)
605 raise HTTPFound(redirect_url)
606
606
607 @LoginRequired()
607 @LoginRequired()
608 @NotAnonymous()
608 @NotAnonymous()
609 @view_config(
609 @view_config(
610 route_name='my_account_perms', request_method='GET',
610 route_name='my_account_perms', request_method='GET',
611 renderer='rhodecode:templates/admin/my_account/my_account.mako')
611 renderer='rhodecode:templates/admin/my_account/my_account.mako')
612 def my_account_perms(self):
612 def my_account_perms(self):
613 c = self.load_default_context()
613 c = self.load_default_context()
614 c.active = 'perms'
614 c.active = 'perms'
615
615
616 c.perm_user = c.auth_user
616 c.perm_user = c.auth_user
617 return self._get_template_context(c)
617 return self._get_template_context(c)
618
618
619 @LoginRequired()
619 @LoginRequired()
620 @NotAnonymous()
620 @NotAnonymous()
621 @view_config(
621 @view_config(
622 route_name='my_account_notifications', request_method='GET',
622 route_name='my_account_notifications', request_method='GET',
623 renderer='rhodecode:templates/admin/my_account/my_account.mako')
623 renderer='rhodecode:templates/admin/my_account/my_account.mako')
624 def my_notifications(self):
624 def my_notifications(self):
625 c = self.load_default_context()
625 c = self.load_default_context()
626 c.active = 'notifications'
626 c.active = 'notifications'
627
627
628 return self._get_template_context(c)
628 return self._get_template_context(c)
629
629
630 @LoginRequired()
630 @LoginRequired()
631 @NotAnonymous()
631 @NotAnonymous()
632 @CSRFRequired()
632 @CSRFRequired()
633 @view_config(
633 @view_config(
634 route_name='my_account_notifications_toggle_visibility',
634 route_name='my_account_notifications_toggle_visibility',
635 request_method='POST', renderer='json_ext')
635 request_method='POST', renderer='json_ext')
636 def my_notifications_toggle_visibility(self):
636 def my_notifications_toggle_visibility(self):
637 user = self._rhodecode_db_user
637 user = self._rhodecode_db_user
638 new_status = not user.user_data.get('notification_status', True)
638 new_status = not user.user_data.get('notification_status', True)
639 user.update_userdata(notification_status=new_status)
639 user.update_userdata(notification_status=new_status)
640 Session().commit()
640 Session().commit()
641 return user.user_data['notification_status']
641 return user.user_data['notification_status']
642
642
643 @LoginRequired()
643 @LoginRequired()
644 @NotAnonymous()
644 @NotAnonymous()
645 @view_config(
645 @view_config(
646 route_name='my_account_edit',
646 route_name='my_account_edit',
647 request_method='GET',
647 request_method='GET',
648 renderer='rhodecode:templates/admin/my_account/my_account.mako')
648 renderer='rhodecode:templates/admin/my_account/my_account.mako')
649 def my_account_edit(self):
649 def my_account_edit(self):
650 c = self.load_default_context()
650 c = self.load_default_context()
651 c.active = 'profile_edit'
651 c.active = 'profile_edit'
652 c.extern_type = c.user.extern_type
652 c.extern_type = c.user.extern_type
653 c.extern_name = c.user.extern_name
653 c.extern_name = c.user.extern_name
654
654
655 schema = user_schema.UserProfileSchema().bind(
655 schema = user_schema.UserProfileSchema().bind(
656 username=c.user.username, user_emails=c.user.emails)
656 username=c.user.username, user_emails=c.user.emails)
657 appstruct = {
657 appstruct = {
658 'username': c.user.username,
658 'username': c.user.username,
659 'email': c.user.email,
659 'email': c.user.email,
660 'firstname': c.user.firstname,
660 'firstname': c.user.firstname,
661 'lastname': c.user.lastname,
661 'lastname': c.user.lastname,
662 'description': c.user.description,
662 'description': c.user.description,
663 }
663 }
664 c.form = forms.RcForm(
664 c.form = forms.RcForm(
665 schema, appstruct=appstruct,
665 schema, appstruct=appstruct,
666 action=h.route_path('my_account_update'),
666 action=h.route_path('my_account_update'),
667 buttons=(forms.buttons.save, forms.buttons.reset))
667 buttons=(forms.buttons.save, forms.buttons.reset))
668
668
669 return self._get_template_context(c)
669 return self._get_template_context(c)
670
670
671 @LoginRequired()
671 @LoginRequired()
672 @NotAnonymous()
672 @NotAnonymous()
673 @CSRFRequired()
673 @CSRFRequired()
674 @view_config(
674 @view_config(
675 route_name='my_account_update',
675 route_name='my_account_update',
676 request_method='POST',
676 request_method='POST',
677 renderer='rhodecode:templates/admin/my_account/my_account.mako')
677 renderer='rhodecode:templates/admin/my_account/my_account.mako')
678 def my_account_update(self):
678 def my_account_update(self):
679 _ = self.request.translate
679 _ = self.request.translate
680 c = self.load_default_context()
680 c = self.load_default_context()
681 c.active = 'profile_edit'
681 c.active = 'profile_edit'
682 c.perm_user = c.auth_user
682 c.perm_user = c.auth_user
683 c.extern_type = c.user.extern_type
683 c.extern_type = c.user.extern_type
684 c.extern_name = c.user.extern_name
684 c.extern_name = c.user.extern_name
685
685
686 schema = user_schema.UserProfileSchema().bind(
686 schema = user_schema.UserProfileSchema().bind(
687 username=c.user.username, user_emails=c.user.emails)
687 username=c.user.username, user_emails=c.user.emails)
688 form = forms.RcForm(
688 form = forms.RcForm(
689 schema, buttons=(forms.buttons.save, forms.buttons.reset))
689 schema, buttons=(forms.buttons.save, forms.buttons.reset))
690
690
691 controls = self.request.POST.items()
691 controls = self.request.POST.items()
692 try:
692 try:
693 valid_data = form.validate(controls)
693 valid_data = form.validate(controls)
694 skip_attrs = ['admin', 'active', 'extern_type', 'extern_name',
694 skip_attrs = ['admin', 'active', 'extern_type', 'extern_name',
695 'new_password', 'password_confirmation']
695 'new_password', 'password_confirmation']
696 if c.extern_type != "rhodecode":
696 if c.extern_type != "rhodecode":
697 # forbid updating username for external accounts
697 # forbid updating username for external accounts
698 skip_attrs.append('username')
698 skip_attrs.append('username')
699 old_email = c.user.email
699 old_email = c.user.email
700 UserModel().update_user(
700 UserModel().update_user(
701 self._rhodecode_user.user_id, skip_attrs=skip_attrs,
701 self._rhodecode_user.user_id, skip_attrs=skip_attrs,
702 **valid_data)
702 **valid_data)
703 if old_email != valid_data['email']:
703 if old_email != valid_data['email']:
704 old = UserEmailMap.query() \
704 old = UserEmailMap.query() \
705 .filter(UserEmailMap.user == c.user).filter(UserEmailMap.email == valid_data['email']).first()
705 .filter(UserEmailMap.user == c.user)\
706 .filter(UserEmailMap.email == valid_data['email'])\
707 .first()
706 old.email = old_email
708 old.email = old_email
707 h.flash(_('Your account was updated successfully'), category='success')
709 h.flash(_('Your account was updated successfully'), category='success')
708 Session().commit()
710 Session().commit()
709 except forms.ValidationFailure as e:
711 except forms.ValidationFailure as e:
710 c.form = e
712 c.form = e
711 return self._get_template_context(c)
713 return self._get_template_context(c)
712 except Exception:
714 except Exception:
713 log.exception("Exception updating user")
715 log.exception("Exception updating user")
714 h.flash(_('Error occurred during update of user'),
716 h.flash(_('Error occurred during update of user'),
715 category='error')
717 category='error')
716 raise HTTPFound(h.route_path('my_account_profile'))
718 raise HTTPFound(h.route_path('my_account_profile'))
717
719
718 def _get_pull_requests_list(self, statuses):
720 def _get_pull_requests_list(self, statuses):
719 draw, start, limit = self._extract_chunk(self.request)
721 draw, start, limit = self._extract_chunk(self.request)
720 search_q, order_by, order_dir = self._extract_ordering(self.request)
722 search_q, order_by, order_dir = self._extract_ordering(self.request)
723
721 _render = self.request.get_partial_renderer(
724 _render = self.request.get_partial_renderer(
722 'rhodecode:templates/data_table/_dt_elements.mako')
725 'rhodecode:templates/data_table/_dt_elements.mako')
723
726
724 pull_requests = PullRequestModel().get_im_participating_in(
727 pull_requests = PullRequestModel().get_im_participating_in(
725 user_id=self._rhodecode_user.user_id,
728 user_id=self._rhodecode_user.user_id,
726 statuses=statuses, query=search_q,
729 statuses=statuses, query=search_q,
727 offset=start, length=limit, order_by=order_by,
730 offset=start, length=limit, order_by=order_by,
728 order_dir=order_dir)
731 order_dir=order_dir)
729
732
730 pull_requests_total_count = PullRequestModel().count_im_participating_in(
733 pull_requests_total_count = PullRequestModel().count_im_participating_in(
731 user_id=self._rhodecode_user.user_id, statuses=statuses, query=search_q)
734 user_id=self._rhodecode_user.user_id, statuses=statuses, query=search_q)
732
735
733 data = []
736 data = []
734 comments_model = CommentsModel()
737 comments_model = CommentsModel()
735 for pr in pull_requests:
738 for pr in pull_requests:
736 repo_id = pr.target_repo_id
739 repo_id = pr.target_repo_id
737 comments_count = comments_model.get_all_comments(
740 comments_count = comments_model.get_all_comments(
738 repo_id, pull_request=pr, count_only=True)
741 repo_id, pull_request=pr, include_drafts=False, count_only=True)
739 owned = pr.user_id == self._rhodecode_user.user_id
742 owned = pr.user_id == self._rhodecode_user.user_id
740
743
741 data.append({
744 data.append({
742 'target_repo': _render('pullrequest_target_repo',
745 'target_repo': _render('pullrequest_target_repo',
743 pr.target_repo.repo_name),
746 pr.target_repo.repo_name),
744 'name': _render('pullrequest_name',
747 'name': _render('pullrequest_name',
745 pr.pull_request_id, pr.pull_request_state,
748 pr.pull_request_id, pr.pull_request_state,
746 pr.work_in_progress, pr.target_repo.repo_name,
749 pr.work_in_progress, pr.target_repo.repo_name,
747 short=True),
750 short=True),
748 'name_raw': pr.pull_request_id,
751 'name_raw': pr.pull_request_id,
749 'status': _render('pullrequest_status',
752 'status': _render('pullrequest_status',
750 pr.calculated_review_status()),
753 pr.calculated_review_status()),
751 'title': _render('pullrequest_title', pr.title, pr.description),
754 'title': _render('pullrequest_title', pr.title, pr.description),
752 'description': h.escape(pr.description),
755 'description': h.escape(pr.description),
753 'updated_on': _render('pullrequest_updated_on',
756 'updated_on': _render('pullrequest_updated_on',
754 h.datetime_to_time(pr.updated_on)),
757 h.datetime_to_time(pr.updated_on),
758 pr.versions_count),
755 'updated_on_raw': h.datetime_to_time(pr.updated_on),
759 'updated_on_raw': h.datetime_to_time(pr.updated_on),
756 'created_on': _render('pullrequest_updated_on',
760 'created_on': _render('pullrequest_updated_on',
757 h.datetime_to_time(pr.created_on)),
761 h.datetime_to_time(pr.created_on)),
758 'created_on_raw': h.datetime_to_time(pr.created_on),
762 'created_on_raw': h.datetime_to_time(pr.created_on),
759 'state': pr.pull_request_state,
763 'state': pr.pull_request_state,
760 'author': _render('pullrequest_author',
764 'author': _render('pullrequest_author',
761 pr.author.full_contact, ),
765 pr.author.full_contact, ),
762 'author_raw': pr.author.full_name,
766 'author_raw': pr.author.full_name,
763 'comments': _render('pullrequest_comments', comments_count),
767 'comments': _render('pullrequest_comments', comments_count),
764 'comments_raw': comments_count,
768 'comments_raw': comments_count,
765 'closed': pr.is_closed(),
769 'closed': pr.is_closed(),
766 'owned': owned
770 'owned': owned
767 })
771 })
768
772
769 # json used to render the grid
773 # json used to render the grid
770 data = ({
774 data = ({
771 'draw': draw,
775 'draw': draw,
772 'data': data,
776 'data': data,
773 'recordsTotal': pull_requests_total_count,
777 'recordsTotal': pull_requests_total_count,
774 'recordsFiltered': pull_requests_total_count,
778 'recordsFiltered': pull_requests_total_count,
775 })
779 })
776 return data
780 return data
777
781
778 @LoginRequired()
782 @LoginRequired()
779 @NotAnonymous()
783 @NotAnonymous()
780 @view_config(
784 @view_config(
781 route_name='my_account_pullrequests',
785 route_name='my_account_pullrequests',
782 request_method='GET',
786 request_method='GET',
783 renderer='rhodecode:templates/admin/my_account/my_account.mako')
787 renderer='rhodecode:templates/admin/my_account/my_account.mako')
784 def my_account_pullrequests(self):
788 def my_account_pullrequests(self):
785 c = self.load_default_context()
789 c = self.load_default_context()
786 c.active = 'pullrequests'
790 c.active = 'pullrequests'
787 req_get = self.request.GET
791 req_get = self.request.GET
788
792
789 c.closed = str2bool(req_get.get('pr_show_closed'))
793 c.closed = str2bool(req_get.get('pr_show_closed'))
790
794
791 return self._get_template_context(c)
795 return self._get_template_context(c)
792
796
793 @LoginRequired()
797 @LoginRequired()
794 @NotAnonymous()
798 @NotAnonymous()
795 @view_config(
799 @view_config(
796 route_name='my_account_pullrequests_data',
800 route_name='my_account_pullrequests_data',
797 request_method='GET', renderer='json_ext')
801 request_method='GET', renderer='json_ext')
798 def my_account_pullrequests_data(self):
802 def my_account_pullrequests_data(self):
799 self.load_default_context()
803 self.load_default_context()
800 req_get = self.request.GET
804 req_get = self.request.GET
801 closed = str2bool(req_get.get('closed'))
805 closed = str2bool(req_get.get('closed'))
802
806
803 statuses = [PullRequest.STATUS_NEW, PullRequest.STATUS_OPEN]
807 statuses = [PullRequest.STATUS_NEW, PullRequest.STATUS_OPEN]
804 if closed:
808 if closed:
805 statuses += [PullRequest.STATUS_CLOSED]
809 statuses += [PullRequest.STATUS_CLOSED]
806
810
807 data = self._get_pull_requests_list(statuses=statuses)
811 data = self._get_pull_requests_list(statuses=statuses)
808 return data
812 return data
809
813
810 @LoginRequired()
814 @LoginRequired()
811 @NotAnonymous()
815 @NotAnonymous()
812 @view_config(
816 @view_config(
813 route_name='my_account_user_group_membership',
817 route_name='my_account_user_group_membership',
814 request_method='GET',
818 request_method='GET',
815 renderer='rhodecode:templates/admin/my_account/my_account.mako')
819 renderer='rhodecode:templates/admin/my_account/my_account.mako')
816 def my_account_user_group_membership(self):
820 def my_account_user_group_membership(self):
817 c = self.load_default_context()
821 c = self.load_default_context()
818 c.active = 'user_group_membership'
822 c.active = 'user_group_membership'
819 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
823 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
820 for group in self._rhodecode_db_user.group_member]
824 for group in self._rhodecode_db_user.group_member]
821 c.user_groups = json.dumps(groups)
825 c.user_groups = json.dumps(groups)
822 return self._get_template_context(c)
826 return self._get_template_context(c)
@@ -1,543 +1,548 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 from rhodecode.apps._base import add_route_with_slash
20 from rhodecode.apps._base import add_route_with_slash
21
21
22
22
23 def includeme(config):
23 def includeme(config):
24
24
25 # repo creating checks, special cases that aren't repo routes
25 # repo creating checks, special cases that aren't repo routes
26 config.add_route(
26 config.add_route(
27 name='repo_creating',
27 name='repo_creating',
28 pattern='/{repo_name:.*?[^/]}/repo_creating')
28 pattern='/{repo_name:.*?[^/]}/repo_creating')
29
29
30 config.add_route(
30 config.add_route(
31 name='repo_creating_check',
31 name='repo_creating_check',
32 pattern='/{repo_name:.*?[^/]}/repo_creating_check')
32 pattern='/{repo_name:.*?[^/]}/repo_creating_check')
33
33
34 # Summary
34 # Summary
35 # NOTE(marcink): one additional route is defined in very bottom, catch
35 # NOTE(marcink): one additional route is defined in very bottom, catch
36 # all pattern
36 # all pattern
37 config.add_route(
37 config.add_route(
38 name='repo_summary_explicit',
38 name='repo_summary_explicit',
39 pattern='/{repo_name:.*?[^/]}/summary', repo_route=True)
39 pattern='/{repo_name:.*?[^/]}/summary', repo_route=True)
40 config.add_route(
40 config.add_route(
41 name='repo_summary_commits',
41 name='repo_summary_commits',
42 pattern='/{repo_name:.*?[^/]}/summary-commits', repo_route=True)
42 pattern='/{repo_name:.*?[^/]}/summary-commits', repo_route=True)
43
43
44 # Commits
44 # Commits
45 config.add_route(
45 config.add_route(
46 name='repo_commit',
46 name='repo_commit',
47 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}', repo_route=True)
47 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}', repo_route=True)
48
48
49 config.add_route(
49 config.add_route(
50 name='repo_commit_children',
50 name='repo_commit_children',
51 pattern='/{repo_name:.*?[^/]}/changeset_children/{commit_id}', repo_route=True)
51 pattern='/{repo_name:.*?[^/]}/changeset_children/{commit_id}', repo_route=True)
52
52
53 config.add_route(
53 config.add_route(
54 name='repo_commit_parents',
54 name='repo_commit_parents',
55 pattern='/{repo_name:.*?[^/]}/changeset_parents/{commit_id}', repo_route=True)
55 pattern='/{repo_name:.*?[^/]}/changeset_parents/{commit_id}', repo_route=True)
56
56
57 config.add_route(
57 config.add_route(
58 name='repo_commit_raw',
58 name='repo_commit_raw',
59 pattern='/{repo_name:.*?[^/]}/changeset-diff/{commit_id}', repo_route=True)
59 pattern='/{repo_name:.*?[^/]}/changeset-diff/{commit_id}', repo_route=True)
60
60
61 config.add_route(
61 config.add_route(
62 name='repo_commit_patch',
62 name='repo_commit_patch',
63 pattern='/{repo_name:.*?[^/]}/changeset-patch/{commit_id}', repo_route=True)
63 pattern='/{repo_name:.*?[^/]}/changeset-patch/{commit_id}', repo_route=True)
64
64
65 config.add_route(
65 config.add_route(
66 name='repo_commit_download',
66 name='repo_commit_download',
67 pattern='/{repo_name:.*?[^/]}/changeset-download/{commit_id}', repo_route=True)
67 pattern='/{repo_name:.*?[^/]}/changeset-download/{commit_id}', repo_route=True)
68
68
69 config.add_route(
69 config.add_route(
70 name='repo_commit_data',
70 name='repo_commit_data',
71 pattern='/{repo_name:.*?[^/]}/changeset-data/{commit_id}', repo_route=True)
71 pattern='/{repo_name:.*?[^/]}/changeset-data/{commit_id}', repo_route=True)
72
72
73 config.add_route(
73 config.add_route(
74 name='repo_commit_comment_create',
74 name='repo_commit_comment_create',
75 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/create', repo_route=True)
75 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/create', repo_route=True)
76
76
77 config.add_route(
77 config.add_route(
78 name='repo_commit_comment_preview',
78 name='repo_commit_comment_preview',
79 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True)
79 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True)
80
80
81 config.add_route(
81 config.add_route(
82 name='repo_commit_comment_history_view',
82 name='repo_commit_comment_history_view',
83 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_history_id}/history_view', repo_route=True)
83 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_history_id}/history_view', repo_route=True)
84
84
85 config.add_route(
85 config.add_route(
86 name='repo_commit_comment_attachment_upload',
86 name='repo_commit_comment_attachment_upload',
87 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True)
87 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True)
88
88
89 config.add_route(
89 config.add_route(
90 name='repo_commit_comment_delete',
90 name='repo_commit_comment_delete',
91 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True)
91 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True)
92
92
93 config.add_route(
93 config.add_route(
94 name='repo_commit_comment_edit',
94 name='repo_commit_comment_edit',
95 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/edit', repo_route=True)
95 pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/edit', repo_route=True)
96
96
97 # still working url for backward compat.
97 # still working url for backward compat.
98 config.add_route(
98 config.add_route(
99 name='repo_commit_raw_deprecated',
99 name='repo_commit_raw_deprecated',
100 pattern='/{repo_name:.*?[^/]}/raw-changeset/{commit_id}', repo_route=True)
100 pattern='/{repo_name:.*?[^/]}/raw-changeset/{commit_id}', repo_route=True)
101
101
102 # Files
102 # Files
103 config.add_route(
103 config.add_route(
104 name='repo_archivefile',
104 name='repo_archivefile',
105 pattern='/{repo_name:.*?[^/]}/archive/{fname:.*}', repo_route=True)
105 pattern='/{repo_name:.*?[^/]}/archive/{fname:.*}', repo_route=True)
106
106
107 config.add_route(
107 config.add_route(
108 name='repo_files_diff',
108 name='repo_files_diff',
109 pattern='/{repo_name:.*?[^/]}/diff/{f_path:.*}', repo_route=True)
109 pattern='/{repo_name:.*?[^/]}/diff/{f_path:.*}', repo_route=True)
110 config.add_route( # legacy route to make old links work
110 config.add_route( # legacy route to make old links work
111 name='repo_files_diff_2way_redirect',
111 name='repo_files_diff_2way_redirect',
112 pattern='/{repo_name:.*?[^/]}/diff-2way/{f_path:.*}', repo_route=True)
112 pattern='/{repo_name:.*?[^/]}/diff-2way/{f_path:.*}', repo_route=True)
113
113
114 config.add_route(
114 config.add_route(
115 name='repo_files',
115 name='repo_files',
116 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/{f_path:.*}', repo_route=True)
116 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/{f_path:.*}', repo_route=True)
117 config.add_route(
117 config.add_route(
118 name='repo_files:default_path',
118 name='repo_files:default_path',
119 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/', repo_route=True)
119 pattern='/{repo_name:.*?[^/]}/files/{commit_id}/', repo_route=True)
120 config.add_route(
120 config.add_route(
121 name='repo_files:default_commit',
121 name='repo_files:default_commit',
122 pattern='/{repo_name:.*?[^/]}/files', repo_route=True)
122 pattern='/{repo_name:.*?[^/]}/files', repo_route=True)
123
123
124 config.add_route(
124 config.add_route(
125 name='repo_files:rendered',
125 name='repo_files:rendered',
126 pattern='/{repo_name:.*?[^/]}/render/{commit_id}/{f_path:.*}', repo_route=True)
126 pattern='/{repo_name:.*?[^/]}/render/{commit_id}/{f_path:.*}', repo_route=True)
127
127
128 config.add_route(
128 config.add_route(
129 name='repo_files:annotated',
129 name='repo_files:annotated',
130 pattern='/{repo_name:.*?[^/]}/annotate/{commit_id}/{f_path:.*}', repo_route=True)
130 pattern='/{repo_name:.*?[^/]}/annotate/{commit_id}/{f_path:.*}', repo_route=True)
131 config.add_route(
131 config.add_route(
132 name='repo_files:annotated_previous',
132 name='repo_files:annotated_previous',
133 pattern='/{repo_name:.*?[^/]}/annotate-previous/{commit_id}/{f_path:.*}', repo_route=True)
133 pattern='/{repo_name:.*?[^/]}/annotate-previous/{commit_id}/{f_path:.*}', repo_route=True)
134
134
135 config.add_route(
135 config.add_route(
136 name='repo_nodetree_full',
136 name='repo_nodetree_full',
137 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/{f_path:.*}', repo_route=True)
137 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/{f_path:.*}', repo_route=True)
138 config.add_route(
138 config.add_route(
139 name='repo_nodetree_full:default_path',
139 name='repo_nodetree_full:default_path',
140 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/', repo_route=True)
140 pattern='/{repo_name:.*?[^/]}/nodetree_full/{commit_id}/', repo_route=True)
141
141
142 config.add_route(
142 config.add_route(
143 name='repo_files_nodelist',
143 name='repo_files_nodelist',
144 pattern='/{repo_name:.*?[^/]}/nodelist/{commit_id}/{f_path:.*}', repo_route=True)
144 pattern='/{repo_name:.*?[^/]}/nodelist/{commit_id}/{f_path:.*}', repo_route=True)
145
145
146 config.add_route(
146 config.add_route(
147 name='repo_file_raw',
147 name='repo_file_raw',
148 pattern='/{repo_name:.*?[^/]}/raw/{commit_id}/{f_path:.*}', repo_route=True)
148 pattern='/{repo_name:.*?[^/]}/raw/{commit_id}/{f_path:.*}', repo_route=True)
149
149
150 config.add_route(
150 config.add_route(
151 name='repo_file_download',
151 name='repo_file_download',
152 pattern='/{repo_name:.*?[^/]}/download/{commit_id}/{f_path:.*}', repo_route=True)
152 pattern='/{repo_name:.*?[^/]}/download/{commit_id}/{f_path:.*}', repo_route=True)
153 config.add_route( # backward compat to keep old links working
153 config.add_route( # backward compat to keep old links working
154 name='repo_file_download:legacy',
154 name='repo_file_download:legacy',
155 pattern='/{repo_name:.*?[^/]}/rawfile/{commit_id}/{f_path:.*}',
155 pattern='/{repo_name:.*?[^/]}/rawfile/{commit_id}/{f_path:.*}',
156 repo_route=True)
156 repo_route=True)
157
157
158 config.add_route(
158 config.add_route(
159 name='repo_file_history',
159 name='repo_file_history',
160 pattern='/{repo_name:.*?[^/]}/history/{commit_id}/{f_path:.*}', repo_route=True)
160 pattern='/{repo_name:.*?[^/]}/history/{commit_id}/{f_path:.*}', repo_route=True)
161
161
162 config.add_route(
162 config.add_route(
163 name='repo_file_authors',
163 name='repo_file_authors',
164 pattern='/{repo_name:.*?[^/]}/authors/{commit_id}/{f_path:.*}', repo_route=True)
164 pattern='/{repo_name:.*?[^/]}/authors/{commit_id}/{f_path:.*}', repo_route=True)
165
165
166 config.add_route(
166 config.add_route(
167 name='repo_files_check_head',
167 name='repo_files_check_head',
168 pattern='/{repo_name:.*?[^/]}/check_head/{commit_id}/{f_path:.*}',
168 pattern='/{repo_name:.*?[^/]}/check_head/{commit_id}/{f_path:.*}',
169 repo_route=True)
169 repo_route=True)
170 config.add_route(
170 config.add_route(
171 name='repo_files_remove_file',
171 name='repo_files_remove_file',
172 pattern='/{repo_name:.*?[^/]}/remove_file/{commit_id}/{f_path:.*}',
172 pattern='/{repo_name:.*?[^/]}/remove_file/{commit_id}/{f_path:.*}',
173 repo_route=True)
173 repo_route=True)
174 config.add_route(
174 config.add_route(
175 name='repo_files_delete_file',
175 name='repo_files_delete_file',
176 pattern='/{repo_name:.*?[^/]}/delete_file/{commit_id}/{f_path:.*}',
176 pattern='/{repo_name:.*?[^/]}/delete_file/{commit_id}/{f_path:.*}',
177 repo_route=True)
177 repo_route=True)
178 config.add_route(
178 config.add_route(
179 name='repo_files_edit_file',
179 name='repo_files_edit_file',
180 pattern='/{repo_name:.*?[^/]}/edit_file/{commit_id}/{f_path:.*}',
180 pattern='/{repo_name:.*?[^/]}/edit_file/{commit_id}/{f_path:.*}',
181 repo_route=True)
181 repo_route=True)
182 config.add_route(
182 config.add_route(
183 name='repo_files_update_file',
183 name='repo_files_update_file',
184 pattern='/{repo_name:.*?[^/]}/update_file/{commit_id}/{f_path:.*}',
184 pattern='/{repo_name:.*?[^/]}/update_file/{commit_id}/{f_path:.*}',
185 repo_route=True)
185 repo_route=True)
186 config.add_route(
186 config.add_route(
187 name='repo_files_add_file',
187 name='repo_files_add_file',
188 pattern='/{repo_name:.*?[^/]}/add_file/{commit_id}/{f_path:.*}',
188 pattern='/{repo_name:.*?[^/]}/add_file/{commit_id}/{f_path:.*}',
189 repo_route=True)
189 repo_route=True)
190 config.add_route(
190 config.add_route(
191 name='repo_files_upload_file',
191 name='repo_files_upload_file',
192 pattern='/{repo_name:.*?[^/]}/upload_file/{commit_id}/{f_path:.*}',
192 pattern='/{repo_name:.*?[^/]}/upload_file/{commit_id}/{f_path:.*}',
193 repo_route=True)
193 repo_route=True)
194 config.add_route(
194 config.add_route(
195 name='repo_files_create_file',
195 name='repo_files_create_file',
196 pattern='/{repo_name:.*?[^/]}/create_file/{commit_id}/{f_path:.*}',
196 pattern='/{repo_name:.*?[^/]}/create_file/{commit_id}/{f_path:.*}',
197 repo_route=True)
197 repo_route=True)
198
198
199 # Refs data
199 # Refs data
200 config.add_route(
200 config.add_route(
201 name='repo_refs_data',
201 name='repo_refs_data',
202 pattern='/{repo_name:.*?[^/]}/refs-data', repo_route=True)
202 pattern='/{repo_name:.*?[^/]}/refs-data', repo_route=True)
203
203
204 config.add_route(
204 config.add_route(
205 name='repo_refs_changelog_data',
205 name='repo_refs_changelog_data',
206 pattern='/{repo_name:.*?[^/]}/refs-data-changelog', repo_route=True)
206 pattern='/{repo_name:.*?[^/]}/refs-data-changelog', repo_route=True)
207
207
208 config.add_route(
208 config.add_route(
209 name='repo_stats',
209 name='repo_stats',
210 pattern='/{repo_name:.*?[^/]}/repo_stats/{commit_id}', repo_route=True)
210 pattern='/{repo_name:.*?[^/]}/repo_stats/{commit_id}', repo_route=True)
211
211
212 # Commits
212 # Commits
213 config.add_route(
213 config.add_route(
214 name='repo_commits',
214 name='repo_commits',
215 pattern='/{repo_name:.*?[^/]}/commits', repo_route=True)
215 pattern='/{repo_name:.*?[^/]}/commits', repo_route=True)
216 config.add_route(
216 config.add_route(
217 name='repo_commits_file',
217 name='repo_commits_file',
218 pattern='/{repo_name:.*?[^/]}/commits/{commit_id}/{f_path:.*}', repo_route=True)
218 pattern='/{repo_name:.*?[^/]}/commits/{commit_id}/{f_path:.*}', repo_route=True)
219 config.add_route(
219 config.add_route(
220 name='repo_commits_elements',
220 name='repo_commits_elements',
221 pattern='/{repo_name:.*?[^/]}/commits_elements', repo_route=True)
221 pattern='/{repo_name:.*?[^/]}/commits_elements', repo_route=True)
222 config.add_route(
222 config.add_route(
223 name='repo_commits_elements_file',
223 name='repo_commits_elements_file',
224 pattern='/{repo_name:.*?[^/]}/commits_elements/{commit_id}/{f_path:.*}', repo_route=True)
224 pattern='/{repo_name:.*?[^/]}/commits_elements/{commit_id}/{f_path:.*}', repo_route=True)
225
225
226 # Changelog (old deprecated name for commits page)
226 # Changelog (old deprecated name for commits page)
227 config.add_route(
227 config.add_route(
228 name='repo_changelog',
228 name='repo_changelog',
229 pattern='/{repo_name:.*?[^/]}/changelog', repo_route=True)
229 pattern='/{repo_name:.*?[^/]}/changelog', repo_route=True)
230 config.add_route(
230 config.add_route(
231 name='repo_changelog_file',
231 name='repo_changelog_file',
232 pattern='/{repo_name:.*?[^/]}/changelog/{commit_id}/{f_path:.*}', repo_route=True)
232 pattern='/{repo_name:.*?[^/]}/changelog/{commit_id}/{f_path:.*}', repo_route=True)
233
233
234 # Compare
234 # Compare
235 config.add_route(
235 config.add_route(
236 name='repo_compare_select',
236 name='repo_compare_select',
237 pattern='/{repo_name:.*?[^/]}/compare', repo_route=True)
237 pattern='/{repo_name:.*?[^/]}/compare', repo_route=True)
238
238
239 config.add_route(
239 config.add_route(
240 name='repo_compare',
240 name='repo_compare',
241 pattern='/{repo_name:.*?[^/]}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', repo_route=True)
241 pattern='/{repo_name:.*?[^/]}/compare/{source_ref_type}@{source_ref:.*?}...{target_ref_type}@{target_ref:.*?}', repo_route=True)
242
242
243 # Tags
243 # Tags
244 config.add_route(
244 config.add_route(
245 name='tags_home',
245 name='tags_home',
246 pattern='/{repo_name:.*?[^/]}/tags', repo_route=True)
246 pattern='/{repo_name:.*?[^/]}/tags', repo_route=True)
247
247
248 # Branches
248 # Branches
249 config.add_route(
249 config.add_route(
250 name='branches_home',
250 name='branches_home',
251 pattern='/{repo_name:.*?[^/]}/branches', repo_route=True)
251 pattern='/{repo_name:.*?[^/]}/branches', repo_route=True)
252
252
253 # Bookmarks
253 # Bookmarks
254 config.add_route(
254 config.add_route(
255 name='bookmarks_home',
255 name='bookmarks_home',
256 pattern='/{repo_name:.*?[^/]}/bookmarks', repo_route=True)
256 pattern='/{repo_name:.*?[^/]}/bookmarks', repo_route=True)
257
257
258 # Forks
258 # Forks
259 config.add_route(
259 config.add_route(
260 name='repo_fork_new',
260 name='repo_fork_new',
261 pattern='/{repo_name:.*?[^/]}/fork', repo_route=True,
261 pattern='/{repo_name:.*?[^/]}/fork', repo_route=True,
262 repo_forbid_when_archived=True,
262 repo_forbid_when_archived=True,
263 repo_accepted_types=['hg', 'git'])
263 repo_accepted_types=['hg', 'git'])
264
264
265 config.add_route(
265 config.add_route(
266 name='repo_fork_create',
266 name='repo_fork_create',
267 pattern='/{repo_name:.*?[^/]}/fork/create', repo_route=True,
267 pattern='/{repo_name:.*?[^/]}/fork/create', repo_route=True,
268 repo_forbid_when_archived=True,
268 repo_forbid_when_archived=True,
269 repo_accepted_types=['hg', 'git'])
269 repo_accepted_types=['hg', 'git'])
270
270
271 config.add_route(
271 config.add_route(
272 name='repo_forks_show_all',
272 name='repo_forks_show_all',
273 pattern='/{repo_name:.*?[^/]}/forks', repo_route=True,
273 pattern='/{repo_name:.*?[^/]}/forks', repo_route=True,
274 repo_accepted_types=['hg', 'git'])
274 repo_accepted_types=['hg', 'git'])
275 config.add_route(
275 config.add_route(
276 name='repo_forks_data',
276 name='repo_forks_data',
277 pattern='/{repo_name:.*?[^/]}/forks/data', repo_route=True,
277 pattern='/{repo_name:.*?[^/]}/forks/data', repo_route=True,
278 repo_accepted_types=['hg', 'git'])
278 repo_accepted_types=['hg', 'git'])
279
279
280 # Pull Requests
280 # Pull Requests
281 config.add_route(
281 config.add_route(
282 name='pullrequest_show',
282 name='pullrequest_show',
283 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}',
283 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}',
284 repo_route=True)
284 repo_route=True)
285
285
286 config.add_route(
286 config.add_route(
287 name='pullrequest_show_all',
287 name='pullrequest_show_all',
288 pattern='/{repo_name:.*?[^/]}/pull-request',
288 pattern='/{repo_name:.*?[^/]}/pull-request',
289 repo_route=True, repo_accepted_types=['hg', 'git'])
289 repo_route=True, repo_accepted_types=['hg', 'git'])
290
290
291 config.add_route(
291 config.add_route(
292 name='pullrequest_show_all_data',
292 name='pullrequest_show_all_data',
293 pattern='/{repo_name:.*?[^/]}/pull-request-data',
293 pattern='/{repo_name:.*?[^/]}/pull-request-data',
294 repo_route=True, repo_accepted_types=['hg', 'git'])
294 repo_route=True, repo_accepted_types=['hg', 'git'])
295
295
296 config.add_route(
296 config.add_route(
297 name='pullrequest_repo_refs',
297 name='pullrequest_repo_refs',
298 pattern='/{repo_name:.*?[^/]}/pull-request/refs/{target_repo_name:.*?[^/]}',
298 pattern='/{repo_name:.*?[^/]}/pull-request/refs/{target_repo_name:.*?[^/]}',
299 repo_route=True)
299 repo_route=True)
300
300
301 config.add_route(
301 config.add_route(
302 name='pullrequest_repo_targets',
302 name='pullrequest_repo_targets',
303 pattern='/{repo_name:.*?[^/]}/pull-request/repo-targets',
303 pattern='/{repo_name:.*?[^/]}/pull-request/repo-targets',
304 repo_route=True)
304 repo_route=True)
305
305
306 config.add_route(
306 config.add_route(
307 name='pullrequest_new',
307 name='pullrequest_new',
308 pattern='/{repo_name:.*?[^/]}/pull-request/new',
308 pattern='/{repo_name:.*?[^/]}/pull-request/new',
309 repo_route=True, repo_accepted_types=['hg', 'git'],
309 repo_route=True, repo_accepted_types=['hg', 'git'],
310 repo_forbid_when_archived=True)
310 repo_forbid_when_archived=True)
311
311
312 config.add_route(
312 config.add_route(
313 name='pullrequest_create',
313 name='pullrequest_create',
314 pattern='/{repo_name:.*?[^/]}/pull-request/create',
314 pattern='/{repo_name:.*?[^/]}/pull-request/create',
315 repo_route=True, repo_accepted_types=['hg', 'git'],
315 repo_route=True, repo_accepted_types=['hg', 'git'],
316 repo_forbid_when_archived=True)
316 repo_forbid_when_archived=True)
317
317
318 config.add_route(
318 config.add_route(
319 name='pullrequest_update',
319 name='pullrequest_update',
320 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/update',
320 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/update',
321 repo_route=True, repo_forbid_when_archived=True)
321 repo_route=True, repo_forbid_when_archived=True)
322
322
323 config.add_route(
323 config.add_route(
324 name='pullrequest_merge',
324 name='pullrequest_merge',
325 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/merge',
325 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/merge',
326 repo_route=True, repo_forbid_when_archived=True)
326 repo_route=True, repo_forbid_when_archived=True)
327
327
328 config.add_route(
328 config.add_route(
329 name='pullrequest_delete',
329 name='pullrequest_delete',
330 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/delete',
330 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/delete',
331 repo_route=True, repo_forbid_when_archived=True)
331 repo_route=True, repo_forbid_when_archived=True)
332
332
333 config.add_route(
333 config.add_route(
334 name='pullrequest_comment_create',
334 name='pullrequest_comment_create',
335 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment',
335 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment',
336 repo_route=True)
336 repo_route=True)
337
337
338 config.add_route(
338 config.add_route(
339 name='pullrequest_comment_edit',
339 name='pullrequest_comment_edit',
340 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/edit',
340 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/edit',
341 repo_route=True, repo_accepted_types=['hg', 'git'])
341 repo_route=True, repo_accepted_types=['hg', 'git'])
342
342
343 config.add_route(
343 config.add_route(
344 name='pullrequest_comment_delete',
344 name='pullrequest_comment_delete',
345 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/delete',
345 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/delete',
346 repo_route=True, repo_accepted_types=['hg', 'git'])
346 repo_route=True, repo_accepted_types=['hg', 'git'])
347
347
348 config.add_route(
348 config.add_route(
349 name='pullrequest_comments',
349 name='pullrequest_comments',
350 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comments',
350 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comments',
351 repo_route=True)
351 repo_route=True)
352
352
353 config.add_route(
353 config.add_route(
354 name='pullrequest_todos',
354 name='pullrequest_todos',
355 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/todos',
355 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/todos',
356 repo_route=True)
356 repo_route=True)
357
357
358 config.add_route(
359 name='pullrequest_drafts',
360 pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/drafts',
361 repo_route=True)
362
358 # Artifacts, (EE feature)
363 # Artifacts, (EE feature)
359 config.add_route(
364 config.add_route(
360 name='repo_artifacts_list',
365 name='repo_artifacts_list',
361 pattern='/{repo_name:.*?[^/]}/artifacts', repo_route=True)
366 pattern='/{repo_name:.*?[^/]}/artifacts', repo_route=True)
362
367
363 # Settings
368 # Settings
364 config.add_route(
369 config.add_route(
365 name='edit_repo',
370 name='edit_repo',
366 pattern='/{repo_name:.*?[^/]}/settings', repo_route=True)
371 pattern='/{repo_name:.*?[^/]}/settings', repo_route=True)
367 # update is POST on edit_repo
372 # update is POST on edit_repo
368
373
369 # Settings advanced
374 # Settings advanced
370 config.add_route(
375 config.add_route(
371 name='edit_repo_advanced',
376 name='edit_repo_advanced',
372 pattern='/{repo_name:.*?[^/]}/settings/advanced', repo_route=True)
377 pattern='/{repo_name:.*?[^/]}/settings/advanced', repo_route=True)
373 config.add_route(
378 config.add_route(
374 name='edit_repo_advanced_archive',
379 name='edit_repo_advanced_archive',
375 pattern='/{repo_name:.*?[^/]}/settings/advanced/archive', repo_route=True)
380 pattern='/{repo_name:.*?[^/]}/settings/advanced/archive', repo_route=True)
376 config.add_route(
381 config.add_route(
377 name='edit_repo_advanced_delete',
382 name='edit_repo_advanced_delete',
378 pattern='/{repo_name:.*?[^/]}/settings/advanced/delete', repo_route=True)
383 pattern='/{repo_name:.*?[^/]}/settings/advanced/delete', repo_route=True)
379 config.add_route(
384 config.add_route(
380 name='edit_repo_advanced_locking',
385 name='edit_repo_advanced_locking',
381 pattern='/{repo_name:.*?[^/]}/settings/advanced/locking', repo_route=True)
386 pattern='/{repo_name:.*?[^/]}/settings/advanced/locking', repo_route=True)
382 config.add_route(
387 config.add_route(
383 name='edit_repo_advanced_journal',
388 name='edit_repo_advanced_journal',
384 pattern='/{repo_name:.*?[^/]}/settings/advanced/journal', repo_route=True)
389 pattern='/{repo_name:.*?[^/]}/settings/advanced/journal', repo_route=True)
385 config.add_route(
390 config.add_route(
386 name='edit_repo_advanced_fork',
391 name='edit_repo_advanced_fork',
387 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
392 pattern='/{repo_name:.*?[^/]}/settings/advanced/fork', repo_route=True)
388
393
389 config.add_route(
394 config.add_route(
390 name='edit_repo_advanced_hooks',
395 name='edit_repo_advanced_hooks',
391 pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True)
396 pattern='/{repo_name:.*?[^/]}/settings/advanced/hooks', repo_route=True)
392
397
393 # Caches
398 # Caches
394 config.add_route(
399 config.add_route(
395 name='edit_repo_caches',
400 name='edit_repo_caches',
396 pattern='/{repo_name:.*?[^/]}/settings/caches', repo_route=True)
401 pattern='/{repo_name:.*?[^/]}/settings/caches', repo_route=True)
397
402
398 # Permissions
403 # Permissions
399 config.add_route(
404 config.add_route(
400 name='edit_repo_perms',
405 name='edit_repo_perms',
401 pattern='/{repo_name:.*?[^/]}/settings/permissions', repo_route=True)
406 pattern='/{repo_name:.*?[^/]}/settings/permissions', repo_route=True)
402
407
403 config.add_route(
408 config.add_route(
404 name='edit_repo_perms_set_private',
409 name='edit_repo_perms_set_private',
405 pattern='/{repo_name:.*?[^/]}/settings/permissions/set_private', repo_route=True)
410 pattern='/{repo_name:.*?[^/]}/settings/permissions/set_private', repo_route=True)
406
411
407 # Permissions Branch (EE feature)
412 # Permissions Branch (EE feature)
408 config.add_route(
413 config.add_route(
409 name='edit_repo_perms_branch',
414 name='edit_repo_perms_branch',
410 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions', repo_route=True)
415 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions', repo_route=True)
411 config.add_route(
416 config.add_route(
412 name='edit_repo_perms_branch_delete',
417 name='edit_repo_perms_branch_delete',
413 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions/{rule_id}/delete',
418 pattern='/{repo_name:.*?[^/]}/settings/branch_permissions/{rule_id}/delete',
414 repo_route=True)
419 repo_route=True)
415
420
416 # Maintenance
421 # Maintenance
417 config.add_route(
422 config.add_route(
418 name='edit_repo_maintenance',
423 name='edit_repo_maintenance',
419 pattern='/{repo_name:.*?[^/]}/settings/maintenance', repo_route=True)
424 pattern='/{repo_name:.*?[^/]}/settings/maintenance', repo_route=True)
420
425
421 config.add_route(
426 config.add_route(
422 name='edit_repo_maintenance_execute',
427 name='edit_repo_maintenance_execute',
423 pattern='/{repo_name:.*?[^/]}/settings/maintenance/execute', repo_route=True)
428 pattern='/{repo_name:.*?[^/]}/settings/maintenance/execute', repo_route=True)
424
429
425 # Fields
430 # Fields
426 config.add_route(
431 config.add_route(
427 name='edit_repo_fields',
432 name='edit_repo_fields',
428 pattern='/{repo_name:.*?[^/]}/settings/fields', repo_route=True)
433 pattern='/{repo_name:.*?[^/]}/settings/fields', repo_route=True)
429 config.add_route(
434 config.add_route(
430 name='edit_repo_fields_create',
435 name='edit_repo_fields_create',
431 pattern='/{repo_name:.*?[^/]}/settings/fields/create', repo_route=True)
436 pattern='/{repo_name:.*?[^/]}/settings/fields/create', repo_route=True)
432 config.add_route(
437 config.add_route(
433 name='edit_repo_fields_delete',
438 name='edit_repo_fields_delete',
434 pattern='/{repo_name:.*?[^/]}/settings/fields/{field_id}/delete', repo_route=True)
439 pattern='/{repo_name:.*?[^/]}/settings/fields/{field_id}/delete', repo_route=True)
435
440
436 # Locking
441 # Locking
437 config.add_route(
442 config.add_route(
438 name='repo_edit_toggle_locking',
443 name='repo_edit_toggle_locking',
439 pattern='/{repo_name:.*?[^/]}/settings/toggle_locking', repo_route=True)
444 pattern='/{repo_name:.*?[^/]}/settings/toggle_locking', repo_route=True)
440
445
441 # Remote
446 # Remote
442 config.add_route(
447 config.add_route(
443 name='edit_repo_remote',
448 name='edit_repo_remote',
444 pattern='/{repo_name:.*?[^/]}/settings/remote', repo_route=True)
449 pattern='/{repo_name:.*?[^/]}/settings/remote', repo_route=True)
445 config.add_route(
450 config.add_route(
446 name='edit_repo_remote_pull',
451 name='edit_repo_remote_pull',
447 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
452 pattern='/{repo_name:.*?[^/]}/settings/remote/pull', repo_route=True)
448 config.add_route(
453 config.add_route(
449 name='edit_repo_remote_push',
454 name='edit_repo_remote_push',
450 pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True)
455 pattern='/{repo_name:.*?[^/]}/settings/remote/push', repo_route=True)
451
456
452 # Statistics
457 # Statistics
453 config.add_route(
458 config.add_route(
454 name='edit_repo_statistics',
459 name='edit_repo_statistics',
455 pattern='/{repo_name:.*?[^/]}/settings/statistics', repo_route=True)
460 pattern='/{repo_name:.*?[^/]}/settings/statistics', repo_route=True)
456 config.add_route(
461 config.add_route(
457 name='edit_repo_statistics_reset',
462 name='edit_repo_statistics_reset',
458 pattern='/{repo_name:.*?[^/]}/settings/statistics/update', repo_route=True)
463 pattern='/{repo_name:.*?[^/]}/settings/statistics/update', repo_route=True)
459
464
460 # Issue trackers
465 # Issue trackers
461 config.add_route(
466 config.add_route(
462 name='edit_repo_issuetracker',
467 name='edit_repo_issuetracker',
463 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers', repo_route=True)
468 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers', repo_route=True)
464 config.add_route(
469 config.add_route(
465 name='edit_repo_issuetracker_test',
470 name='edit_repo_issuetracker_test',
466 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/test', repo_route=True)
471 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/test', repo_route=True)
467 config.add_route(
472 config.add_route(
468 name='edit_repo_issuetracker_delete',
473 name='edit_repo_issuetracker_delete',
469 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/delete', repo_route=True)
474 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/delete', repo_route=True)
470 config.add_route(
475 config.add_route(
471 name='edit_repo_issuetracker_update',
476 name='edit_repo_issuetracker_update',
472 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/update', repo_route=True)
477 pattern='/{repo_name:.*?[^/]}/settings/issue_trackers/update', repo_route=True)
473
478
474 # VCS Settings
479 # VCS Settings
475 config.add_route(
480 config.add_route(
476 name='edit_repo_vcs',
481 name='edit_repo_vcs',
477 pattern='/{repo_name:.*?[^/]}/settings/vcs', repo_route=True)
482 pattern='/{repo_name:.*?[^/]}/settings/vcs', repo_route=True)
478 config.add_route(
483 config.add_route(
479 name='edit_repo_vcs_update',
484 name='edit_repo_vcs_update',
480 pattern='/{repo_name:.*?[^/]}/settings/vcs/update', repo_route=True)
485 pattern='/{repo_name:.*?[^/]}/settings/vcs/update', repo_route=True)
481
486
482 # svn pattern
487 # svn pattern
483 config.add_route(
488 config.add_route(
484 name='edit_repo_vcs_svn_pattern_delete',
489 name='edit_repo_vcs_svn_pattern_delete',
485 pattern='/{repo_name:.*?[^/]}/settings/vcs/svn_pattern/delete', repo_route=True)
490 pattern='/{repo_name:.*?[^/]}/settings/vcs/svn_pattern/delete', repo_route=True)
486
491
487 # Repo Review Rules (EE feature)
492 # Repo Review Rules (EE feature)
488 config.add_route(
493 config.add_route(
489 name='repo_reviewers',
494 name='repo_reviewers',
490 pattern='/{repo_name:.*?[^/]}/settings/review/rules', repo_route=True)
495 pattern='/{repo_name:.*?[^/]}/settings/review/rules', repo_route=True)
491
496
492 config.add_route(
497 config.add_route(
493 name='repo_default_reviewers_data',
498 name='repo_default_reviewers_data',
494 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
499 pattern='/{repo_name:.*?[^/]}/settings/review/default-reviewers', repo_route=True)
495
500
496 # Repo Automation (EE feature)
501 # Repo Automation (EE feature)
497 config.add_route(
502 config.add_route(
498 name='repo_automation',
503 name='repo_automation',
499 pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True)
504 pattern='/{repo_name:.*?[^/]}/settings/automation', repo_route=True)
500
505
501 # Strip
506 # Strip
502 config.add_route(
507 config.add_route(
503 name='edit_repo_strip',
508 name='edit_repo_strip',
504 pattern='/{repo_name:.*?[^/]}/settings/strip', repo_route=True)
509 pattern='/{repo_name:.*?[^/]}/settings/strip', repo_route=True)
505
510
506 config.add_route(
511 config.add_route(
507 name='strip_check',
512 name='strip_check',
508 pattern='/{repo_name:.*?[^/]}/settings/strip_check', repo_route=True)
513 pattern='/{repo_name:.*?[^/]}/settings/strip_check', repo_route=True)
509
514
510 config.add_route(
515 config.add_route(
511 name='strip_execute',
516 name='strip_execute',
512 pattern='/{repo_name:.*?[^/]}/settings/strip_execute', repo_route=True)
517 pattern='/{repo_name:.*?[^/]}/settings/strip_execute', repo_route=True)
513
518
514 # Audit logs
519 # Audit logs
515 config.add_route(
520 config.add_route(
516 name='edit_repo_audit_logs',
521 name='edit_repo_audit_logs',
517 pattern='/{repo_name:.*?[^/]}/settings/audit_logs', repo_route=True)
522 pattern='/{repo_name:.*?[^/]}/settings/audit_logs', repo_route=True)
518
523
519 # ATOM/RSS Feed, shouldn't contain slashes for outlook compatibility
524 # ATOM/RSS Feed, shouldn't contain slashes for outlook compatibility
520 config.add_route(
525 config.add_route(
521 name='rss_feed_home',
526 name='rss_feed_home',
522 pattern='/{repo_name:.*?[^/]}/feed-rss', repo_route=True)
527 pattern='/{repo_name:.*?[^/]}/feed-rss', repo_route=True)
523
528
524 config.add_route(
529 config.add_route(
525 name='atom_feed_home',
530 name='atom_feed_home',
526 pattern='/{repo_name:.*?[^/]}/feed-atom', repo_route=True)
531 pattern='/{repo_name:.*?[^/]}/feed-atom', repo_route=True)
527
532
528 config.add_route(
533 config.add_route(
529 name='rss_feed_home_old',
534 name='rss_feed_home_old',
530 pattern='/{repo_name:.*?[^/]}/feed/rss', repo_route=True)
535 pattern='/{repo_name:.*?[^/]}/feed/rss', repo_route=True)
531
536
532 config.add_route(
537 config.add_route(
533 name='atom_feed_home_old',
538 name='atom_feed_home_old',
534 pattern='/{repo_name:.*?[^/]}/feed/atom', repo_route=True)
539 pattern='/{repo_name:.*?[^/]}/feed/atom', repo_route=True)
535
540
536 # NOTE(marcink): needs to be at the end for catch-all
541 # NOTE(marcink): needs to be at the end for catch-all
537 add_route_with_slash(
542 add_route_with_slash(
538 config,
543 config,
539 name='repo_summary',
544 name='repo_summary',
540 pattern='/{repo_name:.*?[^/]}', repo_route=True)
545 pattern='/{repo_name:.*?[^/]}', repo_route=True)
541
546
542 # Scan module for configuration decorators.
547 # Scan module for configuration decorators.
543 config.scan('.views', ignore='.tests')
548 config.scan('.views', ignore='.tests')
@@ -1,1661 +1,1658 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 import mock
20 import mock
21 import pytest
21 import pytest
22
22
23 import rhodecode
23 import rhodecode
24 from rhodecode.lib.vcs.backends.base import MergeResponse, MergeFailureReason
24 from rhodecode.lib.vcs.backends.base import MergeResponse, MergeFailureReason
25 from rhodecode.lib.vcs.nodes import FileNode
25 from rhodecode.lib.vcs.nodes import FileNode
26 from rhodecode.lib import helpers as h
26 from rhodecode.lib import helpers as h
27 from rhodecode.model.changeset_status import ChangesetStatusModel
27 from rhodecode.model.changeset_status import ChangesetStatusModel
28 from rhodecode.model.db import (
28 from rhodecode.model.db import (
29 PullRequest, ChangesetStatus, UserLog, Notification, ChangesetComment, Repository)
29 PullRequest, ChangesetStatus, UserLog, Notification, ChangesetComment, Repository)
30 from rhodecode.model.meta import Session
30 from rhodecode.model.meta import Session
31 from rhodecode.model.pull_request import PullRequestModel
31 from rhodecode.model.pull_request import PullRequestModel
32 from rhodecode.model.user import UserModel
32 from rhodecode.model.user import UserModel
33 from rhodecode.model.comment import CommentsModel
33 from rhodecode.model.comment import CommentsModel
34 from rhodecode.tests import (
34 from rhodecode.tests import (
35 assert_session_flash, TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_LOGIN)
35 assert_session_flash, TEST_USER_ADMIN_LOGIN, TEST_USER_REGULAR_LOGIN)
36
36
37
37
38 def route_path(name, params=None, **kwargs):
38 def route_path(name, params=None, **kwargs):
39 import urllib
39 import urllib
40
40
41 base_url = {
41 base_url = {
42 'repo_changelog': '/{repo_name}/changelog',
42 'repo_changelog': '/{repo_name}/changelog',
43 'repo_changelog_file': '/{repo_name}/changelog/{commit_id}/{f_path}',
43 'repo_changelog_file': '/{repo_name}/changelog/{commit_id}/{f_path}',
44 'repo_commits': '/{repo_name}/commits',
44 'repo_commits': '/{repo_name}/commits',
45 'repo_commits_file': '/{repo_name}/commits/{commit_id}/{f_path}',
45 'repo_commits_file': '/{repo_name}/commits/{commit_id}/{f_path}',
46 'pullrequest_show': '/{repo_name}/pull-request/{pull_request_id}',
46 'pullrequest_show': '/{repo_name}/pull-request/{pull_request_id}',
47 'pullrequest_show_all': '/{repo_name}/pull-request',
47 'pullrequest_show_all': '/{repo_name}/pull-request',
48 'pullrequest_show_all_data': '/{repo_name}/pull-request-data',
48 'pullrequest_show_all_data': '/{repo_name}/pull-request-data',
49 'pullrequest_repo_refs': '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}',
49 'pullrequest_repo_refs': '/{repo_name}/pull-request/refs/{target_repo_name:.*?[^/]}',
50 'pullrequest_repo_targets': '/{repo_name}/pull-request/repo-destinations',
50 'pullrequest_repo_targets': '/{repo_name}/pull-request/repo-destinations',
51 'pullrequest_new': '/{repo_name}/pull-request/new',
51 'pullrequest_new': '/{repo_name}/pull-request/new',
52 'pullrequest_create': '/{repo_name}/pull-request/create',
52 'pullrequest_create': '/{repo_name}/pull-request/create',
53 'pullrequest_update': '/{repo_name}/pull-request/{pull_request_id}/update',
53 'pullrequest_update': '/{repo_name}/pull-request/{pull_request_id}/update',
54 'pullrequest_merge': '/{repo_name}/pull-request/{pull_request_id}/merge',
54 'pullrequest_merge': '/{repo_name}/pull-request/{pull_request_id}/merge',
55 'pullrequest_delete': '/{repo_name}/pull-request/{pull_request_id}/delete',
55 'pullrequest_delete': '/{repo_name}/pull-request/{pull_request_id}/delete',
56 'pullrequest_comment_create': '/{repo_name}/pull-request/{pull_request_id}/comment',
56 'pullrequest_comment_create': '/{repo_name}/pull-request/{pull_request_id}/comment',
57 'pullrequest_comment_delete': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/delete',
57 'pullrequest_comment_delete': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/delete',
58 'pullrequest_comment_edit': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/edit',
58 'pullrequest_comment_edit': '/{repo_name}/pull-request/{pull_request_id}/comment/{comment_id}/edit',
59 }[name].format(**kwargs)
59 }[name].format(**kwargs)
60
60
61 if params:
61 if params:
62 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
62 base_url = '{}?{}'.format(base_url, urllib.urlencode(params))
63 return base_url
63 return base_url
64
64
65
65
66 @pytest.mark.usefixtures('app', 'autologin_user')
66 @pytest.mark.usefixtures('app', 'autologin_user')
67 @pytest.mark.backends("git", "hg")
67 @pytest.mark.backends("git", "hg")
68 class TestPullrequestsView(object):
68 class TestPullrequestsView(object):
69
69
70 def test_index(self, backend):
70 def test_index(self, backend):
71 self.app.get(route_path(
71 self.app.get(route_path(
72 'pullrequest_new',
72 'pullrequest_new',
73 repo_name=backend.repo_name))
73 repo_name=backend.repo_name))
74
74
75 def test_option_menu_create_pull_request_exists(self, backend):
75 def test_option_menu_create_pull_request_exists(self, backend):
76 repo_name = backend.repo_name
76 repo_name = backend.repo_name
77 response = self.app.get(h.route_path('repo_summary', repo_name=repo_name))
77 response = self.app.get(h.route_path('repo_summary', repo_name=repo_name))
78
78
79 create_pr_link = '<a href="%s">Create Pull Request</a>' % route_path(
79 create_pr_link = '<a href="%s">Create Pull Request</a>' % route_path(
80 'pullrequest_new', repo_name=repo_name)
80 'pullrequest_new', repo_name=repo_name)
81 response.mustcontain(create_pr_link)
81 response.mustcontain(create_pr_link)
82
82
83 def test_create_pr_form_with_raw_commit_id(self, backend):
83 def test_create_pr_form_with_raw_commit_id(self, backend):
84 repo = backend.repo
84 repo = backend.repo
85
85
86 self.app.get(
86 self.app.get(
87 route_path('pullrequest_new', repo_name=repo.repo_name,
87 route_path('pullrequest_new', repo_name=repo.repo_name,
88 commit=repo.get_commit().raw_id),
88 commit=repo.get_commit().raw_id),
89 status=200)
89 status=200)
90
90
91 @pytest.mark.parametrize('pr_merge_enabled', [True, False])
91 @pytest.mark.parametrize('pr_merge_enabled', [True, False])
92 @pytest.mark.parametrize('range_diff', ["0", "1"])
92 @pytest.mark.parametrize('range_diff', ["0", "1"])
93 def test_show(self, pr_util, pr_merge_enabled, range_diff):
93 def test_show(self, pr_util, pr_merge_enabled, range_diff):
94 pull_request = pr_util.create_pull_request(
94 pull_request = pr_util.create_pull_request(
95 mergeable=pr_merge_enabled, enable_notifications=False)
95 mergeable=pr_merge_enabled, enable_notifications=False)
96
96
97 response = self.app.get(route_path(
97 response = self.app.get(route_path(
98 'pullrequest_show',
98 'pullrequest_show',
99 repo_name=pull_request.target_repo.scm_instance().name,
99 repo_name=pull_request.target_repo.scm_instance().name,
100 pull_request_id=pull_request.pull_request_id,
100 pull_request_id=pull_request.pull_request_id,
101 params={'range-diff': range_diff}))
101 params={'range-diff': range_diff}))
102
102
103 for commit_id in pull_request.revisions:
103 for commit_id in pull_request.revisions:
104 response.mustcontain(commit_id)
104 response.mustcontain(commit_id)
105
105
106 response.mustcontain(pull_request.target_ref_parts.type)
106 response.mustcontain(pull_request.target_ref_parts.type)
107 response.mustcontain(pull_request.target_ref_parts.name)
107 response.mustcontain(pull_request.target_ref_parts.name)
108
108
109 response.mustcontain('class="pull-request-merge"')
109 response.mustcontain('class="pull-request-merge"')
110
110
111 if pr_merge_enabled:
111 if pr_merge_enabled:
112 response.mustcontain('Pull request reviewer approval is pending')
112 response.mustcontain('Pull request reviewer approval is pending')
113 else:
113 else:
114 response.mustcontain('Server-side pull request merging is disabled.')
114 response.mustcontain('Server-side pull request merging is disabled.')
115
115
116 if range_diff == "1":
116 if range_diff == "1":
117 response.mustcontain('Turn off: Show the diff as commit range')
117 response.mustcontain('Turn off: Show the diff as commit range')
118
118
119 def test_show_versions_of_pr(self, backend, csrf_token):
119 def test_show_versions_of_pr(self, backend, csrf_token):
120 commits = [
120 commits = [
121 {'message': 'initial-commit',
121 {'message': 'initial-commit',
122 'added': [FileNode('test-file.txt', 'LINE1\n')]},
122 'added': [FileNode('test-file.txt', 'LINE1\n')]},
123
123
124 {'message': 'commit-1',
124 {'message': 'commit-1',
125 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\n')]},
125 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\n')]},
126 # Above is the initial version of PR that changes a single line
126 # Above is the initial version of PR that changes a single line
127
127
128 # from now on we'll add 3x commit adding a nother line on each step
128 # from now on we'll add 3x commit adding a nother line on each step
129 {'message': 'commit-2',
129 {'message': 'commit-2',
130 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\n')]},
130 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\n')]},
131
131
132 {'message': 'commit-3',
132 {'message': 'commit-3',
133 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\n')]},
133 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\n')]},
134
134
135 {'message': 'commit-4',
135 {'message': 'commit-4',
136 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\nLINE5\n')]},
136 'changed': [FileNode('test-file.txt', 'LINE1\nLINE2\nLINE3\nLINE4\nLINE5\n')]},
137 ]
137 ]
138
138
139 commit_ids = backend.create_master_repo(commits)
139 commit_ids = backend.create_master_repo(commits)
140 target = backend.create_repo(heads=['initial-commit'])
140 target = backend.create_repo(heads=['initial-commit'])
141 source = backend.create_repo(heads=['commit-1'])
141 source = backend.create_repo(heads=['commit-1'])
142 source_repo_name = source.repo_name
142 source_repo_name = source.repo_name
143 target_repo_name = target.repo_name
143 target_repo_name = target.repo_name
144
144
145 target_ref = 'branch:{branch}:{commit_id}'.format(
145 target_ref = 'branch:{branch}:{commit_id}'.format(
146 branch=backend.default_branch_name, commit_id=commit_ids['initial-commit'])
146 branch=backend.default_branch_name, commit_id=commit_ids['initial-commit'])
147 source_ref = 'branch:{branch}:{commit_id}'.format(
147 source_ref = 'branch:{branch}:{commit_id}'.format(
148 branch=backend.default_branch_name, commit_id=commit_ids['commit-1'])
148 branch=backend.default_branch_name, commit_id=commit_ids['commit-1'])
149
149
150 response = self.app.post(
150 response = self.app.post(
151 route_path('pullrequest_create', repo_name=source.repo_name),
151 route_path('pullrequest_create', repo_name=source.repo_name),
152 [
152 [
153 ('source_repo', source_repo_name),
153 ('source_repo', source_repo_name),
154 ('source_ref', source_ref),
154 ('source_ref', source_ref),
155 ('target_repo', target_repo_name),
155 ('target_repo', target_repo_name),
156 ('target_ref', target_ref),
156 ('target_ref', target_ref),
157 ('common_ancestor', commit_ids['initial-commit']),
157 ('common_ancestor', commit_ids['initial-commit']),
158 ('pullrequest_title', 'Title'),
158 ('pullrequest_title', 'Title'),
159 ('pullrequest_desc', 'Description'),
159 ('pullrequest_desc', 'Description'),
160 ('description_renderer', 'markdown'),
160 ('description_renderer', 'markdown'),
161 ('__start__', 'review_members:sequence'),
161 ('__start__', 'review_members:sequence'),
162 ('__start__', 'reviewer:mapping'),
162 ('__start__', 'reviewer:mapping'),
163 ('user_id', '1'),
163 ('user_id', '1'),
164 ('__start__', 'reasons:sequence'),
164 ('__start__', 'reasons:sequence'),
165 ('reason', 'Some reason'),
165 ('reason', 'Some reason'),
166 ('__end__', 'reasons:sequence'),
166 ('__end__', 'reasons:sequence'),
167 ('__start__', 'rules:sequence'),
167 ('__start__', 'rules:sequence'),
168 ('__end__', 'rules:sequence'),
168 ('__end__', 'rules:sequence'),
169 ('mandatory', 'False'),
169 ('mandatory', 'False'),
170 ('__end__', 'reviewer:mapping'),
170 ('__end__', 'reviewer:mapping'),
171 ('__end__', 'review_members:sequence'),
171 ('__end__', 'review_members:sequence'),
172 ('__start__', 'revisions:sequence'),
172 ('__start__', 'revisions:sequence'),
173 ('revisions', commit_ids['commit-1']),
173 ('revisions', commit_ids['commit-1']),
174 ('__end__', 'revisions:sequence'),
174 ('__end__', 'revisions:sequence'),
175 ('user', ''),
175 ('user', ''),
176 ('csrf_token', csrf_token),
176 ('csrf_token', csrf_token),
177 ],
177 ],
178 status=302)
178 status=302)
179
179
180 location = response.headers['Location']
180 location = response.headers['Location']
181
181
182 pull_request_id = location.rsplit('/', 1)[1]
182 pull_request_id = location.rsplit('/', 1)[1]
183 assert pull_request_id != 'new'
183 assert pull_request_id != 'new'
184 pull_request = PullRequest.get(int(pull_request_id))
184 pull_request = PullRequest.get(int(pull_request_id))
185
185
186 pull_request_id = pull_request.pull_request_id
186 pull_request_id = pull_request.pull_request_id
187
187
188 # Show initial version of PR
188 # Show initial version of PR
189 response = self.app.get(
189 response = self.app.get(
190 route_path('pullrequest_show',
190 route_path('pullrequest_show',
191 repo_name=target_repo_name,
191 repo_name=target_repo_name,
192 pull_request_id=pull_request_id))
192 pull_request_id=pull_request_id))
193
193
194 response.mustcontain('commit-1')
194 response.mustcontain('commit-1')
195 response.mustcontain(no=['commit-2'])
195 response.mustcontain(no=['commit-2'])
196 response.mustcontain(no=['commit-3'])
196 response.mustcontain(no=['commit-3'])
197 response.mustcontain(no=['commit-4'])
197 response.mustcontain(no=['commit-4'])
198
198
199 response.mustcontain('cb-addition"></span><span>LINE2</span>')
199 response.mustcontain('cb-addition"></span><span>LINE2</span>')
200 response.mustcontain(no=['LINE3'])
200 response.mustcontain(no=['LINE3'])
201 response.mustcontain(no=['LINE4'])
201 response.mustcontain(no=['LINE4'])
202 response.mustcontain(no=['LINE5'])
202 response.mustcontain(no=['LINE5'])
203
203
204 # update PR #1
204 # update PR #1
205 source_repo = Repository.get_by_repo_name(source_repo_name)
205 source_repo = Repository.get_by_repo_name(source_repo_name)
206 backend.pull_heads(source_repo, heads=['commit-2'])
206 backend.pull_heads(source_repo, heads=['commit-2'])
207 response = self.app.post(
207 response = self.app.post(
208 route_path('pullrequest_update',
208 route_path('pullrequest_update',
209 repo_name=target_repo_name, pull_request_id=pull_request_id),
209 repo_name=target_repo_name, pull_request_id=pull_request_id),
210 params={'update_commits': 'true', 'csrf_token': csrf_token})
210 params={'update_commits': 'true', 'csrf_token': csrf_token})
211
211
212 # update PR #2
212 # update PR #2
213 source_repo = Repository.get_by_repo_name(source_repo_name)
213 source_repo = Repository.get_by_repo_name(source_repo_name)
214 backend.pull_heads(source_repo, heads=['commit-3'])
214 backend.pull_heads(source_repo, heads=['commit-3'])
215 response = self.app.post(
215 response = self.app.post(
216 route_path('pullrequest_update',
216 route_path('pullrequest_update',
217 repo_name=target_repo_name, pull_request_id=pull_request_id),
217 repo_name=target_repo_name, pull_request_id=pull_request_id),
218 params={'update_commits': 'true', 'csrf_token': csrf_token})
218 params={'update_commits': 'true', 'csrf_token': csrf_token})
219
219
220 # update PR #3
220 # update PR #3
221 source_repo = Repository.get_by_repo_name(source_repo_name)
221 source_repo = Repository.get_by_repo_name(source_repo_name)
222 backend.pull_heads(source_repo, heads=['commit-4'])
222 backend.pull_heads(source_repo, heads=['commit-4'])
223 response = self.app.post(
223 response = self.app.post(
224 route_path('pullrequest_update',
224 route_path('pullrequest_update',
225 repo_name=target_repo_name, pull_request_id=pull_request_id),
225 repo_name=target_repo_name, pull_request_id=pull_request_id),
226 params={'update_commits': 'true', 'csrf_token': csrf_token})
226 params={'update_commits': 'true', 'csrf_token': csrf_token})
227
227
228 # Show final version !
228 # Show final version !
229 response = self.app.get(
229 response = self.app.get(
230 route_path('pullrequest_show',
230 route_path('pullrequest_show',
231 repo_name=target_repo_name,
231 repo_name=target_repo_name,
232 pull_request_id=pull_request_id))
232 pull_request_id=pull_request_id))
233
233
234 # 3 updates, and the latest == 4
234 # 3 updates, and the latest == 4
235 response.mustcontain('4 versions available for this pull request')
235 response.mustcontain('4 versions available for this pull request')
236 response.mustcontain(no=['rhodecode diff rendering error'])
236 response.mustcontain(no=['rhodecode diff rendering error'])
237
237
238 # initial show must have 3 commits, and 3 adds
238 # initial show must have 3 commits, and 3 adds
239 response.mustcontain('commit-1')
239 response.mustcontain('commit-1')
240 response.mustcontain('commit-2')
240 response.mustcontain('commit-2')
241 response.mustcontain('commit-3')
241 response.mustcontain('commit-3')
242 response.mustcontain('commit-4')
242 response.mustcontain('commit-4')
243
243
244 response.mustcontain('cb-addition"></span><span>LINE2</span>')
244 response.mustcontain('cb-addition"></span><span>LINE2</span>')
245 response.mustcontain('cb-addition"></span><span>LINE3</span>')
245 response.mustcontain('cb-addition"></span><span>LINE3</span>')
246 response.mustcontain('cb-addition"></span><span>LINE4</span>')
246 response.mustcontain('cb-addition"></span><span>LINE4</span>')
247 response.mustcontain('cb-addition"></span><span>LINE5</span>')
247 response.mustcontain('cb-addition"></span><span>LINE5</span>')
248
248
249 # fetch versions
249 # fetch versions
250 pr = PullRequest.get(pull_request_id)
250 pr = PullRequest.get(pull_request_id)
251 versions = [x.pull_request_version_id for x in pr.versions.all()]
251 versions = [x.pull_request_version_id for x in pr.versions.all()]
252 assert len(versions) == 3
252 assert len(versions) == 3
253
253
254 # show v1,v2,v3,v4
254 # show v1,v2,v3,v4
255 def cb_line(text):
255 def cb_line(text):
256 return 'cb-addition"></span><span>{}</span>'.format(text)
256 return 'cb-addition"></span><span>{}</span>'.format(text)
257
257
258 def cb_context(text):
258 def cb_context(text):
259 return '<span class="cb-code"><span class="cb-action cb-context">' \
259 return '<span class="cb-code"><span class="cb-action cb-context">' \
260 '</span><span>{}</span></span>'.format(text)
260 '</span><span>{}</span></span>'.format(text)
261
261
262 commit_tests = {
262 commit_tests = {
263 # in response, not in response
263 # in response, not in response
264 1: (['commit-1'], ['commit-2', 'commit-3', 'commit-4']),
264 1: (['commit-1'], ['commit-2', 'commit-3', 'commit-4']),
265 2: (['commit-1', 'commit-2'], ['commit-3', 'commit-4']),
265 2: (['commit-1', 'commit-2'], ['commit-3', 'commit-4']),
266 3: (['commit-1', 'commit-2', 'commit-3'], ['commit-4']),
266 3: (['commit-1', 'commit-2', 'commit-3'], ['commit-4']),
267 4: (['commit-1', 'commit-2', 'commit-3', 'commit-4'], []),
267 4: (['commit-1', 'commit-2', 'commit-3', 'commit-4'], []),
268 }
268 }
269 diff_tests = {
269 diff_tests = {
270 1: (['LINE2'], ['LINE3', 'LINE4', 'LINE5']),
270 1: (['LINE2'], ['LINE3', 'LINE4', 'LINE5']),
271 2: (['LINE2', 'LINE3'], ['LINE4', 'LINE5']),
271 2: (['LINE2', 'LINE3'], ['LINE4', 'LINE5']),
272 3: (['LINE2', 'LINE3', 'LINE4'], ['LINE5']),
272 3: (['LINE2', 'LINE3', 'LINE4'], ['LINE5']),
273 4: (['LINE2', 'LINE3', 'LINE4', 'LINE5'], []),
273 4: (['LINE2', 'LINE3', 'LINE4', 'LINE5'], []),
274 }
274 }
275 for idx, ver in enumerate(versions, 1):
275 for idx, ver in enumerate(versions, 1):
276
276
277 response = self.app.get(
277 response = self.app.get(
278 route_path('pullrequest_show',
278 route_path('pullrequest_show',
279 repo_name=target_repo_name,
279 repo_name=target_repo_name,
280 pull_request_id=pull_request_id,
280 pull_request_id=pull_request_id,
281 params={'version': ver}))
281 params={'version': ver}))
282
282
283 response.mustcontain(no=['rhodecode diff rendering error'])
283 response.mustcontain(no=['rhodecode diff rendering error'])
284 response.mustcontain('Showing changes at v{}'.format(idx))
284 response.mustcontain('Showing changes at v{}'.format(idx))
285
285
286 yes, no = commit_tests[idx]
286 yes, no = commit_tests[idx]
287 for y in yes:
287 for y in yes:
288 response.mustcontain(y)
288 response.mustcontain(y)
289 for n in no:
289 for n in no:
290 response.mustcontain(no=n)
290 response.mustcontain(no=n)
291
291
292 yes, no = diff_tests[idx]
292 yes, no = diff_tests[idx]
293 for y in yes:
293 for y in yes:
294 response.mustcontain(cb_line(y))
294 response.mustcontain(cb_line(y))
295 for n in no:
295 for n in no:
296 response.mustcontain(no=n)
296 response.mustcontain(no=n)
297
297
298 # show diff between versions
298 # show diff between versions
299 diff_compare_tests = {
299 diff_compare_tests = {
300 1: (['LINE3'], ['LINE1', 'LINE2']),
300 1: (['LINE3'], ['LINE1', 'LINE2']),
301 2: (['LINE3', 'LINE4'], ['LINE1', 'LINE2']),
301 2: (['LINE3', 'LINE4'], ['LINE1', 'LINE2']),
302 3: (['LINE3', 'LINE4', 'LINE5'], ['LINE1', 'LINE2']),
302 3: (['LINE3', 'LINE4', 'LINE5'], ['LINE1', 'LINE2']),
303 }
303 }
304 for idx, ver in enumerate(versions, 1):
304 for idx, ver in enumerate(versions, 1):
305 adds, context = diff_compare_tests[idx]
305 adds, context = diff_compare_tests[idx]
306
306
307 to_ver = ver+1
307 to_ver = ver+1
308 if idx == 3:
308 if idx == 3:
309 to_ver = 'latest'
309 to_ver = 'latest'
310
310
311 response = self.app.get(
311 response = self.app.get(
312 route_path('pullrequest_show',
312 route_path('pullrequest_show',
313 repo_name=target_repo_name,
313 repo_name=target_repo_name,
314 pull_request_id=pull_request_id,
314 pull_request_id=pull_request_id,
315 params={'from_version': versions[0], 'version': to_ver}))
315 params={'from_version': versions[0], 'version': to_ver}))
316
316
317 response.mustcontain(no=['rhodecode diff rendering error'])
317 response.mustcontain(no=['rhodecode diff rendering error'])
318
318
319 for a in adds:
319 for a in adds:
320 response.mustcontain(cb_line(a))
320 response.mustcontain(cb_line(a))
321 for c in context:
321 for c in context:
322 response.mustcontain(cb_context(c))
322 response.mustcontain(cb_context(c))
323
323
324 # test version v2 -> v3
324 # test version v2 -> v3
325 response = self.app.get(
325 response = self.app.get(
326 route_path('pullrequest_show',
326 route_path('pullrequest_show',
327 repo_name=target_repo_name,
327 repo_name=target_repo_name,
328 pull_request_id=pull_request_id,
328 pull_request_id=pull_request_id,
329 params={'from_version': versions[1], 'version': versions[2]}))
329 params={'from_version': versions[1], 'version': versions[2]}))
330
330
331 response.mustcontain(cb_context('LINE1'))
331 response.mustcontain(cb_context('LINE1'))
332 response.mustcontain(cb_context('LINE2'))
332 response.mustcontain(cb_context('LINE2'))
333 response.mustcontain(cb_context('LINE3'))
333 response.mustcontain(cb_context('LINE3'))
334 response.mustcontain(cb_line('LINE4'))
334 response.mustcontain(cb_line('LINE4'))
335
335
336 def test_close_status_visibility(self, pr_util, user_util, csrf_token):
336 def test_close_status_visibility(self, pr_util, user_util, csrf_token):
337 # Logout
337 # Logout
338 response = self.app.post(
338 response = self.app.post(
339 h.route_path('logout'),
339 h.route_path('logout'),
340 params={'csrf_token': csrf_token})
340 params={'csrf_token': csrf_token})
341 # Login as regular user
341 # Login as regular user
342 response = self.app.post(h.route_path('login'),
342 response = self.app.post(h.route_path('login'),
343 {'username': TEST_USER_REGULAR_LOGIN,
343 {'username': TEST_USER_REGULAR_LOGIN,
344 'password': 'test12'})
344 'password': 'test12'})
345
345
346 pull_request = pr_util.create_pull_request(
346 pull_request = pr_util.create_pull_request(
347 author=TEST_USER_REGULAR_LOGIN)
347 author=TEST_USER_REGULAR_LOGIN)
348
348
349 response = self.app.get(route_path(
349 response = self.app.get(route_path(
350 'pullrequest_show',
350 'pullrequest_show',
351 repo_name=pull_request.target_repo.scm_instance().name,
351 repo_name=pull_request.target_repo.scm_instance().name,
352 pull_request_id=pull_request.pull_request_id))
352 pull_request_id=pull_request.pull_request_id))
353
353
354 response.mustcontain('Server-side pull request merging is disabled.')
354 response.mustcontain('Server-side pull request merging is disabled.')
355
355
356 assert_response = response.assert_response()
356 assert_response = response.assert_response()
357 # for regular user without a merge permissions, we don't see it
357 # for regular user without a merge permissions, we don't see it
358 assert_response.no_element_exists('#close-pull-request-action')
358 assert_response.no_element_exists('#close-pull-request-action')
359
359
360 user_util.grant_user_permission_to_repo(
360 user_util.grant_user_permission_to_repo(
361 pull_request.target_repo,
361 pull_request.target_repo,
362 UserModel().get_by_username(TEST_USER_REGULAR_LOGIN),
362 UserModel().get_by_username(TEST_USER_REGULAR_LOGIN),
363 'repository.write')
363 'repository.write')
364 response = self.app.get(route_path(
364 response = self.app.get(route_path(
365 'pullrequest_show',
365 'pullrequest_show',
366 repo_name=pull_request.target_repo.scm_instance().name,
366 repo_name=pull_request.target_repo.scm_instance().name,
367 pull_request_id=pull_request.pull_request_id))
367 pull_request_id=pull_request.pull_request_id))
368
368
369 response.mustcontain('Server-side pull request merging is disabled.')
369 response.mustcontain('Server-side pull request merging is disabled.')
370
370
371 assert_response = response.assert_response()
371 assert_response = response.assert_response()
372 # now regular user has a merge permissions, we have CLOSE button
372 # now regular user has a merge permissions, we have CLOSE button
373 assert_response.one_element_exists('#close-pull-request-action')
373 assert_response.one_element_exists('#close-pull-request-action')
374
374
375 def test_show_invalid_commit_id(self, pr_util):
375 def test_show_invalid_commit_id(self, pr_util):
376 # Simulating invalid revisions which will cause a lookup error
376 # Simulating invalid revisions which will cause a lookup error
377 pull_request = pr_util.create_pull_request()
377 pull_request = pr_util.create_pull_request()
378 pull_request.revisions = ['invalid']
378 pull_request.revisions = ['invalid']
379 Session().add(pull_request)
379 Session().add(pull_request)
380 Session().commit()
380 Session().commit()
381
381
382 response = self.app.get(route_path(
382 response = self.app.get(route_path(
383 'pullrequest_show',
383 'pullrequest_show',
384 repo_name=pull_request.target_repo.scm_instance().name,
384 repo_name=pull_request.target_repo.scm_instance().name,
385 pull_request_id=pull_request.pull_request_id))
385 pull_request_id=pull_request.pull_request_id))
386
386
387 for commit_id in pull_request.revisions:
387 for commit_id in pull_request.revisions:
388 response.mustcontain(commit_id)
388 response.mustcontain(commit_id)
389
389
390 def test_show_invalid_source_reference(self, pr_util):
390 def test_show_invalid_source_reference(self, pr_util):
391 pull_request = pr_util.create_pull_request()
391 pull_request = pr_util.create_pull_request()
392 pull_request.source_ref = 'branch:b:invalid'
392 pull_request.source_ref = 'branch:b:invalid'
393 Session().add(pull_request)
393 Session().add(pull_request)
394 Session().commit()
394 Session().commit()
395
395
396 self.app.get(route_path(
396 self.app.get(route_path(
397 'pullrequest_show',
397 'pullrequest_show',
398 repo_name=pull_request.target_repo.scm_instance().name,
398 repo_name=pull_request.target_repo.scm_instance().name,
399 pull_request_id=pull_request.pull_request_id))
399 pull_request_id=pull_request.pull_request_id))
400
400
401 def test_edit_title_description(self, pr_util, csrf_token):
401 def test_edit_title_description(self, pr_util, csrf_token):
402 pull_request = pr_util.create_pull_request()
402 pull_request = pr_util.create_pull_request()
403 pull_request_id = pull_request.pull_request_id
403 pull_request_id = pull_request.pull_request_id
404
404
405 response = self.app.post(
405 response = self.app.post(
406 route_path('pullrequest_update',
406 route_path('pullrequest_update',
407 repo_name=pull_request.target_repo.repo_name,
407 repo_name=pull_request.target_repo.repo_name,
408 pull_request_id=pull_request_id),
408 pull_request_id=pull_request_id),
409 params={
409 params={
410 'edit_pull_request': 'true',
410 'edit_pull_request': 'true',
411 'title': 'New title',
411 'title': 'New title',
412 'description': 'New description',
412 'description': 'New description',
413 'csrf_token': csrf_token})
413 'csrf_token': csrf_token})
414
414
415 assert_session_flash(
415 assert_session_flash(
416 response, u'Pull request title & description updated.',
416 response, u'Pull request title & description updated.',
417 category='success')
417 category='success')
418
418
419 pull_request = PullRequest.get(pull_request_id)
419 pull_request = PullRequest.get(pull_request_id)
420 assert pull_request.title == 'New title'
420 assert pull_request.title == 'New title'
421 assert pull_request.description == 'New description'
421 assert pull_request.description == 'New description'
422
422
423 def test_edit_title_description_closed(self, pr_util, csrf_token):
423 def test_edit_title_description_closed(self, pr_util, csrf_token):
424 pull_request = pr_util.create_pull_request()
424 pull_request = pr_util.create_pull_request()
425 pull_request_id = pull_request.pull_request_id
425 pull_request_id = pull_request.pull_request_id
426 repo_name = pull_request.target_repo.repo_name
426 repo_name = pull_request.target_repo.repo_name
427 pr_util.close()
427 pr_util.close()
428
428
429 response = self.app.post(
429 response = self.app.post(
430 route_path('pullrequest_update',
430 route_path('pullrequest_update',
431 repo_name=repo_name, pull_request_id=pull_request_id),
431 repo_name=repo_name, pull_request_id=pull_request_id),
432 params={
432 params={
433 'edit_pull_request': 'true',
433 'edit_pull_request': 'true',
434 'title': 'New title',
434 'title': 'New title',
435 'description': 'New description',
435 'description': 'New description',
436 'csrf_token': csrf_token}, status=200)
436 'csrf_token': csrf_token}, status=200)
437 assert_session_flash(
437 assert_session_flash(
438 response, u'Cannot update closed pull requests.',
438 response, u'Cannot update closed pull requests.',
439 category='error')
439 category='error')
440
440
441 def test_update_invalid_source_reference(self, pr_util, csrf_token):
441 def test_update_invalid_source_reference(self, pr_util, csrf_token):
442 from rhodecode.lib.vcs.backends.base import UpdateFailureReason
442 from rhodecode.lib.vcs.backends.base import UpdateFailureReason
443
443
444 pull_request = pr_util.create_pull_request()
444 pull_request = pr_util.create_pull_request()
445 pull_request.source_ref = 'branch:invalid-branch:invalid-commit-id'
445 pull_request.source_ref = 'branch:invalid-branch:invalid-commit-id'
446 Session().add(pull_request)
446 Session().add(pull_request)
447 Session().commit()
447 Session().commit()
448
448
449 pull_request_id = pull_request.pull_request_id
449 pull_request_id = pull_request.pull_request_id
450
450
451 response = self.app.post(
451 response = self.app.post(
452 route_path('pullrequest_update',
452 route_path('pullrequest_update',
453 repo_name=pull_request.target_repo.repo_name,
453 repo_name=pull_request.target_repo.repo_name,
454 pull_request_id=pull_request_id),
454 pull_request_id=pull_request_id),
455 params={'update_commits': 'true', 'csrf_token': csrf_token})
455 params={'update_commits': 'true', 'csrf_token': csrf_token})
456
456
457 expected_msg = str(PullRequestModel.UPDATE_STATUS_MESSAGES[
457 expected_msg = str(PullRequestModel.UPDATE_STATUS_MESSAGES[
458 UpdateFailureReason.MISSING_SOURCE_REF])
458 UpdateFailureReason.MISSING_SOURCE_REF])
459 assert_session_flash(response, expected_msg, category='error')
459 assert_session_flash(response, expected_msg, category='error')
460
460
461 def test_missing_target_reference(self, pr_util, csrf_token):
461 def test_missing_target_reference(self, pr_util, csrf_token):
462 from rhodecode.lib.vcs.backends.base import MergeFailureReason
462 from rhodecode.lib.vcs.backends.base import MergeFailureReason
463 pull_request = pr_util.create_pull_request(
463 pull_request = pr_util.create_pull_request(
464 approved=True, mergeable=True)
464 approved=True, mergeable=True)
465 unicode_reference = u'branch:invalid-branch:invalid-commit-id'
465 unicode_reference = u'branch:invalid-branch:invalid-commit-id'
466 pull_request.target_ref = unicode_reference
466 pull_request.target_ref = unicode_reference
467 Session().add(pull_request)
467 Session().add(pull_request)
468 Session().commit()
468 Session().commit()
469
469
470 pull_request_id = pull_request.pull_request_id
470 pull_request_id = pull_request.pull_request_id
471 pull_request_url = route_path(
471 pull_request_url = route_path(
472 'pullrequest_show',
472 'pullrequest_show',
473 repo_name=pull_request.target_repo.repo_name,
473 repo_name=pull_request.target_repo.repo_name,
474 pull_request_id=pull_request_id)
474 pull_request_id=pull_request_id)
475
475
476 response = self.app.get(pull_request_url)
476 response = self.app.get(pull_request_url)
477 target_ref_id = 'invalid-branch'
477 target_ref_id = 'invalid-branch'
478 merge_resp = MergeResponse(
478 merge_resp = MergeResponse(
479 True, True, '', MergeFailureReason.MISSING_TARGET_REF,
479 True, True, '', MergeFailureReason.MISSING_TARGET_REF,
480 metadata={'target_ref': PullRequest.unicode_to_reference(unicode_reference)})
480 metadata={'target_ref': PullRequest.unicode_to_reference(unicode_reference)})
481 response.assert_response().element_contains(
481 response.assert_response().element_contains(
482 'div[data-role="merge-message"]', merge_resp.merge_status_message)
482 'div[data-role="merge-message"]', merge_resp.merge_status_message)
483
483
484 def test_comment_and_close_pull_request_custom_message_approved(
484 def test_comment_and_close_pull_request_custom_message_approved(
485 self, pr_util, csrf_token, xhr_header):
485 self, pr_util, csrf_token, xhr_header):
486
486
487 pull_request = pr_util.create_pull_request(approved=True)
487 pull_request = pr_util.create_pull_request(approved=True)
488 pull_request_id = pull_request.pull_request_id
488 pull_request_id = pull_request.pull_request_id
489 author = pull_request.user_id
489 author = pull_request.user_id
490 repo = pull_request.target_repo.repo_id
490 repo = pull_request.target_repo.repo_id
491
491
492 self.app.post(
492 self.app.post(
493 route_path('pullrequest_comment_create',
493 route_path('pullrequest_comment_create',
494 repo_name=pull_request.target_repo.scm_instance().name,
494 repo_name=pull_request.target_repo.scm_instance().name,
495 pull_request_id=pull_request_id),
495 pull_request_id=pull_request_id),
496 params={
496 params={
497 'close_pull_request': '1',
497 'close_pull_request': '1',
498 'text': 'Closing a PR',
498 'text': 'Closing a PR',
499 'csrf_token': csrf_token},
499 'csrf_token': csrf_token},
500 extra_environ=xhr_header,)
500 extra_environ=xhr_header,)
501
501
502 journal = UserLog.query()\
502 journal = UserLog.query()\
503 .filter(UserLog.user_id == author)\
503 .filter(UserLog.user_id == author)\
504 .filter(UserLog.repository_id == repo) \
504 .filter(UserLog.repository_id == repo) \
505 .order_by(UserLog.user_log_id.asc()) \
505 .order_by(UserLog.user_log_id.asc()) \
506 .all()
506 .all()
507 assert journal[-1].action == 'repo.pull_request.close'
507 assert journal[-1].action == 'repo.pull_request.close'
508
508
509 pull_request = PullRequest.get(pull_request_id)
509 pull_request = PullRequest.get(pull_request_id)
510 assert pull_request.is_closed()
510 assert pull_request.is_closed()
511
511
512 status = ChangesetStatusModel().get_status(
512 status = ChangesetStatusModel().get_status(
513 pull_request.source_repo, pull_request=pull_request)
513 pull_request.source_repo, pull_request=pull_request)
514 assert status == ChangesetStatus.STATUS_APPROVED
514 assert status == ChangesetStatus.STATUS_APPROVED
515 comments = ChangesetComment().query() \
515 comments = ChangesetComment().query() \
516 .filter(ChangesetComment.pull_request == pull_request) \
516 .filter(ChangesetComment.pull_request == pull_request) \
517 .order_by(ChangesetComment.comment_id.asc())\
517 .order_by(ChangesetComment.comment_id.asc())\
518 .all()
518 .all()
519 assert comments[-1].text == 'Closing a PR'
519 assert comments[-1].text == 'Closing a PR'
520
520
521 def test_comment_force_close_pull_request_rejected(
521 def test_comment_force_close_pull_request_rejected(
522 self, pr_util, csrf_token, xhr_header):
522 self, pr_util, csrf_token, xhr_header):
523 pull_request = pr_util.create_pull_request()
523 pull_request = pr_util.create_pull_request()
524 pull_request_id = pull_request.pull_request_id
524 pull_request_id = pull_request.pull_request_id
525 PullRequestModel().update_reviewers(
525 PullRequestModel().update_reviewers(
526 pull_request_id, [
526 pull_request_id, [
527 (1, ['reason'], False, 'reviewer', []),
527 (1, ['reason'], False, 'reviewer', []),
528 (2, ['reason2'], False, 'reviewer', [])],
528 (2, ['reason2'], False, 'reviewer', [])],
529 pull_request.author)
529 pull_request.author)
530 author = pull_request.user_id
530 author = pull_request.user_id
531 repo = pull_request.target_repo.repo_id
531 repo = pull_request.target_repo.repo_id
532
532
533 self.app.post(
533 self.app.post(
534 route_path('pullrequest_comment_create',
534 route_path('pullrequest_comment_create',
535 repo_name=pull_request.target_repo.scm_instance().name,
535 repo_name=pull_request.target_repo.scm_instance().name,
536 pull_request_id=pull_request_id),
536 pull_request_id=pull_request_id),
537 params={
537 params={
538 'close_pull_request': '1',
538 'close_pull_request': '1',
539 'csrf_token': csrf_token},
539 'csrf_token': csrf_token},
540 extra_environ=xhr_header)
540 extra_environ=xhr_header)
541
541
542 pull_request = PullRequest.get(pull_request_id)
542 pull_request = PullRequest.get(pull_request_id)
543
543
544 journal = UserLog.query()\
544 journal = UserLog.query()\
545 .filter(UserLog.user_id == author, UserLog.repository_id == repo) \
545 .filter(UserLog.user_id == author, UserLog.repository_id == repo) \
546 .order_by(UserLog.user_log_id.asc()) \
546 .order_by(UserLog.user_log_id.asc()) \
547 .all()
547 .all()
548 assert journal[-1].action == 'repo.pull_request.close'
548 assert journal[-1].action == 'repo.pull_request.close'
549
549
550 # check only the latest status, not the review status
550 # check only the latest status, not the review status
551 status = ChangesetStatusModel().get_status(
551 status = ChangesetStatusModel().get_status(
552 pull_request.source_repo, pull_request=pull_request)
552 pull_request.source_repo, pull_request=pull_request)
553 assert status == ChangesetStatus.STATUS_REJECTED
553 assert status == ChangesetStatus.STATUS_REJECTED
554
554
555 def test_comment_and_close_pull_request(
555 def test_comment_and_close_pull_request(
556 self, pr_util, csrf_token, xhr_header):
556 self, pr_util, csrf_token, xhr_header):
557 pull_request = pr_util.create_pull_request()
557 pull_request = pr_util.create_pull_request()
558 pull_request_id = pull_request.pull_request_id
558 pull_request_id = pull_request.pull_request_id
559
559
560 response = self.app.post(
560 response = self.app.post(
561 route_path('pullrequest_comment_create',
561 route_path('pullrequest_comment_create',
562 repo_name=pull_request.target_repo.scm_instance().name,
562 repo_name=pull_request.target_repo.scm_instance().name,
563 pull_request_id=pull_request.pull_request_id),
563 pull_request_id=pull_request.pull_request_id),
564 params={
564 params={
565 'close_pull_request': 'true',
565 'close_pull_request': 'true',
566 'csrf_token': csrf_token},
566 'csrf_token': csrf_token},
567 extra_environ=xhr_header)
567 extra_environ=xhr_header)
568
568
569 assert response.json
569 assert response.json
570
570
571 pull_request = PullRequest.get(pull_request_id)
571 pull_request = PullRequest.get(pull_request_id)
572 assert pull_request.is_closed()
572 assert pull_request.is_closed()
573
573
574 # check only the latest status, not the review status
574 # check only the latest status, not the review status
575 status = ChangesetStatusModel().get_status(
575 status = ChangesetStatusModel().get_status(
576 pull_request.source_repo, pull_request=pull_request)
576 pull_request.source_repo, pull_request=pull_request)
577 assert status == ChangesetStatus.STATUS_REJECTED
577 assert status == ChangesetStatus.STATUS_REJECTED
578
578
579 def test_comment_and_close_pull_request_try_edit_comment(
579 def test_comment_and_close_pull_request_try_edit_comment(
580 self, pr_util, csrf_token, xhr_header
580 self, pr_util, csrf_token, xhr_header
581 ):
581 ):
582 pull_request = pr_util.create_pull_request()
582 pull_request = pr_util.create_pull_request()
583 pull_request_id = pull_request.pull_request_id
583 pull_request_id = pull_request.pull_request_id
584 target_scm = pull_request.target_repo.scm_instance()
584 target_scm = pull_request.target_repo.scm_instance()
585 target_scm_name = target_scm.name
585 target_scm_name = target_scm.name
586
586
587 response = self.app.post(
587 response = self.app.post(
588 route_path(
588 route_path(
589 'pullrequest_comment_create',
589 'pullrequest_comment_create',
590 repo_name=target_scm_name,
590 repo_name=target_scm_name,
591 pull_request_id=pull_request_id,
591 pull_request_id=pull_request_id,
592 ),
592 ),
593 params={
593 params={
594 'close_pull_request': 'true',
594 'close_pull_request': 'true',
595 'csrf_token': csrf_token,
595 'csrf_token': csrf_token,
596 },
596 },
597 extra_environ=xhr_header)
597 extra_environ=xhr_header)
598
598
599 assert response.json
599 assert response.json
600
600
601 pull_request = PullRequest.get(pull_request_id)
601 pull_request = PullRequest.get(pull_request_id)
602 target_scm = pull_request.target_repo.scm_instance()
602 target_scm = pull_request.target_repo.scm_instance()
603 target_scm_name = target_scm.name
603 target_scm_name = target_scm.name
604 assert pull_request.is_closed()
604 assert pull_request.is_closed()
605
605
606 # check only the latest status, not the review status
606 # check only the latest status, not the review status
607 status = ChangesetStatusModel().get_status(
607 status = ChangesetStatusModel().get_status(
608 pull_request.source_repo, pull_request=pull_request)
608 pull_request.source_repo, pull_request=pull_request)
609 assert status == ChangesetStatus.STATUS_REJECTED
609 assert status == ChangesetStatus.STATUS_REJECTED
610
610
611 comment_id = response.json.get('comment_id', None)
611 for comment_id in response.json.keys():
612 test_text = 'test'
612 test_text = 'test'
613 response = self.app.post(
613 response = self.app.post(
614 route_path(
614 route_path(
615 'pullrequest_comment_edit',
615 'pullrequest_comment_edit',
616 repo_name=target_scm_name,
616 repo_name=target_scm_name,
617 pull_request_id=pull_request_id,
617 pull_request_id=pull_request_id,
618 comment_id=comment_id,
618 comment_id=comment_id,
619 ),
619 ),
620 extra_environ=xhr_header,
620 extra_environ=xhr_header,
621 params={
621 params={
622 'csrf_token': csrf_token,
622 'csrf_token': csrf_token,
623 'text': test_text,
623 'text': test_text,
624 },
624 },
625 status=403,
625 status=403,
626 )
626 )
627 assert response.status_int == 403
627 assert response.status_int == 403
628
628
629 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
629 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
630 pull_request = pr_util.create_pull_request()
630 pull_request = pr_util.create_pull_request()
631 target_scm = pull_request.target_repo.scm_instance()
631 target_scm = pull_request.target_repo.scm_instance()
632 target_scm_name = target_scm.name
632 target_scm_name = target_scm.name
633
633
634 response = self.app.post(
634 response = self.app.post(
635 route_path(
635 route_path(
636 'pullrequest_comment_create',
636 'pullrequest_comment_create',
637 repo_name=target_scm_name,
637 repo_name=target_scm_name,
638 pull_request_id=pull_request.pull_request_id),
638 pull_request_id=pull_request.pull_request_id),
639 params={
639 params={
640 'csrf_token': csrf_token,
640 'csrf_token': csrf_token,
641 'text': 'init',
641 'text': 'init',
642 },
642 },
643 extra_environ=xhr_header,
643 extra_environ=xhr_header,
644 )
644 )
645 assert response.json
645 assert response.json
646
646
647 comment_id = response.json.get('comment_id', None)
647 for comment_id in response.json.keys():
648 assert comment_id
648 assert comment_id
649 test_text = 'test'
649 test_text = 'test'
650 self.app.post(
650 self.app.post(
651 route_path(
651 route_path(
652 'pullrequest_comment_edit',
652 'pullrequest_comment_edit',
653 repo_name=target_scm_name,
653 repo_name=target_scm_name,
654 pull_request_id=pull_request.pull_request_id,
654 pull_request_id=pull_request.pull_request_id,
655 comment_id=comment_id,
655 comment_id=comment_id,
656 ),
656 ),
657 extra_environ=xhr_header,
657 extra_environ=xhr_header,
658 params={
658 params={
659 'csrf_token': csrf_token,
659 'csrf_token': csrf_token,
660 'text': test_text,
660 'text': test_text,
661 'version': '0',
661 'version': '0',
662 },
662 },
663
663
664 )
664 )
665 text_form_db = ChangesetComment.query().filter(
665 text_form_db = ChangesetComment.query().filter(
666 ChangesetComment.comment_id == comment_id).first().text
666 ChangesetComment.comment_id == comment_id).first().text
667 assert test_text == text_form_db
667 assert test_text == text_form_db
668
668
669 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
669 def test_comment_and_comment_edit(self, pr_util, csrf_token, xhr_header):
670 pull_request = pr_util.create_pull_request()
670 pull_request = pr_util.create_pull_request()
671 target_scm = pull_request.target_repo.scm_instance()
671 target_scm = pull_request.target_repo.scm_instance()
672 target_scm_name = target_scm.name
672 target_scm_name = target_scm.name
673
673
674 response = self.app.post(
674 response = self.app.post(
675 route_path(
675 route_path(
676 'pullrequest_comment_create',
676 'pullrequest_comment_create',
677 repo_name=target_scm_name,
677 repo_name=target_scm_name,
678 pull_request_id=pull_request.pull_request_id),
678 pull_request_id=pull_request.pull_request_id),
679 params={
679 params={
680 'csrf_token': csrf_token,
680 'csrf_token': csrf_token,
681 'text': 'init',
681 'text': 'init',
682 },
682 },
683 extra_environ=xhr_header,
683 extra_environ=xhr_header,
684 )
684 )
685 assert response.json
685 assert response.json
686
686
687 comment_id = response.json.get('comment_id', None)
687 for comment_id in response.json.keys():
688 assert comment_id
688 test_text = 'init'
689 test_text = 'init'
689 response = self.app.post(
690 response = self.app.post(
690 route_path(
691 route_path(
691 'pullrequest_comment_edit',
692 'pullrequest_comment_edit',
692 repo_name=target_scm_name,
693 repo_name=target_scm_name,
693 pull_request_id=pull_request.pull_request_id,
694 pull_request_id=pull_request.pull_request_id,
694 comment_id=comment_id,
695 comment_id=comment_id,
695 ),
696 ),
696 extra_environ=xhr_header,
697 extra_environ=xhr_header,
697 params={
698 params={
698 'csrf_token': csrf_token,
699 'csrf_token': csrf_token,
699 'text': test_text,
700 'text': test_text,
700 'version': '0',
701 'version': '0',
701 },
702 },
702 status=404,
703 status=404,
704
703
705 )
704 )
706 assert response.status_int == 404
705 assert response.status_int == 404
707
706
708 def test_comment_and_try_edit_already_edited(self, pr_util, csrf_token, xhr_header):
707 def test_comment_and_try_edit_already_edited(self, pr_util, csrf_token, xhr_header):
709 pull_request = pr_util.create_pull_request()
708 pull_request = pr_util.create_pull_request()
710 target_scm = pull_request.target_repo.scm_instance()
709 target_scm = pull_request.target_repo.scm_instance()
711 target_scm_name = target_scm.name
710 target_scm_name = target_scm.name
712
711
713 response = self.app.post(
712 response = self.app.post(
714 route_path(
713 route_path(
715 'pullrequest_comment_create',
714 'pullrequest_comment_create',
716 repo_name=target_scm_name,
715 repo_name=target_scm_name,
717 pull_request_id=pull_request.pull_request_id),
716 pull_request_id=pull_request.pull_request_id),
718 params={
717 params={
719 'csrf_token': csrf_token,
718 'csrf_token': csrf_token,
720 'text': 'init',
719 'text': 'init',
721 },
720 },
722 extra_environ=xhr_header,
721 extra_environ=xhr_header,
723 )
722 )
724 assert response.json
723 assert response.json
725 comment_id = response.json.get('comment_id', None)
724 for comment_id in response.json.keys():
726 assert comment_id
725 test_text = 'test'
727
726 self.app.post(
728 test_text = 'test'
727 route_path(
729 self.app.post(
728 'pullrequest_comment_edit',
730 route_path(
729 repo_name=target_scm_name,
731 'pullrequest_comment_edit',
730 pull_request_id=pull_request.pull_request_id,
732 repo_name=target_scm_name,
731 comment_id=comment_id,
733 pull_request_id=pull_request.pull_request_id,
732 ),
734 comment_id=comment_id,
733 extra_environ=xhr_header,
735 ),
734 params={
736 extra_environ=xhr_header,
735 'csrf_token': csrf_token,
737 params={
736 'text': test_text,
738 'csrf_token': csrf_token,
737 'version': '0',
739 'text': test_text,
738 },
740 'version': '0',
741 },
742
739
743 )
740 )
744 test_text_v2 = 'test_v2'
741 test_text_v2 = 'test_v2'
745 response = self.app.post(
742 response = self.app.post(
746 route_path(
743 route_path(
747 'pullrequest_comment_edit',
744 'pullrequest_comment_edit',
748 repo_name=target_scm_name,
745 repo_name=target_scm_name,
749 pull_request_id=pull_request.pull_request_id,
746 pull_request_id=pull_request.pull_request_id,
750 comment_id=comment_id,
747 comment_id=comment_id,
751 ),
748 ),
752 extra_environ=xhr_header,
749 extra_environ=xhr_header,
753 params={
750 params={
754 'csrf_token': csrf_token,
751 'csrf_token': csrf_token,
755 'text': test_text_v2,
752 'text': test_text_v2,
756 'version': '0',
753 'version': '0',
757 },
754 },
758 status=409,
755 status=409,
759 )
756 )
760 assert response.status_int == 409
757 assert response.status_int == 409
761
758
762 text_form_db = ChangesetComment.query().filter(
759 text_form_db = ChangesetComment.query().filter(
763 ChangesetComment.comment_id == comment_id).first().text
760 ChangesetComment.comment_id == comment_id).first().text
764
761
765 assert test_text == text_form_db
762 assert test_text == text_form_db
766 assert test_text_v2 != text_form_db
763 assert test_text_v2 != text_form_db
767
764
768 def test_comment_and_comment_edit_permissions_forbidden(
765 def test_comment_and_comment_edit_permissions_forbidden(
769 self, autologin_regular_user, user_regular, user_admin, pr_util,
766 self, autologin_regular_user, user_regular, user_admin, pr_util,
770 csrf_token, xhr_header):
767 csrf_token, xhr_header):
771 pull_request = pr_util.create_pull_request(
768 pull_request = pr_util.create_pull_request(
772 author=user_admin.username, enable_notifications=False)
769 author=user_admin.username, enable_notifications=False)
773 comment = CommentsModel().create(
770 comment = CommentsModel().create(
774 text='test',
771 text='test',
775 repo=pull_request.target_repo.scm_instance().name,
772 repo=pull_request.target_repo.scm_instance().name,
776 user=user_admin,
773 user=user_admin,
777 pull_request=pull_request,
774 pull_request=pull_request,
778 )
775 )
779 response = self.app.post(
776 response = self.app.post(
780 route_path(
777 route_path(
781 'pullrequest_comment_edit',
778 'pullrequest_comment_edit',
782 repo_name=pull_request.target_repo.scm_instance().name,
779 repo_name=pull_request.target_repo.scm_instance().name,
783 pull_request_id=pull_request.pull_request_id,
780 pull_request_id=pull_request.pull_request_id,
784 comment_id=comment.comment_id,
781 comment_id=comment.comment_id,
785 ),
782 ),
786 extra_environ=xhr_header,
783 extra_environ=xhr_header,
787 params={
784 params={
788 'csrf_token': csrf_token,
785 'csrf_token': csrf_token,
789 'text': 'test_text',
786 'text': 'test_text',
790 },
787 },
791 status=403,
788 status=403,
792 )
789 )
793 assert response.status_int == 403
790 assert response.status_int == 403
794
791
795 def test_create_pull_request(self, backend, csrf_token):
792 def test_create_pull_request(self, backend, csrf_token):
796 commits = [
793 commits = [
797 {'message': 'ancestor'},
794 {'message': 'ancestor'},
798 {'message': 'change'},
795 {'message': 'change'},
799 {'message': 'change2'},
796 {'message': 'change2'},
800 ]
797 ]
801 commit_ids = backend.create_master_repo(commits)
798 commit_ids = backend.create_master_repo(commits)
802 target = backend.create_repo(heads=['ancestor'])
799 target = backend.create_repo(heads=['ancestor'])
803 source = backend.create_repo(heads=['change2'])
800 source = backend.create_repo(heads=['change2'])
804
801
805 response = self.app.post(
802 response = self.app.post(
806 route_path('pullrequest_create', repo_name=source.repo_name),
803 route_path('pullrequest_create', repo_name=source.repo_name),
807 [
804 [
808 ('source_repo', source.repo_name),
805 ('source_repo', source.repo_name),
809 ('source_ref', 'branch:default:' + commit_ids['change2']),
806 ('source_ref', 'branch:default:' + commit_ids['change2']),
810 ('target_repo', target.repo_name),
807 ('target_repo', target.repo_name),
811 ('target_ref', 'branch:default:' + commit_ids['ancestor']),
808 ('target_ref', 'branch:default:' + commit_ids['ancestor']),
812 ('common_ancestor', commit_ids['ancestor']),
809 ('common_ancestor', commit_ids['ancestor']),
813 ('pullrequest_title', 'Title'),
810 ('pullrequest_title', 'Title'),
814 ('pullrequest_desc', 'Description'),
811 ('pullrequest_desc', 'Description'),
815 ('description_renderer', 'markdown'),
812 ('description_renderer', 'markdown'),
816 ('__start__', 'review_members:sequence'),
813 ('__start__', 'review_members:sequence'),
817 ('__start__', 'reviewer:mapping'),
814 ('__start__', 'reviewer:mapping'),
818 ('user_id', '1'),
815 ('user_id', '1'),
819 ('__start__', 'reasons:sequence'),
816 ('__start__', 'reasons:sequence'),
820 ('reason', 'Some reason'),
817 ('reason', 'Some reason'),
821 ('__end__', 'reasons:sequence'),
818 ('__end__', 'reasons:sequence'),
822 ('__start__', 'rules:sequence'),
819 ('__start__', 'rules:sequence'),
823 ('__end__', 'rules:sequence'),
820 ('__end__', 'rules:sequence'),
824 ('mandatory', 'False'),
821 ('mandatory', 'False'),
825 ('__end__', 'reviewer:mapping'),
822 ('__end__', 'reviewer:mapping'),
826 ('__end__', 'review_members:sequence'),
823 ('__end__', 'review_members:sequence'),
827 ('__start__', 'revisions:sequence'),
824 ('__start__', 'revisions:sequence'),
828 ('revisions', commit_ids['change']),
825 ('revisions', commit_ids['change']),
829 ('revisions', commit_ids['change2']),
826 ('revisions', commit_ids['change2']),
830 ('__end__', 'revisions:sequence'),
827 ('__end__', 'revisions:sequence'),
831 ('user', ''),
828 ('user', ''),
832 ('csrf_token', csrf_token),
829 ('csrf_token', csrf_token),
833 ],
830 ],
834 status=302)
831 status=302)
835
832
836 location = response.headers['Location']
833 location = response.headers['Location']
837 pull_request_id = location.rsplit('/', 1)[1]
834 pull_request_id = location.rsplit('/', 1)[1]
838 assert pull_request_id != 'new'
835 assert pull_request_id != 'new'
839 pull_request = PullRequest.get(int(pull_request_id))
836 pull_request = PullRequest.get(int(pull_request_id))
840
837
841 # check that we have now both revisions
838 # check that we have now both revisions
842 assert pull_request.revisions == [commit_ids['change2'], commit_ids['change']]
839 assert pull_request.revisions == [commit_ids['change2'], commit_ids['change']]
843 assert pull_request.source_ref == 'branch:default:' + commit_ids['change2']
840 assert pull_request.source_ref == 'branch:default:' + commit_ids['change2']
844 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
841 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
845 assert pull_request.target_ref == expected_target_ref
842 assert pull_request.target_ref == expected_target_ref
846
843
847 def test_reviewer_notifications(self, backend, csrf_token):
844 def test_reviewer_notifications(self, backend, csrf_token):
848 # We have to use the app.post for this test so it will create the
845 # We have to use the app.post for this test so it will create the
849 # notifications properly with the new PR
846 # notifications properly with the new PR
850 commits = [
847 commits = [
851 {'message': 'ancestor',
848 {'message': 'ancestor',
852 'added': [FileNode('file_A', content='content_of_ancestor')]},
849 'added': [FileNode('file_A', content='content_of_ancestor')]},
853 {'message': 'change',
850 {'message': 'change',
854 'added': [FileNode('file_a', content='content_of_change')]},
851 'added': [FileNode('file_a', content='content_of_change')]},
855 {'message': 'change-child'},
852 {'message': 'change-child'},
856 {'message': 'ancestor-child', 'parents': ['ancestor'],
853 {'message': 'ancestor-child', 'parents': ['ancestor'],
857 'added': [
854 'added': [
858 FileNode('file_B', content='content_of_ancestor_child')]},
855 FileNode('file_B', content='content_of_ancestor_child')]},
859 {'message': 'ancestor-child-2'},
856 {'message': 'ancestor-child-2'},
860 ]
857 ]
861 commit_ids = backend.create_master_repo(commits)
858 commit_ids = backend.create_master_repo(commits)
862 target = backend.create_repo(heads=['ancestor-child'])
859 target = backend.create_repo(heads=['ancestor-child'])
863 source = backend.create_repo(heads=['change'])
860 source = backend.create_repo(heads=['change'])
864
861
865 response = self.app.post(
862 response = self.app.post(
866 route_path('pullrequest_create', repo_name=source.repo_name),
863 route_path('pullrequest_create', repo_name=source.repo_name),
867 [
864 [
868 ('source_repo', source.repo_name),
865 ('source_repo', source.repo_name),
869 ('source_ref', 'branch:default:' + commit_ids['change']),
866 ('source_ref', 'branch:default:' + commit_ids['change']),
870 ('target_repo', target.repo_name),
867 ('target_repo', target.repo_name),
871 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
868 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
872 ('common_ancestor', commit_ids['ancestor']),
869 ('common_ancestor', commit_ids['ancestor']),
873 ('pullrequest_title', 'Title'),
870 ('pullrequest_title', 'Title'),
874 ('pullrequest_desc', 'Description'),
871 ('pullrequest_desc', 'Description'),
875 ('description_renderer', 'markdown'),
872 ('description_renderer', 'markdown'),
876 ('__start__', 'review_members:sequence'),
873 ('__start__', 'review_members:sequence'),
877 ('__start__', 'reviewer:mapping'),
874 ('__start__', 'reviewer:mapping'),
878 ('user_id', '2'),
875 ('user_id', '2'),
879 ('__start__', 'reasons:sequence'),
876 ('__start__', 'reasons:sequence'),
880 ('reason', 'Some reason'),
877 ('reason', 'Some reason'),
881 ('__end__', 'reasons:sequence'),
878 ('__end__', 'reasons:sequence'),
882 ('__start__', 'rules:sequence'),
879 ('__start__', 'rules:sequence'),
883 ('__end__', 'rules:sequence'),
880 ('__end__', 'rules:sequence'),
884 ('mandatory', 'False'),
881 ('mandatory', 'False'),
885 ('__end__', 'reviewer:mapping'),
882 ('__end__', 'reviewer:mapping'),
886 ('__end__', 'review_members:sequence'),
883 ('__end__', 'review_members:sequence'),
887 ('__start__', 'revisions:sequence'),
884 ('__start__', 'revisions:sequence'),
888 ('revisions', commit_ids['change']),
885 ('revisions', commit_ids['change']),
889 ('__end__', 'revisions:sequence'),
886 ('__end__', 'revisions:sequence'),
890 ('user', ''),
887 ('user', ''),
891 ('csrf_token', csrf_token),
888 ('csrf_token', csrf_token),
892 ],
889 ],
893 status=302)
890 status=302)
894
891
895 location = response.headers['Location']
892 location = response.headers['Location']
896
893
897 pull_request_id = location.rsplit('/', 1)[1]
894 pull_request_id = location.rsplit('/', 1)[1]
898 assert pull_request_id != 'new'
895 assert pull_request_id != 'new'
899 pull_request = PullRequest.get(int(pull_request_id))
896 pull_request = PullRequest.get(int(pull_request_id))
900
897
901 # Check that a notification was made
898 # Check that a notification was made
902 notifications = Notification.query()\
899 notifications = Notification.query()\
903 .filter(Notification.created_by == pull_request.author.user_id,
900 .filter(Notification.created_by == pull_request.author.user_id,
904 Notification.type_ == Notification.TYPE_PULL_REQUEST,
901 Notification.type_ == Notification.TYPE_PULL_REQUEST,
905 Notification.subject.contains(
902 Notification.subject.contains(
906 "requested a pull request review. !%s" % pull_request_id))
903 "requested a pull request review. !%s" % pull_request_id))
907 assert len(notifications.all()) == 1
904 assert len(notifications.all()) == 1
908
905
909 # Change reviewers and check that a notification was made
906 # Change reviewers and check that a notification was made
910 PullRequestModel().update_reviewers(
907 PullRequestModel().update_reviewers(
911 pull_request.pull_request_id, [
908 pull_request.pull_request_id, [
912 (1, [], False, 'reviewer', [])
909 (1, [], False, 'reviewer', [])
913 ],
910 ],
914 pull_request.author)
911 pull_request.author)
915 assert len(notifications.all()) == 2
912 assert len(notifications.all()) == 2
916
913
917 def test_create_pull_request_stores_ancestor_commit_id(self, backend, csrf_token):
914 def test_create_pull_request_stores_ancestor_commit_id(self, backend, csrf_token):
918 commits = [
915 commits = [
919 {'message': 'ancestor',
916 {'message': 'ancestor',
920 'added': [FileNode('file_A', content='content_of_ancestor')]},
917 'added': [FileNode('file_A', content='content_of_ancestor')]},
921 {'message': 'change',
918 {'message': 'change',
922 'added': [FileNode('file_a', content='content_of_change')]},
919 'added': [FileNode('file_a', content='content_of_change')]},
923 {'message': 'change-child'},
920 {'message': 'change-child'},
924 {'message': 'ancestor-child', 'parents': ['ancestor'],
921 {'message': 'ancestor-child', 'parents': ['ancestor'],
925 'added': [
922 'added': [
926 FileNode('file_B', content='content_of_ancestor_child')]},
923 FileNode('file_B', content='content_of_ancestor_child')]},
927 {'message': 'ancestor-child-2'},
924 {'message': 'ancestor-child-2'},
928 ]
925 ]
929 commit_ids = backend.create_master_repo(commits)
926 commit_ids = backend.create_master_repo(commits)
930 target = backend.create_repo(heads=['ancestor-child'])
927 target = backend.create_repo(heads=['ancestor-child'])
931 source = backend.create_repo(heads=['change'])
928 source = backend.create_repo(heads=['change'])
932
929
933 response = self.app.post(
930 response = self.app.post(
934 route_path('pullrequest_create', repo_name=source.repo_name),
931 route_path('pullrequest_create', repo_name=source.repo_name),
935 [
932 [
936 ('source_repo', source.repo_name),
933 ('source_repo', source.repo_name),
937 ('source_ref', 'branch:default:' + commit_ids['change']),
934 ('source_ref', 'branch:default:' + commit_ids['change']),
938 ('target_repo', target.repo_name),
935 ('target_repo', target.repo_name),
939 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
936 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
940 ('common_ancestor', commit_ids['ancestor']),
937 ('common_ancestor', commit_ids['ancestor']),
941 ('pullrequest_title', 'Title'),
938 ('pullrequest_title', 'Title'),
942 ('pullrequest_desc', 'Description'),
939 ('pullrequest_desc', 'Description'),
943 ('description_renderer', 'markdown'),
940 ('description_renderer', 'markdown'),
944 ('__start__', 'review_members:sequence'),
941 ('__start__', 'review_members:sequence'),
945 ('__start__', 'reviewer:mapping'),
942 ('__start__', 'reviewer:mapping'),
946 ('user_id', '1'),
943 ('user_id', '1'),
947 ('__start__', 'reasons:sequence'),
944 ('__start__', 'reasons:sequence'),
948 ('reason', 'Some reason'),
945 ('reason', 'Some reason'),
949 ('__end__', 'reasons:sequence'),
946 ('__end__', 'reasons:sequence'),
950 ('__start__', 'rules:sequence'),
947 ('__start__', 'rules:sequence'),
951 ('__end__', 'rules:sequence'),
948 ('__end__', 'rules:sequence'),
952 ('mandatory', 'False'),
949 ('mandatory', 'False'),
953 ('__end__', 'reviewer:mapping'),
950 ('__end__', 'reviewer:mapping'),
954 ('__end__', 'review_members:sequence'),
951 ('__end__', 'review_members:sequence'),
955 ('__start__', 'revisions:sequence'),
952 ('__start__', 'revisions:sequence'),
956 ('revisions', commit_ids['change']),
953 ('revisions', commit_ids['change']),
957 ('__end__', 'revisions:sequence'),
954 ('__end__', 'revisions:sequence'),
958 ('user', ''),
955 ('user', ''),
959 ('csrf_token', csrf_token),
956 ('csrf_token', csrf_token),
960 ],
957 ],
961 status=302)
958 status=302)
962
959
963 location = response.headers['Location']
960 location = response.headers['Location']
964
961
965 pull_request_id = location.rsplit('/', 1)[1]
962 pull_request_id = location.rsplit('/', 1)[1]
966 assert pull_request_id != 'new'
963 assert pull_request_id != 'new'
967 pull_request = PullRequest.get(int(pull_request_id))
964 pull_request = PullRequest.get(int(pull_request_id))
968
965
969 # target_ref has to point to the ancestor's commit_id in order to
966 # target_ref has to point to the ancestor's commit_id in order to
970 # show the correct diff
967 # show the correct diff
971 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
968 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
972 assert pull_request.target_ref == expected_target_ref
969 assert pull_request.target_ref == expected_target_ref
973
970
974 # Check generated diff contents
971 # Check generated diff contents
975 response = response.follow()
972 response = response.follow()
976 response.mustcontain(no=['content_of_ancestor'])
973 response.mustcontain(no=['content_of_ancestor'])
977 response.mustcontain(no=['content_of_ancestor-child'])
974 response.mustcontain(no=['content_of_ancestor-child'])
978 response.mustcontain('content_of_change')
975 response.mustcontain('content_of_change')
979
976
980 def test_merge_pull_request_enabled(self, pr_util, csrf_token):
977 def test_merge_pull_request_enabled(self, pr_util, csrf_token):
981 # Clear any previous calls to rcextensions
978 # Clear any previous calls to rcextensions
982 rhodecode.EXTENSIONS.calls.clear()
979 rhodecode.EXTENSIONS.calls.clear()
983
980
984 pull_request = pr_util.create_pull_request(
981 pull_request = pr_util.create_pull_request(
985 approved=True, mergeable=True)
982 approved=True, mergeable=True)
986 pull_request_id = pull_request.pull_request_id
983 pull_request_id = pull_request.pull_request_id
987 repo_name = pull_request.target_repo.scm_instance().name,
984 repo_name = pull_request.target_repo.scm_instance().name,
988
985
989 url = route_path('pullrequest_merge',
986 url = route_path('pullrequest_merge',
990 repo_name=str(repo_name[0]),
987 repo_name=str(repo_name[0]),
991 pull_request_id=pull_request_id)
988 pull_request_id=pull_request_id)
992 response = self.app.post(url, params={'csrf_token': csrf_token}).follow()
989 response = self.app.post(url, params={'csrf_token': csrf_token}).follow()
993
990
994 pull_request = PullRequest.get(pull_request_id)
991 pull_request = PullRequest.get(pull_request_id)
995
992
996 assert response.status_int == 200
993 assert response.status_int == 200
997 assert pull_request.is_closed()
994 assert pull_request.is_closed()
998 assert_pull_request_status(
995 assert_pull_request_status(
999 pull_request, ChangesetStatus.STATUS_APPROVED)
996 pull_request, ChangesetStatus.STATUS_APPROVED)
1000
997
1001 # Check the relevant log entries were added
998 # Check the relevant log entries were added
1002 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(3)
999 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(3)
1003 actions = [log.action for log in user_logs]
1000 actions = [log.action for log in user_logs]
1004 pr_commit_ids = PullRequestModel()._get_commit_ids(pull_request)
1001 pr_commit_ids = PullRequestModel()._get_commit_ids(pull_request)
1005 expected_actions = [
1002 expected_actions = [
1006 u'repo.pull_request.close',
1003 u'repo.pull_request.close',
1007 u'repo.pull_request.merge',
1004 u'repo.pull_request.merge',
1008 u'repo.pull_request.comment.create'
1005 u'repo.pull_request.comment.create'
1009 ]
1006 ]
1010 assert actions == expected_actions
1007 assert actions == expected_actions
1011
1008
1012 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(4)
1009 user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(4)
1013 actions = [log for log in user_logs]
1010 actions = [log for log in user_logs]
1014 assert actions[-1].action == 'user.push'
1011 assert actions[-1].action == 'user.push'
1015 assert actions[-1].action_data['commit_ids'] == pr_commit_ids
1012 assert actions[-1].action_data['commit_ids'] == pr_commit_ids
1016
1013
1017 # Check post_push rcextension was really executed
1014 # Check post_push rcextension was really executed
1018 push_calls = rhodecode.EXTENSIONS.calls['_push_hook']
1015 push_calls = rhodecode.EXTENSIONS.calls['_push_hook']
1019 assert len(push_calls) == 1
1016 assert len(push_calls) == 1
1020 unused_last_call_args, last_call_kwargs = push_calls[0]
1017 unused_last_call_args, last_call_kwargs = push_calls[0]
1021 assert last_call_kwargs['action'] == 'push'
1018 assert last_call_kwargs['action'] == 'push'
1022 assert last_call_kwargs['commit_ids'] == pr_commit_ids
1019 assert last_call_kwargs['commit_ids'] == pr_commit_ids
1023
1020
1024 def test_merge_pull_request_disabled(self, pr_util, csrf_token):
1021 def test_merge_pull_request_disabled(self, pr_util, csrf_token):
1025 pull_request = pr_util.create_pull_request(mergeable=False)
1022 pull_request = pr_util.create_pull_request(mergeable=False)
1026 pull_request_id = pull_request.pull_request_id
1023 pull_request_id = pull_request.pull_request_id
1027 pull_request = PullRequest.get(pull_request_id)
1024 pull_request = PullRequest.get(pull_request_id)
1028
1025
1029 response = self.app.post(
1026 response = self.app.post(
1030 route_path('pullrequest_merge',
1027 route_path('pullrequest_merge',
1031 repo_name=pull_request.target_repo.scm_instance().name,
1028 repo_name=pull_request.target_repo.scm_instance().name,
1032 pull_request_id=pull_request.pull_request_id),
1029 pull_request_id=pull_request.pull_request_id),
1033 params={'csrf_token': csrf_token}).follow()
1030 params={'csrf_token': csrf_token}).follow()
1034
1031
1035 assert response.status_int == 200
1032 assert response.status_int == 200
1036 response.mustcontain(
1033 response.mustcontain(
1037 'Merge is not currently possible because of below failed checks.')
1034 'Merge is not currently possible because of below failed checks.')
1038 response.mustcontain('Server-side pull request merging is disabled.')
1035 response.mustcontain('Server-side pull request merging is disabled.')
1039
1036
1040 @pytest.mark.skip_backends('svn')
1037 @pytest.mark.skip_backends('svn')
1041 def test_merge_pull_request_not_approved(self, pr_util, csrf_token):
1038 def test_merge_pull_request_not_approved(self, pr_util, csrf_token):
1042 pull_request = pr_util.create_pull_request(mergeable=True)
1039 pull_request = pr_util.create_pull_request(mergeable=True)
1043 pull_request_id = pull_request.pull_request_id
1040 pull_request_id = pull_request.pull_request_id
1044 repo_name = pull_request.target_repo.scm_instance().name
1041 repo_name = pull_request.target_repo.scm_instance().name
1045
1042
1046 response = self.app.post(
1043 response = self.app.post(
1047 route_path('pullrequest_merge',
1044 route_path('pullrequest_merge',
1048 repo_name=repo_name, pull_request_id=pull_request_id),
1045 repo_name=repo_name, pull_request_id=pull_request_id),
1049 params={'csrf_token': csrf_token}).follow()
1046 params={'csrf_token': csrf_token}).follow()
1050
1047
1051 assert response.status_int == 200
1048 assert response.status_int == 200
1052
1049
1053 response.mustcontain(
1050 response.mustcontain(
1054 'Merge is not currently possible because of below failed checks.')
1051 'Merge is not currently possible because of below failed checks.')
1055 response.mustcontain('Pull request reviewer approval is pending.')
1052 response.mustcontain('Pull request reviewer approval is pending.')
1056
1053
1057 def test_merge_pull_request_renders_failure_reason(
1054 def test_merge_pull_request_renders_failure_reason(
1058 self, user_regular, csrf_token, pr_util):
1055 self, user_regular, csrf_token, pr_util):
1059 pull_request = pr_util.create_pull_request(mergeable=True, approved=True)
1056 pull_request = pr_util.create_pull_request(mergeable=True, approved=True)
1060 pull_request_id = pull_request.pull_request_id
1057 pull_request_id = pull_request.pull_request_id
1061 repo_name = pull_request.target_repo.scm_instance().name
1058 repo_name = pull_request.target_repo.scm_instance().name
1062
1059
1063 merge_resp = MergeResponse(True, False, 'STUB_COMMIT_ID',
1060 merge_resp = MergeResponse(True, False, 'STUB_COMMIT_ID',
1064 MergeFailureReason.PUSH_FAILED,
1061 MergeFailureReason.PUSH_FAILED,
1065 metadata={'target': 'shadow repo',
1062 metadata={'target': 'shadow repo',
1066 'merge_commit': 'xxx'})
1063 'merge_commit': 'xxx'})
1067 model_patcher = mock.patch.multiple(
1064 model_patcher = mock.patch.multiple(
1068 PullRequestModel,
1065 PullRequestModel,
1069 merge_repo=mock.Mock(return_value=merge_resp),
1066 merge_repo=mock.Mock(return_value=merge_resp),
1070 merge_status=mock.Mock(return_value=(None, True, 'WRONG_MESSAGE')))
1067 merge_status=mock.Mock(return_value=(None, True, 'WRONG_MESSAGE')))
1071
1068
1072 with model_patcher:
1069 with model_patcher:
1073 response = self.app.post(
1070 response = self.app.post(
1074 route_path('pullrequest_merge',
1071 route_path('pullrequest_merge',
1075 repo_name=repo_name,
1072 repo_name=repo_name,
1076 pull_request_id=pull_request_id),
1073 pull_request_id=pull_request_id),
1077 params={'csrf_token': csrf_token}, status=302)
1074 params={'csrf_token': csrf_token}, status=302)
1078
1075
1079 merge_resp = MergeResponse(True, True, '', MergeFailureReason.PUSH_FAILED,
1076 merge_resp = MergeResponse(True, True, '', MergeFailureReason.PUSH_FAILED,
1080 metadata={'target': 'shadow repo',
1077 metadata={'target': 'shadow repo',
1081 'merge_commit': 'xxx'})
1078 'merge_commit': 'xxx'})
1082 assert_session_flash(response, merge_resp.merge_status_message)
1079 assert_session_flash(response, merge_resp.merge_status_message)
1083
1080
1084 def test_update_source_revision(self, backend, csrf_token):
1081 def test_update_source_revision(self, backend, csrf_token):
1085 commits = [
1082 commits = [
1086 {'message': 'ancestor'},
1083 {'message': 'ancestor'},
1087 {'message': 'change'},
1084 {'message': 'change'},
1088 {'message': 'change-2'},
1085 {'message': 'change-2'},
1089 ]
1086 ]
1090 commit_ids = backend.create_master_repo(commits)
1087 commit_ids = backend.create_master_repo(commits)
1091 target = backend.create_repo(heads=['ancestor'])
1088 target = backend.create_repo(heads=['ancestor'])
1092 source = backend.create_repo(heads=['change'])
1089 source = backend.create_repo(heads=['change'])
1093
1090
1094 # create pr from a in source to A in target
1091 # create pr from a in source to A in target
1095 pull_request = PullRequest()
1092 pull_request = PullRequest()
1096
1093
1097 pull_request.source_repo = source
1094 pull_request.source_repo = source
1098 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1095 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1099 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1096 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1100
1097
1101 pull_request.target_repo = target
1098 pull_request.target_repo = target
1102 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1099 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1103 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1100 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1104
1101
1105 pull_request.revisions = [commit_ids['change']]
1102 pull_request.revisions = [commit_ids['change']]
1106 pull_request.title = u"Test"
1103 pull_request.title = u"Test"
1107 pull_request.description = u"Description"
1104 pull_request.description = u"Description"
1108 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1105 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1109 pull_request.pull_request_state = PullRequest.STATE_CREATED
1106 pull_request.pull_request_state = PullRequest.STATE_CREATED
1110 Session().add(pull_request)
1107 Session().add(pull_request)
1111 Session().commit()
1108 Session().commit()
1112 pull_request_id = pull_request.pull_request_id
1109 pull_request_id = pull_request.pull_request_id
1113
1110
1114 # source has ancestor - change - change-2
1111 # source has ancestor - change - change-2
1115 backend.pull_heads(source, heads=['change-2'])
1112 backend.pull_heads(source, heads=['change-2'])
1116 target_repo_name = target.repo_name
1113 target_repo_name = target.repo_name
1117
1114
1118 # update PR
1115 # update PR
1119 self.app.post(
1116 self.app.post(
1120 route_path('pullrequest_update',
1117 route_path('pullrequest_update',
1121 repo_name=target_repo_name, pull_request_id=pull_request_id),
1118 repo_name=target_repo_name, pull_request_id=pull_request_id),
1122 params={'update_commits': 'true', 'csrf_token': csrf_token})
1119 params={'update_commits': 'true', 'csrf_token': csrf_token})
1123
1120
1124 response = self.app.get(
1121 response = self.app.get(
1125 route_path('pullrequest_show',
1122 route_path('pullrequest_show',
1126 repo_name=target_repo_name,
1123 repo_name=target_repo_name,
1127 pull_request_id=pull_request.pull_request_id))
1124 pull_request_id=pull_request.pull_request_id))
1128
1125
1129 assert response.status_int == 200
1126 assert response.status_int == 200
1130 response.mustcontain('Pull request updated to')
1127 response.mustcontain('Pull request updated to')
1131 response.mustcontain('with 1 added, 0 removed commits.')
1128 response.mustcontain('with 1 added, 0 removed commits.')
1132
1129
1133 # check that we have now both revisions
1130 # check that we have now both revisions
1134 pull_request = PullRequest.get(pull_request_id)
1131 pull_request = PullRequest.get(pull_request_id)
1135 assert pull_request.revisions == [commit_ids['change-2'], commit_ids['change']]
1132 assert pull_request.revisions == [commit_ids['change-2'], commit_ids['change']]
1136
1133
1137 def test_update_target_revision(self, backend, csrf_token):
1134 def test_update_target_revision(self, backend, csrf_token):
1138 commits = [
1135 commits = [
1139 {'message': 'ancestor'},
1136 {'message': 'ancestor'},
1140 {'message': 'change'},
1137 {'message': 'change'},
1141 {'message': 'ancestor-new', 'parents': ['ancestor']},
1138 {'message': 'ancestor-new', 'parents': ['ancestor']},
1142 {'message': 'change-rebased'},
1139 {'message': 'change-rebased'},
1143 ]
1140 ]
1144 commit_ids = backend.create_master_repo(commits)
1141 commit_ids = backend.create_master_repo(commits)
1145 target = backend.create_repo(heads=['ancestor'])
1142 target = backend.create_repo(heads=['ancestor'])
1146 source = backend.create_repo(heads=['change'])
1143 source = backend.create_repo(heads=['change'])
1147
1144
1148 # create pr from a in source to A in target
1145 # create pr from a in source to A in target
1149 pull_request = PullRequest()
1146 pull_request = PullRequest()
1150
1147
1151 pull_request.source_repo = source
1148 pull_request.source_repo = source
1152 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1149 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1153 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1150 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1154
1151
1155 pull_request.target_repo = target
1152 pull_request.target_repo = target
1156 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1153 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1157 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1154 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1158
1155
1159 pull_request.revisions = [commit_ids['change']]
1156 pull_request.revisions = [commit_ids['change']]
1160 pull_request.title = u"Test"
1157 pull_request.title = u"Test"
1161 pull_request.description = u"Description"
1158 pull_request.description = u"Description"
1162 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1159 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1163 pull_request.pull_request_state = PullRequest.STATE_CREATED
1160 pull_request.pull_request_state = PullRequest.STATE_CREATED
1164
1161
1165 Session().add(pull_request)
1162 Session().add(pull_request)
1166 Session().commit()
1163 Session().commit()
1167 pull_request_id = pull_request.pull_request_id
1164 pull_request_id = pull_request.pull_request_id
1168
1165
1169 # target has ancestor - ancestor-new
1166 # target has ancestor - ancestor-new
1170 # source has ancestor - ancestor-new - change-rebased
1167 # source has ancestor - ancestor-new - change-rebased
1171 backend.pull_heads(target, heads=['ancestor-new'])
1168 backend.pull_heads(target, heads=['ancestor-new'])
1172 backend.pull_heads(source, heads=['change-rebased'])
1169 backend.pull_heads(source, heads=['change-rebased'])
1173 target_repo_name = target.repo_name
1170 target_repo_name = target.repo_name
1174
1171
1175 # update PR
1172 # update PR
1176 url = route_path('pullrequest_update',
1173 url = route_path('pullrequest_update',
1177 repo_name=target_repo_name,
1174 repo_name=target_repo_name,
1178 pull_request_id=pull_request_id)
1175 pull_request_id=pull_request_id)
1179 self.app.post(url,
1176 self.app.post(url,
1180 params={'update_commits': 'true', 'csrf_token': csrf_token},
1177 params={'update_commits': 'true', 'csrf_token': csrf_token},
1181 status=200)
1178 status=200)
1182
1179
1183 # check that we have now both revisions
1180 # check that we have now both revisions
1184 pull_request = PullRequest.get(pull_request_id)
1181 pull_request = PullRequest.get(pull_request_id)
1185 assert pull_request.revisions == [commit_ids['change-rebased']]
1182 assert pull_request.revisions == [commit_ids['change-rebased']]
1186 assert pull_request.target_ref == 'branch:{branch}:{commit_id}'.format(
1183 assert pull_request.target_ref == 'branch:{branch}:{commit_id}'.format(
1187 branch=backend.default_branch_name, commit_id=commit_ids['ancestor-new'])
1184 branch=backend.default_branch_name, commit_id=commit_ids['ancestor-new'])
1188
1185
1189 response = self.app.get(
1186 response = self.app.get(
1190 route_path('pullrequest_show',
1187 route_path('pullrequest_show',
1191 repo_name=target_repo_name,
1188 repo_name=target_repo_name,
1192 pull_request_id=pull_request.pull_request_id))
1189 pull_request_id=pull_request.pull_request_id))
1193 assert response.status_int == 200
1190 assert response.status_int == 200
1194 response.mustcontain('Pull request updated to')
1191 response.mustcontain('Pull request updated to')
1195 response.mustcontain('with 1 added, 1 removed commits.')
1192 response.mustcontain('with 1 added, 1 removed commits.')
1196
1193
1197 def test_update_target_revision_with_removal_of_1_commit_git(self, backend_git, csrf_token):
1194 def test_update_target_revision_with_removal_of_1_commit_git(self, backend_git, csrf_token):
1198 backend = backend_git
1195 backend = backend_git
1199 commits = [
1196 commits = [
1200 {'message': 'master-commit-1'},
1197 {'message': 'master-commit-1'},
1201 {'message': 'master-commit-2-change-1'},
1198 {'message': 'master-commit-2-change-1'},
1202 {'message': 'master-commit-3-change-2'},
1199 {'message': 'master-commit-3-change-2'},
1203
1200
1204 {'message': 'feat-commit-1', 'parents': ['master-commit-1']},
1201 {'message': 'feat-commit-1', 'parents': ['master-commit-1']},
1205 {'message': 'feat-commit-2'},
1202 {'message': 'feat-commit-2'},
1206 ]
1203 ]
1207 commit_ids = backend.create_master_repo(commits)
1204 commit_ids = backend.create_master_repo(commits)
1208 target = backend.create_repo(heads=['master-commit-3-change-2'])
1205 target = backend.create_repo(heads=['master-commit-3-change-2'])
1209 source = backend.create_repo(heads=['feat-commit-2'])
1206 source = backend.create_repo(heads=['feat-commit-2'])
1210
1207
1211 # create pr from a in source to A in target
1208 # create pr from a in source to A in target
1212 pull_request = PullRequest()
1209 pull_request = PullRequest()
1213 pull_request.source_repo = source
1210 pull_request.source_repo = source
1214
1211
1215 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1212 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1216 branch=backend.default_branch_name,
1213 branch=backend.default_branch_name,
1217 commit_id=commit_ids['master-commit-3-change-2'])
1214 commit_id=commit_ids['master-commit-3-change-2'])
1218
1215
1219 pull_request.target_repo = target
1216 pull_request.target_repo = target
1220 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1217 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1221 branch=backend.default_branch_name, commit_id=commit_ids['feat-commit-2'])
1218 branch=backend.default_branch_name, commit_id=commit_ids['feat-commit-2'])
1222
1219
1223 pull_request.revisions = [
1220 pull_request.revisions = [
1224 commit_ids['feat-commit-1'],
1221 commit_ids['feat-commit-1'],
1225 commit_ids['feat-commit-2']
1222 commit_ids['feat-commit-2']
1226 ]
1223 ]
1227 pull_request.title = u"Test"
1224 pull_request.title = u"Test"
1228 pull_request.description = u"Description"
1225 pull_request.description = u"Description"
1229 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1226 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1230 pull_request.pull_request_state = PullRequest.STATE_CREATED
1227 pull_request.pull_request_state = PullRequest.STATE_CREATED
1231 Session().add(pull_request)
1228 Session().add(pull_request)
1232 Session().commit()
1229 Session().commit()
1233 pull_request_id = pull_request.pull_request_id
1230 pull_request_id = pull_request.pull_request_id
1234
1231
1235 # PR is created, now we simulate a force-push into target,
1232 # PR is created, now we simulate a force-push into target,
1236 # that drops a 2 last commits
1233 # that drops a 2 last commits
1237 vcsrepo = target.scm_instance()
1234 vcsrepo = target.scm_instance()
1238 vcsrepo.config.clear_section('hooks')
1235 vcsrepo.config.clear_section('hooks')
1239 vcsrepo.run_git_command(['reset', '--soft', 'HEAD~2'])
1236 vcsrepo.run_git_command(['reset', '--soft', 'HEAD~2'])
1240 target_repo_name = target.repo_name
1237 target_repo_name = target.repo_name
1241
1238
1242 # update PR
1239 # update PR
1243 url = route_path('pullrequest_update',
1240 url = route_path('pullrequest_update',
1244 repo_name=target_repo_name,
1241 repo_name=target_repo_name,
1245 pull_request_id=pull_request_id)
1242 pull_request_id=pull_request_id)
1246 self.app.post(url,
1243 self.app.post(url,
1247 params={'update_commits': 'true', 'csrf_token': csrf_token},
1244 params={'update_commits': 'true', 'csrf_token': csrf_token},
1248 status=200)
1245 status=200)
1249
1246
1250 response = self.app.get(route_path('pullrequest_new', repo_name=target_repo_name))
1247 response = self.app.get(route_path('pullrequest_new', repo_name=target_repo_name))
1251 assert response.status_int == 200
1248 assert response.status_int == 200
1252 response.mustcontain('Pull request updated to')
1249 response.mustcontain('Pull request updated to')
1253 response.mustcontain('with 0 added, 0 removed commits.')
1250 response.mustcontain('with 0 added, 0 removed commits.')
1254
1251
1255 def test_update_of_ancestor_reference(self, backend, csrf_token):
1252 def test_update_of_ancestor_reference(self, backend, csrf_token):
1256 commits = [
1253 commits = [
1257 {'message': 'ancestor'},
1254 {'message': 'ancestor'},
1258 {'message': 'change'},
1255 {'message': 'change'},
1259 {'message': 'change-2'},
1256 {'message': 'change-2'},
1260 {'message': 'ancestor-new', 'parents': ['ancestor']},
1257 {'message': 'ancestor-new', 'parents': ['ancestor']},
1261 {'message': 'change-rebased'},
1258 {'message': 'change-rebased'},
1262 ]
1259 ]
1263 commit_ids = backend.create_master_repo(commits)
1260 commit_ids = backend.create_master_repo(commits)
1264 target = backend.create_repo(heads=['ancestor'])
1261 target = backend.create_repo(heads=['ancestor'])
1265 source = backend.create_repo(heads=['change'])
1262 source = backend.create_repo(heads=['change'])
1266
1263
1267 # create pr from a in source to A in target
1264 # create pr from a in source to A in target
1268 pull_request = PullRequest()
1265 pull_request = PullRequest()
1269 pull_request.source_repo = source
1266 pull_request.source_repo = source
1270
1267
1271 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1268 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1272 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1269 branch=backend.default_branch_name, commit_id=commit_ids['change'])
1273 pull_request.target_repo = target
1270 pull_request.target_repo = target
1274 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1271 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1275 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1272 branch=backend.default_branch_name, commit_id=commit_ids['ancestor'])
1276 pull_request.revisions = [commit_ids['change']]
1273 pull_request.revisions = [commit_ids['change']]
1277 pull_request.title = u"Test"
1274 pull_request.title = u"Test"
1278 pull_request.description = u"Description"
1275 pull_request.description = u"Description"
1279 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1276 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1280 pull_request.pull_request_state = PullRequest.STATE_CREATED
1277 pull_request.pull_request_state = PullRequest.STATE_CREATED
1281 Session().add(pull_request)
1278 Session().add(pull_request)
1282 Session().commit()
1279 Session().commit()
1283 pull_request_id = pull_request.pull_request_id
1280 pull_request_id = pull_request.pull_request_id
1284
1281
1285 # target has ancestor - ancestor-new
1282 # target has ancestor - ancestor-new
1286 # source has ancestor - ancestor-new - change-rebased
1283 # source has ancestor - ancestor-new - change-rebased
1287 backend.pull_heads(target, heads=['ancestor-new'])
1284 backend.pull_heads(target, heads=['ancestor-new'])
1288 backend.pull_heads(source, heads=['change-rebased'])
1285 backend.pull_heads(source, heads=['change-rebased'])
1289 target_repo_name = target.repo_name
1286 target_repo_name = target.repo_name
1290
1287
1291 # update PR
1288 # update PR
1292 self.app.post(
1289 self.app.post(
1293 route_path('pullrequest_update',
1290 route_path('pullrequest_update',
1294 repo_name=target_repo_name, pull_request_id=pull_request_id),
1291 repo_name=target_repo_name, pull_request_id=pull_request_id),
1295 params={'update_commits': 'true', 'csrf_token': csrf_token},
1292 params={'update_commits': 'true', 'csrf_token': csrf_token},
1296 status=200)
1293 status=200)
1297
1294
1298 # Expect the target reference to be updated correctly
1295 # Expect the target reference to be updated correctly
1299 pull_request = PullRequest.get(pull_request_id)
1296 pull_request = PullRequest.get(pull_request_id)
1300 assert pull_request.revisions == [commit_ids['change-rebased']]
1297 assert pull_request.revisions == [commit_ids['change-rebased']]
1301 expected_target_ref = 'branch:{branch}:{commit_id}'.format(
1298 expected_target_ref = 'branch:{branch}:{commit_id}'.format(
1302 branch=backend.default_branch_name,
1299 branch=backend.default_branch_name,
1303 commit_id=commit_ids['ancestor-new'])
1300 commit_id=commit_ids['ancestor-new'])
1304 assert pull_request.target_ref == expected_target_ref
1301 assert pull_request.target_ref == expected_target_ref
1305
1302
1306 def test_remove_pull_request_branch(self, backend_git, csrf_token):
1303 def test_remove_pull_request_branch(self, backend_git, csrf_token):
1307 branch_name = 'development'
1304 branch_name = 'development'
1308 commits = [
1305 commits = [
1309 {'message': 'initial-commit'},
1306 {'message': 'initial-commit'},
1310 {'message': 'old-feature'},
1307 {'message': 'old-feature'},
1311 {'message': 'new-feature', 'branch': branch_name},
1308 {'message': 'new-feature', 'branch': branch_name},
1312 ]
1309 ]
1313 repo = backend_git.create_repo(commits)
1310 repo = backend_git.create_repo(commits)
1314 repo_name = repo.repo_name
1311 repo_name = repo.repo_name
1315 commit_ids = backend_git.commit_ids
1312 commit_ids = backend_git.commit_ids
1316
1313
1317 pull_request = PullRequest()
1314 pull_request = PullRequest()
1318 pull_request.source_repo = repo
1315 pull_request.source_repo = repo
1319 pull_request.target_repo = repo
1316 pull_request.target_repo = repo
1320 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1317 pull_request.source_ref = 'branch:{branch}:{commit_id}'.format(
1321 branch=branch_name, commit_id=commit_ids['new-feature'])
1318 branch=branch_name, commit_id=commit_ids['new-feature'])
1322 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1319 pull_request.target_ref = 'branch:{branch}:{commit_id}'.format(
1323 branch=backend_git.default_branch_name, commit_id=commit_ids['old-feature'])
1320 branch=backend_git.default_branch_name, commit_id=commit_ids['old-feature'])
1324 pull_request.revisions = [commit_ids['new-feature']]
1321 pull_request.revisions = [commit_ids['new-feature']]
1325 pull_request.title = u"Test"
1322 pull_request.title = u"Test"
1326 pull_request.description = u"Description"
1323 pull_request.description = u"Description"
1327 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1324 pull_request.author = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
1328 pull_request.pull_request_state = PullRequest.STATE_CREATED
1325 pull_request.pull_request_state = PullRequest.STATE_CREATED
1329 Session().add(pull_request)
1326 Session().add(pull_request)
1330 Session().commit()
1327 Session().commit()
1331
1328
1332 pull_request_id = pull_request.pull_request_id
1329 pull_request_id = pull_request.pull_request_id
1333
1330
1334 vcs = repo.scm_instance()
1331 vcs = repo.scm_instance()
1335 vcs.remove_ref('refs/heads/{}'.format(branch_name))
1332 vcs.remove_ref('refs/heads/{}'.format(branch_name))
1336 # NOTE(marcink): run GC to ensure the commits are gone
1333 # NOTE(marcink): run GC to ensure the commits are gone
1337 vcs.run_gc()
1334 vcs.run_gc()
1338
1335
1339 response = self.app.get(route_path(
1336 response = self.app.get(route_path(
1340 'pullrequest_show',
1337 'pullrequest_show',
1341 repo_name=repo_name,
1338 repo_name=repo_name,
1342 pull_request_id=pull_request_id))
1339 pull_request_id=pull_request_id))
1343
1340
1344 assert response.status_int == 200
1341 assert response.status_int == 200
1345
1342
1346 response.assert_response().element_contains(
1343 response.assert_response().element_contains(
1347 '#changeset_compare_view_content .alert strong',
1344 '#changeset_compare_view_content .alert strong',
1348 'Missing commits')
1345 'Missing commits')
1349 response.assert_response().element_contains(
1346 response.assert_response().element_contains(
1350 '#changeset_compare_view_content .alert',
1347 '#changeset_compare_view_content .alert',
1351 'This pull request cannot be displayed, because one or more'
1348 'This pull request cannot be displayed, because one or more'
1352 ' commits no longer exist in the source repository.')
1349 ' commits no longer exist in the source repository.')
1353
1350
1354 def test_strip_commits_from_pull_request(
1351 def test_strip_commits_from_pull_request(
1355 self, backend, pr_util, csrf_token):
1352 self, backend, pr_util, csrf_token):
1356 commits = [
1353 commits = [
1357 {'message': 'initial-commit'},
1354 {'message': 'initial-commit'},
1358 {'message': 'old-feature'},
1355 {'message': 'old-feature'},
1359 {'message': 'new-feature', 'parents': ['initial-commit']},
1356 {'message': 'new-feature', 'parents': ['initial-commit']},
1360 ]
1357 ]
1361 pull_request = pr_util.create_pull_request(
1358 pull_request = pr_util.create_pull_request(
1362 commits, target_head='initial-commit', source_head='new-feature',
1359 commits, target_head='initial-commit', source_head='new-feature',
1363 revisions=['new-feature'])
1360 revisions=['new-feature'])
1364
1361
1365 vcs = pr_util.source_repository.scm_instance()
1362 vcs = pr_util.source_repository.scm_instance()
1366 if backend.alias == 'git':
1363 if backend.alias == 'git':
1367 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1364 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1368 else:
1365 else:
1369 vcs.strip(pr_util.commit_ids['new-feature'])
1366 vcs.strip(pr_util.commit_ids['new-feature'])
1370
1367
1371 response = self.app.get(route_path(
1368 response = self.app.get(route_path(
1372 'pullrequest_show',
1369 'pullrequest_show',
1373 repo_name=pr_util.target_repository.repo_name,
1370 repo_name=pr_util.target_repository.repo_name,
1374 pull_request_id=pull_request.pull_request_id))
1371 pull_request_id=pull_request.pull_request_id))
1375
1372
1376 assert response.status_int == 200
1373 assert response.status_int == 200
1377
1374
1378 response.assert_response().element_contains(
1375 response.assert_response().element_contains(
1379 '#changeset_compare_view_content .alert strong',
1376 '#changeset_compare_view_content .alert strong',
1380 'Missing commits')
1377 'Missing commits')
1381 response.assert_response().element_contains(
1378 response.assert_response().element_contains(
1382 '#changeset_compare_view_content .alert',
1379 '#changeset_compare_view_content .alert',
1383 'This pull request cannot be displayed, because one or more'
1380 'This pull request cannot be displayed, because one or more'
1384 ' commits no longer exist in the source repository.')
1381 ' commits no longer exist in the source repository.')
1385 response.assert_response().element_contains(
1382 response.assert_response().element_contains(
1386 '#update_commits',
1383 '#update_commits',
1387 'Update commits')
1384 'Update commits')
1388
1385
1389 def test_strip_commits_and_update(
1386 def test_strip_commits_and_update(
1390 self, backend, pr_util, csrf_token):
1387 self, backend, pr_util, csrf_token):
1391 commits = [
1388 commits = [
1392 {'message': 'initial-commit'},
1389 {'message': 'initial-commit'},
1393 {'message': 'old-feature'},
1390 {'message': 'old-feature'},
1394 {'message': 'new-feature', 'parents': ['old-feature']},
1391 {'message': 'new-feature', 'parents': ['old-feature']},
1395 ]
1392 ]
1396 pull_request = pr_util.create_pull_request(
1393 pull_request = pr_util.create_pull_request(
1397 commits, target_head='old-feature', source_head='new-feature',
1394 commits, target_head='old-feature', source_head='new-feature',
1398 revisions=['new-feature'], mergeable=True)
1395 revisions=['new-feature'], mergeable=True)
1399 pr_id = pull_request.pull_request_id
1396 pr_id = pull_request.pull_request_id
1400 target_repo_name = pull_request.target_repo.repo_name
1397 target_repo_name = pull_request.target_repo.repo_name
1401
1398
1402 vcs = pr_util.source_repository.scm_instance()
1399 vcs = pr_util.source_repository.scm_instance()
1403 if backend.alias == 'git':
1400 if backend.alias == 'git':
1404 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1401 vcs.strip(pr_util.commit_ids['new-feature'], branch_name='master')
1405 else:
1402 else:
1406 vcs.strip(pr_util.commit_ids['new-feature'])
1403 vcs.strip(pr_util.commit_ids['new-feature'])
1407
1404
1408 url = route_path('pullrequest_update',
1405 url = route_path('pullrequest_update',
1409 repo_name=target_repo_name,
1406 repo_name=target_repo_name,
1410 pull_request_id=pr_id)
1407 pull_request_id=pr_id)
1411 response = self.app.post(url,
1408 response = self.app.post(url,
1412 params={'update_commits': 'true',
1409 params={'update_commits': 'true',
1413 'csrf_token': csrf_token})
1410 'csrf_token': csrf_token})
1414
1411
1415 assert response.status_int == 200
1412 assert response.status_int == 200
1416 assert response.body == '{"response": true, "redirect_url": null}'
1413 assert response.body == '{"response": true, "redirect_url": null}'
1417
1414
1418 # Make sure that after update, it won't raise 500 errors
1415 # Make sure that after update, it won't raise 500 errors
1419 response = self.app.get(route_path(
1416 response = self.app.get(route_path(
1420 'pullrequest_show',
1417 'pullrequest_show',
1421 repo_name=target_repo_name,
1418 repo_name=target_repo_name,
1422 pull_request_id=pr_id))
1419 pull_request_id=pr_id))
1423
1420
1424 assert response.status_int == 200
1421 assert response.status_int == 200
1425 response.assert_response().element_contains(
1422 response.assert_response().element_contains(
1426 '#changeset_compare_view_content .alert strong',
1423 '#changeset_compare_view_content .alert strong',
1427 'Missing commits')
1424 'Missing commits')
1428
1425
1429 def test_branch_is_a_link(self, pr_util):
1426 def test_branch_is_a_link(self, pr_util):
1430 pull_request = pr_util.create_pull_request()
1427 pull_request = pr_util.create_pull_request()
1431 pull_request.source_ref = 'branch:origin:1234567890abcdef'
1428 pull_request.source_ref = 'branch:origin:1234567890abcdef'
1432 pull_request.target_ref = 'branch:target:abcdef1234567890'
1429 pull_request.target_ref = 'branch:target:abcdef1234567890'
1433 Session().add(pull_request)
1430 Session().add(pull_request)
1434 Session().commit()
1431 Session().commit()
1435
1432
1436 response = self.app.get(route_path(
1433 response = self.app.get(route_path(
1437 'pullrequest_show',
1434 'pullrequest_show',
1438 repo_name=pull_request.target_repo.scm_instance().name,
1435 repo_name=pull_request.target_repo.scm_instance().name,
1439 pull_request_id=pull_request.pull_request_id))
1436 pull_request_id=pull_request.pull_request_id))
1440 assert response.status_int == 200
1437 assert response.status_int == 200
1441
1438
1442 source = response.assert_response().get_element('.pr-source-info')
1439 source = response.assert_response().get_element('.pr-source-info')
1443 source_parent = source.getparent()
1440 source_parent = source.getparent()
1444 assert len(source_parent) == 1
1441 assert len(source_parent) == 1
1445
1442
1446 target = response.assert_response().get_element('.pr-target-info')
1443 target = response.assert_response().get_element('.pr-target-info')
1447 target_parent = target.getparent()
1444 target_parent = target.getparent()
1448 assert len(target_parent) == 1
1445 assert len(target_parent) == 1
1449
1446
1450 expected_origin_link = route_path(
1447 expected_origin_link = route_path(
1451 'repo_commits',
1448 'repo_commits',
1452 repo_name=pull_request.source_repo.scm_instance().name,
1449 repo_name=pull_request.source_repo.scm_instance().name,
1453 params=dict(branch='origin'))
1450 params=dict(branch='origin'))
1454 expected_target_link = route_path(
1451 expected_target_link = route_path(
1455 'repo_commits',
1452 'repo_commits',
1456 repo_name=pull_request.target_repo.scm_instance().name,
1453 repo_name=pull_request.target_repo.scm_instance().name,
1457 params=dict(branch='target'))
1454 params=dict(branch='target'))
1458 assert source_parent.attrib['href'] == expected_origin_link
1455 assert source_parent.attrib['href'] == expected_origin_link
1459 assert target_parent.attrib['href'] == expected_target_link
1456 assert target_parent.attrib['href'] == expected_target_link
1460
1457
1461 def test_bookmark_is_not_a_link(self, pr_util):
1458 def test_bookmark_is_not_a_link(self, pr_util):
1462 pull_request = pr_util.create_pull_request()
1459 pull_request = pr_util.create_pull_request()
1463 pull_request.source_ref = 'bookmark:origin:1234567890abcdef'
1460 pull_request.source_ref = 'bookmark:origin:1234567890abcdef'
1464 pull_request.target_ref = 'bookmark:target:abcdef1234567890'
1461 pull_request.target_ref = 'bookmark:target:abcdef1234567890'
1465 Session().add(pull_request)
1462 Session().add(pull_request)
1466 Session().commit()
1463 Session().commit()
1467
1464
1468 response = self.app.get(route_path(
1465 response = self.app.get(route_path(
1469 'pullrequest_show',
1466 'pullrequest_show',
1470 repo_name=pull_request.target_repo.scm_instance().name,
1467 repo_name=pull_request.target_repo.scm_instance().name,
1471 pull_request_id=pull_request.pull_request_id))
1468 pull_request_id=pull_request.pull_request_id))
1472 assert response.status_int == 200
1469 assert response.status_int == 200
1473
1470
1474 source = response.assert_response().get_element('.pr-source-info')
1471 source = response.assert_response().get_element('.pr-source-info')
1475 assert source.text.strip() == 'bookmark:origin'
1472 assert source.text.strip() == 'bookmark:origin'
1476 assert source.getparent().attrib.get('href') is None
1473 assert source.getparent().attrib.get('href') is None
1477
1474
1478 target = response.assert_response().get_element('.pr-target-info')
1475 target = response.assert_response().get_element('.pr-target-info')
1479 assert target.text.strip() == 'bookmark:target'
1476 assert target.text.strip() == 'bookmark:target'
1480 assert target.getparent().attrib.get('href') is None
1477 assert target.getparent().attrib.get('href') is None
1481
1478
1482 def test_tag_is_not_a_link(self, pr_util):
1479 def test_tag_is_not_a_link(self, pr_util):
1483 pull_request = pr_util.create_pull_request()
1480 pull_request = pr_util.create_pull_request()
1484 pull_request.source_ref = 'tag:origin:1234567890abcdef'
1481 pull_request.source_ref = 'tag:origin:1234567890abcdef'
1485 pull_request.target_ref = 'tag:target:abcdef1234567890'
1482 pull_request.target_ref = 'tag:target:abcdef1234567890'
1486 Session().add(pull_request)
1483 Session().add(pull_request)
1487 Session().commit()
1484 Session().commit()
1488
1485
1489 response = self.app.get(route_path(
1486 response = self.app.get(route_path(
1490 'pullrequest_show',
1487 'pullrequest_show',
1491 repo_name=pull_request.target_repo.scm_instance().name,
1488 repo_name=pull_request.target_repo.scm_instance().name,
1492 pull_request_id=pull_request.pull_request_id))
1489 pull_request_id=pull_request.pull_request_id))
1493 assert response.status_int == 200
1490 assert response.status_int == 200
1494
1491
1495 source = response.assert_response().get_element('.pr-source-info')
1492 source = response.assert_response().get_element('.pr-source-info')
1496 assert source.text.strip() == 'tag:origin'
1493 assert source.text.strip() == 'tag:origin'
1497 assert source.getparent().attrib.get('href') is None
1494 assert source.getparent().attrib.get('href') is None
1498
1495
1499 target = response.assert_response().get_element('.pr-target-info')
1496 target = response.assert_response().get_element('.pr-target-info')
1500 assert target.text.strip() == 'tag:target'
1497 assert target.text.strip() == 'tag:target'
1501 assert target.getparent().attrib.get('href') is None
1498 assert target.getparent().attrib.get('href') is None
1502
1499
1503 @pytest.mark.parametrize('mergeable', [True, False])
1500 @pytest.mark.parametrize('mergeable', [True, False])
1504 def test_shadow_repository_link(
1501 def test_shadow_repository_link(
1505 self, mergeable, pr_util, http_host_only_stub):
1502 self, mergeable, pr_util, http_host_only_stub):
1506 """
1503 """
1507 Check that the pull request summary page displays a link to the shadow
1504 Check that the pull request summary page displays a link to the shadow
1508 repository if the pull request is mergeable. If it is not mergeable
1505 repository if the pull request is mergeable. If it is not mergeable
1509 the link should not be displayed.
1506 the link should not be displayed.
1510 """
1507 """
1511 pull_request = pr_util.create_pull_request(
1508 pull_request = pr_util.create_pull_request(
1512 mergeable=mergeable, enable_notifications=False)
1509 mergeable=mergeable, enable_notifications=False)
1513 target_repo = pull_request.target_repo.scm_instance()
1510 target_repo = pull_request.target_repo.scm_instance()
1514 pr_id = pull_request.pull_request_id
1511 pr_id = pull_request.pull_request_id
1515 shadow_url = '{host}/{repo}/pull-request/{pr_id}/repository'.format(
1512 shadow_url = '{host}/{repo}/pull-request/{pr_id}/repository'.format(
1516 host=http_host_only_stub, repo=target_repo.name, pr_id=pr_id)
1513 host=http_host_only_stub, repo=target_repo.name, pr_id=pr_id)
1517
1514
1518 response = self.app.get(route_path(
1515 response = self.app.get(route_path(
1519 'pullrequest_show',
1516 'pullrequest_show',
1520 repo_name=target_repo.name,
1517 repo_name=target_repo.name,
1521 pull_request_id=pr_id))
1518 pull_request_id=pr_id))
1522
1519
1523 if mergeable:
1520 if mergeable:
1524 response.assert_response().element_value_contains(
1521 response.assert_response().element_value_contains(
1525 'input.pr-mergeinfo', shadow_url)
1522 'input.pr-mergeinfo', shadow_url)
1526 response.assert_response().element_value_contains(
1523 response.assert_response().element_value_contains(
1527 'input.pr-mergeinfo ', 'pr-merge')
1524 'input.pr-mergeinfo ', 'pr-merge')
1528 else:
1525 else:
1529 response.assert_response().no_element_exists('.pr-mergeinfo')
1526 response.assert_response().no_element_exists('.pr-mergeinfo')
1530
1527
1531
1528
1532 @pytest.mark.usefixtures('app')
1529 @pytest.mark.usefixtures('app')
1533 @pytest.mark.backends("git", "hg")
1530 @pytest.mark.backends("git", "hg")
1534 class TestPullrequestsControllerDelete(object):
1531 class TestPullrequestsControllerDelete(object):
1535 def test_pull_request_delete_button_permissions_admin(
1532 def test_pull_request_delete_button_permissions_admin(
1536 self, autologin_user, user_admin, pr_util):
1533 self, autologin_user, user_admin, pr_util):
1537 pull_request = pr_util.create_pull_request(
1534 pull_request = pr_util.create_pull_request(
1538 author=user_admin.username, enable_notifications=False)
1535 author=user_admin.username, enable_notifications=False)
1539
1536
1540 response = self.app.get(route_path(
1537 response = self.app.get(route_path(
1541 'pullrequest_show',
1538 'pullrequest_show',
1542 repo_name=pull_request.target_repo.scm_instance().name,
1539 repo_name=pull_request.target_repo.scm_instance().name,
1543 pull_request_id=pull_request.pull_request_id))
1540 pull_request_id=pull_request.pull_request_id))
1544
1541
1545 response.mustcontain('id="delete_pullrequest"')
1542 response.mustcontain('id="delete_pullrequest"')
1546 response.mustcontain('Confirm to delete this pull request')
1543 response.mustcontain('Confirm to delete this pull request')
1547
1544
1548 def test_pull_request_delete_button_permissions_owner(
1545 def test_pull_request_delete_button_permissions_owner(
1549 self, autologin_regular_user, user_regular, pr_util):
1546 self, autologin_regular_user, user_regular, pr_util):
1550 pull_request = pr_util.create_pull_request(
1547 pull_request = pr_util.create_pull_request(
1551 author=user_regular.username, enable_notifications=False)
1548 author=user_regular.username, enable_notifications=False)
1552
1549
1553 response = self.app.get(route_path(
1550 response = self.app.get(route_path(
1554 'pullrequest_show',
1551 'pullrequest_show',
1555 repo_name=pull_request.target_repo.scm_instance().name,
1552 repo_name=pull_request.target_repo.scm_instance().name,
1556 pull_request_id=pull_request.pull_request_id))
1553 pull_request_id=pull_request.pull_request_id))
1557
1554
1558 response.mustcontain('id="delete_pullrequest"')
1555 response.mustcontain('id="delete_pullrequest"')
1559 response.mustcontain('Confirm to delete this pull request')
1556 response.mustcontain('Confirm to delete this pull request')
1560
1557
1561 def test_pull_request_delete_button_permissions_forbidden(
1558 def test_pull_request_delete_button_permissions_forbidden(
1562 self, autologin_regular_user, user_regular, user_admin, pr_util):
1559 self, autologin_regular_user, user_regular, user_admin, pr_util):
1563 pull_request = pr_util.create_pull_request(
1560 pull_request = pr_util.create_pull_request(
1564 author=user_admin.username, enable_notifications=False)
1561 author=user_admin.username, enable_notifications=False)
1565
1562
1566 response = self.app.get(route_path(
1563 response = self.app.get(route_path(
1567 'pullrequest_show',
1564 'pullrequest_show',
1568 repo_name=pull_request.target_repo.scm_instance().name,
1565 repo_name=pull_request.target_repo.scm_instance().name,
1569 pull_request_id=pull_request.pull_request_id))
1566 pull_request_id=pull_request.pull_request_id))
1570 response.mustcontain(no=['id="delete_pullrequest"'])
1567 response.mustcontain(no=['id="delete_pullrequest"'])
1571 response.mustcontain(no=['Confirm to delete this pull request'])
1568 response.mustcontain(no=['Confirm to delete this pull request'])
1572
1569
1573 def test_pull_request_delete_button_permissions_can_update_cannot_delete(
1570 def test_pull_request_delete_button_permissions_can_update_cannot_delete(
1574 self, autologin_regular_user, user_regular, user_admin, pr_util,
1571 self, autologin_regular_user, user_regular, user_admin, pr_util,
1575 user_util):
1572 user_util):
1576
1573
1577 pull_request = pr_util.create_pull_request(
1574 pull_request = pr_util.create_pull_request(
1578 author=user_admin.username, enable_notifications=False)
1575 author=user_admin.username, enable_notifications=False)
1579
1576
1580 user_util.grant_user_permission_to_repo(
1577 user_util.grant_user_permission_to_repo(
1581 pull_request.target_repo, user_regular,
1578 pull_request.target_repo, user_regular,
1582 'repository.write')
1579 'repository.write')
1583
1580
1584 response = self.app.get(route_path(
1581 response = self.app.get(route_path(
1585 'pullrequest_show',
1582 'pullrequest_show',
1586 repo_name=pull_request.target_repo.scm_instance().name,
1583 repo_name=pull_request.target_repo.scm_instance().name,
1587 pull_request_id=pull_request.pull_request_id))
1584 pull_request_id=pull_request.pull_request_id))
1588
1585
1589 response.mustcontain('id="open_edit_pullrequest"')
1586 response.mustcontain('id="open_edit_pullrequest"')
1590 response.mustcontain('id="delete_pullrequest"')
1587 response.mustcontain('id="delete_pullrequest"')
1591 response.mustcontain(no=['Confirm to delete this pull request'])
1588 response.mustcontain(no=['Confirm to delete this pull request'])
1592
1589
1593 def test_delete_comment_returns_404_if_comment_does_not_exist(
1590 def test_delete_comment_returns_404_if_comment_does_not_exist(
1594 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1591 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1595
1592
1596 pull_request = pr_util.create_pull_request(
1593 pull_request = pr_util.create_pull_request(
1597 author=user_admin.username, enable_notifications=False)
1594 author=user_admin.username, enable_notifications=False)
1598
1595
1599 self.app.post(
1596 self.app.post(
1600 route_path(
1597 route_path(
1601 'pullrequest_comment_delete',
1598 'pullrequest_comment_delete',
1602 repo_name=pull_request.target_repo.scm_instance().name,
1599 repo_name=pull_request.target_repo.scm_instance().name,
1603 pull_request_id=pull_request.pull_request_id,
1600 pull_request_id=pull_request.pull_request_id,
1604 comment_id=1024404),
1601 comment_id=1024404),
1605 extra_environ=xhr_header,
1602 extra_environ=xhr_header,
1606 params={'csrf_token': csrf_token},
1603 params={'csrf_token': csrf_token},
1607 status=404
1604 status=404
1608 )
1605 )
1609
1606
1610 def test_delete_comment(
1607 def test_delete_comment(
1611 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1608 self, autologin_user, pr_util, user_admin, csrf_token, xhr_header):
1612
1609
1613 pull_request = pr_util.create_pull_request(
1610 pull_request = pr_util.create_pull_request(
1614 author=user_admin.username, enable_notifications=False)
1611 author=user_admin.username, enable_notifications=False)
1615 comment = pr_util.create_comment()
1612 comment = pr_util.create_comment()
1616 comment_id = comment.comment_id
1613 comment_id = comment.comment_id
1617
1614
1618 response = self.app.post(
1615 response = self.app.post(
1619 route_path(
1616 route_path(
1620 'pullrequest_comment_delete',
1617 'pullrequest_comment_delete',
1621 repo_name=pull_request.target_repo.scm_instance().name,
1618 repo_name=pull_request.target_repo.scm_instance().name,
1622 pull_request_id=pull_request.pull_request_id,
1619 pull_request_id=pull_request.pull_request_id,
1623 comment_id=comment_id),
1620 comment_id=comment_id),
1624 extra_environ=xhr_header,
1621 extra_environ=xhr_header,
1625 params={'csrf_token': csrf_token},
1622 params={'csrf_token': csrf_token},
1626 status=200
1623 status=200
1627 )
1624 )
1628 assert response.body == 'true'
1625 assert response.body == 'true'
1629
1626
1630 @pytest.mark.parametrize('url_type', [
1627 @pytest.mark.parametrize('url_type', [
1631 'pullrequest_new',
1628 'pullrequest_new',
1632 'pullrequest_create',
1629 'pullrequest_create',
1633 'pullrequest_update',
1630 'pullrequest_update',
1634 'pullrequest_merge',
1631 'pullrequest_merge',
1635 ])
1632 ])
1636 def test_pull_request_is_forbidden_on_archived_repo(
1633 def test_pull_request_is_forbidden_on_archived_repo(
1637 self, autologin_user, backend, xhr_header, user_util, url_type):
1634 self, autologin_user, backend, xhr_header, user_util, url_type):
1638
1635
1639 # create a temporary repo
1636 # create a temporary repo
1640 source = user_util.create_repo(repo_type=backend.alias)
1637 source = user_util.create_repo(repo_type=backend.alias)
1641 repo_name = source.repo_name
1638 repo_name = source.repo_name
1642 repo = Repository.get_by_repo_name(repo_name)
1639 repo = Repository.get_by_repo_name(repo_name)
1643 repo.archived = True
1640 repo.archived = True
1644 Session().commit()
1641 Session().commit()
1645
1642
1646 response = self.app.get(
1643 response = self.app.get(
1647 route_path(url_type, repo_name=repo_name, pull_request_id=1), status=302)
1644 route_path(url_type, repo_name=repo_name, pull_request_id=1), status=302)
1648
1645
1649 msg = 'Action not supported for archived repository.'
1646 msg = 'Action not supported for archived repository.'
1650 assert_session_flash(response, msg)
1647 assert_session_flash(response, msg)
1651
1648
1652
1649
1653 def assert_pull_request_status(pull_request, expected_status):
1650 def assert_pull_request_status(pull_request, expected_status):
1654 status = ChangesetStatusModel().calculated_review_status(pull_request=pull_request)
1651 status = ChangesetStatusModel().calculated_review_status(pull_request=pull_request)
1655 assert status == expected_status
1652 assert status == expected_status
1656
1653
1657
1654
1658 @pytest.mark.parametrize('route', ['pullrequest_new', 'pullrequest_create'])
1655 @pytest.mark.parametrize('route', ['pullrequest_new', 'pullrequest_create'])
1659 @pytest.mark.usefixtures("autologin_user")
1656 @pytest.mark.usefixtures("autologin_user")
1660 def test_forbidde_to_repo_summary_for_svn_repositories(backend_svn, app, route):
1657 def test_forbidde_to_repo_summary_for_svn_repositories(backend_svn, app, route):
1661 app.get(route_path(route, repo_name=backend_svn.repo_name), status=404)
1658 app.get(route_path(route, repo_name=backend_svn.repo_name), status=404)
@@ -1,111 +1,113 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 from rhodecode.lib import helpers as h, rc_cache
21 from rhodecode.lib import helpers as h, rc_cache
22 from rhodecode.lib.utils2 import safe_int
22 from rhodecode.lib.utils2 import safe_int
23 from rhodecode.model.pull_request import get_diff_info
23 from rhodecode.model.pull_request import get_diff_info
24 from rhodecode.model.db import PullRequestReviewers
24 from rhodecode.model.db import PullRequestReviewers
25 # V3 - Reviewers, with default rules data
25 # V3 - Reviewers, with default rules data
26 # v4 - Added observers metadata
26 # v4 - Added observers metadata
27 REVIEWER_API_VERSION = 'V4'
27 # v5 - pr_author/commit_author include/exclude logic
28 REVIEWER_API_VERSION = 'V5'
28
29
29
30
30 def reviewer_as_json(user, reasons=None, role=None, mandatory=False, rules=None, user_group=None):
31 def reviewer_as_json(user, reasons=None, role=None, mandatory=False, rules=None, user_group=None):
31 """
32 """
32 Returns json struct of a reviewer for frontend
33 Returns json struct of a reviewer for frontend
33
34
34 :param user: the reviewer
35 :param user: the reviewer
35 :param reasons: list of strings of why they are reviewers
36 :param reasons: list of strings of why they are reviewers
36 :param mandatory: bool, to set user as mandatory
37 :param mandatory: bool, to set user as mandatory
37 """
38 """
38 role = role or PullRequestReviewers.ROLE_REVIEWER
39 role = role or PullRequestReviewers.ROLE_REVIEWER
39 if role not in PullRequestReviewers.ROLES:
40 if role not in PullRequestReviewers.ROLES:
40 raise ValueError('role is not one of %s', PullRequestReviewers.ROLES)
41 raise ValueError('role is not one of %s', PullRequestReviewers.ROLES)
41
42
42 return {
43 return {
43 'user_id': user.user_id,
44 'user_id': user.user_id,
44 'reasons': reasons or [],
45 'reasons': reasons or [],
45 'rules': rules or [],
46 'rules': rules or [],
46 'role': role,
47 'role': role,
47 'mandatory': mandatory,
48 'mandatory': mandatory,
48 'user_group': user_group,
49 'user_group': user_group,
49 'username': user.username,
50 'username': user.username,
50 'first_name': user.first_name,
51 'first_name': user.first_name,
51 'last_name': user.last_name,
52 'last_name': user.last_name,
52 'user_link': h.link_to_user(user),
53 'user_link': h.link_to_user(user),
53 'gravatar_link': h.gravatar_url(user.email, 14),
54 'gravatar_link': h.gravatar_url(user.email, 14),
54 }
55 }
55
56
56
57
57 def to_reviewers(e):
58 def to_reviewers(e):
58 if isinstance(e, (tuple, list)):
59 if isinstance(e, (tuple, list)):
59 return map(reviewer_as_json, e)
60 return map(reviewer_as_json, e)
60 else:
61 else:
61 return reviewer_as_json(e)
62 return reviewer_as_json(e)
62
63
63
64
64 def get_default_reviewers_data(current_user, source_repo, source_ref, target_repo, target_ref,
65 def get_default_reviewers_data(current_user, source_repo, source_ref, target_repo, target_ref,
65 include_diff_info=True):
66 include_diff_info=True):
66 """
67 """
67 Return json for default reviewers of a repository
68 Return json for default reviewers of a repository
68 """
69 """
69
70
70 diff_info = {}
71 diff_info = {}
71 if include_diff_info:
72 if include_diff_info:
72 diff_info = get_diff_info(
73 diff_info = get_diff_info(
73 source_repo, source_ref.commit_id, target_repo, target_ref.commit_id)
74 source_repo, source_ref.commit_id, target_repo, target_ref.commit_id)
74
75
75 reasons = ['Default reviewer', 'Repository owner']
76 reasons = ['Default reviewer', 'Repository owner']
76 json_reviewers = [reviewer_as_json(
77 json_reviewers = [reviewer_as_json(
77 user=target_repo.user, reasons=reasons, mandatory=False, rules=None, role=None)]
78 user=target_repo.user, reasons=reasons, mandatory=False, rules=None, role=None)]
78
79
79 compute_key = rc_cache.utils.compute_key_from_params(
80 compute_key = rc_cache.utils.compute_key_from_params(
80 current_user.user_id, source_repo.repo_id, source_ref.type, source_ref.name,
81 current_user.user_id, source_repo.repo_id, source_ref.type, source_ref.name,
81 source_ref.commit_id, target_repo.repo_id, target_ref.type, target_ref.name,
82 source_ref.commit_id, target_repo.repo_id, target_ref.type, target_ref.name,
82 target_ref.commit_id)
83 target_ref.commit_id)
83
84
84 return {
85 return {
85 'api_ver': REVIEWER_API_VERSION, # define version for later possible schema upgrade
86 'api_ver': REVIEWER_API_VERSION, # define version for later possible schema upgrade
86 'compute_key': compute_key,
87 'compute_key': compute_key,
87 'diff_info': diff_info,
88 'diff_info': diff_info,
88 'reviewers': json_reviewers,
89 'reviewers': json_reviewers,
89 'rules': {},
90 'rules': {},
90 'rules_data': {},
91 'rules_data': {},
92 'rules_humanized': [],
91 }
93 }
92
94
93
95
94 def validate_default_reviewers(review_members, reviewer_rules):
96 def validate_default_reviewers(review_members, reviewer_rules):
95 """
97 """
96 Function to validate submitted reviewers against the saved rules
98 Function to validate submitted reviewers against the saved rules
97 """
99 """
98 reviewers = []
100 reviewers = []
99 reviewer_by_id = {}
101 reviewer_by_id = {}
100 for r in review_members:
102 for r in review_members:
101 reviewer_user_id = safe_int(r['user_id'])
103 reviewer_user_id = safe_int(r['user_id'])
102 entry = (reviewer_user_id, r['reasons'], r['mandatory'], r['role'], r['rules'])
104 entry = (reviewer_user_id, r['reasons'], r['mandatory'], r['role'], r['rules'])
103
105
104 reviewer_by_id[reviewer_user_id] = entry
106 reviewer_by_id[reviewer_user_id] = entry
105 reviewers.append(entry)
107 reviewers.append(entry)
106
108
107 return reviewers
109 return reviewers
108
110
109
111
110 def validate_observers(observer_members, reviewer_rules):
112 def validate_observers(observer_members, reviewer_rules):
111 return {}
113 return {}
@@ -1,791 +1,852 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import collections
22 import collections
23
23
24 from pyramid.httpexceptions import (
24 from pyramid.httpexceptions import (
25 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
25 HTTPNotFound, HTTPBadRequest, HTTPFound, HTTPForbidden, HTTPConflict)
26 from pyramid.view import view_config
26 from pyramid.view import view_config
27 from pyramid.renderers import render
27 from pyramid.renderers import render
28 from pyramid.response import Response
28 from pyramid.response import Response
29
29
30 from rhodecode.apps._base import RepoAppView
30 from rhodecode.apps._base import RepoAppView
31 from rhodecode.apps.file_store import utils as store_utils
31 from rhodecode.apps.file_store import utils as store_utils
32 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
32 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, FileOverSizeException
33
33
34 from rhodecode.lib import diffs, codeblocks, channelstream
34 from rhodecode.lib import diffs, codeblocks, channelstream
35 from rhodecode.lib.auth import (
35 from rhodecode.lib.auth import (
36 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
36 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, CSRFRequired)
37 from rhodecode.lib.ext_json import json
37 from rhodecode.lib.ext_json import json
38 from rhodecode.lib.compat import OrderedDict
38 from rhodecode.lib.compat import OrderedDict
39 from rhodecode.lib.diffs import (
39 from rhodecode.lib.diffs import (
40 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
40 cache_diff, load_cached_diff, diff_cache_exist, get_diff_context,
41 get_diff_whitespace_flag)
41 get_diff_whitespace_flag)
42 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
42 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError, CommentVersionMismatch
43 import rhodecode.lib.helpers as h
43 import rhodecode.lib.helpers as h
44 from rhodecode.lib.utils2 import safe_unicode, str2bool, StrictAttributeDict
44 from rhodecode.lib.utils2 import safe_unicode, str2bool, StrictAttributeDict
45 from rhodecode.lib.vcs.backends.base import EmptyCommit
45 from rhodecode.lib.vcs.backends.base import EmptyCommit
46 from rhodecode.lib.vcs.exceptions import (
46 from rhodecode.lib.vcs.exceptions import (
47 RepositoryError, CommitDoesNotExistError)
47 RepositoryError, CommitDoesNotExistError)
48 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
48 from rhodecode.model.db import ChangesetComment, ChangesetStatus, FileStore, \
49 ChangesetCommentHistory
49 ChangesetCommentHistory
50 from rhodecode.model.changeset_status import ChangesetStatusModel
50 from rhodecode.model.changeset_status import ChangesetStatusModel
51 from rhodecode.model.comment import CommentsModel
51 from rhodecode.model.comment import CommentsModel
52 from rhodecode.model.meta import Session
52 from rhodecode.model.meta import Session
53 from rhodecode.model.settings import VcsSettingsModel
53 from rhodecode.model.settings import VcsSettingsModel
54
54
55 log = logging.getLogger(__name__)
55 log = logging.getLogger(__name__)
56
56
57
57
58 def _update_with_GET(params, request):
58 def _update_with_GET(params, request):
59 for k in ['diff1', 'diff2', 'diff']:
59 for k in ['diff1', 'diff2', 'diff']:
60 params[k] += request.GET.getall(k)
60 params[k] += request.GET.getall(k)
61
61
62
62
63 class RepoCommitsView(RepoAppView):
63 class RepoCommitsView(RepoAppView):
64 def load_default_context(self):
64 def load_default_context(self):
65 c = self._get_local_tmpl_context(include_app_defaults=True)
65 c = self._get_local_tmpl_context(include_app_defaults=True)
66 c.rhodecode_repo = self.rhodecode_vcs_repo
66 c.rhodecode_repo = self.rhodecode_vcs_repo
67
67
68 return c
68 return c
69
69
70 def _is_diff_cache_enabled(self, target_repo):
70 def _is_diff_cache_enabled(self, target_repo):
71 caching_enabled = self._get_general_setting(
71 caching_enabled = self._get_general_setting(
72 target_repo, 'rhodecode_diff_cache')
72 target_repo, 'rhodecode_diff_cache')
73 log.debug('Diff caching enabled: %s', caching_enabled)
73 log.debug('Diff caching enabled: %s', caching_enabled)
74 return caching_enabled
74 return caching_enabled
75
75
76 def _commit(self, commit_id_range, method):
76 def _commit(self, commit_id_range, method):
77 _ = self.request.translate
77 _ = self.request.translate
78 c = self.load_default_context()
78 c = self.load_default_context()
79 c.fulldiff = self.request.GET.get('fulldiff')
79 c.fulldiff = self.request.GET.get('fulldiff')
80 redirect_to_combined = str2bool(self.request.GET.get('redirect_combined'))
80
81
81 # fetch global flags of ignore ws or context lines
82 # fetch global flags of ignore ws or context lines
82 diff_context = get_diff_context(self.request)
83 diff_context = get_diff_context(self.request)
83 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
84 hide_whitespace_changes = get_diff_whitespace_flag(self.request)
84
85
85 # diff_limit will cut off the whole diff if the limit is applied
86 # diff_limit will cut off the whole diff if the limit is applied
86 # otherwise it will just hide the big files from the front-end
87 # otherwise it will just hide the big files from the front-end
87 diff_limit = c.visual.cut_off_limit_diff
88 diff_limit = c.visual.cut_off_limit_diff
88 file_limit = c.visual.cut_off_limit_file
89 file_limit = c.visual.cut_off_limit_file
89
90
90 # get ranges of commit ids if preset
91 # get ranges of commit ids if preset
91 commit_range = commit_id_range.split('...')[:2]
92 commit_range = commit_id_range.split('...')[:2]
92
93
93 try:
94 try:
94 pre_load = ['affected_files', 'author', 'branch', 'date',
95 pre_load = ['affected_files', 'author', 'branch', 'date',
95 'message', 'parents']
96 'message', 'parents']
96 if self.rhodecode_vcs_repo.alias == 'hg':
97 if self.rhodecode_vcs_repo.alias == 'hg':
97 pre_load += ['hidden', 'obsolete', 'phase']
98 pre_load += ['hidden', 'obsolete', 'phase']
98
99
99 if len(commit_range) == 2:
100 if len(commit_range) == 2:
100 commits = self.rhodecode_vcs_repo.get_commits(
101 commits = self.rhodecode_vcs_repo.get_commits(
101 start_id=commit_range[0], end_id=commit_range[1],
102 start_id=commit_range[0], end_id=commit_range[1],
102 pre_load=pre_load, translate_tags=False)
103 pre_load=pre_load, translate_tags=False)
103 commits = list(commits)
104 commits = list(commits)
104 else:
105 else:
105 commits = [self.rhodecode_vcs_repo.get_commit(
106 commits = [self.rhodecode_vcs_repo.get_commit(
106 commit_id=commit_id_range, pre_load=pre_load)]
107 commit_id=commit_id_range, pre_load=pre_load)]
107
108
108 c.commit_ranges = commits
109 c.commit_ranges = commits
109 if not c.commit_ranges:
110 if not c.commit_ranges:
110 raise RepositoryError('The commit range returned an empty result')
111 raise RepositoryError('The commit range returned an empty result')
111 except CommitDoesNotExistError as e:
112 except CommitDoesNotExistError as e:
112 msg = _('No such commit exists. Org exception: `{}`').format(e)
113 msg = _('No such commit exists. Org exception: `{}`').format(e)
113 h.flash(msg, category='error')
114 h.flash(msg, category='error')
114 raise HTTPNotFound()
115 raise HTTPNotFound()
115 except Exception:
116 except Exception:
116 log.exception("General failure")
117 log.exception("General failure")
117 raise HTTPNotFound()
118 raise HTTPNotFound()
118 single_commit = len(c.commit_ranges) == 1
119 single_commit = len(c.commit_ranges) == 1
119
120
121 if redirect_to_combined and not single_commit:
122 source_ref = getattr(c.commit_ranges[0].parents[0]
123 if c.commit_ranges[0].parents else h.EmptyCommit(), 'raw_id')
124 target_ref = c.commit_ranges[-1].raw_id
125 next_url = h.route_path(
126 'repo_compare',
127 repo_name=c.repo_name,
128 source_ref_type='rev',
129 source_ref=source_ref,
130 target_ref_type='rev',
131 target_ref=target_ref)
132 raise HTTPFound(next_url)
133
120 c.changes = OrderedDict()
134 c.changes = OrderedDict()
121 c.lines_added = 0
135 c.lines_added = 0
122 c.lines_deleted = 0
136 c.lines_deleted = 0
123
137
124 # auto collapse if we have more than limit
138 # auto collapse if we have more than limit
125 collapse_limit = diffs.DiffProcessor._collapse_commits_over
139 collapse_limit = diffs.DiffProcessor._collapse_commits_over
126 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
140 c.collapse_all_commits = len(c.commit_ranges) > collapse_limit
127
141
128 c.commit_statuses = ChangesetStatus.STATUSES
142 c.commit_statuses = ChangesetStatus.STATUSES
129 c.inline_comments = []
143 c.inline_comments = []
130 c.files = []
144 c.files = []
131
145
132 c.comments = []
146 c.comments = []
133 c.unresolved_comments = []
147 c.unresolved_comments = []
134 c.resolved_comments = []
148 c.resolved_comments = []
135
149
136 # Single commit
150 # Single commit
137 if single_commit:
151 if single_commit:
138 commit = c.commit_ranges[0]
152 commit = c.commit_ranges[0]
139 c.comments = CommentsModel().get_comments(
153 c.comments = CommentsModel().get_comments(
140 self.db_repo.repo_id,
154 self.db_repo.repo_id,
141 revision=commit.raw_id)
155 revision=commit.raw_id)
142
156
143 # comments from PR
157 # comments from PR
144 statuses = ChangesetStatusModel().get_statuses(
158 statuses = ChangesetStatusModel().get_statuses(
145 self.db_repo.repo_id, commit.raw_id,
159 self.db_repo.repo_id, commit.raw_id,
146 with_revisions=True)
160 with_revisions=True)
147
161
148 prs = set()
162 prs = set()
149 reviewers = list()
163 reviewers = list()
150 reviewers_duplicates = set() # to not have duplicates from multiple votes
164 reviewers_duplicates = set() # to not have duplicates from multiple votes
151 for c_status in statuses:
165 for c_status in statuses:
152
166
153 # extract associated pull-requests from votes
167 # extract associated pull-requests from votes
154 if c_status.pull_request:
168 if c_status.pull_request:
155 prs.add(c_status.pull_request)
169 prs.add(c_status.pull_request)
156
170
157 # extract reviewers
171 # extract reviewers
158 _user_id = c_status.author.user_id
172 _user_id = c_status.author.user_id
159 if _user_id not in reviewers_duplicates:
173 if _user_id not in reviewers_duplicates:
160 reviewers.append(
174 reviewers.append(
161 StrictAttributeDict({
175 StrictAttributeDict({
162 'user': c_status.author,
176 'user': c_status.author,
163
177
164 # fake attributed for commit, page that we don't have
178 # fake attributed for commit, page that we don't have
165 # but we share the display with PR page
179 # but we share the display with PR page
166 'mandatory': False,
180 'mandatory': False,
167 'reasons': [],
181 'reasons': [],
168 'rule_user_group_data': lambda: None
182 'rule_user_group_data': lambda: None
169 })
183 })
170 )
184 )
171 reviewers_duplicates.add(_user_id)
185 reviewers_duplicates.add(_user_id)
172
186
173 c.reviewers_count = len(reviewers)
187 c.reviewers_count = len(reviewers)
174 c.observers_count = 0
188 c.observers_count = 0
175
189
176 # from associated statuses, check the pull requests, and
190 # from associated statuses, check the pull requests, and
177 # show comments from them
191 # show comments from them
178 for pr in prs:
192 for pr in prs:
179 c.comments.extend(pr.comments)
193 c.comments.extend(pr.comments)
180
194
181 c.unresolved_comments = CommentsModel()\
195 c.unresolved_comments = CommentsModel()\
182 .get_commit_unresolved_todos(commit.raw_id)
196 .get_commit_unresolved_todos(commit.raw_id)
183 c.resolved_comments = CommentsModel()\
197 c.resolved_comments = CommentsModel()\
184 .get_commit_resolved_todos(commit.raw_id)
198 .get_commit_resolved_todos(commit.raw_id)
185
199
186 c.inline_comments_flat = CommentsModel()\
200 c.inline_comments_flat = CommentsModel()\
187 .get_commit_inline_comments(commit.raw_id)
201 .get_commit_inline_comments(commit.raw_id)
188
202
189 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
203 review_statuses = ChangesetStatusModel().aggregate_votes_by_user(
190 statuses, reviewers)
204 statuses, reviewers)
191
205
192 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
206 c.commit_review_status = ChangesetStatus.STATUS_NOT_REVIEWED
193
207
194 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
208 c.commit_set_reviewers_data_json = collections.OrderedDict({'reviewers': []})
195
209
196 for review_obj, member, reasons, mandatory, status in review_statuses:
210 for review_obj, member, reasons, mandatory, status in review_statuses:
197 member_reviewer = h.reviewer_as_json(
211 member_reviewer = h.reviewer_as_json(
198 member, reasons=reasons, mandatory=mandatory, role=None,
212 member, reasons=reasons, mandatory=mandatory, role=None,
199 user_group=None
213 user_group=None
200 )
214 )
201
215
202 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
216 current_review_status = status[0][1].status if status else ChangesetStatus.STATUS_NOT_REVIEWED
203 member_reviewer['review_status'] = current_review_status
217 member_reviewer['review_status'] = current_review_status
204 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
218 member_reviewer['review_status_label'] = h.commit_status_lbl(current_review_status)
205 member_reviewer['allowed_to_update'] = False
219 member_reviewer['allowed_to_update'] = False
206 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
220 c.commit_set_reviewers_data_json['reviewers'].append(member_reviewer)
207
221
208 c.commit_set_reviewers_data_json = json.dumps(c.commit_set_reviewers_data_json)
222 c.commit_set_reviewers_data_json = json.dumps(c.commit_set_reviewers_data_json)
209
223
210 # NOTE(marcink): this uses the same voting logic as in pull-requests
224 # NOTE(marcink): this uses the same voting logic as in pull-requests
211 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
225 c.commit_review_status = ChangesetStatusModel().calculate_status(review_statuses)
212 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
226 c.commit_broadcast_channel = channelstream.comment_channel(c.repo_name, commit_obj=commit)
213
227
214 diff = None
228 diff = None
215 # Iterate over ranges (default commit view is always one commit)
229 # Iterate over ranges (default commit view is always one commit)
216 for commit in c.commit_ranges:
230 for commit in c.commit_ranges:
217 c.changes[commit.raw_id] = []
231 c.changes[commit.raw_id] = []
218
232
219 commit2 = commit
233 commit2 = commit
220 commit1 = commit.first_parent
234 commit1 = commit.first_parent
221
235
222 if method == 'show':
236 if method == 'show':
223 inline_comments = CommentsModel().get_inline_comments(
237 inline_comments = CommentsModel().get_inline_comments(
224 self.db_repo.repo_id, revision=commit.raw_id)
238 self.db_repo.repo_id, revision=commit.raw_id)
225 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
239 c.inline_cnt = len(CommentsModel().get_inline_comments_as_list(
226 inline_comments))
240 inline_comments))
227 c.inline_comments = inline_comments
241 c.inline_comments = inline_comments
228
242
229 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
243 cache_path = self.rhodecode_vcs_repo.get_create_shadow_cache_pr_path(
230 self.db_repo)
244 self.db_repo)
231 cache_file_path = diff_cache_exist(
245 cache_file_path = diff_cache_exist(
232 cache_path, 'diff', commit.raw_id,
246 cache_path, 'diff', commit.raw_id,
233 hide_whitespace_changes, diff_context, c.fulldiff)
247 hide_whitespace_changes, diff_context, c.fulldiff)
234
248
235 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
249 caching_enabled = self._is_diff_cache_enabled(self.db_repo)
236 force_recache = str2bool(self.request.GET.get('force_recache'))
250 force_recache = str2bool(self.request.GET.get('force_recache'))
237
251
238 cached_diff = None
252 cached_diff = None
239 if caching_enabled:
253 if caching_enabled:
240 cached_diff = load_cached_diff(cache_file_path)
254 cached_diff = load_cached_diff(cache_file_path)
241
255
242 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
256 has_proper_diff_cache = cached_diff and cached_diff.get('diff')
243 if not force_recache and has_proper_diff_cache:
257 if not force_recache and has_proper_diff_cache:
244 diffset = cached_diff['diff']
258 diffset = cached_diff['diff']
245 else:
259 else:
246 vcs_diff = self.rhodecode_vcs_repo.get_diff(
260 vcs_diff = self.rhodecode_vcs_repo.get_diff(
247 commit1, commit2,
261 commit1, commit2,
248 ignore_whitespace=hide_whitespace_changes,
262 ignore_whitespace=hide_whitespace_changes,
249 context=diff_context)
263 context=diff_context)
250
264
251 diff_processor = diffs.DiffProcessor(
265 diff_processor = diffs.DiffProcessor(
252 vcs_diff, format='newdiff', diff_limit=diff_limit,
266 vcs_diff, format='newdiff', diff_limit=diff_limit,
253 file_limit=file_limit, show_full_diff=c.fulldiff)
267 file_limit=file_limit, show_full_diff=c.fulldiff)
254
268
255 _parsed = diff_processor.prepare()
269 _parsed = diff_processor.prepare()
256
270
257 diffset = codeblocks.DiffSet(
271 diffset = codeblocks.DiffSet(
258 repo_name=self.db_repo_name,
272 repo_name=self.db_repo_name,
259 source_node_getter=codeblocks.diffset_node_getter(commit1),
273 source_node_getter=codeblocks.diffset_node_getter(commit1),
260 target_node_getter=codeblocks.diffset_node_getter(commit2))
274 target_node_getter=codeblocks.diffset_node_getter(commit2))
261
275
262 diffset = self.path_filter.render_patchset_filtered(
276 diffset = self.path_filter.render_patchset_filtered(
263 diffset, _parsed, commit1.raw_id, commit2.raw_id)
277 diffset, _parsed, commit1.raw_id, commit2.raw_id)
264
278
265 # save cached diff
279 # save cached diff
266 if caching_enabled:
280 if caching_enabled:
267 cache_diff(cache_file_path, diffset, None)
281 cache_diff(cache_file_path, diffset, None)
268
282
269 c.limited_diff = diffset.limited_diff
283 c.limited_diff = diffset.limited_diff
270 c.changes[commit.raw_id] = diffset
284 c.changes[commit.raw_id] = diffset
271 else:
285 else:
272 # TODO(marcink): no cache usage here...
286 # TODO(marcink): no cache usage here...
273 _diff = self.rhodecode_vcs_repo.get_diff(
287 _diff = self.rhodecode_vcs_repo.get_diff(
274 commit1, commit2,
288 commit1, commit2,
275 ignore_whitespace=hide_whitespace_changes, context=diff_context)
289 ignore_whitespace=hide_whitespace_changes, context=diff_context)
276 diff_processor = diffs.DiffProcessor(
290 diff_processor = diffs.DiffProcessor(
277 _diff, format='newdiff', diff_limit=diff_limit,
291 _diff, format='newdiff', diff_limit=diff_limit,
278 file_limit=file_limit, show_full_diff=c.fulldiff)
292 file_limit=file_limit, show_full_diff=c.fulldiff)
279 # downloads/raw we only need RAW diff nothing else
293 # downloads/raw we only need RAW diff nothing else
280 diff = self.path_filter.get_raw_patch(diff_processor)
294 diff = self.path_filter.get_raw_patch(diff_processor)
281 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
295 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
282
296
283 # sort comments by how they were generated
297 # sort comments by how they were generated
284 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
298 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
285 c.at_version_num = None
299 c.at_version_num = None
286
300
287 if len(c.commit_ranges) == 1:
301 if len(c.commit_ranges) == 1:
288 c.commit = c.commit_ranges[0]
302 c.commit = c.commit_ranges[0]
289 c.parent_tmpl = ''.join(
303 c.parent_tmpl = ''.join(
290 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
304 '# Parent %s\n' % x.raw_id for x in c.commit.parents)
291
305
292 if method == 'download':
306 if method == 'download':
293 response = Response(diff)
307 response = Response(diff)
294 response.content_type = 'text/plain'
308 response.content_type = 'text/plain'
295 response.content_disposition = (
309 response.content_disposition = (
296 'attachment; filename=%s.diff' % commit_id_range[:12])
310 'attachment; filename=%s.diff' % commit_id_range[:12])
297 return response
311 return response
298 elif method == 'patch':
312 elif method == 'patch':
299 c.diff = safe_unicode(diff)
313 c.diff = safe_unicode(diff)
300 patch = render(
314 patch = render(
301 'rhodecode:templates/changeset/patch_changeset.mako',
315 'rhodecode:templates/changeset/patch_changeset.mako',
302 self._get_template_context(c), self.request)
316 self._get_template_context(c), self.request)
303 response = Response(patch)
317 response = Response(patch)
304 response.content_type = 'text/plain'
318 response.content_type = 'text/plain'
305 return response
319 return response
306 elif method == 'raw':
320 elif method == 'raw':
307 response = Response(diff)
321 response = Response(diff)
308 response.content_type = 'text/plain'
322 response.content_type = 'text/plain'
309 return response
323 return response
310 elif method == 'show':
324 elif method == 'show':
311 if len(c.commit_ranges) == 1:
325 if len(c.commit_ranges) == 1:
312 html = render(
326 html = render(
313 'rhodecode:templates/changeset/changeset.mako',
327 'rhodecode:templates/changeset/changeset.mako',
314 self._get_template_context(c), self.request)
328 self._get_template_context(c), self.request)
315 return Response(html)
329 return Response(html)
316 else:
330 else:
317 c.ancestor = None
331 c.ancestor = None
318 c.target_repo = self.db_repo
332 c.target_repo = self.db_repo
319 html = render(
333 html = render(
320 'rhodecode:templates/changeset/changeset_range.mako',
334 'rhodecode:templates/changeset/changeset_range.mako',
321 self._get_template_context(c), self.request)
335 self._get_template_context(c), self.request)
322 return Response(html)
336 return Response(html)
323
337
324 raise HTTPBadRequest()
338 raise HTTPBadRequest()
325
339
326 @LoginRequired()
340 @LoginRequired()
327 @HasRepoPermissionAnyDecorator(
341 @HasRepoPermissionAnyDecorator(
328 'repository.read', 'repository.write', 'repository.admin')
342 'repository.read', 'repository.write', 'repository.admin')
329 @view_config(
343 @view_config(
330 route_name='repo_commit', request_method='GET',
344 route_name='repo_commit', request_method='GET',
331 renderer=None)
345 renderer=None)
332 def repo_commit_show(self):
346 def repo_commit_show(self):
333 commit_id = self.request.matchdict['commit_id']
347 commit_id = self.request.matchdict['commit_id']
334 return self._commit(commit_id, method='show')
348 return self._commit(commit_id, method='show')
335
349
336 @LoginRequired()
350 @LoginRequired()
337 @HasRepoPermissionAnyDecorator(
351 @HasRepoPermissionAnyDecorator(
338 'repository.read', 'repository.write', 'repository.admin')
352 'repository.read', 'repository.write', 'repository.admin')
339 @view_config(
353 @view_config(
340 route_name='repo_commit_raw', request_method='GET',
354 route_name='repo_commit_raw', request_method='GET',
341 renderer=None)
355 renderer=None)
342 @view_config(
356 @view_config(
343 route_name='repo_commit_raw_deprecated', request_method='GET',
357 route_name='repo_commit_raw_deprecated', request_method='GET',
344 renderer=None)
358 renderer=None)
345 def repo_commit_raw(self):
359 def repo_commit_raw(self):
346 commit_id = self.request.matchdict['commit_id']
360 commit_id = self.request.matchdict['commit_id']
347 return self._commit(commit_id, method='raw')
361 return self._commit(commit_id, method='raw')
348
362
349 @LoginRequired()
363 @LoginRequired()
350 @HasRepoPermissionAnyDecorator(
364 @HasRepoPermissionAnyDecorator(
351 'repository.read', 'repository.write', 'repository.admin')
365 'repository.read', 'repository.write', 'repository.admin')
352 @view_config(
366 @view_config(
353 route_name='repo_commit_patch', request_method='GET',
367 route_name='repo_commit_patch', request_method='GET',
354 renderer=None)
368 renderer=None)
355 def repo_commit_patch(self):
369 def repo_commit_patch(self):
356 commit_id = self.request.matchdict['commit_id']
370 commit_id = self.request.matchdict['commit_id']
357 return self._commit(commit_id, method='patch')
371 return self._commit(commit_id, method='patch')
358
372
359 @LoginRequired()
373 @LoginRequired()
360 @HasRepoPermissionAnyDecorator(
374 @HasRepoPermissionAnyDecorator(
361 'repository.read', 'repository.write', 'repository.admin')
375 'repository.read', 'repository.write', 'repository.admin')
362 @view_config(
376 @view_config(
363 route_name='repo_commit_download', request_method='GET',
377 route_name='repo_commit_download', request_method='GET',
364 renderer=None)
378 renderer=None)
365 def repo_commit_download(self):
379 def repo_commit_download(self):
366 commit_id = self.request.matchdict['commit_id']
380 commit_id = self.request.matchdict['commit_id']
367 return self._commit(commit_id, method='download')
381 return self._commit(commit_id, method='download')
368
382
383 def _commit_comments_create(self, commit_id, comments):
384 _ = self.request.translate
385 data = {}
386 if not comments:
387 return
388
389 commit = self.db_repo.get_commit(commit_id)
390
391 all_drafts = len([x for x in comments if str2bool(x['is_draft'])]) == len(comments)
392 for entry in comments:
393 c = self.load_default_context()
394 comment_type = entry['comment_type']
395 text = entry['text']
396 status = entry['status']
397 is_draft = str2bool(entry['is_draft'])
398 resolves_comment_id = entry['resolves_comment_id']
399 f_path = entry['f_path']
400 line_no = entry['line']
401 target_elem_id = 'file-{}'.format(h.safeid(h.safe_unicode(f_path)))
402
403 if status:
404 text = text or (_('Status change %(transition_icon)s %(status)s')
405 % {'transition_icon': '>',
406 'status': ChangesetStatus.get_status_lbl(status)})
407
408 comment = CommentsModel().create(
409 text=text,
410 repo=self.db_repo.repo_id,
411 user=self._rhodecode_db_user.user_id,
412 commit_id=commit_id,
413 f_path=f_path,
414 line_no=line_no,
415 status_change=(ChangesetStatus.get_status_lbl(status)
416 if status else None),
417 status_change_type=status,
418 comment_type=comment_type,
419 is_draft=is_draft,
420 resolves_comment_id=resolves_comment_id,
421 auth_user=self._rhodecode_user,
422 send_email=not is_draft, # skip notification for draft comments
423 )
424 is_inline = comment.is_inline
425
426 # get status if set !
427 if status:
428 # `dont_allow_on_closed_pull_request = True` means
429 # if latest status was from pull request and it's closed
430 # disallow changing status !
431
432 try:
433 ChangesetStatusModel().set_status(
434 self.db_repo.repo_id,
435 status,
436 self._rhodecode_db_user.user_id,
437 comment,
438 revision=commit_id,
439 dont_allow_on_closed_pull_request=True
440 )
441 except StatusChangeOnClosedPullRequestError:
442 msg = _('Changing the status of a commit associated with '
443 'a closed pull request is not allowed')
444 log.exception(msg)
445 h.flash(msg, category='warning')
446 raise HTTPFound(h.route_path(
447 'repo_commit', repo_name=self.db_repo_name,
448 commit_id=commit_id))
449
450 Session().flush()
451 # this is somehow required to get access to some relationship
452 # loaded on comment
453 Session().refresh(comment)
454
455 # skip notifications for drafts
456 if not is_draft:
457 CommentsModel().trigger_commit_comment_hook(
458 self.db_repo, self._rhodecode_user, 'create',
459 data={'comment': comment, 'commit': commit})
460
461 comment_id = comment.comment_id
462 data[comment_id] = {
463 'target_id': target_elem_id
464 }
465 Session().flush()
466
467 c.co = comment
468 c.at_version_num = 0
469 c.is_new = True
470 rendered_comment = render(
471 'rhodecode:templates/changeset/changeset_comment_block.mako',
472 self._get_template_context(c), self.request)
473
474 data[comment_id].update(comment.get_dict())
475 data[comment_id].update({'rendered_text': rendered_comment})
476
477 # finalize, commit and redirect
478 Session().commit()
479
480 # skip channelstream for draft comments
481 if not all_drafts:
482 comment_broadcast_channel = channelstream.comment_channel(
483 self.db_repo_name, commit_obj=commit)
484
485 comment_data = data
486 posted_comment_type = 'inline' if is_inline else 'general'
487 if len(data) == 1:
488 msg = _('posted {} new {} comment').format(len(data), posted_comment_type)
489 else:
490 msg = _('posted {} new {} comments').format(len(data), posted_comment_type)
491
492 channelstream.comment_channelstream_push(
493 self.request, comment_broadcast_channel, self._rhodecode_user, msg,
494 comment_data=comment_data)
495
496 return data
497
369 @LoginRequired()
498 @LoginRequired()
370 @NotAnonymous()
499 @NotAnonymous()
371 @HasRepoPermissionAnyDecorator(
500 @HasRepoPermissionAnyDecorator(
372 'repository.read', 'repository.write', 'repository.admin')
501 'repository.read', 'repository.write', 'repository.admin')
373 @CSRFRequired()
502 @CSRFRequired()
374 @view_config(
503 @view_config(
375 route_name='repo_commit_comment_create', request_method='POST',
504 route_name='repo_commit_comment_create', request_method='POST',
376 renderer='json_ext')
505 renderer='json_ext')
377 def repo_commit_comment_create(self):
506 def repo_commit_comment_create(self):
378 _ = self.request.translate
507 _ = self.request.translate
379 commit_id = self.request.matchdict['commit_id']
508 commit_id = self.request.matchdict['commit_id']
380
509
381 c = self.load_default_context()
382 status = self.request.POST.get('changeset_status', None)
383 text = self.request.POST.get('text')
384 comment_type = self.request.POST.get('comment_type')
385 resolves_comment_id = self.request.POST.get('resolves_comment_id', None)
386
387 if status:
388 text = text or (_('Status change %(transition_icon)s %(status)s')
389 % {'transition_icon': '>',
390 'status': ChangesetStatus.get_status_lbl(status)})
391
392 multi_commit_ids = []
510 multi_commit_ids = []
393 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
511 for _commit_id in self.request.POST.get('commit_ids', '').split(','):
394 if _commit_id not in ['', None, EmptyCommit.raw_id]:
512 if _commit_id not in ['', None, EmptyCommit.raw_id]:
395 if _commit_id not in multi_commit_ids:
513 if _commit_id not in multi_commit_ids:
396 multi_commit_ids.append(_commit_id)
514 multi_commit_ids.append(_commit_id)
397
515
398 commit_ids = multi_commit_ids or [commit_id]
516 commit_ids = multi_commit_ids or [commit_id]
399
517
400 comment = None
518 data = []
519 # Multiple comments for each passed commit id
401 for current_id in filter(None, commit_ids):
520 for current_id in filter(None, commit_ids):
402 comment = CommentsModel().create(
521 comment_data = {
403 text=text,
522 'comment_type': self.request.POST.get('comment_type'),
404 repo=self.db_repo.repo_id,
523 'text': self.request.POST.get('text'),
405 user=self._rhodecode_db_user.user_id,
524 'status': self.request.POST.get('changeset_status', None),
406 commit_id=current_id,
525 'is_draft': self.request.POST.get('draft'),
407 f_path=self.request.POST.get('f_path'),
526 'resolves_comment_id': self.request.POST.get('resolves_comment_id', None),
408 line_no=self.request.POST.get('line'),
527 'close_pull_request': self.request.POST.get('close_pull_request'),
409 status_change=(ChangesetStatus.get_status_lbl(status)
528 'f_path': self.request.POST.get('f_path'),
410 if status else None),
529 'line': self.request.POST.get('line'),
411 status_change_type=status,
530 }
412 comment_type=comment_type,
531 comment = self._commit_comments_create(commit_id=current_id, comments=[comment_data])
413 resolves_comment_id=resolves_comment_id,
532 data.append(comment)
414 auth_user=self._rhodecode_user
415 )
416 is_inline = comment.is_inline
417
418 # get status if set !
419 if status:
420 # if latest status was from pull request and it's closed
421 # disallow changing status !
422 # dont_allow_on_closed_pull_request = True !
423
533
424 try:
534 return data if len(data) > 1 else data[0]
425 ChangesetStatusModel().set_status(
426 self.db_repo.repo_id,
427 status,
428 self._rhodecode_db_user.user_id,
429 comment,
430 revision=current_id,
431 dont_allow_on_closed_pull_request=True
432 )
433 except StatusChangeOnClosedPullRequestError:
434 msg = _('Changing the status of a commit associated with '
435 'a closed pull request is not allowed')
436 log.exception(msg)
437 h.flash(msg, category='warning')
438 raise HTTPFound(h.route_path(
439 'repo_commit', repo_name=self.db_repo_name,
440 commit_id=current_id))
441
442 commit = self.db_repo.get_commit(current_id)
443 CommentsModel().trigger_commit_comment_hook(
444 self.db_repo, self._rhodecode_user, 'create',
445 data={'comment': comment, 'commit': commit})
446
447 # finalize, commit and redirect
448 Session().commit()
449
450 data = {
451 'target_id': h.safeid(h.safe_unicode(
452 self.request.POST.get('f_path'))),
453 }
454 if comment:
455 c.co = comment
456 c.at_version_num = 0
457 rendered_comment = render(
458 'rhodecode:templates/changeset/changeset_comment_block.mako',
459 self._get_template_context(c), self.request)
460
461 data.update(comment.get_dict())
462 data.update({'rendered_text': rendered_comment})
463
464 comment_broadcast_channel = channelstream.comment_channel(
465 self.db_repo_name, commit_obj=commit)
466
467 comment_data = data
468 comment_type = 'inline' if is_inline else 'general'
469 channelstream.comment_channelstream_push(
470 self.request, comment_broadcast_channel, self._rhodecode_user,
471 _('posted a new {} comment').format(comment_type),
472 comment_data=comment_data)
473
474 return data
475
535
476 @LoginRequired()
536 @LoginRequired()
477 @NotAnonymous()
537 @NotAnonymous()
478 @HasRepoPermissionAnyDecorator(
538 @HasRepoPermissionAnyDecorator(
479 'repository.read', 'repository.write', 'repository.admin')
539 'repository.read', 'repository.write', 'repository.admin')
480 @CSRFRequired()
540 @CSRFRequired()
481 @view_config(
541 @view_config(
482 route_name='repo_commit_comment_preview', request_method='POST',
542 route_name='repo_commit_comment_preview', request_method='POST',
483 renderer='string', xhr=True)
543 renderer='string', xhr=True)
484 def repo_commit_comment_preview(self):
544 def repo_commit_comment_preview(self):
485 # Technically a CSRF token is not needed as no state changes with this
545 # Technically a CSRF token is not needed as no state changes with this
486 # call. However, as this is a POST is better to have it, so automated
546 # call. However, as this is a POST is better to have it, so automated
487 # tools don't flag it as potential CSRF.
547 # tools don't flag it as potential CSRF.
488 # Post is required because the payload could be bigger than the maximum
548 # Post is required because the payload could be bigger than the maximum
489 # allowed by GET.
549 # allowed by GET.
490
550
491 text = self.request.POST.get('text')
551 text = self.request.POST.get('text')
492 renderer = self.request.POST.get('renderer') or 'rst'
552 renderer = self.request.POST.get('renderer') or 'rst'
493 if text:
553 if text:
494 return h.render(text, renderer=renderer, mentions=True,
554 return h.render(text, renderer=renderer, mentions=True,
495 repo_name=self.db_repo_name)
555 repo_name=self.db_repo_name)
496 return ''
556 return ''
497
557
498 @LoginRequired()
558 @LoginRequired()
499 @HasRepoPermissionAnyDecorator(
559 @HasRepoPermissionAnyDecorator(
500 'repository.read', 'repository.write', 'repository.admin')
560 'repository.read', 'repository.write', 'repository.admin')
501 @CSRFRequired()
561 @CSRFRequired()
502 @view_config(
562 @view_config(
503 route_name='repo_commit_comment_history_view', request_method='POST',
563 route_name='repo_commit_comment_history_view', request_method='POST',
504 renderer='string', xhr=True)
564 renderer='string', xhr=True)
505 def repo_commit_comment_history_view(self):
565 def repo_commit_comment_history_view(self):
506 c = self.load_default_context()
566 c = self.load_default_context()
507
567
508 comment_history_id = self.request.matchdict['comment_history_id']
568 comment_history_id = self.request.matchdict['comment_history_id']
509 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
569 comment_history = ChangesetCommentHistory.get_or_404(comment_history_id)
510 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
570 is_repo_comment = comment_history.comment.repo.repo_id == self.db_repo.repo_id
511
571
512 if is_repo_comment:
572 if is_repo_comment:
513 c.comment_history = comment_history
573 c.comment_history = comment_history
514
574
515 rendered_comment = render(
575 rendered_comment = render(
516 'rhodecode:templates/changeset/comment_history.mako',
576 'rhodecode:templates/changeset/comment_history.mako',
517 self._get_template_context(c)
577 self._get_template_context(c)
518 , self.request)
578 , self.request)
519 return rendered_comment
579 return rendered_comment
520 else:
580 else:
521 log.warning('No permissions for user %s to show comment_history_id: %s',
581 log.warning('No permissions for user %s to show comment_history_id: %s',
522 self._rhodecode_db_user, comment_history_id)
582 self._rhodecode_db_user, comment_history_id)
523 raise HTTPNotFound()
583 raise HTTPNotFound()
524
584
525 @LoginRequired()
585 @LoginRequired()
526 @NotAnonymous()
586 @NotAnonymous()
527 @HasRepoPermissionAnyDecorator(
587 @HasRepoPermissionAnyDecorator(
528 'repository.read', 'repository.write', 'repository.admin')
588 'repository.read', 'repository.write', 'repository.admin')
529 @CSRFRequired()
589 @CSRFRequired()
530 @view_config(
590 @view_config(
531 route_name='repo_commit_comment_attachment_upload', request_method='POST',
591 route_name='repo_commit_comment_attachment_upload', request_method='POST',
532 renderer='json_ext', xhr=True)
592 renderer='json_ext', xhr=True)
533 def repo_commit_comment_attachment_upload(self):
593 def repo_commit_comment_attachment_upload(self):
534 c = self.load_default_context()
594 c = self.load_default_context()
535 upload_key = 'attachment'
595 upload_key = 'attachment'
536
596
537 file_obj = self.request.POST.get(upload_key)
597 file_obj = self.request.POST.get(upload_key)
538
598
539 if file_obj is None:
599 if file_obj is None:
540 self.request.response.status = 400
600 self.request.response.status = 400
541 return {'store_fid': None,
601 return {'store_fid': None,
542 'access_path': None,
602 'access_path': None,
543 'error': '{} data field is missing'.format(upload_key)}
603 'error': '{} data field is missing'.format(upload_key)}
544
604
545 if not hasattr(file_obj, 'filename'):
605 if not hasattr(file_obj, 'filename'):
546 self.request.response.status = 400
606 self.request.response.status = 400
547 return {'store_fid': None,
607 return {'store_fid': None,
548 'access_path': None,
608 'access_path': None,
549 'error': 'filename cannot be read from the data field'}
609 'error': 'filename cannot be read from the data field'}
550
610
551 filename = file_obj.filename
611 filename = file_obj.filename
552 file_display_name = filename
612 file_display_name = filename
553
613
554 metadata = {
614 metadata = {
555 'user_uploaded': {'username': self._rhodecode_user.username,
615 'user_uploaded': {'username': self._rhodecode_user.username,
556 'user_id': self._rhodecode_user.user_id,
616 'user_id': self._rhodecode_user.user_id,
557 'ip': self._rhodecode_user.ip_addr}}
617 'ip': self._rhodecode_user.ip_addr}}
558
618
559 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
619 # TODO(marcink): allow .ini configuration for allowed_extensions, and file-size
560 allowed_extensions = [
620 allowed_extensions = [
561 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
621 'gif', '.jpeg', '.jpg', '.png', '.docx', '.gz', '.log', '.pdf',
562 '.pptx', '.txt', '.xlsx', '.zip']
622 '.pptx', '.txt', '.xlsx', '.zip']
563 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
623 max_file_size = 10 * 1024 * 1024 # 10MB, also validated via dropzone.js
564
624
565 try:
625 try:
566 storage = store_utils.get_file_storage(self.request.registry.settings)
626 storage = store_utils.get_file_storage(self.request.registry.settings)
567 store_uid, metadata = storage.save_file(
627 store_uid, metadata = storage.save_file(
568 file_obj.file, filename, extra_metadata=metadata,
628 file_obj.file, filename, extra_metadata=metadata,
569 extensions=allowed_extensions, max_filesize=max_file_size)
629 extensions=allowed_extensions, max_filesize=max_file_size)
570 except FileNotAllowedException:
630 except FileNotAllowedException:
571 self.request.response.status = 400
631 self.request.response.status = 400
572 permitted_extensions = ', '.join(allowed_extensions)
632 permitted_extensions = ', '.join(allowed_extensions)
573 error_msg = 'File `{}` is not allowed. ' \
633 error_msg = 'File `{}` is not allowed. ' \
574 'Only following extensions are permitted: {}'.format(
634 'Only following extensions are permitted: {}'.format(
575 filename, permitted_extensions)
635 filename, permitted_extensions)
576 return {'store_fid': None,
636 return {'store_fid': None,
577 'access_path': None,
637 'access_path': None,
578 'error': error_msg}
638 'error': error_msg}
579 except FileOverSizeException:
639 except FileOverSizeException:
580 self.request.response.status = 400
640 self.request.response.status = 400
581 limit_mb = h.format_byte_size_binary(max_file_size)
641 limit_mb = h.format_byte_size_binary(max_file_size)
582 return {'store_fid': None,
642 return {'store_fid': None,
583 'access_path': None,
643 'access_path': None,
584 'error': 'File {} is exceeding allowed limit of {}.'.format(
644 'error': 'File {} is exceeding allowed limit of {}.'.format(
585 filename, limit_mb)}
645 filename, limit_mb)}
586
646
587 try:
647 try:
588 entry = FileStore.create(
648 entry = FileStore.create(
589 file_uid=store_uid, filename=metadata["filename"],
649 file_uid=store_uid, filename=metadata["filename"],
590 file_hash=metadata["sha256"], file_size=metadata["size"],
650 file_hash=metadata["sha256"], file_size=metadata["size"],
591 file_display_name=file_display_name,
651 file_display_name=file_display_name,
592 file_description=u'comment attachment `{}`'.format(safe_unicode(filename)),
652 file_description=u'comment attachment `{}`'.format(safe_unicode(filename)),
593 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
653 hidden=True, check_acl=True, user_id=self._rhodecode_user.user_id,
594 scope_repo_id=self.db_repo.repo_id
654 scope_repo_id=self.db_repo.repo_id
595 )
655 )
596 Session().add(entry)
656 Session().add(entry)
597 Session().commit()
657 Session().commit()
598 log.debug('Stored upload in DB as %s', entry)
658 log.debug('Stored upload in DB as %s', entry)
599 except Exception:
659 except Exception:
600 log.exception('Failed to store file %s', filename)
660 log.exception('Failed to store file %s', filename)
601 self.request.response.status = 400
661 self.request.response.status = 400
602 return {'store_fid': None,
662 return {'store_fid': None,
603 'access_path': None,
663 'access_path': None,
604 'error': 'File {} failed to store in DB.'.format(filename)}
664 'error': 'File {} failed to store in DB.'.format(filename)}
605
665
606 Session().commit()
666 Session().commit()
607
667
608 return {
668 return {
609 'store_fid': store_uid,
669 'store_fid': store_uid,
610 'access_path': h.route_path(
670 'access_path': h.route_path(
611 'download_file', fid=store_uid),
671 'download_file', fid=store_uid),
612 'fqn_access_path': h.route_url(
672 'fqn_access_path': h.route_url(
613 'download_file', fid=store_uid),
673 'download_file', fid=store_uid),
614 'repo_access_path': h.route_path(
674 'repo_access_path': h.route_path(
615 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
675 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
616 'repo_fqn_access_path': h.route_url(
676 'repo_fqn_access_path': h.route_url(
617 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
677 'repo_artifacts_get', repo_name=self.db_repo_name, uid=store_uid),
618 }
678 }
619
679
620 @LoginRequired()
680 @LoginRequired()
621 @NotAnonymous()
681 @NotAnonymous()
622 @HasRepoPermissionAnyDecorator(
682 @HasRepoPermissionAnyDecorator(
623 'repository.read', 'repository.write', 'repository.admin')
683 'repository.read', 'repository.write', 'repository.admin')
624 @CSRFRequired()
684 @CSRFRequired()
625 @view_config(
685 @view_config(
626 route_name='repo_commit_comment_delete', request_method='POST',
686 route_name='repo_commit_comment_delete', request_method='POST',
627 renderer='json_ext')
687 renderer='json_ext')
628 def repo_commit_comment_delete(self):
688 def repo_commit_comment_delete(self):
629 commit_id = self.request.matchdict['commit_id']
689 commit_id = self.request.matchdict['commit_id']
630 comment_id = self.request.matchdict['comment_id']
690 comment_id = self.request.matchdict['comment_id']
631
691
632 comment = ChangesetComment.get_or_404(comment_id)
692 comment = ChangesetComment.get_or_404(comment_id)
633 if not comment:
693 if not comment:
634 log.debug('Comment with id:%s not found, skipping', comment_id)
694 log.debug('Comment with id:%s not found, skipping', comment_id)
635 # comment already deleted in another call probably
695 # comment already deleted in another call probably
636 return True
696 return True
637
697
638 if comment.immutable:
698 if comment.immutable:
639 # don't allow deleting comments that are immutable
699 # don't allow deleting comments that are immutable
640 raise HTTPForbidden()
700 raise HTTPForbidden()
641
701
642 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
702 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
643 super_admin = h.HasPermissionAny('hg.admin')()
703 super_admin = h.HasPermissionAny('hg.admin')()
644 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
704 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
645 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
705 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
646 comment_repo_admin = is_repo_admin and is_repo_comment
706 comment_repo_admin = is_repo_admin and is_repo_comment
647
707
648 if super_admin or comment_owner or comment_repo_admin:
708 if super_admin or comment_owner or comment_repo_admin:
649 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
709 CommentsModel().delete(comment=comment, auth_user=self._rhodecode_user)
650 Session().commit()
710 Session().commit()
651 return True
711 return True
652 else:
712 else:
653 log.warning('No permissions for user %s to delete comment_id: %s',
713 log.warning('No permissions for user %s to delete comment_id: %s',
654 self._rhodecode_db_user, comment_id)
714 self._rhodecode_db_user, comment_id)
655 raise HTTPNotFound()
715 raise HTTPNotFound()
656
716
657 @LoginRequired()
717 @LoginRequired()
658 @NotAnonymous()
718 @NotAnonymous()
659 @HasRepoPermissionAnyDecorator(
719 @HasRepoPermissionAnyDecorator(
660 'repository.read', 'repository.write', 'repository.admin')
720 'repository.read', 'repository.write', 'repository.admin')
661 @CSRFRequired()
721 @CSRFRequired()
662 @view_config(
722 @view_config(
663 route_name='repo_commit_comment_edit', request_method='POST',
723 route_name='repo_commit_comment_edit', request_method='POST',
664 renderer='json_ext')
724 renderer='json_ext')
665 def repo_commit_comment_edit(self):
725 def repo_commit_comment_edit(self):
666 self.load_default_context()
726 self.load_default_context()
667
727
728 commit_id = self.request.matchdict['commit_id']
668 comment_id = self.request.matchdict['comment_id']
729 comment_id = self.request.matchdict['comment_id']
669 comment = ChangesetComment.get_or_404(comment_id)
730 comment = ChangesetComment.get_or_404(comment_id)
670
731
671 if comment.immutable:
732 if comment.immutable:
672 # don't allow deleting comments that are immutable
733 # don't allow deleting comments that are immutable
673 raise HTTPForbidden()
734 raise HTTPForbidden()
674
735
675 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
736 is_repo_admin = h.HasRepoPermissionAny('repository.admin')(self.db_repo_name)
676 super_admin = h.HasPermissionAny('hg.admin')()
737 super_admin = h.HasPermissionAny('hg.admin')()
677 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
738 comment_owner = (comment.author.user_id == self._rhodecode_db_user.user_id)
678 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
739 is_repo_comment = comment.repo.repo_id == self.db_repo.repo_id
679 comment_repo_admin = is_repo_admin and is_repo_comment
740 comment_repo_admin = is_repo_admin and is_repo_comment
680
741
681 if super_admin or comment_owner or comment_repo_admin:
742 if super_admin or comment_owner or comment_repo_admin:
682 text = self.request.POST.get('text')
743 text = self.request.POST.get('text')
683 version = self.request.POST.get('version')
744 version = self.request.POST.get('version')
684 if text == comment.text:
745 if text == comment.text:
685 log.warning(
746 log.warning(
686 'Comment(repo): '
747 'Comment(repo): '
687 'Trying to create new version '
748 'Trying to create new version '
688 'with the same comment body {}'.format(
749 'with the same comment body {}'.format(
689 comment_id,
750 comment_id,
690 )
751 )
691 )
752 )
692 raise HTTPNotFound()
753 raise HTTPNotFound()
693
754
694 if version.isdigit():
755 if version.isdigit():
695 version = int(version)
756 version = int(version)
696 else:
757 else:
697 log.warning(
758 log.warning(
698 'Comment(repo): Wrong version type {} {} '
759 'Comment(repo): Wrong version type {} {} '
699 'for comment {}'.format(
760 'for comment {}'.format(
700 version,
761 version,
701 type(version),
762 type(version),
702 comment_id,
763 comment_id,
703 )
764 )
704 )
765 )
705 raise HTTPNotFound()
766 raise HTTPNotFound()
706
767
707 try:
768 try:
708 comment_history = CommentsModel().edit(
769 comment_history = CommentsModel().edit(
709 comment_id=comment_id,
770 comment_id=comment_id,
710 text=text,
771 text=text,
711 auth_user=self._rhodecode_user,
772 auth_user=self._rhodecode_user,
712 version=version,
773 version=version,
713 )
774 )
714 except CommentVersionMismatch:
775 except CommentVersionMismatch:
715 raise HTTPConflict()
776 raise HTTPConflict()
716
777
717 if not comment_history:
778 if not comment_history:
718 raise HTTPNotFound()
779 raise HTTPNotFound()
719
780
720 commit_id = self.request.matchdict['commit_id']
781 if not comment.draft:
721 commit = self.db_repo.get_commit(commit_id)
782 commit = self.db_repo.get_commit(commit_id)
722 CommentsModel().trigger_commit_comment_hook(
783 CommentsModel().trigger_commit_comment_hook(
723 self.db_repo, self._rhodecode_user, 'edit',
784 self.db_repo, self._rhodecode_user, 'edit',
724 data={'comment': comment, 'commit': commit})
785 data={'comment': comment, 'commit': commit})
725
786
726 Session().commit()
787 Session().commit()
727 return {
788 return {
728 'comment_history_id': comment_history.comment_history_id,
789 'comment_history_id': comment_history.comment_history_id,
729 'comment_id': comment.comment_id,
790 'comment_id': comment.comment_id,
730 'comment_version': comment_history.version,
791 'comment_version': comment_history.version,
731 'comment_author_username': comment_history.author.username,
792 'comment_author_username': comment_history.author.username,
732 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16),
793 'comment_author_gravatar': h.gravatar_url(comment_history.author.email, 16),
733 'comment_created_on': h.age_component(comment_history.created_on,
794 'comment_created_on': h.age_component(comment_history.created_on,
734 time_is_local=True),
795 time_is_local=True),
735 }
796 }
736 else:
797 else:
737 log.warning('No permissions for user %s to edit comment_id: %s',
798 log.warning('No permissions for user %s to edit comment_id: %s',
738 self._rhodecode_db_user, comment_id)
799 self._rhodecode_db_user, comment_id)
739 raise HTTPNotFound()
800 raise HTTPNotFound()
740
801
741 @LoginRequired()
802 @LoginRequired()
742 @HasRepoPermissionAnyDecorator(
803 @HasRepoPermissionAnyDecorator(
743 'repository.read', 'repository.write', 'repository.admin')
804 'repository.read', 'repository.write', 'repository.admin')
744 @view_config(
805 @view_config(
745 route_name='repo_commit_data', request_method='GET',
806 route_name='repo_commit_data', request_method='GET',
746 renderer='json_ext', xhr=True)
807 renderer='json_ext', xhr=True)
747 def repo_commit_data(self):
808 def repo_commit_data(self):
748 commit_id = self.request.matchdict['commit_id']
809 commit_id = self.request.matchdict['commit_id']
749 self.load_default_context()
810 self.load_default_context()
750
811
751 try:
812 try:
752 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
813 return self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
753 except CommitDoesNotExistError as e:
814 except CommitDoesNotExistError as e:
754 return EmptyCommit(message=str(e))
815 return EmptyCommit(message=str(e))
755
816
756 @LoginRequired()
817 @LoginRequired()
757 @HasRepoPermissionAnyDecorator(
818 @HasRepoPermissionAnyDecorator(
758 'repository.read', 'repository.write', 'repository.admin')
819 'repository.read', 'repository.write', 'repository.admin')
759 @view_config(
820 @view_config(
760 route_name='repo_commit_children', request_method='GET',
821 route_name='repo_commit_children', request_method='GET',
761 renderer='json_ext', xhr=True)
822 renderer='json_ext', xhr=True)
762 def repo_commit_children(self):
823 def repo_commit_children(self):
763 commit_id = self.request.matchdict['commit_id']
824 commit_id = self.request.matchdict['commit_id']
764 self.load_default_context()
825 self.load_default_context()
765
826
766 try:
827 try:
767 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
828 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
768 children = commit.children
829 children = commit.children
769 except CommitDoesNotExistError:
830 except CommitDoesNotExistError:
770 children = []
831 children = []
771
832
772 result = {"results": children}
833 result = {"results": children}
773 return result
834 return result
774
835
775 @LoginRequired()
836 @LoginRequired()
776 @HasRepoPermissionAnyDecorator(
837 @HasRepoPermissionAnyDecorator(
777 'repository.read', 'repository.write', 'repository.admin')
838 'repository.read', 'repository.write', 'repository.admin')
778 @view_config(
839 @view_config(
779 route_name='repo_commit_parents', request_method='GET',
840 route_name='repo_commit_parents', request_method='GET',
780 renderer='json_ext')
841 renderer='json_ext')
781 def repo_commit_parents(self):
842 def repo_commit_parents(self):
782 commit_id = self.request.matchdict['commit_id']
843 commit_id = self.request.matchdict['commit_id']
783 self.load_default_context()
844 self.load_default_context()
784
845
785 try:
846 try:
786 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
847 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
787 parents = commit.parents
848 parents = commit.parents
788 except CommitDoesNotExistError:
849 except CommitDoesNotExistError:
789 parents = []
850 parents = []
790 result = {"results": parents}
851 result = {"results": parents}
791 return result
852 return result
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
General Comments 0
You need to be logged in to leave comments. Login now